CN106504280A - A kind of method and terminal for browsing video - Google Patents
A kind of method and terminal for browsing video Download PDFInfo
- Publication number
- CN106504280A CN106504280A CN201610958157.4A CN201610958157A CN106504280A CN 106504280 A CN106504280 A CN 106504280A CN 201610958157 A CN201610958157 A CN 201610958157A CN 106504280 A CN106504280 A CN 106504280A
- Authority
- CN
- China
- Prior art keywords
- image
- layer
- target
- video
- original
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Studio Devices (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of method for browsing video, including:When browsing instructions are received, the first figure layer is called, the original image collected when shooting video is shown in first figure layer;Corresponding for the subregion of original image image is carried out zoom and processes generation target image, and call the second figure layer, the target image is shown in second figure layer;The video pictures that the original image and the target image are included by the Overlapping display of first figure layer and second figure layer.The embodiment of the present invention also provides the terminal for realizing said method.
Description
Technical field
A kind of the present invention relates to electronic technology, more particularly to method and terminal for browsing video.
Background technology
As the popularization of smart mobile phone and mobile Internet, video capture no longer need professional camera, anyone may be used
Shot the video with mobile phone conveniently, then upload to and shared on each big social network sites.Regarding captured by common style of shooting
Frequency effect of shadow etc. is all not ideal enough, therefore it is desirable to the video for oneself shooting can be more personalized, with this demand
Increase, various APP process software and emerge in an endless stream, such as Tengxun is micro- regards, U.S. bat etc., scene can be increased, carry out light tone for video
Section etc..But these technology existing, nearly all it is that all for whole video pictures are uniformly processed, user operation space has
Limit, it is impossible to fully meet all users ' individualized requirements.
In addition, existing video software is during regarding recording or playing, terminal can only be aobvious to user in display interface
Show the whole raw frames for photographing, for the picture of local cannot carry out emphasis presentation, so as to user can only browse to shooting
Whole raw frames.Therefore, a kind of technical scheme for browsing video is needed badly, can be during recorded video and broadcasting video
Local in picture is carried out emphasis presentation.
Content of the invention
In view of this, the embodiment of the present invention provides a kind of method and terminal for browsing video, in recorded video and can broadcast
The local in picture is carried out emphasis presentation during putting video.
The technical scheme of the embodiment of the present invention is realized in:
On the one hand, the embodiment of the present invention provides a kind of method that supervises and browse video, and methods described includes:Browse when receiving
During instruction, the first figure layer is called, the original image collected when shooting video is shown in first figure layer;By the original
The corresponding image in the subregion of beginning image carries out zoom and processes generating target image, and calls the second figure layer, by the target
Image shows in second figure layer;Included by first figure layer and the Overlapping display of second figure layer described original
The video pictures of image and the target image.
On the other hand, the embodiment of the present invention provides a kind of terminal for browsing video, including:Original image module, target figure
As module and display module;Wherein, the original image module, for when browsing instructions are received, calling the first figure layer,
The original image collected when shooting video is shown in first figure layer;The target image module, for will be described
The corresponding image in the subregion of original image carries out zoom and processes generating target image, and calls the second figure layer, by the mesh
Logo image shows in second figure layer;The display module, for by first figure layer and second figure layer
Overlapping display includes the video pictures of the original image and the target image.
The method and terminal for browsing video provided in an embodiment of the present invention, by by the first figure layer and the second figure layer respectively with
The original image for collecting and the target image for entering zoom process carry out corresponding to, and show original image by the first figure layer, and
By the second figure layer target image, the first figure layer and the second map overlay are shown, so as to pass through the first figure layer and second
The Overlapping display of figure layer includes the video pictures of original image and target image.Thus, terminal is browsing recorded video or is broadcasting
When putting the process of video and carrying out video tour, shown including original image and mesh by the application of the first figure layer and the second figure layer
The picture of logo image so that the picture for photographing and the office for needing emphasis presentation can be assumed on the display interface of terminal simultaneously
Local in picture is carried out emphasis presentation in recorded video and during playing video, meets user for video by portion's picture
The individual demand for browsing, improves the multimedia experiences of user.
Description of the drawings
Hardware architecture diagrams of the Fig. 1-1 for the optional mobile terminal of realization each embodiment one of the invention;
Fig. 1-2 is the wireless communication system schematic diagram of mobile terminal as Figure 1-1;
Fig. 1-3 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention one;
Fig. 2 is the schematic diagram of original image provided in an embodiment of the present invention and target image;
Fig. 3 is the effect diagram of the shooting picture of the display that the embodiment of the present invention one is provided;
Fig. 4 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention two;
Fig. 5 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention three;
Fig. 6 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention four;
Fig. 7 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention five;
Fig. 8 is the effect diagram of the display interface in the embodiment of the present invention six;
Fig. 9 is a kind of structural representation of the terminal in the embodiment of the present invention seven;
Figure 10 is the structural representation of another kind of terminal in the embodiment of the present invention seven.
Specific embodiment
It should be appreciated that specific embodiment described herein is not used to only in order to explain technical scheme
Limit protection scope of the present invention.
Referring now to the mobile terminal that Description of Drawings realizes each embodiment of the invention.In follow-up description, use
For representing the suffix of such as " module ", " part " or " unit " of element only for being conducive to the explanation of the present invention, itself
Not specific meaning.Therefore, " module " mixedly can be used with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving
Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP
The mobile terminal of (portable media player), guider etc. and such as numeral TV, desktop computer etc. are consolidated
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for moving
Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1-1 illustrates for the hardware configuration for realizing each one optional mobile terminal of embodiment of the invention.
Mobile terminal 1 00 can include wireless communication unit 110, A/V (audio/video) input block 120, user input
Unit 130, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 etc..Fig. 1-1 shows
The mobile terminal with various assemblies is gone out, it should be understood that being not required for implementing all components for illustrating.Can substitute
Implement more or less of component in ground.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assemblies, and which allows mobile terminal 1 00 and wireless communication system
Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module
112nd, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server
Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent
The broadcast singal generated before the server or reception of broadcast singal and/or broadcast related information and/or broadcast related information
And send it to the server of terminal.Broadcast singal can include TV broadcast singals, radio signals, data broadcasting
Signal etc..And, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase
Pass information can also be provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 is receiving.Broadcast singal can be present in a variety of manners, and for example, which can be with the electronics of DMB (DMB)
The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast
Receiver module 111 can receive signal broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111
Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video
Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service
Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and be adapted to provide for extensively
Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via broadcasting reception module 111 receive broadcast singal and/
Or broadcast related information can be stored in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice
Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by the module can include WLAN (WLAN) (Wi-Fi), Wibro
(WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant
ToothTM, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybeeTMEtc..
Location information module 115 be for check or obtain mobile terminal positional information module.Location information module
Typical case be GPS (global positioning system).According to current technology, GPS module 115 is calculated from three or more satellites
Range information and correct time information and for calculate Information application triangulation, so as to according to longitude, latitude
Three-dimensional current location information is highly accurately calculated.Currently, defended using three for calculating the method for position and temporal information
Star and the error of the position that calculated by using other satellite correction and temporal information.Additionally, GPS module 115
Can be by Continuous plus current location information in real time come calculating speed information.
A/V input blocks 120 are used for receiving audio or video signal.A/V input blocks 120 can include 121 He of camera
Microphone 1220,121 pairs of static maps obtained by image capture apparatus in Video Capture pattern or image capture mode of camera
The view data of piece or video is processed.Picture frame after process is may be displayed on display unit 151.At camera 121
Picture frame after reason can be stored in memory 160 (or other storage mediums) or carry out via wireless communication unit 110
Send, two or more cameras 1210 can be provided according to the construction of mobile terminal.Microphone 122 can be in telephone relation mould
Sound (voice data) is received via microphone in formula, logging mode, speech recognition mode etc. operational mode, and can be by
Such acoustic processing is voice data.Audio frequency (voice) data after process can be changed in the case of telephone calling model
For the form output of mobile communication base station can be sent to via mobile communication module 112.Microphone 122 can implement all kinds
Noise eliminate (or suppress) algorithm with eliminate noise that (or suppression) is produced during receiving and sending audio signal or
Person disturbs.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input
Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch
Plate (for example, detection is due to the sensitive component of the change of touched and caused resistance, pressure, electric capacity etc.), roller, rocking bar etc.
Deng.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch-screen can be formed.
Processing unit 140 with the data of the input of receiving user's input unit 130 and can be processed, and can be with memory
160 interactions carry out the read-write of data, and the data of the data of reading or process can be entered by processing unit 140 by display unit 151
Row shows.
Interface unit 170 is connected the interface that can pass through as at least one external device (ED) with mobile terminal 1 00.For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for being used for device of the connection with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be that storage uses each of mobile terminal 1 00 for verifying user
Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included
Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know
Other device can be connected with mobile terminal 1 00 via port or other attachment means.Interface unit 170 can be used for receive from
The input (for example, data message, electric power etc.) of external device (ED) and the input for receiving is transferred in mobile terminal 1 00
One or more elements can be used for transmission data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing to pass through which by electricity
Power provides the path of mobile terminal 1 00 from base or can serve as allowing the various command signals from base input to pass through which
It is transferred to the path of mobile terminal.The various command signals or electric power being input into from base may serve as recognizing that mobile terminal is
The no signal being accurately fitted within base.Output unit 150 is configured to defeated with the offer of vision, audio frequency and/or tactile manner
Go out signal (for example, audio signal, vision signal, vibration signal etc.).Output unit 150 can include display unit 151, sound
Frequency output module 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia files
Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, illustrate video or figure
UI or GUI of picture and correlation function etc..
Meanwhile, when the display unit 151 and touch pad touch-screen with formation superposed on one another as a layer, display unit
151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
A kind of.Some in these displays may be constructed such that transparence to allow user from outside viewing, and this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
The embodiment that wants, mobile terminal 1 00 can include two or more display units (or other display devices), for example, move
Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detection and touch
Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal in call signal reception pattern, call mode, logging mode,
When under the isotypes such as speech recognition mode, broadcast reception mode, that wireless communication unit 110 is received or in memory 160
The voice data transducing audio signal of middle storage and it is output as sound.And, dio Output Modules 152 can be provided and movement
The audio output (for example, call signal receives sound, message sink sound etc.) of the specific function correlation that terminal 100 is executed.
Dio Output Modules 152 can include loudspeaker, buzzer etc..
Memory 160 can store software program for the process and control operation executed by controller 180 etc., Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And, memory 160 can be storing the vibration of various modes with regard to exporting and audio signal when touching and being applied to touch-screen
Data.
Memory 160 can include that the storage medium of at least one type, storage medium include flash memory, hard disk, multimedia
Card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access memory
(SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can execute memory with by network connection
The network storage device cooperation of 160 store function.
The overall operation of 180 usual control mobile terminal of controller.For example, controller 180 is executed and voice call, data
The related control of communication, video calling etc. and process.In addition, controller 180 can be included for reproduction or multimedia playback
The multi-media module 1810 of data, multi-media module 1810 can be constructed in controller 180, or it is so structured that and control
Device 180 is separated.Controller 180 can be with execution pattern identifying processing, by the handwriting input for executing on the touchscreen or picture
Draw input and be identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit
Appropriate electric power needed for part and component.
Various embodiments described herein can be with use such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implementing.For hardware is implemented, embodiment described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to execute function described herein
At least one in electronic unit implementing, in some cases, can be implemented in controller 180 by such embodiment.
For software is implemented, the embodiment of such as process or function can with allow to execute the single of at least one function or operation
Software module is implementing.Software code can be come by the software application (or program) that is write with any appropriate programming language
Implement, software code can be stored in memory 160 and be executed by controller 180.
So far, oneself is through describing mobile terminal according to its function.Below, for the sake of brevity, will description such as folded form,
Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is used as showing
Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
Mobile terminal 1 00 as shown in Fig. 1-1 may be constructed such that using via frame or packet transmission data such as
Wired and wireless communication system and satellite-based communication system are operating.
Referring now to the communication system that Fig. 1-2 descriptions are wherein operable to according to the mobile terminal of the present invention.
Such communication system can use different air interfaces and/or physical layer.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 1-2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base stations (BS) 270, base
Station control (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC280 is also structured to form interface with the BSC275 that can be couple to base station 270 via back haul link.
If back haul link can be constructed according to any one in the interface that Ganji knows, interface includes such as E1/T1, ATM, IP, PPP,
Frame relay, HDSL, ADSL or xDSL.It will be appreciated that system as shown in figs. 1-2 can include multiple BSC2750.
Each BS270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction
Each subregion that line is covered is radially away from BS270.Or, each subregion can by for diversity reception two or more
Antenna is covered.Each BS270 may be constructed such that the multiple frequency distribution of support, and each frequency distribution has specific frequency spectrum
(for example, 1.25MHz, 5MHz etc.).
Intersecting that subregion and frequency are distributed can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single
BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS270 can be claimed
For multiple cellular stations.
As shown in figs. 1-2, broadcast singal is sent to broadcsting transmitter (BT) 295 mobile terminal operated in system
100.Broadcasting reception module 111 as shown in Fig. 1-1 be arranged at mobile terminal 1 00 with receive by BT295 send wide
Broadcast signal.In Fig. 1-2, several global positioning systems (GPS) satellite 300 is shown.Satellite 300 helps position multiple movements eventually
At least one of end 100.
In Fig. 1-2, depict multiple satellites 300, it is understood that be, it is possible to use any number of satellite is had
Location information.GPS module 115 as shown in Fig. 1-1 is generally configured to coordinate with satellite 300 with determining that acquisition is wanted
Position information.Substitute GPS tracking techniques or outside GPS tracking techniques, it is possible to use the position of mobile terminal can be tracked
Other technology.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
Used as a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminal 1s 00
Signal.00 usual participation call of mobile terminal 1, information receiving and transmitting and other types of communication.Each of the reception of certain base station 270 is anti-
Processed in specific BS270 to link signal.The data of acquisition are forwarded to the BSC275 of correlation.BSC provides call
Resource allocation and the mobile management function of the coordination including the soft switching process between BS270.BSC275 is also by the number for receiving
According to MSC280 is routed to, which is provided for the extra route service with PSTN290 formation interfaces.Similarly, PSTN290 with
MSC280 forms interface, and MSC and BSC275 form interface, and BSC275 correspondingly controls BS270 with by forward link signals
It is sent to mobile terminal 1 00.
The inventive method each embodiment pair will be proposed based on above-mentioned mobile terminal hardware configuration and communication system below
Technical scheme is further elaborated on.
Embodiment one:
Aforesaid embodiment is based on, the embodiment of the present invention provides a kind of method for browsing video, and the method is applied to terminal,
The function realized by the method can by terminal in processor caller code realizing, certain program code can be protected
Exist in computer-readable storage medium, it is seen then that the terminal at least includes processor and storage medium.
Schematic flow sheets of the Fig. 1-3 for the method for browsing video in the embodiment of the present invention one, the method can be applicable to end
End, as Figure 1-3, the method includes:
S101, when browsing instructions are received, call the first figure layer, the original image that collects when shooting video shown
In first figure layer;
When carrying out video tour during user carries out recording, broadcasting of video etc. by terminal, by user operation
Browsing instructions are issued to terminal.Here, browsing instructions may include based on the operation of beginning recorded video during recorded video, be based on
The instruction for showing video pictures on display interface that the operations such as the operation for commencing play out when playing video are triggered.Such as:When
When terminal receives the operation of beginning recorded video of user, browsing instructions are received.
When terminal receives browsing instructions, call the first figure layer, and by the original image collected when shooting video with
The first figure layer that calls carries out corresponding to, and original image is attached in the first figure layer, shows original image by the first figure layer.This
In, before the first figure layer is called, it may be determined that the map data mining platform of first figure layer that whether is stored with terminal, when being stored with terminal
The map data mining platform of the first figure layer, obtains the first map data mining platform of the first figure layer, generates first according to the first map data mining platform for obtaining
Figure layer, when the map data mining platform of first figure layer that is not stored with, creates the first figure layer.Figure layer when first figure layer that is stored with terminal
During information, generate the map data mining platform of the first figure layer, alternatively record before when the first figure layer is created when can start for recorded video
Generate when the first figure layer is created during video processed the map data mining platform of the first figure layer.
Here, the original image collected when shooting video is the figure of the complete picture that the camera of terminal is collected
Picture.Wherein, the size of original image can be in the same size with the display screen of terminal, also can be aobvious with the video software for browsing video
Show picture interface in the same size.
S102, corresponding for the subregion of original image image is carried out zoom process to generate target image, and adjust
The second figure layer is used, the target image is shown in second figure layer;
Here, while original image is shown by the first figure layer, by corresponding for the subregion of original image image
Carry out zoom process and obtain target image, target image is attached in the second figure layer that calls, with target image.
It should be noted that when the second figure layer is called, it may be determined that the video whether being stored with current shooting in terminal
The map data mining platform of corresponding second figure layer, when the map data mining platform of second figure layer that is stored with, can the second figure layer of direct access figure
Layer information, directly generates the second figure layer by the map data mining platform of the second figure layer;When the map data mining platform of second figure layer that is not stored with,
New figure layer can be directly created, using the new figure layer for creating as the second figure layer.Here, the size of the second figure layer can be with
The size of one figure layer is identical, is also smaller than the first figure layer.
Target image be the topography to original image carry out zoom process obtain image.Part as original image
The corresponding image in region, which is smaller in size than the size of original image.Wherein, subregion can be any local of original image,
The shape of subregion can be circular, square etc. variously-shaped, and here, subregion is the corresponding viewing area of original image
Locally, to the shapes and sizes of subregion and it is not limited.
It can be Digital Zoom that the zoom that corresponding image is carried out in subregion to original image is processed.Carrying out number
During zoom, by each elemental area increase in the image of the subregion of original image, so as to will be corresponding for the subregion
Image amplifies, i.e., target image is the image after the topography's amplification in original image.
In actual applications, when zoom process is carried out, zoom option can be provided by display interface, for carrying to user
For the degree of zoom, target image putting relative to the corresponding image in untreated subregion is determined according to the selection of user
Big degree.Here, it is also possible to determined the degree of amplification by the amplifieroperation of user, wherein, method operation may include:Double fingers
Slip, pressing operation etc..Here the specific form of amplifieroperation is not defined.
Here, original image and target image are illustrated by Fig. 2, wherein, are wrapped in the original image in Fig. 2
Include the hand of the head and handheld mobile phone of a people.The hand of handheld mobile phone is the subregion in original image, by the subregion
Image carries out Digital Zoom process, the target image of the hand of the handheld mobile phone after obtaining showing amplification.
S103, the Overlapping display by first figure layer with second figure layer include the original image with the mesh
The video pictures of logo image.
Calling the first figure layer and the second figure layer, and by corresponding with original image for the first figure layer, by the second image with relative
After the target image of original image is corresponding, by the first figure layer and the second map overlay, here, when the superposition of figure layer is carried out,
The second figure layer can be adjusted according to the size of target image, specifically be may include:According to the size of target image to the second figure
Layer is adjusted, and the size of the size of the second figure layer and target image is consistent;Also target can will be removed in the second figure layer
Other regions beyond the corresponding region of image carry out transparence, so that after the superposition of the first figure layer and the second figure layer, the
Two figure layers are covered in when showing original image and target image in the first figure layer, the second figure layer while target image,
The region not covered to original image by target image.
Here, in the display interface for browsing, in the second figure layer, the region of target image can be described as magnifying glass region,
The position in magnifying glass region can be located at the optional position in display interface, can be configured according to the needs of user.Work as magnifying glass
When region is located at the target image corresponding viewing area, the image of corresponding for former target image subregion can be covered
Lid.When magnifying glass region is located at the corresponding viewing area of non-coverage goal object, with shown in Fig. 3 including original image and mesh
The shooting picture of logo image is illustrated, and the i.e. magnifying glass region of the target image after zoom is processed is shown in display interface
The upper right corner, so as to not cover the corresponding image for not carrying out zoom process of target image.Wherein, magnifying glass region is at interface
In position can be moved on display interface according to the operation of user.
In embodiments of the present invention, in preview video, emphasis can will be needed to be presented on the local of display interface as original
The subregion of beginning image, carries out zoom and processes the target image after being amplified, and pass through the first figure to the image in the region
Layer and the second figure layer are called, will original image and target image corresponding in the first figure layer and the second figure layer respectively, by first
The video pictures of original image and target image after figure layer and the second map overlay, can be included simultaneously.
In actual applications, the method for browsing video is applicable to what terminal interface photographs during recorded video were arrived
The display of video, the display of the video that terminal interface photographs are arrived when being equally applicable to play video.Here, an amplification can be set
Mirror is switched, and when the magnifying glass switch cuts out, based on the control that magnifying glass is switched, only passes through first on the display interface of terminal
Figure layer shows the original image collected during recorded video so that the video that user sees is to shoot the original video that treats;When putting
When big mirror switch is opened, magnifying glass instruction is generated, the second figure layer is called, the first figure layer and the second figure layer is submitted, the
Target image in two figure layers, shows original image simultaneously by the first figure layer and the second figure layer on the display interface of terminal
And target image so that the video pictures that user sees are the picture that original image and target image are constituted.Here, the magnifying glass
Switch can be the switch that software is realized, alternatively physical hardware switch.
Embodiment two:
Aforesaid embodiment is based on, the embodiment of the present invention provides a kind of method for browsing video, and the method is applied to terminal,
The function realized by the method can by terminal in processor caller code realizing, certain program code can be protected
Exist in computer-readable storage medium, it is seen then that the terminal at least includes processor and storage medium.
Fig. 4 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention two, as shown in figure 4, the method bag
Include:
S401, when browsing instructions are received, call the first figure layer, the original image that collects when shooting video shown
In first figure layer;
S402, when receiving default target image and operating, corresponding seat is operated according to the default target image
Cursor position determines the target area;
Here, default target image operation can be double click operation, there is the operation such as slide of certain track, specifically
The form of operation be not defined.
When default target image operation is received, terminal determines target area according to default object run.Target
Region can be determined according to the position of target image operation, such as:When the operation of default target image is double click operation, can
The coordinate that display interface is corresponded to according to double click operation determines the position that target area is located, here, the big I of target area
Determined according to the size of the pressure of double click operation, the area size of a fixation can be set, centered on the coordinate of double click operation
Determine target area, can automatic identification double click operation correspondence image, determine that by automatic identification the double click operation is corresponding complete
Whole image, using complete image corresponding region as target area.Again such as:When default object run is with an orbit determination
During the slide of mark, the position of target area can be determined according to the track of slide, and corresponding according to the track
Scope is determining the size of target area.
Here, subregion of the target area of determination for original image.
S403, using the image in the target area of the original image as pending image, to described pending
Image carries out zoom and processes generation target image, and calls the second figure layer, and the target figure is shown in second figure layer;
After target area is determined, the original image in target area is extracted, using the image for extracting as pending figure
Picture, and zoom process generation target image is carried out to pending image.When pending image is proposed, to aobvious by the first figure layer
The original image for showing does not carry out any process, so as to not affect the display of the original image of the first figure layer.Calling the second figure
Layer, the target image is shown in the second figure layer.
S404, the Overlapping display by first figure layer with second figure layer include the original image with the mesh
The video pictures of logo image.
By the method for browsing video provided in an embodiment of the present invention, when emphasis assumes key area, the emphasis of presentation
Region is to determine region according to the selection of user, now, with the change of the original image that camera is collected, in target area
Object can change.When the pending image for carrying out zoom process is determined based on target area, pending image meeting
Change with the object variation of target area, be enable to the fixed bit in a certain camera picture is focused on display to user
The concrete picture that puts.
Embodiment three:
Aforesaid embodiment is based on, the embodiment of the present invention provides a kind of method for browsing video, and the method is applied to terminal,
The function realized by the method can by terminal in processor caller code realizing, certain program code can be protected
Exist in computer-readable storage medium, it is seen then that the terminal at least includes processor and storage medium.
Fig. 5 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention three, as shown in figure 5, the method bag
Include:
S501, when instruction for previewing is received, call the first figure layer, the original image that collects when shooting video shown
In first figure layer;
S502, when receiving default target image and operating, corresponding position is operated according to the default target image
Put determination destination object;And the second figure layer is called, the target image and second figure layer are carried out corresponding;
Here, default target image operation can be double click operation, there is the operation such as slide of certain track, specifically
The form of operation be not defined, when receiving default target image and operating, here, default target image operation can
For the operation such as double click operation, the slide with certain track, specifically the form of operation is not defined.
When default target image operation is received, terminal determines destination object according to default object run.Target
Object can be determined according to the position of target image operation, such as:When the operation of default target image is double click operation, can
The object shown on the position of the coordinate is determined according to double click operation corresponding to the coordinate of display interface, using the object as target
Object, here, when object is determined, determines the corresponding target pair of default object run in currently displaying original image
As.Here it is possible to the corresponding complete object of the double click operation is determined by automatic identification, using the complete object as mesh
Mark object.Again such as:When default object run is the slide with certain track, can be according to the track of slide
To determine the corresponding region in the track, using image in the region as destination object.
Here, subregion of the region belonging to the destination object of determination for original image.
S503, using the original image in the region belonging to the destination object as pending image, to described pending
Image carries out zoom and processes generation target image, and calls the second figure layer, and the target image is shown in second figure layer
On;
After destination object is determined, the original image in the region belonging to destination object is extracted, the image for proposing out is made
For pending image, and carry out zoom and process generating target image to pending image.When pending image is proposed, to passing through
The original image that first figure layer shows does not carry out any process, so as to not affect the display of the original image of the first figure layer.?
The second figure layer is called, the target image is shown in the second figure layer.
S504, the Overlapping display by first figure layer with second figure layer include the original image with the mesh
The video pictures of logo image.
By the method for browsing video provided in an embodiment of the present invention, when emphasis assumes key area, the emphasis of presentation
Region is the region according to belonging to the destination object that the selection of user determines, now, the original image that collects with camera
Change, position of the destination object in original image can change, and need when being determined based on the region belonging to destination object
During key area to be presented, in the region for presenting, all there is the destination object that needs are given prominence to the key points all the time.Therefore, in base
Determine that in destination object when carrying out the pending image of zoom process, the object in pending image is with the change of original image
Change and do not change, be enable to the concrete condition that a certain fixed object in camera picture is focused on display to user.
In actual use, two region can be set the goal really and embodiment three sets the goal object really really in conjunction with the embodiments
Fixed pending image, to meet the individual demand that user side browses video.
Example IV:
Aforesaid embodiment is based on, the embodiment of the present invention provides a kind of method for browsing video, and the method is applied to terminal,
The function realized by the method can by terminal in processor caller code realizing, certain program code can be protected
Exist in computer-readable storage medium, it is seen then that the terminal at least includes processor and storage medium.
Fig. 6 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention four, as shown in fig. 6, the method bag
Include:
S601, when receiving browsing instructions, the first figure layer is called, the original image collected when shooting video is shown
In first figure layer;
S602, corresponding for the subregion of original image image is carried out zoom process to generate target image, and adjust
The second figure layer is used, the target image is shown in second figure layer;
S603, the Overlapping display by first figure layer with second figure layer include the original image with the mesh
The video pictures of logo image;
S604, when store instruction is received, by the figure layer of the map data mining platform of first figure layer and second figure layer
Information is bound;Original video and second figure layer pair that first figure layer corresponding original image generate are stored respectively
The target video that the target image that answers is generated.
When terminal receives storage operation, store instruction is issued according to the storage of user operation.Here, storage operation can
Operate for software switch, gesture operation, phonetic order etc., specifically do not limited here.
When store instruction is received, map data mining platform of the map data mining platform of the first figure layer with the second figure layer is bound,
Here, bind and a corresponding relation is formed between the map data mining platform of the first figure layer and the map data mining platform of the second figure layer, to determine
The two figure layers can be shown simultaneously when showing, to characterize the image that the first figure layer shows and the image that the second figure layer shows
For the relation between original image and target image.Wherein, map data mining platform may include the video letter of the size of the figure layer, display
The information related to figure layer such as breath, timeline information.
During video flowing is formed according to the image for showing, the corresponding video of original image that the first figure layer is shown
Stream video flowing corresponding with the target image that the second figure layer shows is retained separately, so as to obtain two independent video files:Drill
Show video and target video, original video and target video are stored respectively.
Here, the picture after the picture of target video is amplified for the partial picture in the picture of original video.Than
Such as:When being to be sealed in the picture of gathering honey on flower in the picture of original video, the picture of target video can be honeybee in gathering honey
In journey, specific face is the how picture of gathering honey.
In the present embodiment, corresponding for raw frames original video and the corresponding target video of target picture are protected respectively
Deposit, the raw frames of video can not be affected, raw frames can individually be played.
Embodiment five:
Aforesaid embodiment is based on, the embodiment of the present invention provides a kind of method for browsing video, and the method is applied to terminal,
The function realized by the method can by terminal in processor caller code realizing, certain program code can be protected
Exist in computer-readable storage medium, it is seen then that the terminal at least includes processor and storage medium.
Fig. 7 is the schematic flow sheet of the method for browsing video in the embodiment of the present invention five, here, with concrete recorded video
After carry out video playback for application scenarios, as shown in fig. 7, the method includes:
S701, when receiving browsing instructions, the first figure layer is called, the original image collected when shooting video is shown
In first figure layer;
S702, corresponding for the subregion of original image image is carried out zoom process to generate target image, and adjust
The second figure layer is used, the target image is shown in second figure layer;
S703, the Overlapping display by first figure layer with second figure layer include the original image with the mesh
The video pictures of logo image;
S704, when store instruction is received, by the figure layer of the map data mining platform of first figure layer and second figure layer
Information is bound;Original video and second figure layer pair that first figure layer corresponding original image generate are stored respectively
The target video that the target image that answers is generated;
S705, when play instruction is received, obtain first figure layer map data mining platform and with first figure layer
The map data mining platform of second figure layer of map data mining platform binding;Map data mining platform and second figure layer according to first figure layer
Map data mining platform determine original video to be shown and target video respectively;By first figure layer and second figure layer
The raw frames of original video described in Overlapping display and the target picture of the target video.
Here, by S701-S704 the step of, the Record and Save of video has been carried out, when play operation is received, under
Play instruction is sent out, here, the video software used with broadcasting video by video software that recorded video is used can be identical, also may not be used
With.
When play instruction is received, the broadcasting of video is carried out, and here, the corresponding object of playing of the play instruction is S701
The video that the step of to S704 shoots.In video playback, obtain while the map data mining platform of the first figure layer is obtained with this
The map data mining platform of the second figure layer of the map data mining platform binding of one figure layer, after the map data mining platform for obtaining the two figure layers, it may be determined that
The original video shown by the first figure layer, the target video with being shown by the second figure layer, is shown by the first figure layer original
The raw frames of video, by the target image of the second figure layer display target video, by the folded of the first figure layer and the second figure layer
Plus show the video pictures for including original image and target image.
In embodiments of the present invention, when video is played, by the application again of the first figure layer and the second figure layer, Neng Goubo
The same video pictures of the picture of same recorded video process preview are put, so that the picture of video playback is with preview when recording
Picture is consistent,
Embodiment six:
In the present embodiment, the method for browsing video that inventive embodiments are provided is carried out by specific application scenarios
Description, specifically, the original image photographed by camera carries out the picture of gathering honey for honeybee, and wherein, picture includes honey
The objects such as honeybee, flower.When needing the specific action by honeybee during gathering honey to project, can be by the head of selection
The region at place be pending object, treat process object carry out zoom process obtain target image, the target image for obtaining is such as
In Fig. 8 shown in the border circular areas in the upper right corner of display interface, here, by the first figure layer of original image will be shown and shows mesh
Second figure layer of logo image is overlapped, and showing on the display interface of terminal includes the shooting picture of original image and Target Photo
Face, shows the effect using target area as the magnifying glass of magnification region on display interface.So as in the process for shooting video
In, to area-of-interest or after needing the region of feature, partial enlargement, video record to complete to preserve, the reservation office when playing
Portion's feature.
Implement profit seven
Aforesaid embodiment of the method is based on, the embodiment of the present invention provides a kind of terminal, as shown in figure 9, the terminal 900 is wrapped
Include:Original image module 901, target image module 902 and display module 903;Wherein,
Original image module 901, for when browsing instructions are received, calling the first figure layer, will gather when will shoot video
To original image show in first figure layer;
Target image module 902, processes life for corresponding for the subregion of original image image is carried out zoom
Into target image, and the second figure layer is called, the target image is shown in second figure layer;
Display module 903, described original for being included by first figure layer and the Overlapping display of second figure layer
The video pictures of image and the target image.
As shown in Figure 10, terminal 900 also includes:Target area determining module 904, receives default target figure for working as
As, during operation, operating corresponding coordinate position to determine the target area according to the default target image;
Now, corresponding for the subregion of original image image is carried out zoom and processes life by target image module 902
Include into target image:
Using the image in the target area of the original image as pending image, the pending image is entered
The process of row zoom generates target image.
As shown in Figure 10, terminal 900 also includes:Destination object determining module 905, receives default target figure for working as
As, during operation, operating corresponding position to determine destination object according to the default target image;
Now, corresponding for the subregion of original image image is carried out zoom and processes life by target image module 902
Include into target image:
Using the original image in the region belonging to the destination object as pending image, the pending image is entered
The process of row zoom generates target image.
As shown in Figure 10, terminal 900 also includes:Memory module 906, is used for:
When store instruction is received, map data mining platform of the map data mining platform of first figure layer with second figure layer is entered
Row binding;
Original video and second figure layer for storing the corresponding original image generation of first figure layer respectively is corresponding
The target video that target image is generated.
As shown in Figure 10, terminal 900 also includes:Playing module 907, is used for:
When play instruction is received, obtain the map data mining platform of first figure layer and believe with the figure layer of first figure layer
The map data mining platform of second figure layer of breath binding;Map data mining platform and the figure layer of second figure layer according to first figure layer
Information determines original video to be shown and target video respectively;Aobvious by the superposition of first figure layer and second figure layer
Show the video pictures of the target image of original image and the target video including the original video.
It should be noted that the original image module 901 being related in the embodiment of the present invention, target image module 902 can be led to
Cross the processing unit 140 shown in Fig. 1 to realize, display module 903 can be realized by the display unit 151 shown in Fig. 1, target area
Determining module 904, destination object determining module 905 can pass through the user input unit 130 shown in Fig. 1 and processing unit 140 is real
Existing, memory module 906 can be realized by the memory 160 shown in Fig. 1, and playing module can pass through the processing unit 140 shown in Fig. 1
Realize with multi-media module 181.
It need to be noted that be:Apparatus above implements the description of item, is similar with said method description, with same
Embodiment of the method identical beneficial effect, does not therefore repeat.For the ins and outs not disclosed in apparatus of the present invention embodiment,
Those skilled in the art refer to the description of the inventive method embodiment and understand, is to save length, repeats no more here.
It should be understood that " one embodiment " or " embodiment " that specification is mentioned in the whole text mean relevant with embodiment
Special characteristic, structure or characteristic are included at least one embodiment of the present invention.Therefore, occur in entire disclosure everywhere
" in one embodiment " or " in one embodiment " not necessarily refers to identical embodiment.Additionally, these specific feature, knots
Structure or characteristic can be combined in one or more embodiments in any suitable manner.It should be understood that the various enforcements in the present invention
In example, the size of the sequence number of above-mentioned each process is not meant to the priority of execution sequence, and the execution sequence of each process should be with its work(
Can determine with internal logic, and any restriction should not be constituted to the implementation process of the embodiment of the present invention.The embodiments of the present invention
Sequence number is for illustration only, does not represent the quality of embodiment.
It should be noted that herein, term " including ", "comprising" or its any other variant are intended to non-row
His property includes, so that a series of process, method, article or device including key elements not only includes those key elements, and
And also include other key elements being not expressly set out, or also include for this process, method, article or device institute inherently
Key element.In the absence of more restrictions, the key element for being limited by sentence "including a ...", it is not excluded that including being somebody's turn to do
Also there is other identical element in the process of key element, method, article or device.
In several embodiments provided herein, it should be understood that disclosed apparatus and method, which can be passed through
Its mode is realized.Apparatus embodiments described above are only schematically, for example division of unit, are only one kind
Division of logic function, can have other dividing mode, such as when actually realizing:Multiple units or component can be combined, or can be with
Another system is integrated into, or some features can be ignored, or do not executed.In addition, shown or discussed each part phase
Coupling or direct-coupling or communication connection between mutually can be the INDIRECT COUPLING or logical by some interfaces, equipment or unit
Letter connection, can be electrical, machinery or other forms.
The above-mentioned unit that illustrates as separating component can be or may not be physically separate, aobvious as unit
The part for showing can be or may not be physical location;Both a place may be located at, it is also possible to be distributed to multiple network lists
In unit;Part or all of unit therein can be selected according to the actual needs to realize the purpose of this embodiment scheme.
In addition, each functional unit in various embodiments of the present invention can be fully integrated in a processing unit, also may be used
Being each unit individually as a unit, it is also possible to which two or more units are integrated in a unit;Above-mentioned
Integrated unit both can be realized in the form of hardware, it would however also be possible to employ hardware adds the form of SFU software functional unit to realize.
One of ordinary skill in the art will appreciate that:Realize that all or part of S of said method embodiment can pass through journey
Sequence instructs related hardware to complete, and aforesaid program can be stored in computer read/write memory medium, and the program is being held
During row, executing includes the S of said method embodiment;And aforesaid storage medium includes:Movable storage device, read-only storage
(Read Only Memory, ROM), magnetic disc or CD etc. are various can be with the medium of store program codes.
Or, if the above-mentioned integrated unit of the present invention is realized and as independent product using in the form of software function module
When sale or use, it is also possible to be stored in a computer read/write memory medium.Such understanding is based on, the present invention is implemented
The part that the technical scheme of example is substantially contributed to prior art in other words can be embodied in the form of software product,
The computer software product is stored in a storage medium, and using including some instructions so that computer equipment (can be with
It is personal computer, server or network equipment etc.) execute all or part of each embodiment method of the invention.And it is front
The storage medium that states includes:Movable storage device, ROM, magnetic disc or CD etc. are various can be with the medium of store program codes.
More than, specific embodiment only of the invention, but protection scope of the present invention is not limited thereto any is familiar with
Those skilled in the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be covered
Within protection scope of the present invention.Therefore, protection scope of the present invention should be defined by scope of the claims.
Claims (10)
1. a kind of method for browsing video, it is characterised in that methods described includes:
When browsing instructions are received, the first figure layer is called, the original image that collects when shooting video is shown described the
In one figure layer;
Corresponding for the subregion of original image image is carried out zoom and processes generation target image, and call the second figure
Layer, the target image is shown in second figure layer;
Include the original image with the target image by Overlapping display of first figure layer with second figure layer
Video pictures.
2. method according to claim 1, it is characterised in that methods described also includes:
When default target image operation is received, corresponding coordinate position is operated to determine according to the default target image
The target area;
Corresponding for the subregion of original image image is carried out zoom process generation target image includes:
Using the image in the target area of the original image as pending image, the pending image is become
Burnt process generates target image.
3. method according to claim 1, it is characterised in that methods described includes:
When default target image operation is received, corresponding position is operated to determine target according to the default target image
Object;
Corresponding for the subregion of original image image is carried out zoom process generation target image includes:
Using the original image in the region belonging to the destination object as pending image, the pending image is become
Burnt process generates target image.
4. method according to claim 1, it is characterised in that methods described also includes:
When store instruction is received, map data mining platform of the map data mining platform of first figure layer with second figure layer is tied up
Fixed;
Original video and second figure layer corresponding target that first figure layer corresponding original image generate are stored respectively
The target video that image is generated.
5. method according to claim 4, it is characterised in that methods described also includes:
When play instruction is received, obtain the map data mining platform of first figure layer and tie up with the map data mining platform of first figure layer
The map data mining platform of fixed second figure layer;
To be shown original regard is determined respectively according to the map data mining platform of first figure layer and the map data mining platform of second figure layer
Frequency and target video;
The original image of the original video and described is included by the Overlapping display of first figure layer and second figure layer
The video pictures of the target image of target video.
6. a kind of terminal, it is characterised in that the terminal includes:Original image module, target image module and display module;
Wherein,
The original image module, for when browsing instructions are received, calling the first figure layer, by collected when shooting video
Original image shows in first figure layer;
The target image module, processes generation mesh for corresponding for the subregion of original image image is carried out zoom
Logo image, and the second figure layer is called, the target image is shown in second figure layer;
The display module, includes the original image for the Overlapping display by first figure layer with second figure layer
Video pictures with the target image.
7. terminal according to claim 6, it is characterised in that the terminal also includes:Target area determining module, is used for
When default target image operation is received, according to the default target image operates corresponding coordinate position determination
Target area;
Corresponding for the subregion of original image image is carried out zoom and processes generation target figure by the target image module
As including:
Using the image in the target area of the original image as pending image, the pending image is become
Burnt process generates target image.
8. terminal according to claim 6, it is characterised in that the terminal includes:Destination object determining module, for working as
When receiving default target image operation, corresponding position is operated to determine destination object according to the default target image;
Corresponding for the subregion of original image image is carried out zoom and processes generation target figure by the target image module
As including:
Using the original image in the region belonging to the destination object as pending image, the pending image is become
Burnt process generates target image.
9. terminal according to claim 6, it is characterised in that the terminal also includes:Memory module, is used for:
When store instruction is received, map data mining platform of the map data mining platform of first figure layer with second figure layer is tied up
Fixed;
Original video and second figure layer corresponding target that first figure layer corresponding original image generate are stored respectively
The target video that image is generated.
10. terminal according to claim 9, it is characterised in that the terminal also includes:Playing module, is used for:
When play instruction is received, obtain the map data mining platform of first figure layer and tie up with the map data mining platform of first figure layer
The map data mining platform of fixed second figure layer;
To be shown original regard is determined respectively according to the map data mining platform of first figure layer and the map data mining platform of second figure layer
Frequency and target video;
The original image of the original video and described is included by the Overlapping display of first figure layer and second figure layer
The video pictures of the target image of target video.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2016109072964 | 2016-10-17 | ||
CN201610907296 | 2016-10-17 | ||
CN201610941145 | 2016-10-25 | ||
CN2016109411450 | 2016-10-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106504280A true CN106504280A (en) | 2017-03-15 |
Family
ID=57892765
Family Applications (11)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610946035.3A Pending CN106534674A (en) | 2016-10-17 | 2016-11-02 | Method for displaying focus area and mobile terminal |
CN201610946668.4A Active CN106502693B (en) | 2016-10-17 | 2016-11-02 | A kind of image display method and device |
CN201610946494.1A Active CN106572302B (en) | 2016-10-17 | 2016-11-02 | A kind of image information processing method and equipment |
CN201610945947.9A Pending CN106375595A (en) | 2016-10-17 | 2016-11-02 | Auxiliary focusing apparatus and method |
CN201610958160.6A Pending CN106572249A (en) | 2016-10-17 | 2016-11-02 | Region enlargement method and apparatus |
CN201610946623.7A Active CN106375596B (en) | 2016-10-17 | 2016-11-02 | Device and method for prompting focusing object |
CN201610944748.6A Active CN106453924B (en) | 2016-10-17 | 2016-11-02 | A kind of image capturing method and device |
CN201610947313.7A Pending CN106534675A (en) | 2016-10-17 | 2016-11-02 | Method and terminal for microphotography background blurring |
CN201610952598.3A Active CN106412324B (en) | 2016-10-17 | 2016-11-02 | Device and method for prompting focusing object |
CN201610947314.1A Active CN106572303B (en) | 2016-10-17 | 2016-11-02 | Picture processing method and terminal |
CN201610958157.4A Pending CN106504280A (en) | 2016-10-17 | 2016-11-02 | A kind of method and terminal for browsing video |
Family Applications Before (10)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610946035.3A Pending CN106534674A (en) | 2016-10-17 | 2016-11-02 | Method for displaying focus area and mobile terminal |
CN201610946668.4A Active CN106502693B (en) | 2016-10-17 | 2016-11-02 | A kind of image display method and device |
CN201610946494.1A Active CN106572302B (en) | 2016-10-17 | 2016-11-02 | A kind of image information processing method and equipment |
CN201610945947.9A Pending CN106375595A (en) | 2016-10-17 | 2016-11-02 | Auxiliary focusing apparatus and method |
CN201610958160.6A Pending CN106572249A (en) | 2016-10-17 | 2016-11-02 | Region enlargement method and apparatus |
CN201610946623.7A Active CN106375596B (en) | 2016-10-17 | 2016-11-02 | Device and method for prompting focusing object |
CN201610944748.6A Active CN106453924B (en) | 2016-10-17 | 2016-11-02 | A kind of image capturing method and device |
CN201610947313.7A Pending CN106534675A (en) | 2016-10-17 | 2016-11-02 | Method and terminal for microphotography background blurring |
CN201610952598.3A Active CN106412324B (en) | 2016-10-17 | 2016-11-02 | Device and method for prompting focusing object |
CN201610947314.1A Active CN106572303B (en) | 2016-10-17 | 2016-11-02 | Picture processing method and terminal |
Country Status (1)
Country | Link |
---|---|
CN (11) | CN106534674A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108989674A (en) * | 2018-07-26 | 2018-12-11 | 努比亚技术有限公司 | A kind of browsing video method, terminal and computer readable storage medium |
CN109525888A (en) * | 2018-09-28 | 2019-03-26 | Oppo广东移动通信有限公司 | Image display method, device, electronic equipment and storage medium |
CN109963200A (en) * | 2017-12-25 | 2019-07-02 | 上海全土豆文化传播有限公司 | Video broadcasting method and device |
CN110333813A (en) * | 2019-05-30 | 2019-10-15 | 平安科技(深圳)有限公司 | Method, electronic device and the computer readable storage medium of invoice picture presentation |
CN111355998A (en) * | 2019-07-23 | 2020-06-30 | 杭州海康威视数字技术股份有限公司 | Video processing method and device |
CN111526425A (en) * | 2020-04-26 | 2020-08-11 | 北京字节跳动网络技术有限公司 | Video playing method and device, readable medium and electronic equipment |
CN111722775A (en) * | 2020-06-24 | 2020-09-29 | 维沃移动通信(杭州)有限公司 | Image processing method, device, equipment and readable storage medium |
CN112188260A (en) * | 2020-10-26 | 2021-01-05 | 咪咕文化科技有限公司 | Video sharing method, electronic device and readable storage medium |
CN113132618A (en) * | 2019-12-31 | 2021-07-16 | 华为技术有限公司 | Auxiliary photographing method and device, terminal equipment and storage medium |
CN116055869A (en) * | 2022-05-30 | 2023-05-02 | 荣耀终端有限公司 | Video processing method and terminal |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106909274B (en) * | 2017-02-27 | 2020-12-15 | 南京车链科技有限公司 | Image display method and device |
CN106973164B (en) * | 2017-03-30 | 2019-03-01 | 维沃移动通信有限公司 | A kind of take pictures weakening method and the mobile terminal of mobile terminal |
CN107145285B (en) * | 2017-05-12 | 2019-12-03 | 维沃移动通信有限公司 | A kind of information extracting method and terminal |
CN107222676B (en) | 2017-05-26 | 2020-06-02 | Tcl移动通信科技(宁波)有限公司 | Blurred picture generation method, storage device and mobile terminal |
CN107247535B (en) * | 2017-05-31 | 2021-11-30 | 北京小米移动软件有限公司 | Intelligent mirror adjusting method and device and computer readable storage medium |
JP6856914B2 (en) * | 2017-07-18 | 2021-04-14 | ハンジョウ タロ ポジショニング テクノロジー カンパニー リミテッドHangzhou Taro Positioning Technology Co.,Ltd. | Intelligent object tracking |
CN107613202B (en) * | 2017-09-21 | 2020-03-10 | 维沃移动通信有限公司 | Shooting method and mobile terminal |
CN107807770A (en) * | 2017-09-27 | 2018-03-16 | 阿里巴巴集团控股有限公司 | A kind of screenshot method, device and electronic equipment |
WO2019113746A1 (en) * | 2017-12-11 | 2019-06-20 | 深圳市大疆创新科技有限公司 | Manual-focus prompt method, control apparatus, photography device, and controller |
CN108536364A (en) * | 2017-12-28 | 2018-09-14 | 努比亚技术有限公司 | A kind of image pickup method, terminal and computer readable storage medium |
CN108093181B (en) * | 2018-01-16 | 2021-03-30 | 奇酷互联网络科技(深圳)有限公司 | Picture shooting method and device, readable storage medium and mobile terminal |
CN108471524B (en) * | 2018-02-28 | 2020-08-07 | 北京小米移动软件有限公司 | Focusing method and device and storage medium |
CN108495029B (en) | 2018-03-15 | 2020-03-31 | 维沃移动通信有限公司 | Photographing method and mobile terminal |
CN110349223B (en) * | 2018-04-08 | 2021-04-30 | 中兴通讯股份有限公司 | Image processing method and device |
CN108876782A (en) * | 2018-06-27 | 2018-11-23 | Oppo广东移动通信有限公司 | Recall video creation method and relevant apparatus |
CN109816485B (en) * | 2019-01-17 | 2021-06-15 | 口碑(上海)信息技术有限公司 | Page display method and device |
CN109648568B (en) * | 2019-01-30 | 2022-01-04 | 深圳镁伽科技有限公司 | Robot control method, system and storage medium |
CN110908558B (en) * | 2019-10-30 | 2022-10-18 | 维沃移动通信(杭州)有限公司 | Image display method and electronic equipment |
CN112770042B (en) * | 2019-11-05 | 2022-11-15 | RealMe重庆移动通信有限公司 | Image processing method and device, computer readable medium, wireless communication terminal |
CN110896451B (en) * | 2019-11-20 | 2022-01-28 | 维沃移动通信有限公司 | Preview picture display method, electronic device and computer readable storage medium |
CN111026316A (en) * | 2019-11-25 | 2020-04-17 | 维沃移动通信有限公司 | Image display method and electronic equipment |
CN111182211B (en) * | 2019-12-31 | 2021-09-24 | 维沃移动通信有限公司 | Shooting method, image processing method and electronic equipment |
CN112584043B (en) * | 2020-12-08 | 2023-03-24 | 维沃移动通信有限公司 | Auxiliary focusing method and device, electronic equipment and storage medium |
CN114666490B (en) * | 2020-12-23 | 2024-02-09 | 北京小米移动软件有限公司 | Focusing method, focusing device, electronic equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101202873A (en) * | 2006-12-13 | 2008-06-18 | 株式会社日立制作所 | Method and device for information record reproduction |
CN101247489A (en) * | 2008-03-20 | 2008-08-20 | 南京大学 | Method for detail real-time replay of digital television |
CN104618627A (en) * | 2014-12-31 | 2015-05-13 | 小米科技有限责任公司 | Video processing method and device |
CN104836956A (en) * | 2015-05-09 | 2015-08-12 | 陈包容 | Processing method and device for cellphone video |
CN104883619A (en) * | 2015-05-12 | 2015-09-02 | 广州酷狗计算机科技有限公司 | System, method and device for recommending audio and video content |
CN105512136A (en) * | 2014-09-25 | 2016-04-20 | 中兴通讯股份有限公司 | Method and device for processing based on layer |
CN105578275A (en) * | 2015-12-16 | 2016-05-11 | 小米科技有限责任公司 | Video display method and apparatus |
CN105611145A (en) * | 2015-09-21 | 2016-05-25 | 宇龙计算机通信科技(深圳)有限公司 | Multi-graphic layer shooting method, multi-graphic layer shooting apparatus and terminal |
Family Cites Families (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN87200129U (en) * | 1987-01-08 | 1988-01-27 | 李传琪 | Multifunction enlarger |
JP4233624B2 (en) * | 1997-12-26 | 2009-03-04 | カシオ計算機株式会社 | Electronic camera device |
JP2003143144A (en) * | 2001-11-01 | 2003-05-16 | Matsushita Electric Ind Co Ltd | Transmission system and method for detecting delay amount of signal propagation |
JP2004064259A (en) * | 2002-07-26 | 2004-02-26 | Kyocera Corp | System for confirming focus of digital camera |
JP4012015B2 (en) * | 2002-08-29 | 2007-11-21 | キヤノン株式会社 | Image forming apparatus |
JP2006295242A (en) * | 2005-04-05 | 2006-10-26 | Olympus Imaging Corp | Digital camera |
JP4678603B2 (en) * | 2007-04-20 | 2011-04-27 | 富士フイルム株式会社 | Imaging apparatus and imaging method |
JP4961282B2 (en) * | 2007-07-03 | 2012-06-27 | キヤノン株式会社 | Display control apparatus and control method thereof |
CN101398527B (en) * | 2007-09-27 | 2011-09-21 | 联想(北京)有限公司 | Method for implementing zooming-in function on photo terminal and photo terminal thereof |
JP5173453B2 (en) * | 2008-01-22 | 2013-04-03 | キヤノン株式会社 | Imaging device and display control method of imaging device |
JP2010041175A (en) * | 2008-08-01 | 2010-02-18 | Olympus Imaging Corp | Image reproducing apparatus, image reproducing method, and program |
CN101778214B (en) * | 2009-01-09 | 2011-08-31 | 华晶科技股份有限公司 | Digital image pick-up device having brightness and focusing compensation function and image compensation method thereof |
JP5361528B2 (en) * | 2009-05-15 | 2013-12-04 | キヤノン株式会社 | Imaging apparatus and program |
CN101895723A (en) * | 2009-05-22 | 2010-11-24 | 深圳市菲特数码技术有限公司 | Monitoring device |
JP5460173B2 (en) * | 2009-08-13 | 2014-04-02 | 富士フイルム株式会社 | Image processing method, image processing apparatus, image processing program, and imaging apparatus |
JP5538992B2 (en) * | 2010-04-27 | 2014-07-02 | キヤノン株式会社 | Imaging apparatus and control method thereof |
CN102289336A (en) * | 2010-06-17 | 2011-12-21 | 昆达电脑科技(昆山)有限公司 | picture management system and method |
JP5546692B2 (en) * | 2011-09-30 | 2014-07-09 | 富士フイルム株式会社 | Imaging apparatus, imaging method, and program |
CN103842907A (en) * | 2011-09-30 | 2014-06-04 | 富士胶片株式会社 | Imaging device for three-dimensional image and image display method for focus state confirmation |
JP2013093819A (en) * | 2011-10-05 | 2013-05-16 | Sanyo Electric Co Ltd | Electronic camera |
JP5936404B2 (en) * | 2012-03-23 | 2016-06-22 | キヤノン株式会社 | Imaging apparatus, control method thereof, and program |
CN103366352B (en) * | 2012-03-30 | 2017-09-22 | 北京三星通信技术研究有限公司 | Apparatus and method for producing the image that background is blurred |
CN102932541A (en) * | 2012-10-25 | 2013-02-13 | 广东欧珀移动通信有限公司 | Mobile phone photographing method and system |
CN106027910B (en) * | 2013-01-22 | 2019-08-16 | 华为终端有限公司 | Preview screen rendering method, device and terminal |
CN103135927B (en) * | 2013-01-25 | 2015-09-30 | 广东欧珀移动通信有限公司 | A kind of mobile terminal rapid focus photographic method and system |
CN104104787B (en) * | 2013-04-12 | 2016-12-28 | 上海果壳电子有限公司 | Photographic method, system and handheld device |
CN103211621B (en) * | 2013-04-27 | 2015-07-15 | 上海市杨浦区中心医院 | Ultrasound directed texture quantitative measuring instrument and method thereof |
WO2015058381A1 (en) * | 2013-10-23 | 2015-04-30 | 华为终端有限公司 | Method and terminal for selecting image from continuous images |
CN103595919B (en) * | 2013-11-15 | 2015-08-26 | 努比亚技术有限公司 | Manual focus method and filming apparatus |
CN103631599B (en) * | 2013-12-11 | 2017-12-12 | Tcl通讯(宁波)有限公司 | One kind is taken pictures processing method, system and mobile terminal |
CN104731494B (en) * | 2013-12-23 | 2019-05-31 | 中兴通讯股份有限公司 | A kind of method and apparatus of preview interface selection area amplification |
JP6151176B2 (en) * | 2013-12-27 | 2017-06-21 | 株式会社 日立産業制御ソリューションズ | Focus control apparatus and method |
CN103777865A (en) * | 2014-02-21 | 2014-05-07 | 联想(北京)有限公司 | Method, device, processor and electronic device for displaying information |
CN104333689A (en) * | 2014-03-05 | 2015-02-04 | 广州三星通信技术研究有限公司 | Method and device for displaying preview image during shooting |
CN103929596B (en) * | 2014-04-30 | 2016-09-14 | 努比亚技术有限公司 | Guide the method and device of shooting composition |
CN104038699B (en) * | 2014-06-27 | 2016-04-06 | 努比亚技术有限公司 | The reminding method of focusing state and filming apparatus |
CN104023172A (en) * | 2014-06-27 | 2014-09-03 | 深圳市中兴移动通信有限公司 | Shooting method and shooting device of dynamic image |
CN104243825B (en) * | 2014-09-22 | 2017-11-14 | 广东欧珀移动通信有限公司 | A kind of mobile terminal Atomatic focusing method and system |
CN104243827A (en) * | 2014-09-23 | 2014-12-24 | 深圳市中兴移动通信有限公司 | Shooting method and device |
EP3018892A1 (en) * | 2014-10-31 | 2016-05-11 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
CN105872349A (en) * | 2015-01-23 | 2016-08-17 | 中兴通讯股份有限公司 | Photographing method, photographing device and mobile terminal |
CN104660913B (en) * | 2015-03-18 | 2016-08-24 | 努比亚技术有限公司 | Focus adjustment method and apparatus |
CN104702846B (en) * | 2015-03-20 | 2018-05-08 | 惠州Tcl移动通信有限公司 | Mobile terminal camera preview image processing method and system |
CN104754227A (en) * | 2015-03-26 | 2015-07-01 | 广东欧珀移动通信有限公司 | Method and device for shooting video |
CN104954672B (en) * | 2015-06-10 | 2020-06-02 | 惠州Tcl移动通信有限公司 | Manual focusing method of mobile terminal and mobile terminal |
CN105100615B (en) * | 2015-07-24 | 2019-02-26 | 青岛海信移动通信技术股份有限公司 | A kind of method for previewing of image, device and terminal |
CN105141858B (en) * | 2015-08-13 | 2018-10-12 | 上海斐讯数据通信技术有限公司 | The background blurring system and method for photo |
CN105843501B (en) * | 2016-02-03 | 2019-11-29 | 维沃移动通信有限公司 | A kind of method of adjustment and mobile terminal of parameter of taking pictures |
CN105979165B (en) * | 2016-06-02 | 2019-02-05 | Oppo广东移动通信有限公司 | Blur photograph generation method, device and mobile terminal |
-
2016
- 2016-11-02 CN CN201610946035.3A patent/CN106534674A/en active Pending
- 2016-11-02 CN CN201610946668.4A patent/CN106502693B/en active Active
- 2016-11-02 CN CN201610946494.1A patent/CN106572302B/en active Active
- 2016-11-02 CN CN201610945947.9A patent/CN106375595A/en active Pending
- 2016-11-02 CN CN201610958160.6A patent/CN106572249A/en active Pending
- 2016-11-02 CN CN201610946623.7A patent/CN106375596B/en active Active
- 2016-11-02 CN CN201610944748.6A patent/CN106453924B/en active Active
- 2016-11-02 CN CN201610947313.7A patent/CN106534675A/en active Pending
- 2016-11-02 CN CN201610952598.3A patent/CN106412324B/en active Active
- 2016-11-02 CN CN201610947314.1A patent/CN106572303B/en active Active
- 2016-11-02 CN CN201610958157.4A patent/CN106504280A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101202873A (en) * | 2006-12-13 | 2008-06-18 | 株式会社日立制作所 | Method and device for information record reproduction |
CN101247489A (en) * | 2008-03-20 | 2008-08-20 | 南京大学 | Method for detail real-time replay of digital television |
CN105512136A (en) * | 2014-09-25 | 2016-04-20 | 中兴通讯股份有限公司 | Method and device for processing based on layer |
CN104618627A (en) * | 2014-12-31 | 2015-05-13 | 小米科技有限责任公司 | Video processing method and device |
CN104836956A (en) * | 2015-05-09 | 2015-08-12 | 陈包容 | Processing method and device for cellphone video |
CN104883619A (en) * | 2015-05-12 | 2015-09-02 | 广州酷狗计算机科技有限公司 | System, method and device for recommending audio and video content |
CN105611145A (en) * | 2015-09-21 | 2016-05-25 | 宇龙计算机通信科技(深圳)有限公司 | Multi-graphic layer shooting method, multi-graphic layer shooting apparatus and terminal |
CN105578275A (en) * | 2015-12-16 | 2016-05-11 | 小米科技有限责任公司 | Video display method and apparatus |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109963200A (en) * | 2017-12-25 | 2019-07-02 | 上海全土豆文化传播有限公司 | Video broadcasting method and device |
CN108989674A (en) * | 2018-07-26 | 2018-12-11 | 努比亚技术有限公司 | A kind of browsing video method, terminal and computer readable storage medium |
CN109525888A (en) * | 2018-09-28 | 2019-03-26 | Oppo广东移动通信有限公司 | Image display method, device, electronic equipment and storage medium |
CN110333813A (en) * | 2019-05-30 | 2019-10-15 | 平安科技(深圳)有限公司 | Method, electronic device and the computer readable storage medium of invoice picture presentation |
CN111355998A (en) * | 2019-07-23 | 2020-06-30 | 杭州海康威视数字技术股份有限公司 | Video processing method and device |
CN113132618A (en) * | 2019-12-31 | 2021-07-16 | 华为技术有限公司 | Auxiliary photographing method and device, terminal equipment and storage medium |
CN111526425A (en) * | 2020-04-26 | 2020-08-11 | 北京字节跳动网络技术有限公司 | Video playing method and device, readable medium and electronic equipment |
CN111526425B (en) * | 2020-04-26 | 2022-08-09 | 北京字节跳动网络技术有限公司 | Video playing method and device, readable medium and electronic equipment |
CN111722775A (en) * | 2020-06-24 | 2020-09-29 | 维沃移动通信(杭州)有限公司 | Image processing method, device, equipment and readable storage medium |
WO2021259185A1 (en) * | 2020-06-24 | 2021-12-30 | 维沃移动通信有限公司 | Image processing method and apparatus, device, and readable storage medium |
CN112188260A (en) * | 2020-10-26 | 2021-01-05 | 咪咕文化科技有限公司 | Video sharing method, electronic device and readable storage medium |
CN116055869A (en) * | 2022-05-30 | 2023-05-02 | 荣耀终端有限公司 | Video processing method and terminal |
CN116055869B (en) * | 2022-05-30 | 2023-10-20 | 荣耀终端有限公司 | Video processing method and terminal |
Also Published As
Publication number | Publication date |
---|---|
CN106375596A (en) | 2017-02-01 |
CN106412324A (en) | 2017-02-15 |
CN106502693B (en) | 2019-07-19 |
CN106572249A (en) | 2017-04-19 |
CN106572302B (en) | 2019-07-30 |
CN106572303B (en) | 2020-02-18 |
CN106453924B (en) | 2019-11-15 |
CN106412324B (en) | 2020-02-14 |
CN106375595A (en) | 2017-02-01 |
CN106534675A (en) | 2017-03-22 |
CN106453924A (en) | 2017-02-22 |
CN106375596B (en) | 2020-04-24 |
CN106502693A (en) | 2017-03-15 |
CN106572302A (en) | 2017-04-19 |
CN106534674A (en) | 2017-03-22 |
CN106572303A (en) | 2017-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106504280A (en) | A kind of method and terminal for browsing video | |
CN106453538A (en) | Screen sharing apparatus and method | |
CN106878464A (en) | A kind of document display method and device | |
CN106888349A (en) | A kind of image pickup method and device | |
CN106909274A (en) | A kind of method for displaying image and device | |
CN106155694A (en) | A kind of application and the display packing and device applied of attending to anything else | |
CN106201252A (en) | The map display of a kind of mobile terminal and method | |
CN106686213A (en) | Shooting method and apparatus thereof | |
CN107071329A (en) | The method and device of automatic switchover camera in video call process | |
CN106791455A (en) | Panorama shooting method and device | |
CN106658159A (en) | Control method and first electronic equipment, and target equipment | |
CN106850941A (en) | Method, photo taking and device | |
CN106909681A (en) | A kind of information processing method and its device | |
CN106506858A (en) | Star orbital Forecasting Methodology and device | |
CN107018334A (en) | A kind of applied program processing method and device based on dual camera | |
CN106372264A (en) | Map data migration device and method | |
CN106373110A (en) | Method and device for image fusion | |
CN106302992A (en) | A kind of mobile terminal and screen lighting method | |
CN105183830B (en) | picture browsing method and device | |
CN104731484B (en) | The method and device that picture is checked | |
CN106453542A (en) | Screen sharing apparatus and method | |
CN106843684A (en) | A kind of device and method, the mobile terminal of editing screen word | |
CN106657619A (en) | Screenshot method and device | |
CN106603859A (en) | Photo filter processing method, device and terminal | |
CN106777251A (en) | A kind of file management method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170315 |
|
RJ01 | Rejection of invention patent application after publication |