WO2022156473A1 - Procédé de lecture de vidéo et dispositif électronique - Google Patents

Procédé de lecture de vidéo et dispositif électronique Download PDF

Info

Publication number
WO2022156473A1
WO2022156473A1 PCT/CN2021/140541 CN2021140541W WO2022156473A1 WO 2022156473 A1 WO2022156473 A1 WO 2022156473A1 CN 2021140541 W CN2021140541 W CN 2021140541W WO 2022156473 A1 WO2022156473 A1 WO 2022156473A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
video clip
window
electronic device
playback
Prior art date
Application number
PCT/CN2021/140541
Other languages
English (en)
Chinese (zh)
Inventor
苏达
张韵叠
于远灏
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022156473A1 publication Critical patent/WO2022156473A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8549Creating video summaries, e.g. movie trailer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations

Definitions

  • the present application relates to the field of electronic technology, and in particular, to a method and electronic device for playing video.
  • a variety of shooting techniques may be used in the process of video shooting.
  • the process of video shooting is often accompanied by lens movement and transformation, which we usually call "mirror movement".
  • Common camera movement methods can include push, pull, shake, shift, etc.
  • the effects of different mirror movement methods in the video are shown as zooming of the screen, approaching or moving away from the subject, translation and rotation, etc., which can increase the video quality. Atmosphere and emotion.
  • the target video can usually be displayed in a pop-up window.
  • the pop-up window can occupy a specific display area of the display screen, or the pop-up window can be displayed on the display screen. full screen display, etc.
  • the video is expanded in a single way, and the user experience is poor.
  • the present application provides a method and electronic device for playing video.
  • the electronic device may include mobile phones, tablets, computers and other devices including display screens.
  • the method can provide users with a coherent immersive experience and improve the user's vision. experience.
  • a method for playing a video is provided, applied to an electronic device including a display screen, the method includes: displaying a video list interface, the video list interface including thumbnails of one or more video clips; receiving user feedback A playback operation of a thumbnail of a first video clip, wherein the first video clip is any one of the one or more video clips; The first camera movement type corresponding to the duration; according to the first camera movement type, the play window of the first video clip is expanded in a first expansion manner, and the first video clip is played in the play window.
  • the embodiment of the present application can be applied to the video playback process, by detecting different motion types used in the specific duration of the title video of the target video, and matching the associated opening animation according to the motion type of the target video, that is, in the During the playback of the target video, the playback window of the target video is expanded in an associated expansion manner to present different visual effects to the user.
  • the playback window of the target video can have a dynamic change effect, and at the same time, it can be accompanied by dynamic changes such as the size and transparency of the background image, which can further provide users with a coherent immersive experience and improve the user's visual experience.
  • the playback effect of the opening animation is associated with the playback effect of the segment of the target video with a specific duration at the title, that is, for different shooting methods, there may be different playback effects of the opening animation.
  • the target video selected by the user will be played in the video playback window, and the video playback window can be presented to the user in different expansion ways (the expansion process is called "opening animation"), and finally a fixed
  • the target video can be played in a video play window of a size, which avoids that the video play window in the existing solution can only be popped up by a fixed window.
  • the content played in the video playback window may be a picture of a specific duration (that is, the first duration) of the title video of the target video, or the content played by the "opening animation" is the The content of S seconds before the title of the target video, or the content of the first N frames, the "first duration” may be S seconds before the title of the first video clip or N frames before the title of the first video clip. duration.
  • S and N may be preset fixed values or values set by the user, and the embodiment of the present application does not limit the duration of the opening animation.
  • the window for playing the "opening animation” is referred to as a "playing window”
  • the playing window can be the same window as the playing window for playing the target video, that is, the opening animation can be understood as the playing window of the target video.
  • the shape change process of the playback window; or, the playback window can be a different window from the playback window for playing the target video, and after the opening animation is played, it jumps to the playback window for playing the target video and continues to play the target video. This is not limited.
  • the playback effect of the segment of the target video with a specific duration of the title video can be understood as in the process of shooting the target video, within the specified duration of the title video, the photographer used different lens movement methods, etc. After shooting skills, the playback effect that can be presented to the user during playback.
  • the first expansion method includes a change in the size of the playback window of the first video clip; and/or the position of the playback window of the first video clip way of change.
  • the method further includes: acquiring the corresponding video clips of the one or more video clips on the video list interface within the first duration Motion type information, and determining the first motion type corresponding to the first video segment within the first duration in response to the playback operation, including: in response to the playback operation, from the one or more video segments The first video segment is determined, and the motion type information corresponding to the first video segment within the first duration is determined as the first motion type.
  • the motion type information corresponding to each video segment within the first duration is that the electronic device performs real-time motion detection or periodic motion. and/or the motion type information corresponding to each video clip within the first duration is the information carried in the tag details of each video clip.
  • the electronic device may perform real-time motion detection on the locally stored video, or periodic motion detection, for example, the electronic device uses the nighttime (for example, between 24:00-06:00) to check the user's rest time.
  • Motion detection is performed on videos stored locally by the electronic device to obtain motion type information of each video, so as to reduce the impact on the user's use of the electronic device.
  • the electronic device starts to detect the movement type information of the first video clip, and does not need to additionally detect the movement types of other video clips, thereby reducing the data processing process of the electronic device and reducing the electronic device. device power consumption, etc.
  • the motion type information corresponding to each video clip within the first duration is the information carried in the tag details of each video clip, that is, each video clip has its own unique tag information, and the
  • the label information may carry mirror movement type information and the like.
  • the electronic device can detect the motion type information of the target video clip selected by the user in real time according to the user's selection operation, or when the electronic device caches a video clip, start the electronic device to detect the cached video clip.
  • the motion type information of the video clip is not limited in this embodiment of the present application.
  • the corresponding low-level features and high-level features in the video segment can be simultaneously extracted through time-series pixel changes in the video data and structured motion based on key frame matching.
  • the motion detection of structured key points (high-level features) in the video is prioritized to effectively judge the lens motion, and then determine the motion of the first N seconds of the first video clip. type. If the scene is too complex and the key point detection and matching errors are large, the time series pixel change histogram (underlying feature) can be used to effectively judge the movement of the lens, and then determine the type of lens movement in the first N seconds of the first video clip.
  • This process can be applied to videos using different mirror movement techniques in more shooting scenarios, which improves the accuracy of video movement detection.
  • the method further includes: taking a picture obtained by taking a screenshot of the video list interface as the background picture of the playback window of the first video segment; and , expanding the play window of the first video clip in a first expansion manner, comprising: on the background picture, expanding the play window of the first video clip in the first expansion manner.
  • the picture obtained by taking the screenshot of the video list interface is used as the background picture of the playback window of the first video clip, and the background picture is scaled to match the playback effect of the opening animation, and this process can reduce the data processing of the mobile phone. amount to ensure the operating performance of the mobile phone.
  • the electronic device may also directly zoom the background elements on the video list interface, and expand the zoomed background elements.
  • the element serves as the background of the playback window of the first video clip.
  • the mobile phone directly zooms the background element itself, that is, the mobile phone zooms the thumbnails of the video clips on the video list interface respectively, and the mobile phone also needs to control the arrangement order of each video clip and the free position of each video clip on the background interface.
  • the displacement change process ensures that the playback effect of the opening animation can be highlighted after zooming the background element.
  • the size of the background picture remains unchanged , and/or the transparency of the background image remains unchanged; or the size of the background image changes according to the first preset rule, and/or the transparency of the background image changes according to the second preset rule.
  • the change in the size of the playback window of the first video clip may be a change in at least one of length and width.
  • the background picture of the play window of the first video clip also changes dynamically. Specifically, the display size and transparency of the background picture of the playback window of the first video clip can be dynamically changed.
  • the background picture of the playback window may remain unchanged, or the background picture of the playback window may also be dynamically changed.
  • the background picture can also be enlarged and displayed as the playback window grows, or reduced and displayed as the playback window shrinks. It should be understood that the embodiments of the present application do not limit the enlargement rate or the reduction rate of the background picture.
  • the transparency of the background picture of the play window may gradually change from high to low, or from low to high.
  • the user in the process of expanding the playback window of the first video clip, along with the gradual enlargement or reduction of the background picture, and with the dynamic change of the transparency of the background picture, the user can experience the video picture more deeply and vividly.
  • the change process of the shooting subject improves the user's visual experience.
  • the playing window of the first video clip in the process of expanding the playing window of the first video clip in the first expanding manner, is initialized
  • the display position is determined according to the position of the thumbnail of the first video clip.
  • the initial display position of the playback window of the first video clip can be changed according to the position of the thumbnail clicked by the user, which is more in line with the user's usage habits and improves the user experience.
  • the first duration is the duration corresponding to S seconds before the title of the first video clip or N frames before the title of the first video clip .
  • the play window of the first video clip includes a picture display area and/or a menu control area.
  • the shooting technique of "moving the mirror” is used during the shooting of the title of the first video clip selected by the user for a specific length of time, then, during the playback of the opening animation, the first video
  • the size corresponding to the playback window of the segment may also have a dynamic change effect from small to large, and the first video segment played in the playback window of the first video segment also has a scene in which the shot is gradually converted from a larger scene to a local close-up scene, The visual effect of the subject being photographed growing from small to large.
  • the position of the lens remains unchanged and the subject to be photographed is in a state of motion from far to near, although the mirror is not used in this process, and the scene where the subject to be photographed is located.
  • the range remains unchanged, but the subject being photographed can also show a dynamic change effect from small to large.
  • the scene can also be divided into the category of "moving mirror", and matching and "moving mirror” during the video playback process. The associated playback effect shown in FIG. 4 will not be repeated here.
  • the playback window has a small The dynamic change effect of becoming larger, and the target video played in the playback window is gradually converted from a larger scene to a local close-up scene, and the subject being photographed grows from small to large. Therefore, the playing process of the opening animation can further provide a coherent immersive experience for the user, that is, the user can experience the driving process of the subject being photographed from far to near in a more profound and vivid way, which improves the user's visual experience.
  • the shooting technique of "pulling the mirror” is used during the shooting of the title of the first video clip selected by the user for a specific duration, then, during the playing of the opening animation, the first The size corresponding to the playback window of the video clip may also have a dynamic change effect from large to small, and the video played in the playback window of the first video clip is gradually converted from a partial close-up scene to a larger scene, and the captured subject changes from a scene to a larger one. Big and small visual effects.
  • the car when the position of the lens remains unchanged and the subject to be photographed, the car, is in a state of motion from near to far, although the camera is not used in this process, and the scope of the scene where the subject to be photographed remains unchanged,
  • the subject being photographed can also show a dynamic change effect from large to small, and the scene can also be divided into the category of "pulling mirror", and match the playback associated with the "pulling mirror" during the video playback process. The effect will not be repeated here.
  • the target video that uses the "pulling mirror” shooting technique within a certain duration of the title after the user clicks to play, it is displayed to the user in the form of an opening animation.
  • the playback window has a dynamic change effect from large to small, and the target video played in the playback window is also gradually converted from a local close-up scene to a larger scene, and the captured subject changes from large to small. . Therefore, the playing process of the opening animation can further provide a coherent immersive experience for the user, that is, the user can experience the driving process of the photographed subject from near to far in a more profound and vivid way, which improves the user's visual experience.
  • the opening animation During the playing process the playing window of the first video clip also has a corresponding dynamic change effect of moving from bottom to top, and the video clip played in the playing window of the first video clip also has the visual effect of the camera moving up and down.
  • the shooting technique of the shot from top to bottom in the "moving mirror" is used during the shooting of the title of the first video clip selected by the user for a specific length of time
  • the opening animation During the playing process the playing window of the first video clip also has the corresponding dynamic change effect of moving up and down, and the video clip played in the playing window of the first video clip also has the visual effect that the lens moves from top to bottom.
  • the playback process of the opening animation matches the playing effect of the moving mirror, that is, the playback window also presents the animation effect of moving up and down. Therefore, the playback of the opening animation
  • the process can further provide a coherent immersive experience for the user, that is, the user can experience the process of moving the camera up and down more deeply and vividly, which improves the user's visual experience.
  • an electronic device comprising: a display screen; one or more processors; one or more memories; a module installed with a plurality of application programs; the memory stores one or more programs, and when the When executed by the processor, one or more programs cause the electronic device to perform the following steps: displaying a video list interface, the video list interface including thumbnails of one or more video clips; receiving a user's thumbnail of the first video clip; A thumbnail playback operation, wherein the first video clip is any one of the one or more video clips; in response to the playback operation, obtain a first video clip corresponding to the first video clip within a first duration camera movement type; according to the first camera movement type, expand the play window of the first video clip in a first expansion manner, and play the first video clip in the play window.
  • the first expansion method includes a change in the size of the playback window of the first video clip; and/or the position of the playback window of the first video clip way of change.
  • the electronic device when the one or more programs are executed by the processor, the electronic device is caused to perform the following steps: take a screenshot of the video list interface to obtain , as the background picture of the play window of the first video clip; and on the background picture, expand the play window of the first video clip in a first expansion manner.
  • the size of the background image in the process of expanding the playback window of the first video clip in the first expanding manner, the size of the background image remains unchanged , and/or the transparency of the background image remains unchanged; or the size of the background image changes according to the first preset rule, and/or the transparency of the background image changes according to the second preset rule.
  • the electronic device in the process of expanding the playback window of the first video clip in the first expansion manner, when the one or more programs are When the processor is executed, the electronic device is caused to perform the following steps: determining the initialized display position of the playback window of the first video clip according to the position of the thumbnail of the first video clip.
  • the first duration is the duration corresponding to S seconds before the title of the first video clip or N frames before the title of the first video clip .
  • the play window of the first video clip includes a picture display area and/or a menu control area.
  • the electronic device when the one or more programs are executed by the processor, the electronic device is caused to perform the following steps: acquiring the one on the video list interface Motion type information corresponding to each video clip in the first duration; and in response to the playback operation, determining the first video clip from the one or more video clips, and using the first video clip The motion type information corresponding to a video clip within the first duration is determined as the first motion type.
  • the motion type information corresponding to each video segment within the first duration is that the electronic device performs real-time motion detection or periodic motion. and/or the motion type information corresponding to each video clip within the first duration is the information carried in the tag details of each video clip.
  • the present application provides an apparatus, the apparatus is included in an electronic device, and the apparatus has a function of implementing the behavior of the electronic device in the above-mentioned aspect and possible implementations of the above-mentioned aspect.
  • the functions can be implemented by hardware, or by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the above functions. For example, a display module or unit, a detection module or unit, a processing module or unit, and the like.
  • the present application provides an electronic device, comprising: a touch display screen, wherein the touch display screen includes a touch-sensitive surface and a display; one or more audio devices; a camera; one or more processors; a memory; an application program; and one or more computer programs.
  • the electronic device is caused to execute the method for playing a video in any possible implementation of any one of the above aspects.
  • the present application provides an electronic device including one or more processors and one or more memories.
  • the one or more memories are coupled to the one or more processors for storing computer program code, the computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform A method for playing a video in any possible implementation of any one of the foregoing aspects.
  • the present application provides a computer-readable storage medium, including computer instructions, which, when the computer instructions are executed on an electronic device, enable the electronic device to perform any of the possible video playback methods in any of the foregoing aspects.
  • the present application provides a computer program product that, when the computer program product runs on an electronic device, enables the electronic device to perform any of the possible methods for playing a video in any of the above aspects.
  • FIG. 1 is a schematic diagram of a scene of playing a video on a mobile phone.
  • FIG. 2 is a schematic structural diagram of an example of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • FIG. 4 is a schematic diagram of a playback process of an example of an opening animation provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an example of a playback window provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a playback process of another example of an opening animation provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of another example of a playback process of an opening animation provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of another example of a playback process of an opening animation provided by an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of an example of a method for playing a video provided by an embodiment of the present application.
  • FIG. 10 is a flowchart of an example of moving mirror detection provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an implementation process of an example of playing a video provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • FIG. 1 is a schematic diagram of a scene of playing a video on a mobile phone.
  • a mobile phone shows the main interface 101 currently displayed on the mobile phone in the unlocked mode, and the main interface 101 displays a variety of application programs (application, App), such as memos, videos, applications such as music.
  • application, App application programs
  • the main interface 101 may also include other more application programs, which are not limited in this embodiment of the present application.
  • the user can click on the “Gallery” application on the main interface 101, and in response to the user’s click operation, the mobile phone enters the interface of the gallery application, and the bottom area of the interface of the gallery application can display the gallery
  • the app's main menu such as the Photos menu, Albums menu, Moments menu, Discovery menu, etc.
  • a list of album categories is displayed on the interface 102 corresponding to the album menu shown in (b) of FIG. 1 .
  • the category list of albums may include submenus such as all photos, videos, camera photos, and screen captures stored locally on the mobile phone, as well as submenus such as More Albums, New Albums, etc., and all photos, videos, camera photos, and screenshots.
  • Submenus such as screen recording are displayed in the form of thumbnails.
  • the thumbnails of the submenus such as all photos, camera photos and screen captures can be the thumbnails of the photos with the latest date in the category (that is, the date saved to the mobile phone is closest to the current time).
  • the video submenu The thumbnail can be the thumbnail of any picture of the video clip with the latest date (that is, the date saved locally on the mobile phone is closest to the current time) in the video category, which is not limited in this embodiment of the present application.
  • each video clip may be displayed in the form of a thumbnail, and the thumbnail may include information such as the playback control button 10 and the playback duration of the video clip.
  • the thumbnail of each video clip may be the thumbnail of the first video frame of the video clip, or the thumbnail of any frame of the video clip, which is not limited in this embodiment of the present application.
  • the user desires to play a certain target video, he can click the play control button 10 of the target video, or click any area of the thumbnail of the target video.
  • the user clicks the playback control button 10 of the target video whose playback duration is 1 minute and 30 seconds in the second row and second column.
  • the mobile phone pops up.
  • the video playing window 20 shown in (d) of FIG. 1 is the target video being played in the video playing window 20 .
  • the style of the video playback window 20 can be adapted according to the current horizontal and vertical screen state of the mobile phone, the size and size of the target video, and displayed in the form of a small window or a full-screen window.
  • the video playback window 20 can be automatically popped up and displayed in the middle area of the display screen of the mobile phone, and the long side of the video playback window 20 is displayed.
  • the size is the width of the area that can be displayed on the display screen of the mobile phone, and the length of the video playback window 20 is adapted to the size of the long side.
  • the video play window 20 can be automatically popped up and displayed in full screen on the display screen of the mobile phone, which is not limited in this embodiment of the present application.
  • the user can also play the target video through various video APPs, and the video APP generally also presents a video list.
  • the target video is generally A window will also be popped up in the manner shown in (d) or (e) in Figure 1, and the target video will be played in the pop-up window, and the process of playing the target video by the user through various video APPs will not be repeated here. .
  • the process of playing the video described above is all popped up and expanded in the form of a window, that is, the window popped up is displayed in the final target size, and the expansion method is single.
  • the shooting process can be accompanied by different lens movement techniques such as lens movement and transformation.
  • the lens movement skills may include zooming in, pushing Different lens movement techniques such as far, pan, pan, bird's-eye view, focus transfer, etc., in the video playback, the effects are shown in the zooming of the video screen, the approach or distance of the subject, translation and rotation, etc.
  • the embodiments of the present application provide a video playback method, which can improve the video playback effect and create an immersive viewing experience for users.
  • the method for playing video provided in the embodiments of the present application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, laptops, On electronic devices such as an ultra-mobile personal computer (UMPC), a netbook, and a personal digital assistant (personal digital assistant, PDA), the embodiments of the present application do not impose any restrictions on the specific type of the electronic device.
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • FIG. 2 is a schematic structural diagram of an example of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modulation and demodulation processor may be independent of the processor 110, and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 may provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites applied on the electronic device 100 Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), MiniLED, MicroLED, Micro-OLED, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application use a layered architecture Taking the system as an example, the software structure of the electronic device 100 is exemplarily described.
  • FIG. 3 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the The system is divided into four layers, from top to bottom are the application layer, the application framework layer, the Android runtime ( runtime) and system libraries, as well as the kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as camera, music, gallery, video, etc.
  • the gallery application may include some locally stored video resources, and the video application may also include locally stored video resources, online video resources, and the like, and the source of the video resources is not limited in this embodiment of the present application.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window managers, content providers, view systems, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether the current display screen has a status bar, and perform operations such as locking the screen and taking screenshots.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the stored data may include video data, image data, audio data, etc., which will not be repeated here.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface may include application icons, views for displaying text, views for displaying pictures, and the like.
  • the runtime includes core libraries and virtual machines.
  • the runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional (three dimensional, 3D) graphics processing library (eg: OpenGL ES), two-dimensional (two dimensional, 2D) graphics engine, etc.
  • surface manager surface manager
  • media library media library
  • three-dimensional (three dimensional, 3D) graphics processing library eg: OpenGL ES
  • two-dimensional (two dimensional, 2D) graphics engine etc.
  • Surface Manager is used to manage the display subsystem of an electronic device and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • a 2D graphics engine is a drawing engine for 2D drawing.
  • the image processing library can provide analysis of various image data and provide a variety of image processing algorithms, such as image cutting, image fusion, image blur, image sharpening and other processing, which will not be repeated here.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, audio drivers, sensor drivers, etc.
  • the surface manager may acquire video data, image data, and the like from different applications in the application layer.
  • the surface manager can also provide layer synthesis services.
  • the 3D graphics processing library and the 2D graphics engine can draw and render the synthesized layers, and send them to the display screen for display after the rendering is completed.
  • data refresh, layer synthesis, rendering and display can be performed according to the video data, so as to keep the display content of the video playback window updated in real time.
  • FIG. 4 is a schematic diagram of a playback process of an opening animation provided by an embodiment of the present application.
  • the mobile phone displays a video list interface 401
  • the video list interface 401 displays thumbnails of 15 video clips
  • the 15 thumbnails correspond to the 15 video clips stored in the mobile phone.
  • the 15 video clips are numbered according to numbers 1-15. It should be understood that each video clip on the video list interface 401 may be in the form of a thumbnail as shown in (c) of FIG. 1 display, and will not be repeated here.
  • the thumbnail of each video clip may be the thumbnail of the first video frame of the video clip, or the thumbnail of any frame of the video clip, which is not limited in this embodiment of the present application.
  • the user can perform the operation as shown in (a) of FIG. 4 , click the playback control button of the target video or any area of the thumbnail of the target video, and respond to the user Click operation, the mobile phone can play the opening animation shown in Figure 4 (b) - (c) - (d) - (e), the playback effect of the opening animation is specific to the title of the target video.
  • the playback effects of the long clips are related. The following describes the playback effects of different opening animations in combination with different shooting methods.
  • the "opening animation” in the embodiment of this application can be understood as “the unfolding process of the video playback window", in other words, in this application, the target video selected by the user will be played in the video playback window, and the video playback window can be It is presented to the user in different expansion methods (the expansion process is called “opening animation”), and finally the target video is played in a video playback window of a specific size, avoiding that the video playback window in the existing solution can only be displayed by popping up a fixed window. Way.
  • the content played in the video playback window may be a picture of a specific duration (preset duration) of the title video of the target video, or the content played by the "opening animation" is the title title of the target video.
  • S and N may be preset fixed values or values set by the user, and the embodiment of the present application does not limit the duration of the opening animation.
  • the window for playing the "opening animation” is referred to as a "playing window”
  • the playing window can be the same window as the playing window for playing the target video, that is, the opening animation can be understood as the playing window of the target video.
  • the shape change process of the playback window; or, the playback window can be a different window from the playback window for playing the target video, and after the opening animation is played, it jumps to the playback window for playing the target video and continues to play the target video. This is not limited.
  • the playback effect of the segment of the target video with a specific duration of the title video can be understood as in the process of shooting the target video, within the specified duration of the title video, the photographer used different lens movement methods, etc. After shooting skills, the playback effect that can be presented to the user during playback.
  • the shooting technique of "moving the mirror” is used during the shooting of the target video, it can be understood that the “moving mirror” is used in the segment of the title video of the target video of a certain length of time. "Mirror” shooting skills, will not repeat them one by one in the follow-up.
  • the following takes different mirror movement techniques used in the process of video shooting as an example, and introduces several corresponding video playback effects for different mirror movement methods.
  • Push-pull mirror movement is the most used technique in video shooting, which can include “push movement mirror” and “pull movement mirror”.
  • the "moving mirror” here can be understood as involving the zooming of the lens during the shooting process. Therefore, the moving mirror can show the process of gradually transforming from a larger scene to a local close-up scene.
  • the subject to be photographed can be Present the dynamic change process from small to large.
  • the photographed subject may be in a motion state from far to near, or a motion state from near to far.
  • the position of the lens remains unchanged and the subject to be photographed is in a state of motion from far to near, although the process does not use the "push mirror", and the scene range of the subject to be photographed remains unchanged, the subject to be photographed remains unchanged. It can also show a dynamic change effect from small to large, so the scene can also be divided into the category of "moving mirror", and match the opening of the playback effect associated with "moving mirror” during the video playback process. animation.
  • the subject of the shot can also show a dynamic change effect from large to small, and the scene can also be divided into the category of "pulling mirror”, and match the playback effect associated with "pulling mirror” during the video playback process.
  • the opening animation will be described in detail with reference to the accompanying drawings in subsequent embodiments.
  • the moving mirror is similar to the push-pull moving mirror, but the movement trajectory is different.
  • the push-pull moving mirror is the back and forth movement of the lens, and the moving mirror is the one-way movement of the lens along a fixed trajectory. It should be understood that the purpose of moving the mirror is mainly to express the spatial relationship between the characters in the scene.
  • the fixed trajectory may be a movement trajectory in an up-down direction, a movement trajectory in a left-right direction, or a movement trajectory in a composite direction between the up-down direction and the left-right direction, or the like.
  • the fixed trajectory is not limited to a straight trajectory or a curved trajectory.
  • the panning mirror is similar to the moving mirror.
  • the moving mirror is mainly the reciprocating motion of the lens along a fixed trajectory.
  • the panning movement is mainly to express the detail changes of different positions of the subject being photographed in the scene.
  • the panning movement of the mirror may include possible ways such as the up and down alternate movement of the lens, the left and right alternate movement, and the like, which will not be repeated here.
  • Following the mirror movement means that the lens moves with the subject being photographed.
  • the top-view mirror movement is a method in which the lens is raised and lowered while shooting with the help of the lifting device.
  • the lifting and lowering of the lens brings about the expansion and contraction of the field of view of the picture. Then it shows various parts of tall objects, point-to-face relationships in depth space, scale, momentum and atmosphere of events or scenes, and changes in emotional states in the content of the picture.
  • the lens is at a low position and is slowly moved to a high position with the help of a lifting device.
  • the presented picture is also very visually impactful, giving people a novel and profound feeling.
  • the playback effect of the opening animation is associated with the playback effect of the target video.
  • the playback effect of the opening animation is associated with the playback effect of the target video.
  • an opening animation with an associated playback effect can be matched to the target video according to the camera movement method.
  • the target video can be matched according to any one of the two or more camera movement methods.
  • An opening animation with an associated play effect when two or more camera movements are used in a mixture for the specific duration of the title video, the opening animation can be matched only according to a certain camera movement method, which is not limited in this embodiment of the present application.
  • the mobile phone can detect two or more types of camera movements that are mixed in a specific duration of the title video of the target video according to artificial intelligence (AI) and other methods, and determine the main camera movement, that is, the visual
  • AI artificial intelligence
  • the main camera movement that is, the visual
  • the opening animation is matched according to the main camera movement to provide users with a better visual experience.
  • the shooting technique of "moving the mirror” is used during the shooting of the title video (video clip 8) selected by the user for a specific duration, then, during the playback of the opening animation, the The size corresponding to the window 30 can also have a dynamic change effect from small to large, and the video clip 8 played in the playback window 30 also has a scene in which the shot is gradually converted from a larger scene to a local close-up scene, and the subject to be photographed increases from small to large. visual effects.
  • the position of the subject to be shot may remain unchanged or be in a moving state, which is not limited in this embodiment of the present application.
  • the content of the target video 8 is: a car on the road.
  • the photographed subject, the car may be in a moving state or a stationary state.
  • the car may be in response to the user's click operation, as shown in (b) in FIG. Display in smaller size.
  • the car is located on the road at a position far from the shooting lens.
  • the display size of the playback window 30 is gradually increased, and in the opening animation, the car is gradually enlarged to show an approaching The effect of the shot.
  • the display size of the playback window 30 continues to increase, and the width of the playback window 30 is close to the display screen of the mobile phone , and in the opening animation, the car is getting closer and closer to the camera, and the size of the car continues to increase.
  • the playback window 30 has the maximum size, for example, the width of the playback window 30 is equal to the width of the display screen of the mobile phone .
  • the opening animation ends, and the target video continues to be played.
  • the size of the playback window 30 is also gradually increased on the interface of the mobile phone; and in the playback process, The display size of the car in the video picture in the playback window 30 also increases gradually, and the display position gradually approaches the shooting lens.
  • the playback effect of the opening animation matches the shooting process of the title of the target video 8 for a specific duration (the first S seconds or the first N frames), that is, the playback effect of the opening animation and the title of the target video 8 have the same playback effect, Therefore, the playing process of the opening animation can further provide a coherent immersive experience for the user, that is, the user can experience the driving process of the car from near to far in a more profound and vivid way, which improves the user's visual experience.
  • the mobile phone may also be in a horizontal screen state.
  • the playback window may be displayed in a full-screen or non-full-screen state.
  • the playback window 30 can be gradually enlarged from small until it is displayed in full screen on the display screen of the mobile phone, so as to be more in line with the user's usage habits.
  • the user turns the mobile phone to switch the mobile phone from the vertical screen display to the horizontal screen display, then the mobile phone can pass the gyroscope, gravity sensor Wait until the horizontal screen state is detected, and adjust the display adaptability of the window. For example, it is adapted to the horizontal screen display state.
  • Figure (d) in Figure 4 shows that the window continues to increase in the horizontal screen state, and Figure (e) in Figure 4 changes to a full-screen display of the playback window 30 on the display screen of the mobile phone. It should be understood that when the mobile phone is displayed in the horizontal screen state, the full-screen display of the playback window 30 is more in line with the user's usage habits, which is not limited in this embodiment of the present application.
  • the target video may be a video shot in a vertical screen, for example, a video shot by a user in a vertical screen of a mobile phone.
  • the magnification rate of the vertical side of the playback window 30 can be greater than the magnification rate of the horizontal side of the playback window 30 (parallel to the horizontal short frame of the mobile phone), and finally, (e) in FIG.
  • This process can be combined with the horizontal and vertical screen display modes of the target video to match the user with a more reasonable window style, which is more in line with the user's visual habits.
  • the playback window 30 when the playback window 30 is initially displayed, it may be a small window that automatically pops up on the video list interface 402, and the initial size of the playback window 30 displayed initially may be The size of the thumbnail image of the target video clip shown in (a) in Figure 4 is the same, or the initial size of the playback window 30 displayed in the initialization can be the size of the thumbnail image of the target video clip according to a certain scale; Alternatively, the initial size of the initially displayed playback window 30 may be a predefined fixed size, which is not limited in this embodiment of the present application.
  • the initial display position may be determined according to the position of the target video thumbnail selected by the user. Specifically, as shown in (a) of FIG. 4, when the thumbnail of the target video 8 selected by the user is located in the central area of the video list interface 401, the playback window 30 is initially displayed in the area where the thumbnail of the target video 8 is located, As shown in (b) of FIG. 4 , the playback window 30 is centered on the area where the thumbnail image is located, and changes gradually and dynamically, which will not be repeated here.
  • the playback window 30 takes the area where the thumbnail in the upper left corner is located as the center, and gradually changes dynamically toward the lower right corner, which will not be repeated here. .
  • the background picture of the play window 30 may be the mobile phone interface when the user clicks the target video 8 .
  • the background picture of the playback window 30 may remain unchanged, or the background picture of the playback window 30 may also be dynamically changed.
  • the background picture can also be enlarged and displayed as the playback window 30 increases, or reduced and displayed as the playback window 30 shrinks. It should be understood that the embodiments of the present application do not limit the enlargement rate or the reduction rate of the background picture.
  • the gradual increase of the play window 30 has a gradually enlarged display effect, and the enlargement rate of the background picture is smaller than the enlargement rate of the play window 30 .
  • the picture obtained by taking the screenshot of the video list interface is used as the background picture of the playback window of the first video clip, and the background picture is scaled to match the playback effect of the opening animation, and this process can reduce the data processing of the mobile phone. amount to ensure the operating performance of the mobile phone.
  • the user clicks the target video 8 on the video list interface 401.
  • the mobile phone can directly use the currently displayed video list interface 401 as a background image, and the background image includes 15 A video clip thumbnail interface.
  • the thumbnails of the 15 video clips also have a gradually enlarged display effect.
  • the playback window 30 continues to increase, as shown in (d) of FIG. 4 , the thumbnails of the 15 video clips continue to be gradually enlarged until the playback window 30 is displayed as shown in (e) of FIG. 4 .
  • the final state is displayed, and thereafter, the play window 30 continues to play the target video 8 .
  • the playback window 30 continues to play the target video 8
  • the playback window 30 displays the opening animation of the first S seconds or the first N frames of the title video 8, and continues to play S seconds or after the first N frames. The content of the video will not be repeated in the future.
  • the playback window 30 is displayed in the final state can be understood as the maximum size, display position, display form (floating display) displayed by the playback window 30 when the video is played in the vertical or horizontal screen state of the mobile phone. etc.), a background picture, and other at least one feature, which is not limited in this embodiment of the present application.
  • the window width of the playback window 30 in the final state can be equal to or approximately equal to the width of the short side of the mobile phone display screen, and the window length can be adapted to the width of the short side of the mobile phone display. window width.
  • the playback window 30 can finally be displayed in a floating state in the middle area of the display screen of the mobile phone; furthermore, the background picture of the final state of the playback window 30 can be hidden, or a blank background, a black background, etc. are presented, which is not limited in the embodiments of the present application. .
  • the electronic device may also directly zoom the background elements on the video list interface, and expand the zoomed background elements.
  • the element serves as the background of the playback window of the first video clip.
  • the mobile phone directly zooms the background element itself, that is, the mobile phone zooms the 1-15 video clips shown in (a) in FIG. 4 respectively. thumbnails, and the mobile phone also needs to control the arrangement order of video clips 1-15 and the displacement change process of each video clip on the background interface to ensure that (b) in Figure 4 can be achieved after zooming the background element-( e) The playback effect in the figure.
  • the position of the lens remains unchanged and the subject to be photographed is in a state of motion from far to near, although the mirror is not used in this process, and the scene where the subject to be photographed is located.
  • the range remains unchanged, but the subject being photographed can also show a dynamic change effect from small to large.
  • the scene can also be divided into the category of "moving mirror", and matching and "moving mirror” during the video playback process. The associated playback effect shown in FIG. 4 will not be repeated here.
  • the playback window has a dynamic change effect from small to large, and the target video played in the playback window is played. It is also gradually transformed from a larger scene to a partial close-up scene, and the subject being photographed grows from small to large.
  • the playing process of the opening animation can further provide a coherent immersive experience for the user.
  • the user can experience the driving process of the car from far to near in a more profound and vivid way, which improves the user's visual experience.
  • the transparency of the background picture of the playback window 30 can be dynamically changed.
  • the playback window has a small The dynamic change effect of becoming larger, and the target video played in the playback window is gradually converted from a larger scene to a local close-up scene, and the subject being photographed grows from small to large. Therefore, the playing process of the opening animation can further provide a coherent immersive experience for the user, that is, the user can experience the driving process of the car from far to near more profoundly and vividly, which improves the user's visual experience.
  • the play window 30 of the opening animation may only include a screen display area for displaying a picture of a specific duration of the title video of the target video;
  • the screen display area of a picture of a specific duration of the title, and the play window 30 may also include a menu control area for playing the target video process, and the menu control area may include one or more buttons, controls, and the like.
  • FIG. 5 is a schematic diagram of an example of a playback window provided by an embodiment of the present application.
  • the play window 30 of the opening animation can be displayed as shown in (a) in FIG. 5 , including only a picture display area, which is used to display a picture of a specific duration of the title video of the target video, excluding the picture. Any menu options, controls, etc. that the user can manipulate.
  • the play window 30 of the opening animation may include a screen display area, and the play window 30 may also include a menu control area for playing the target video process.
  • the playback control button, the next video control button, the duration control, the full screen control, etc. displayed in the menu control area I shown in (b) of FIG. 5 , and the movie name menu and the bullet screen control control displayed in the menu control area II , close controls, etc. the embodiment of the present application does not limit the style of the playback window.
  • the size of the playback window 30 corresponds to It has a dynamic change effect from large to small, and the video played in the playback window 30 is gradually converted from a local close-up scene to a larger scene, and the photographed subject changes from a large to a small visual effect.
  • FIG. 6 is a schematic diagram of a playback process of another example of an opening animation provided by an embodiment of the present application.
  • the mobile phone displays a video list interface 601 as shown in (a) of FIG. 6
  • the video list interface 601 displays thumbnails of 15 video clips stored on the mobile phone.
  • the target video selected by the user is a video clip numbered 1
  • the thumbnail of the video clip 1 is located in the upper left corner of the mobile phone, and the video clip of the target video 1 is shot with a “pull mirror” during the shooting of a specific duration. Skill. And in this scene, it is assumed that the subject captured in the target video 1, the car, is in motion.
  • the play window 30 pops up on the video list interface 602, and the play window 30 is initially displayed in the top area of the display screen of the mobile phone with the largest size, that is, Overlays the top-left area where the video clip 1 thumbnail is located.
  • the playback window 30 has the maximum size, for example, the width of the playback window 30 is equal to the width of the display screen of the mobile phone, and in the initial picture of the opening animation of the playback window 30, the car has the maximum size of a close-up shot, which is captured by the lens.
  • the scope of the scene is small.
  • the playback time continues, as shown in (d) in FIG. 6 , on the video list interface 604, the display size of the playback window 30 continues to shrink, and in the opening animation, the scope of the scene shot by the lens Continuing to grow, the car continues to get farther and farther from the camera, and the size of the car continues to shrink. Finally, the playback of the opening animation ends. As shown in (e) of FIG. 6 , on the interface 605 , the mobile phone can display the play window 30 in the middle position of the interface 605 and continue to play the target video 1 .
  • the background picture of the play window 30 in addition to the dynamic change process of the play window 30 described above, during the play process of the opening animation, the background picture of the play window 30 also changes dynamically.
  • the background picture of the playback window 30 may remain unchanged, or the background picture of the playback window 30 may also be dynamically changed.
  • the background picture can also be enlarged and displayed as the playback window 30 increases, or reduced and displayed as the playback window 30 shrinks. It should be understood that the embodiments of the present application do not limit the enlargement rate or the reduction rate of the background picture.
  • the background picture can also be gradually reduced as the playback window 30 is reduced, and the reduction rate of the background picture is greater than this.
  • the reduction process of the background image is also accompanied by the change of transparency, that is, the background image gradually changes from low transparency to high transparency.
  • the target video 1 continues to be played in the play window 30 .
  • the car when the position of the lens remains unchanged and the subject to be photographed, the car, is in a state of motion from near to far, although the camera is not used in this process, and the scope of the scene where the subject to be photographed remains unchanged,
  • the subject being photographed can also show a dynamic change effect from large to small, and the scene can also be divided into the category of "pulling mirror", and match the playback associated with the "pulling mirror" during the video playback process. The effect will not be repeated here.
  • the target video that uses the "pulling mirror” shooting technique within a certain duration of the title after the user clicks to play, it is displayed to the user in the form of an opening animation.
  • the playback window has a dynamic change effect from large to small, and the target video played in the playback window is also gradually converted from a local close-up scene to a larger scene, and the captured subject changes from large to small. . Therefore, the playing process of the opening animation can further provide a coherent immersive experience for the user, that is, the user can experience the driving process of the car from near to far in a more profound and vivid way, which improves the user's visual experience.
  • the size change of the playback window 30 may be a change in at least one of length and width.
  • the size of the playback window 30 parallel to the short side of the mobile phone is called “width”, and the size parallel to the long side of the mobile phone is called “length”.
  • the length and width of the playback window 30 gradually increase until the width reaches the width of the short side of the mobile phone.
  • the length of the playback window 30 is gradually reduced, while the width is the width of the short side of the mobile phone and remains unchanged.
  • FIG. 7 is a schematic diagram of another example of a playback process of an opening animation provided by an embodiment of the present application.
  • the mobile phone displays a video list interface 701 as shown in (a) of FIG. 7
  • the video list interface 701 displays thumbnails of 15 video clips stored on the mobile phone.
  • the playback window 30 also has a corresponding dynamic change effect of moving from bottom to top, and the video clip 11 played in the playback window 30 also has the visual effect of the camera moving up and down.
  • the content of the target video 11 is: gradually move up from the bottom of the iron tower to the top of the iron tower.
  • the user clicks on the target video 11 in response to the user's click operation, as shown in (b) in FIG.
  • the fixed size of the final state when the state is playing the video that is, the width of the window is equal to or approximately equal to the width of the short side of the mobile phone display, and the length of the window can be adapted to the width of the window.
  • the play window 30 is located at the bottom area of the display screen of the mobile phone, the background picture of the play window 30 is a video list interface, and the background picture gradually begins to appear in a transparent state.
  • the picture of the opening animation in the play window 30 is displayed as the bottom end of the iron tower.
  • the picture (b) in FIG. 7 changes gradually, as shown in the picture (c) in FIG. 7 .
  • the position of the playback window 30 is gradually changed from the bottom area of the mobile phone display screen. Moving upward, the size of the playback window 30 can remain unchanged, and the transparency of the background picture of the playback window 30 increases, that is, the background picture gradually becomes more transparent.
  • the lens picture of the opening animation of the play window 30 gradually moves upwards from the bottom end of the iron tower to display the waist area of the iron tower as shown in (c) of FIG. 7 .
  • the gradual change is shown in (d) in FIG. 7 .
  • the position of the playback window 30 continues to move upward, and the size of the playback window 30 can remain unchanged.
  • 30 is suspended and displayed in the middle area of the interface 704, and the transparency of the background image increases to a certain threshold and disappears, that is, the background can be fully transparent or black.
  • the opening animation of the play window 30 moves upward from the waist area of the iron tower to the top area of the iron tower, as shown in (d) in FIG.
  • the playing process of the opening animation matches the playing effect of the moving mirror, that is, the playing window 30 also presents the animation effect of moving up and down. Therefore, the opening animation
  • the animation playback process can further provide users with a coherent immersive experience, that is, users can experience the moving process of the camera from the bottom of the tower to the top of the tower in a more profound and vivid way, which improves the user's visual experience.
  • FIG. 8 is a schematic diagram of another example of a playback process of an opening animation provided by an embodiment of the present application.
  • the target video 5 in FIG. 8 and the target video 11 in FIG. 7 use different moving mirrors.
  • the title of the target video (video clip 5) is shot for a specific length of time.
  • the shooting technique from top to bottom in the "moving mirror" is used.
  • the playback window 30 also has a corresponding dynamic change effect of moving up and down, and the video clip played in the playback window 30 11 also has the visual effect of the camera moving from top to bottom.
  • the content of the target video 5 is: gradually move up from the bottom of the tower to the top of the tower.
  • the play window 30 pops up on the video list interface 802
  • the playback window 30 is initially displayed as the fixed size of the final state when the video is played in the vertical screen state of the mobile phone, that is, the window width is equal to or approximately equal to the width of the short side of the mobile phone display, and the window length can be adapted to the window width.
  • the play window 30 is located at the top area of the display screen of the mobile phone, the background picture of the play window 30 is a video list interface, and the background picture gradually begins to appear in a transparent state.
  • the picture of the opening animation in the play window 30 is displayed as the spire of an iron tower.
  • the picture (b) in FIG. 8 changes gradually, as shown in the picture (c) in FIG. 8 .
  • the position of the playback window 30 gradually changes from the top area of the display screen of the mobile phone to the Moving down, the size of the playback window 30 can remain unchanged, and the transparency of the background picture of the playback window 30 increases, that is, the background picture gradually becomes more transparent.
  • the lens picture of the opening animation of the playing window 30 gradually moves downward from the top of the iron tower to display the waist area of the iron tower as shown in (c) of FIG. 8 .
  • the gradual change is shown in (d) in FIG. 8 .
  • the position of the playback window 30 continues to move upward, and the size of the playback window 30 can remain unchanged.
  • 30 is suspended and displayed in the bottom area of the interface 804, and the transparency of the background image increases to a certain threshold and disappears, that is, the background can be fully transparent or black.
  • the lens picture of the opening animation of the play window 30 moves down from the waist area of the iron tower to the bottom end area of the iron tower, as shown in (d) in FIG. 8 , and thereafter, the play window 30 continues to play the target video 11 .
  • the playing process of the opening animation matches the playing effect of the moving mirror, that is, the playing window 30 also presents the animation effect of moving up and down. Therefore, the opening animation
  • the animation playback process can further provide users with a coherent immersive experience, that is, users can experience the moving process of the camera from the bottom of the tower to the top of the tower in a more profound and vivid way, which improves the user's visual experience.
  • the thumbnails of the video clips on the video list interface displayed on the mobile phone are displayed in a certain order based on a preset rule.
  • the mobile phone can determine the camera movement skill of a specific duration (the first S seconds or the first N frames) of the title of each video clip, and sort all the video clips according to the camera movement skill corresponding to each video.
  • the mobile phone may be arranged according to the time when each video clip is saved locally on the mobile phone, which is not limited in this embodiment of the present application.
  • the 15 video clips displayed on the video list interface 401 of the mobile phone correspond to Different movement techniques
  • the video clips that use the push-pull movement technique (such as video clip 8) are arranged in the middle area of the list
  • the video clips that use the panning technique such as video clips 5 and 11
  • the front row or last row are arranged in the front row or last row.
  • the opening animation corresponding to the video clip (eg video clip 8) of the push-pull camera technique mainly corresponds to the enlarged or reduced opening animation of the playback window 30, the video clips of the push-pull camera technique are displayed in the middle area, which is more in line with the user
  • the opening animation corresponding to the video clips (such as video clips 5 and 11) of the moving mirror technique will mainly correspond to the opening animation of the up and down movement of the playback window 30, and will have an opening that moves from top to bottom.
  • the video clips of the animation are displayed in the area of the first row or the second row, and the video clips with the opening animation that shakes from bottom to top are displayed in the area of the bottom row, so that the playback process of the opening animation is more in line with the The user's visual habits further improve the user's visual experience.
  • the playback process and display method of the opening animation are introduced. It should be understood that the above method is also applicable to the mobile phone displayed on the horizontal screen, and the playback window can be full screen or Full screen status display.
  • the playback window 30 can gradually increase from small to large until it is displayed in full screen on the display screen of the mobile phone; or, in combination with the scene in FIG. 6 , the playback window 30 can be played at the end of the opening animation. Then, it is displayed in a full-screen state in (e) in FIG. 6; or, in conjunction with the scene in FIG. 7 or FIG. 8, the playback window 30 can move up and down in the form of a small window, and after the opening animation is played, in FIG. Figure (d) in Figure 7 or Figure (d) in Figure 8 is displayed in a full screen state.
  • the size, display position, display mode, etc. of the playback window may be determined according to the horizontal and vertical screen display status of the mobile phone and the shooting mode within a specific duration of the title video of the target video played by the user. Adjustments and changes are not limited in this embodiment of the present application.
  • the target video that uses different lens-moving shooting techniques within a certain duration of the title
  • the video playback window can be expanded in different ways.
  • electronic devices such as mobile phones can detect the movement type of the target video, and match different opening animations according to the movement type of the target video, so that the playing process of the opening animation can match the movement skills, enhance the user's sense of visual impact, and further Provide users with a coherent immersive experience that enhances the user's visual experience.
  • the method for playing video is introduced from the user interaction level.
  • the following describes the method for playing video provided by the embodiment of the present application from the perspective of software implementation strategy with reference to FIGS. 9 to 11.
  • FIG. 9 is a schematic flowchart of an example of a method for playing a video provided by an embodiment of the present application. It should be understood that the method can be used in electronic devices (such as mobile phones, tablet computers, etc.) having a structure such as a touch screen as shown in FIGS. 2 and 3 . ) is implemented. Taking a mobile phone as an example, as shown in Figure 9, the method may include the following steps:
  • the mobile phone displays a video list interface 401
  • the video list interface 401 displays thumbnails of 15 video clips
  • the 15 thumbnails correspond to 15 thumbnails stored in the mobile phone. video clips.
  • the thumbnail of each video clip may be the thumbnail of the first video frame of the video clip, or the thumbnail of any frame of the video clip, which is not limited in this embodiment of the present application.
  • the process of this step 920 may correspond to different processing methods.
  • the electronic device may acquire motion type information corresponding to each of the one or more video segments on the video list interface within the first duration, and respond to the video list interface.
  • the first video clip is determined from the one or more video clips, and the motion type information corresponding to the first video clip within the first duration is determined as the first video clip.
  • Mirror type the motion type information corresponding to the first video clip within the first duration is determined as the first video clip.
  • the electronic device can perform real-time motion detection on the locally stored video, or periodic motion detection, to obtain motion type information of each video; or after the user clicks on the first video segment, The electronic device starts to detect the motion type information of the first video segment without additionally detecting motion types of other video segments, thereby reducing the data processing process of the electronic device and reducing the power consumption of the electronic device.
  • the motion type information corresponding to each video clip within the first duration is the information carried in the tag details of each video clip, that is, each video clip has its own unique tag information, and the
  • the label information may carry mirror movement type information and the like.
  • the electronic device can detect the motion type information of the target video clip selected by the user in real time according to the user's selection operation, or when the electronic device caches a video clip, start the electronic device to detect the cached video clip.
  • the motion type information of the video clip is not limited in this embodiment of the present application.
  • the user performs the operation as shown in (a) of FIG. 4, clicks the playback control button of the thumbnail of the first video clip or any area of the thumbnail of the first video clip, and responds to the user's Clicking the operation, the mobile phone can expand the play window of the first video clip in the expansion process shown in (b)-(c)-(d) in FIG. 4 .
  • the playback effect of the opening animation is associated with the playback effect of the segment of the target video with a specific duration at the title, that is, for different shooting methods, there may be different playback effects of the opening animation.
  • the "first duration" may be S seconds before the title of the first video clip or a duration corresponding to N frames before the title of the first video clip.
  • the play window of the first video clip includes a picture display area and/or a menu control area.
  • the playback window of the first video clip may be as shown in (a) of FIG. 5 , including only a screen display area, and the screen display area is used to display a screen of a specific duration of the title video of the target video, excluding the user. Any menu options, controls, etc. that can be manipulated.
  • the play window of the first video clip can be as shown in (b) in FIG. 5
  • the play window 30 of the opening animation can include a screen display area
  • the play window 30 can also include playing the target video The menu control area of the procedure.
  • the movie name menu and the bullet screen control control displayed in the menu control area II close controls, etc.
  • the embodiment of the present application does not limit the style of the playback window.
  • the first expansion manner includes a variation manner of the size of the playback window of the first video segment; and/or a position variation manner of the playback window of the first video segment.
  • the size of the background picture remains unchanged, and/or the transparency of the background picture remains unchanged.
  • the size of the background picture changes according to the first preset rule, and/or the transparency of the background picture changes according to the second preset rule.
  • the change in the size of the playback window of the first video clip may be a change in at least one of length and width.
  • the electronic device may also take a picture obtained by taking a screenshot of the video list interface as the first video a background picture of the playback window of the segment; and on the background picture, the playback window of the first video segment is expanded in a first expansion manner.
  • the picture obtained by taking the screenshot of the video list interface is used as the background picture of the playback window of the first video clip, and the background picture is scaled to match the playback effect of the opening animation, and this process can reduce the data processing of the mobile phone. amount to ensure the operating performance of the mobile phone.
  • the electronic device may also directly zoom the background elements on the video list interface, and expand the zoomed background elements.
  • the element serves as the background of the playback window of the first video clip.
  • the mobile phone directly zooms the background element itself, that is, the mobile phone zooms the 1-15 video clips shown in (a) in FIG. 4 respectively. thumbnails, and the mobile phone also needs to control the arrangement order of video clips 1-15 and the displacement change process of each video clip on the background interface to ensure that (b) in Figure 4 can be achieved after zooming the background element-( e) The playback effect in the figure.
  • the background picture of the play window of the first video clip also changes dynamically. Specifically, the display size and transparency of the background picture of the playback window of the first video clip can be dynamically changed.
  • the background picture of the playback window may remain unchanged, or the background picture of the playback window may also be dynamically changed.
  • the background picture can also be enlarged and displayed as the playback window grows, or reduced and displayed as the playback window shrinks. It should be understood that the embodiments of the present application do not limit the enlargement rate or the reduction rate of the background picture.
  • the transparency of the background picture of the play window may gradually change from high to low, or from low to high.
  • the user in the process of expanding the playback window of the first video clip, along with the gradual enlargement or reduction of the background picture, and with the dynamic change of the transparency of the background picture, the user can experience the video picture more deeply and vividly.
  • the change process of the shooting subject improves the user's visual experience.
  • the initial display position of the play window of the first video clip is based on the first video clip.
  • the position of the thumbnail is determined. Specifically, when the playback window of the first video clip is initially displayed, the initial display position may be determined according to the position where the thumbnail of the first video clip selected by the user is located.
  • the playback window 30 is initially displayed on the thumbnail of the target video 8.
  • the region where the thumbnail is located as shown in (b) of FIG. 4 , the play window 30 takes the region where the thumbnail is located as the center, and changes gradually and dynamically, which will not be repeated here.
  • the playback window 30 is centered on the area where the thumbnail in the upper left corner is located, and gradually changes dynamically, which will not be repeated here.
  • the initial display position of the playback window of the first video clip can be changed according to the position of the thumbnail clicked by the user, which is more in line with the user's usage habits and improves the user experience.
  • the shooting technique of "moving the mirror” is used in the shooting process of the specific duration of the title of the first video clip to be played selected by the user, then, during the playback of the opening animation, the playback window of the first video clip
  • the corresponding size can also have a dynamic change effect from small to large, and the first video clip played in the playback window of the first video clip also has a scene in which the shot is gradually converted from a larger scene to a local close-up scene. Visual effects that grow from small to large.
  • the size of the playback window 30 also gradually increases on the interface of the mobile phone. and during the playback process, the display size of the car in the video picture in the playback window 30 also gradually increases, and the display position gradually approaches the shooting lens.
  • the playback window 30 when the playback window 30 is initially displayed, it can be a small window that automatically pops up on the video list interface 402, and the initial size of the playback window 30 displayed initially can be the same as The thumbnails of the target video clips shown in (a) in FIG.
  • the initial size of the initially displayed playback window 30 may be the size of the thumbnails of the target video clips scaled according to a certain ratio; or , the initial size of the initially displayed playback window 30 may be a predefined fixed size, which is not limited in this embodiment of the present application.
  • the playback window has a small The dynamic change effect of becoming larger, and the target video played in the playback window is gradually converted from a larger scene to a local close-up scene, and the subject being photographed grows from small to large. Therefore, the playing process of the opening animation can further provide a coherent immersive experience for the user, that is, the user can experience the driving process of the subject being photographed from far to near in a more profound and vivid way, which improves the user's visual experience.
  • the shooting technique of "pulling the mirror” is used in the shooting process of the specific duration of the title of the first video clip to be played selected by the user, then, during the playback of the opening animation, the playback window of the first video clip
  • the corresponding size can also have a dynamic change effect from large to small, and the video played in the playback window of the first video clip is gradually converted from a local close-up scene to a larger scene, and the captured subject changes from large to small visually. Effect.
  • the size of the playback window 30 is displayed on the interface of the mobile phone.
  • the size is gradually reduced, and in the opening animation, the car gradually moves away from the shooting lens, and the scope of the scene captured by the lens gradually increases.
  • the target video that uses the "pulling mirror” shooting technique within a certain duration of the title after the user clicks to play, it is displayed to the user in the form of an opening animation.
  • the playback window has a dynamic change effect from large to small, and the target video played in the playback window is also gradually converted from a local close-up scene to a larger scene, and the captured subject changes from large to small. . Therefore, the playing process of the opening animation can further provide a coherent immersive experience for the user, that is, the user can experience the driving process of the photographed subject from near to far in a more profound and vivid way, which improves the user's visual experience.
  • the shooting technique from bottom to top in the "moving mirror” is used during the shooting of the title of the first video clip selected by the user for a specific length of time, then, during the playback of the opening animation, the first The play window of the video clip also has a corresponding dynamic change effect of moving from bottom to top, and the video clip played in the play window of the first video clip also has the visual effect of the camera moving up and down.
  • the top-to-bottom shooting technique in the "moving mirror” is used during the shooting of the title of the first video clip selected by the user for a specific length of time, then, during the playback of the opening animation, the first The play window of the video clip also has a corresponding dynamic change effect of moving up and down, and the video clip played in the play window of the first video clip also has a visual effect of the lens moving from top to bottom.
  • the playback process of the opening animation matches the playing effect of the moving mirror, that is, the playback window also presents the animation effect of moving up and down. Therefore, the playback of the opening animation
  • the process can further provide a coherent immersive experience for the user, that is, the user can experience the process of moving the camera up and down more deeply and vividly, which improves the user's visual experience.
  • FIG. 10 is a flowchart of an example of motion detection provided by an embodiment of the present application. As shown in FIG. 10 , taking the first N seconds of the title of the first video clip as an example, the motion detection process 1000 may include:
  • each segment is n seconds long.
  • the video clips are segmented according to each segment of 1 second, and the segmentation rules are not limited in this embodiment of the present application.
  • key frame may refer to the frame where the key action in the process of movement or change of the photographed object is located, and may also be referred to as a "transition frame” or an "intermediate frame”.
  • the maximum movement distance of the key points is calculated by perspective transformation as the time series feature of the n-second video clip.
  • steps 1001-1008 describe the process of determining the motion type of the first N seconds of the first video segment through motion detection analysis of structured key points (high-level features) in the video.
  • the process of steps 1010-1013 can be performed; or for step 1006, when the key point of the picture structure feature extracted from other key frames does not match the key point of the first key frame , the process of steps 1010-1013 may also be performed.
  • the classifier performs classification based on the difference between the grayscale pixel histograms, and determines the motion type of the n-second video clip.
  • steps 1010-1013 describe the process of determining the motion type of the first N seconds of the first video segment according to the time-series pixel change histogram (underlying feature).
  • This process can classify the extracted underlying pixel features based on the classifier and the logistic regression of machine learning trained with offline data, which ensures the accuracy of the shot motion detection method and the reliability of the detection process in unknown scenes.
  • the corresponding low-level features and high-level features in the video clip can be simultaneously extracted through the time-series pixel changes in the video data and the structured motion based on key frame matching.
  • the process of steps 1001-1008 is preferentially selected, and through the motion detection of structured key points (high-level features) in the video, an effective lens motion judgment is performed, and then the first step is determined.
  • steps 1010-1013 If the scene is too complex and the key point detection and matching errors are large, it will automatically switch to the process of steps 1010-1013, and use the time series pixel change histogram (underlying feature) to effectively judge the lens motion, and then determine the first video.
  • the motion type of the first N seconds of the clip This process can be applied to videos using different mirror movement techniques in more shooting scenarios, which improves the accuracy of video movement detection.
  • FIG. 11 is a schematic diagram of an implementation process of playing a video provided by an embodiment of the present application. As shown in FIG. 11 , the process 1100 includes:
  • the user selects a first video segment to be played on the video list interface.
  • the process may be shown in (a) in FIG. 4 , (a) in FIG. 6 , (a) in FIG. 7 , or (a) in FIG. 8 , which is not repeated here.
  • the user clicks on the display screen of the mobile phone to select the first video clip to be played, and the touch sensor 180K of the display screen detects the user's operation, and transmits the operation to the processor of the mobile phone, which is processed by The controller determines the first video segment to be played.
  • the video player is initialized.
  • the process of this step 1102-1 can be understood as the player of the mobile phone initializes and loads some controls, buttons, menus, etc. of the playback window, as shown in (b) in Figure 5, during the video playback process, the playback display may be displayed. Control buttons, next video control buttons, duration controls, full-screen controls, etc., are used to initialize the display of the subsequent video playback window.
  • the components or controls initialized in step 1102-1 are generally in a hidden state or an uninitialized state, and can be directly displayed during the playback of the opening animation, so that the connection between the user interface and the playback interface of the opening animation can be more compact. .
  • the screenshot of the video list interface can be used as a background picture of the unfolding process of the playback window, and the process can be implemented by a window manager, a content provider, etc., which will not be repeated here.
  • the video list interface can be screenshotted and saved as a frame of pictures, that is, a screenshot mechanism can be used to decode the first N seconds of the first video clip to be played, and extract them to be saved as a set of sequence frames.
  • a screenshot mechanism can be used to decode the first N seconds of the first video clip to be played, and extract them to be saved as a set of sequence frames.
  • the decoder acquires the file corresponding to the first video segment to be played, and decodes the file to determine the motion type.
  • steps 1102-1, 1102-2, and 1102-3 may be performed simultaneously, and the embodiment of the present application does not limit the execution order of several steps of the internal processing process.
  • the unfolding strategy of the first video segment is matched according to the result of the camera movement detection (the camera movement type), that is, the strategy of the opening animation is determined.
  • the strategy of the opening animation can be listed as follows:
  • the renderer obtains the file corresponding to the decoded first video segment and the background image obtained by the screenshot, and performs rendering and rendering to synthesize a layer that can be displayed on the display screen.
  • step 1103 may be completed by a renderer of a video application, or a renderer built-in to the mobile phone system, and the embodiment of the present application does not limit the execution subject of the process.
  • the process can be as shown in (b)-(c)-(d)-(e) in Figure 4, or (b)-(c)- in Figure 6- (d) Figure-(e), or (b)-(c)-(d) in Figure 7, or (b)-(c) in Figure 8 -(d) is shown in the figure, and will not be repeated here.
  • the decoder continues to decode the video from the Nth second, and sends it to the display control to play the video content, that is, the playback window can continue to play the first video clip from the Nth second, which is not required here. Repeat.
  • the embodiment of the present application can detect the movement type of the target video for a target video that uses different camera movement shooting techniques within a specific duration of the title video, and match different opening animations according to the movement type of the target video. , that is, during the playing process of the target video, the playing window of the target video is expanded in different ways to present different visual effects to the user.
  • the playback window may also have a dynamic change effect, and at the same time, along with the dynamic change process of the size and transparency of the background image, the user can be more Deeply and vividly experience the dynamic change process of the photographed object, further providing users with a coherent immersive experience and improving the user's visual experience.
  • the electronic device includes corresponding hardware and/or software modules for executing each function.
  • the present application can be implemented in hardware or in the form of a combination of hardware and computer software in conjunction with the algorithm steps of each example described in conjunction with the embodiments disclosed herein. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functionality for each particular application in conjunction with the embodiments, but such implementations should not be considered beyond the scope of this application.
  • the electronic device can be divided into functional modules according to the above method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware. It should be noted that, the division of modules in this embodiment is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • the electronic device may include: a display unit, a detection unit, a processing unit, and the like.
  • the display unit, the detection unit and the processing unit cooperate with each other to realize the steps and processes involved in the above method embodiments, which will not be repeated here.
  • the electronic device provided in this embodiment is used to execute the above method for playing a video, so the same effect as the above implementation method can be achieved.
  • the electronic device may include a processing module, a memory module and a communication module.
  • the processing module may be used to control and manage the actions of the electronic device, for example, may be used to support the electronic device to perform the steps performed by the display unit, the detection unit and the processing unit.
  • the storage module may be used to support the electronic device to execute stored program codes and data, and the like.
  • the communication module can be used to support the communication between the electronic device and other devices.
  • the processing module may be a processor or a controller. It may implement or execute the various exemplary logical blocks, modules and circuits described in connection with this disclosure.
  • the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (DSP) and a microprocessor, and the like.
  • the storage module may be a memory.
  • the communication module may specifically be a device that interacts with other electronic devices, such as a radio frequency circuit, a Bluetooth chip, and a Wi-Fi chip.
  • the electronic device involved in this embodiment may be a device having the structure shown in FIG. 2 .
  • This embodiment also provides a computer-readable storage medium, where computer instructions are stored in the computer-readable storage medium, and when the computer instructions are executed on the electronic device, the electronic device executes the above-mentioned related method steps to realize the above-mentioned embodiments. How to play the video.
  • This embodiment also provides a computer program product, which when the computer program product runs on a computer, makes the computer execute the above-mentioned relevant steps, so as to realize the method for playing a video in the above-mentioned embodiment.
  • the embodiments of the present application also provide an apparatus, which may specifically be a chip, a component or a module, and the apparatus may include a connected processor and a memory; wherein, the memory is used for storing computer execution instructions, and when the apparatus is running, The processor can execute the computer-executable instructions stored in the memory, so that the chip executes the method for playing a video in the foregoing method embodiments.
  • the electronic device, computer-readable storage medium, computer program product or chip provided in this embodiment are all used to execute the corresponding method provided above. Therefore, for the beneficial effects that can be achieved, reference may be made to the above-provided method. The beneficial effects in the corresponding method will not be repeated here.
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • Units described as separate components may or may not be physically separated, and components shown as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium.
  • a readable storage medium including several instructions to make a device (which may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande concerne un procédé de lecture de vidéo et un dispositif électronique. Le dispositif électronique peut être un dispositif, tel qu'un téléphone mobile, une tablette électronique et un ordinateur, qui comprend un écran d'affichage. Le procédé est appliqué à un processus de lecture vidéo. Différents types de mouvement de caméra utilisés dans la durée spécifique du titre d'une vidéo cible sont détectés, et la mise en correspondance est effectuée selon les types de mouvement de caméra utilisés dans la vidéo cible, de manière à obtenir une animation d'ouverture associée, qui est, pendant le processus de lecture de la vidéo cible, une fenêtre de lecture de la vidéo cible est étendue d'une manière d'expansion associée, ce qui permet de présenter différents effets visuels à un utilisateur. En particulier, la fenêtre de lecture de la vidéo cible peut avoir un effet de changement dynamique, et une expérience immersive cohérente peut en outre être fournie pour l'utilisateur conjointement avec le processus de changement dynamique de la taille, de la transparence, etc. d'une image d'arrière-plan, ce qui permet d'améliorer l'expérience visuelle de l'utilisateur.
PCT/CN2021/140541 2021-01-20 2021-12-22 Procédé de lecture de vidéo et dispositif électronique WO2022156473A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110080633.8 2021-01-20
CN202110080633.8A CN114866860B (zh) 2021-01-20 2021-01-20 一种播放视频的方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022156473A1 true WO2022156473A1 (fr) 2022-07-28

Family

ID=82549276

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/140541 WO2022156473A1 (fr) 2021-01-20 2021-12-22 Procédé de lecture de vidéo et dispositif électronique

Country Status (2)

Country Link
CN (1) CN114866860B (fr)
WO (1) WO2022156473A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117724783A (zh) * 2023-07-12 2024-03-19 荣耀终端有限公司 动效显示方法及电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115695889A (zh) * 2022-09-30 2023-02-03 聚好看科技股份有限公司 显示设备及悬浮窗显示方法
CN115967831B (zh) * 2022-10-28 2023-08-22 北京优酷科技有限公司 视频显示方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179814A1 (en) * 2003-03-13 2004-09-16 Yoon Kyoung Ro Video reproducing method and apparatus and system using the same
US20080152297A1 (en) * 2006-12-22 2008-06-26 Apple Inc. Select Drag and Drop Operations on Video Thumbnails Across Clip Boundaries
CN103839562A (zh) * 2014-03-17 2014-06-04 杨雅 一种视频创作***
CN105169703A (zh) * 2014-06-09 2015-12-23 掌赢信息科技(上海)有限公司 智能手持设备点对点视频和游戏的互动融合方法、装置及***
CN105554553A (zh) * 2015-12-15 2016-05-04 腾讯科技(深圳)有限公司 通过悬浮窗口播放视频的方法及装置
CN105677159A (zh) * 2016-01-14 2016-06-15 深圳市至壹科技开发有限公司 视频显示方法和视频显示装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8020100B2 (en) * 2006-12-22 2011-09-13 Apple Inc. Fast creation of video segments
CN108337497B (zh) * 2018-02-07 2020-10-16 刘智勇 一种虚拟现实视频/图像格式以及拍摄、处理、播放方法和装置
CN110913136A (zh) * 2019-11-27 2020-03-24 维沃移动通信有限公司 视频拍摄方法、装置、电子设备及介质
CN111491183B (zh) * 2020-04-23 2022-07-12 百度在线网络技术(北京)有限公司 一种视频处理方法、装置、设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179814A1 (en) * 2003-03-13 2004-09-16 Yoon Kyoung Ro Video reproducing method and apparatus and system using the same
US20080152297A1 (en) * 2006-12-22 2008-06-26 Apple Inc. Select Drag and Drop Operations on Video Thumbnails Across Clip Boundaries
CN103839562A (zh) * 2014-03-17 2014-06-04 杨雅 一种视频创作***
CN105169703A (zh) * 2014-06-09 2015-12-23 掌赢信息科技(上海)有限公司 智能手持设备点对点视频和游戏的互动融合方法、装置及***
CN105554553A (zh) * 2015-12-15 2016-05-04 腾讯科技(深圳)有限公司 通过悬浮窗口播放视频的方法及装置
CN105677159A (zh) * 2016-01-14 2016-06-15 深圳市至壹科技开发有限公司 视频显示方法和视频显示装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117724783A (zh) * 2023-07-12 2024-03-19 荣耀终端有限公司 动效显示方法及电子设备

Also Published As

Publication number Publication date
CN114866860A (zh) 2022-08-05
CN114866860B (zh) 2023-07-11

Similar Documents

Publication Publication Date Title
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2021136050A1 (fr) Procédé de photographie d'image et appareil associé
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021000881A1 (fr) Procédé de division d'écran et dispositif électronique
WO2021104485A1 (fr) Procédé de photographie et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2022156473A1 (fr) Procédé de lecture de vidéo et dispositif électronique
WO2022127787A1 (fr) Procédé d'affichage d'image et dispositif électronique
WO2022017261A1 (fr) Procédé de synthèse d'image et dispositif électronique
CN113170037B (zh) 一种拍摄长曝光图像的方法和电子设备
WO2021258814A1 (fr) Procédé et appareil de synthèse vidéo, dispositif électronique, et support de stockage
WO2020192761A1 (fr) Procédé permettant d'enregistrer une émotion d'utilisateur et appareil associé
WO2021082815A1 (fr) Procédé d'affichage d'élément d'affichage et dispositif électronique
CN113935898A (zh) 图像处理方法、***、电子设备及计算机可读存储介质
WO2023273323A1 (fr) Procédé de mise au point et dispositif électronique
WO2021238370A1 (fr) Procédé de commande d'affichage, dispositif électronique, et support de stockage lisible par ordinateur
CN112068907A (zh) 一种界面显示方法和电子设备
CN115484380A (zh) 拍摄方法、图形用户界面及电子设备
CN113986070A (zh) 一种应用卡片的快速查看方法及电子设备
WO2022267783A1 (fr) Procédé de détermination d'une scène recommandée, et dispositif électronique
CN115115679A (zh) 一种图像配准方法及相关设备
CN115150542B (zh) 一种视频防抖方法及相关设备
CN114979457B (zh) 一种图像处理方法及相关装置
WO2021204103A1 (fr) Procédé de prévisualisation d'images, dispositif électronique et support de stockage
CN115808997A (zh) 一种预览方法、电子设备及***

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21920847

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21920847

Country of ref document: EP

Kind code of ref document: A1