WO2019084769A1 - 用于跟踪拍摄的方法和设备 - Google Patents

用于跟踪拍摄的方法和设备 Download PDF

Info

Publication number
WO2019084769A1
WO2019084769A1 PCT/CN2017/108574 CN2017108574W WO2019084769A1 WO 2019084769 A1 WO2019084769 A1 WO 2019084769A1 CN 2017108574 W CN2017108574 W CN 2017108574W WO 2019084769 A1 WO2019084769 A1 WO 2019084769A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
mode
specified target
camera
image stream
Prior art date
Application number
PCT/CN2017/108574
Other languages
English (en)
French (fr)
Inventor
翁超
陈洪晶
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2017/108574 priority Critical patent/WO2019084769A1/zh
Priority to CN201780004508.0A priority patent/CN108496363A/zh
Publication of WO2019084769A1 publication Critical patent/WO2019084769A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Definitions

  • the present disclosure relates to the field of image processing, and more particularly, to a method and apparatus for tracking shots.
  • a drone with a camera can track the desired subject and transmit the captured image and/or video to the user in real time.
  • Tracking of specified targets can be achieved during the tracking process. That is, the user can select the target of interest, and then the drone can track the selected target as the specified target.
  • the existing tracking mode only involves tracking under visible light, which does not meet the needs of a particular situation. For example, if there is insufficient illumination or at night, tracking by visible light alone may not guarantee the accuracy of tracking, or even tracking.
  • the present disclosure proposes a scheme of controlling a tracking mode of a mobile device.
  • the target can be tracked in various tracking modes including visible light tracking and infrared tracking, thereby improving the accuracy of tracking and being applicable to a wider range of scenes.
  • a control device for controlling a tracking mode of a removable device.
  • the control device includes: an image mode selection unit configured to receive a user selection of a visible light image stream mode and/or an infrared image stream mode; and a tracking mode selection unit configured to receive a user selection of a tracking mode, the tracking mode
  • the method includes prohibiting tracking, specifying target tracking and hotspot tracking, and a sending unit configured to send information related to the selected image mode and tracking mode to the mobile device.
  • a method for controlling a tracking mode of a removable device includes: selecting a visible light image stream mode and/or an infrared image stream mode; receiving a user selection of a tracking mode, including prohibiting tracking, specifying target tracking, and The hottest tracking; and transmitting information related to the selected image mode and tracking mode to the mobile device.
  • a mobile device in accordance with another aspect of the present disclosure, includes a body, a camera, and a controller.
  • the body is used to carry the components.
  • a camera is carried on the body and is configured to capture a stream of visible light images and/or a stream of infrared images.
  • a controller is mounted to the body and configured to receive information related to an image mode and a tracking mode, thereby controlling shooting and posture of the camera.
  • the image mode includes a visible light image streaming mode and/or an infrared image streaming mode. Tracking modes include prohibiting tracking, specifying target tracking, and hotspot tracking.
  • a method performed by a mobile device comprising: receiving information related to an image mode and a tracking mode; and tracking and capturing visible light according to the information related to the image mode and the tracking mode Image stream and/or infrared image stream.
  • the image mode includes a visible light image streaming mode and/or an infrared image streaming mode.
  • Tracking modes include prohibiting tracking, specifying target tracking, and hotspot tracking.
  • a computer readable storage medium storing a computer program, when executed by at least one processor, causes at least one processor to perform the method described above.
  • multiple tracking modes including visible light tracking and infrared tracking can be realized through a simple interaction manner. Therefore, the accuracy of tracking of the mobile device is improved, and the application scenario of the mobile device is broadened.
  • FIG. 1A is a block diagram showing a mobile device in accordance with one embodiment of the present disclosure.
  • FIG. 1B is a block diagram showing a mobile device in accordance with one embodiment of the present disclosure.
  • FIG. 2 is a flow chart showing a method performed by a mobile device in accordance with one embodiment of the present disclosure.
  • FIG. 3 is a block diagram showing a control device according to an embodiment of the present disclosure.
  • FIG. 4 is a flow chart showing a method performed by a control device in accordance with one embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram showing a computer readable storage medium in accordance with one embodiment of the present disclosure.
  • FIG. 1A is a block diagram showing a removable device 10 in accordance with one embodiment of the present disclosure.
  • the mobile device 10 includes a body 110, a camera 120, and a controller 130.
  • the mobile device 10 described herein can be a variety of devices including, but not limited to, drones, unmanned vehicles, or unmanned vessels.
  • drones unmanned vehicles
  • unmanned vessels unmanned vessels
  • the principles of the present disclosure are described with the drone as an example of the mobile device 10.
  • those skilled in the art will appreciate that the described principles are equally applicable to other forms of mobile devices.
  • the body 110 is used to carry various components of the mobile device.
  • camera 120 and controller 130 can be carried (mounted) on body 110.
  • the camera 120 is carried on the body 110, and the controller 130 controls the posture of the camera 120 by controlling the posture of the body 110.
  • camera 120 may be configured to capture a stream of visible light images and/or a stream of infrared images. It can be understood that when the camera 120 includes a visible light camera, a visible light image stream can be captured. When the camera 120 includes an infrared camera, an infrared image stream can be taken. When the camera 120 includes a dual lens camera (for example, a camera having a visible light lens and an infrared lens), a visible light image stream and an infrared image stream can be taken.
  • image stream refers to a stream of data including images and/or video.
  • the controller 130 is mounted on the body 110 and can be configured to receive information related to the image mode and the tracking mode, thereby controlling the shooting and posture of the camera 120.
  • the image mode may include a visible light image streaming mode and/or an infrared image streaming mode
  • the tracking mode may include prohibiting tracking, specifying target tracking, and most hotspot tracking. In the following, the details of the image mode and the tracking mode are described in detail.
  • the body 110 may be a housing structure, and the controller 130 may be mounted on the housing. Internal or external to the housing.
  • FIG. 1B is a block diagram showing a removable device 10 in accordance with one embodiment of the present disclosure.
  • the mobile device in FIG. 1B further includes a pan/tilt 140, which can be mounted on the body 110.
  • the camera 120 is mounted on the body 110 of the mobile device 10 by being mounted on the platform 140.
  • the controller 130 can control the posture of the camera 120 by controlling the posture of the body 110 and/or the pan/tilt 140 of the movable device 10.
  • the structure of the mobile device 10 and its components in FIG. 1B are the same as those of the mobile device 10 in FIG. 1A.
  • the controller 130 may receive tracking mode information indicating a specified target tracking and position information of the specified target, extract feature parameters of the specified target according to the position information, and control the posture of the camera 120 based on the feature parameters to cause the camera 120 to track the specified target .
  • “designated target” refers to a target object that is of interest to the user and that needs to be tracked.
  • the controller 130 may control the camera 120 to capture a visible light image stream of the specified target (the image mode is the visible light image streaming mode).
  • the controller 130 may control the camera 120 to capture a fused image stream of the visible image stream and the infrared image stream of the specified target (the image mode is a visible light and infrared image stream mode; it can be understood that the camera 120 has an infrared camera in this case) ability).
  • the fusion of the visible image stream and the infrared image stream can improve the accuracy of the tracking, especially when the light is insufficient, the visible image stream may not accurately track the target object.
  • the controller 130 acquires the hottest spot and the position information acquired by the camera 120 in an infrared manner, thereby controlling the posture of the camera 120, so that the camera 120 tracks the hottest spot.
  • the camera 120 can capture the fused image stream containing the hottest visible light image stream and the infrared image stream (the image mode is the visible light and infrared image stream mode), or only the infrared image stream containing the hottest spot (the image mode is Infrared image streaming mode). For example, when monitoring a fire situation at night, it may only be of interest to the fire point. In this way, it is only necessary to track the infrared image stream of the fire point to help the disaster relief.
  • the controller 130 may stop tracking control of the camera 120.
  • various tracking modes of the mobile device can be implemented, including various tracking modes under visible light and infrared. Therefore, the accuracy of tracking of the mobile device is improved, and the application scenario of the mobile device is broadened.
  • FIG. 2 is a flow diagram showing a method performed by a mobile device in accordance with one embodiment of the present disclosure. Cheng Tu.
  • the method can be performed by the removable device 10 shown in FIG. 1A or 1B.
  • the description of the details of the removable device 10 is omitted below for the sake of brevity.
  • step S210 information related to the image mode and the tracking mode is received.
  • the image mode may include a visible light image streaming mode and/or an infrared image streaming mode
  • the tracking mode may include inhibiting tracking, specifying target tracking, and most hotspot tracking.
  • step S220 the captured visible light image stream and/or the infrared image stream are tracked according to information related to the image mode and the tracking mode.
  • tracking mode information indicating a specified target tracking and location information specifying the target are received.
  • the feature parameters of the specified target are extracted based on the location information, and the tracking target is specified based on the feature parameters.
  • the visible image stream can be captured, or the fused image stream of the visible image stream and the infrared image stream can be captured.
  • the hotspot and its location information collected in the infrared mode can be acquired, thereby tracking the hottest spot.
  • FIG. 3 is a block diagram showing a control device according to an embodiment of the present disclosure.
  • the control device 30 includes an image mode selection unit 310, a tracking mode selection unit 320, and a transmission unit 330.
  • control device 30 may also include display unit 340 (shown in dashed boxes). Next, the operation of each component in the control device 30 will be described in detail.
  • control device 30 can be various forms of devices including, but not limited to, a smartphone, a control terminal, a personal digital assistant PDA, and the like.
  • a smartphone a control terminal
  • PDA personal digital assistant
  • the principle of the present disclosure will be described with a smartphone as an example of the control device 30.
  • Those skilled in the art will appreciate that the principles described are equally applicable to other forms of control devices.
  • the image mode selection unit 310 is configured to receive a user selection of a visible light image stream mode and/or an infrared image stream mode.
  • the image mode may include a visible light image streaming mode and/or an infrared image streaming mode
  • the tracking mode may include inhibiting tracking, specifying target tracking, and most hotspot tracking.
  • Tracking mode selection unit 320 is configured to receive a user selection of a tracking mode.
  • the tracking mode can include disabling tracking, specifying target tracking, and hotspot tracking.
  • the transmitting unit 330 is configured to transmit information related to the selected image mode and tracking mode to the mobile device (eg, the mobile device 10 illustrated in FIGS. 1A and 1B).
  • the tracking mode selection unit 320 may be configured to further receive the user's specified target in the user interface displayed on the display unit 340 if the user selects the specified target tracking mode. select.
  • the sending unit 330 may be further configured to: if the user selects the specified target tracking mode, send the tracking mode information indicating the specified target tracking to the mobile device along with the location information of the specified target in the user interface (for example, The mobile device 10) shown in 1A and 1B.
  • the display unit 340 is further configured to receive the visible light image stream and/or the infrared image stream captured by the mobile device according to the selection of the image mode by the user (via the image mode selection unit 310).
  • the specified target tracking mode a visible light image containing the specified target or a fused image (visible light image and infrared image) containing the specified target may be displayed on the display unit 340.
  • the infrared image containing the hottest spot or the fused image containing the hottest spot may be displayed on the display unit 340.
  • a control device 30 is a smartphone running a particular application, the particular application including a display interface.
  • the display interface there is a tracking mode selection button.
  • the user can click on the button continuously to continuously switch between the three tracking modes (ie, tracking, tracking, and hotspot tracking).
  • the display interface can also include buttons for the user to select a visible light stream and/or an infrared stream.
  • the mobile device does not have any tracking behavior when the user chooses to disable the tracking mode.
  • the user selects the target tracking mode
  • the user can select the tracking target on the display interface.
  • the specific application sends the coordinates of the tracking target selected by the user and the current tracking mode to the mobile device. In this way, the mobile device extracts the target feature based on the visible light stream and the coordinate parameters and controls the camera to track the specified target.
  • the user selects the target tracking mode
  • the user can select the tracking target on the display interface.
  • the specific application sends the coordinates of the tracking target selected by the user and the current tracking mode to the mobile device. In this way, the mobile device extracts the target feature based on the fused code stream and the coordinate parameters and controls the camera to track the specified target.
  • the display interface When the user selects the hottest tracking mode, the display interface must be a pure infrared stream or a fused stream of infrared and visible light (ie, only visible light streams will not appear). This particular application will send the current tracking mode to the removable device.
  • the camera of the mobile device is taken in real time by infrared Set the temperature and coordinates of the hottest point (the highest temperature point), and control the camera to follow the hottest spot according to the hottest temperature and coordinates.
  • the image stream captured by the mobile device can be displayed on the display interface of the smartphone.
  • FIG. 4 is a flow chart showing a method performed by a control device in accordance with one embodiment of the present disclosure.
  • the method can be performed by the control device 30 shown in FIG.
  • the following description of the details of the control device 30 is omitted for the sake of brevity.
  • step S410 a visible light image stream mode and/or an infrared image stream mode are selected.
  • the tracking mode includes prohibiting tracking, specifying target tracking, and hotspot tracking.
  • step S430 information related to the selected image mode and tracking mode is transmitted to the mobile device.
  • the user's selection of the specified target in the user interface of the control device is further received. If the user selects the specified target tracking mode, the tracking mode information indicating the specified target tracking is transmitted to the mobile device along with the location information of the specified target in the user interface.
  • the visible light image stream and/or the infrared image stream captured by the mobile device are received according to the user's selection of the image mode.
  • embodiments of the present disclosure may be implemented by means of a computer program product.
  • the computer program product can be a computer readable storage medium.
  • a computer program is stored on a computer readable storage medium, and when executed on a computing device, related operations can be performed to implement the above-described aspects of the present disclosure.
  • Figure 5 is a block diagram showing a computer readable storage medium 50 in accordance with one embodiment of the present disclosure.
  • computer readable storage medium 50 includes computer program 510.
  • the computer program 510 when executed by at least one processor, causes at least one processor to perform various steps of the method, such as described in accordance with FIGS. 2 and 4.
  • examples of computer readable storage medium 50 include, but are not limited to, a semiconductor storage medium, an optical storage medium, a magnetic storage medium, or any other form of computer readable storage medium.
  • Such an arrangement of the present disclosure is typically provided as software, code, and/or other data structures, such as one or more, that are arranged or encoded on a computer readable medium such as an optical medium (eg, CD-ROM), floppy disk, or hard disk.
  • a computer readable medium such as an optical medium (eg, CD-ROM), floppy disk, or hard disk.
  • Software or firmware or such a configuration may be installed on the computing device such that one or more processors in the computing device perform the technical solutions described in the embodiments of the present disclosure.
  • each functional module or individual feature of the device used in each of the above embodiments may be implemented or executed by circuitry, typically one or more integrated circuits.
  • Circuitry designed to perform the various functions described in this specification can include general purpose processors, digital signal processors (DSPs), application specific integrated circuits (ASICs) or general purpose integrated circuits, field programmable gate arrays (FPGAs), or others.
  • a general purpose processor may be a microprocessor, or the processor may be an existing processor, controller, microcontroller, or state machine.
  • the above general purpose processor or each circuit may be configured by a digital circuit or may be configured by a logic circuit.
  • the present disclosure may also use integrated circuits obtained using the advanced technology.
  • the program running on the device may be a program that causes a computer to implement the functions of the embodiments of the present invention by controlling a central processing unit (CPU).
  • the program or information processed by the program may be temporarily stored in a volatile memory (such as a random access memory RAM), a hard disk drive (HDD), a non-volatile memory (such as a flash memory), or other memory system.
  • a program for realizing the functions of the embodiments of the present invention can be recorded on a computer readable recording medium.
  • the corresponding functions can be realized by causing a computer system to read programs recorded on the recording medium and execute the programs.
  • the so-called "computer system” herein may be a computer system embedded in the device, and may include an operating system or hardware (such as a peripheral device).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Telephone Function (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

提供了一种用于控制可移动设备的跟踪模式的控制设备。该控制设备包括影像模式选择单元,被配置为接收用户对可见光影像流模式和/或红外影像流模式的选择。该控制设备还包括跟踪模式选择单元,被配置为接收用户对跟踪模式的选择,跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪。该控制设备还包括发送单元,被配置为向可移动设备发送与所选择的影像模式和跟踪模式有关的信息。提供了一种用于控制可移动设备的跟踪模式的方法。

Description

用于跟踪拍摄的方法和设备 技术领域
本公开涉及图像处理领域,更具体地,本公开涉及一种用于跟踪拍摄的方法和设备。
背景技术
当前,具有摄像装置的可移动设备得到了广泛的应用。例如,具有摄像头的无人机可以跟踪拍摄期望的对象,并将所拍摄的图像和/或视频实时传送给用户。
在跟踪过程中,可以实现指定目标的跟踪。即,用户可以选择感兴趣的目标,然后无人机可以将所选择的目标作为指定目标进行跟踪。
然而,现有的跟踪模式仅涉及可见光下的跟踪,这无法满足特定情况下的需求。例如,如果在照明不足的情况下或者在夜晚,仅通过可见光进行跟踪可能无法保证跟踪的精度,甚至无法实现跟踪。
发明内容
为了解决以上问题中的至少一部分,本公开提出一种控制可移动设备的跟踪模式的方案。在该技术方案中,能够以包括可见光跟踪和红外跟踪的多种跟踪模式对目标进行跟踪,从而提高了跟踪的准确度,并且能够应用于更广泛的场景。
根据本公开的一个方面,提供了一种控制设备,用于控制可移动设备的跟踪模式。该控制设备包括:影像模式选择单元,被配置为接收用户对可见光影像流模式和/或红外影像流模式的选择;跟踪模式选择单元,被配置为接收用户对跟踪模式的选择,所述跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪;以及发送单元,被配置为向可移动设备发送与所选择的影像模式和跟踪模式有关的信息。
根据本公开的另一个方面,提供了一种用于控制可移动设备的跟踪模式的方法。该方法包括:选择可见光影像流模式和/或红外影像流模式;接收用户对跟踪模式的选择,所述跟踪模式包括禁止跟踪、指定目标跟踪和 最热点跟踪;以及向可移动设备发送与所选择的影像模式和跟踪模式有关的信息。
根据本公开的另一个方面,提供了一种可移动设备,包括本体、摄像机和控制器。本体用于承载元件。摄像机承载于所述本体,并且被配置为拍摄可见光影像流和/或红外影像流。控制器安装于所述本体,并且被配置为接收与影像模式和跟踪模式有关的信息,由此来控制摄像机的拍摄以及姿态。影像模式包括可见光影像流模式和/或红外影像流模式。跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪。
根据本公开的另一个方面,提供了一种由可移动设备执行的方法,包括:接收与影像模式和跟踪模式有关的信息;以及根据所述与影像模式和跟踪模式有关的信息,跟踪拍摄可见光影像流和/或红外影像流。影像模式包括可见光影像流模式和/或红外影像流模式。跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪。
根据本公开的另一个方面,提供了一种计算机可读存储介质,存储有计算机程序,当计算机程序由至少一个处理器运行时,使至少一个处理器执行上文所述的方法。
采用本公开的技术方案,可以通过简便的交互方式,实现包括可见光跟踪和红外跟踪的多种跟踪模式。因此,提高了可移动设备的跟踪的准确度,并且拓宽了可移动设备的应用场景。
附图说明
通过下文结合附图的详细描述,本公开的上述和其它特征将会变得更加明显,其中:
图1A是示出了根据本公开一个实施例的可移动设备的框图。
图1B是示出了根据本公开一个实施例的可移动设备的框图。
图2是示出了根据本公开一个实施例的由可移动设备执行的方法的流程图。
图3是示出了根据本公开一个实施例的控制设备的框图。
图4是示出了根据本公开一个实施例的由控制设备执行的方法的流程图。
图5是示出了根据本公开一个实施例的计算机可读存储介质的示意图。
需要注意的是,附图不一定按比例绘制,重点在于示出本文公开的技术的原理。另外,为了清楚起见,贯穿附图中的相似的附图标记指代相似的元素。
具体实施方式
下面结合附图和具体实施方式对本公开进行详细阐述。应当注意,本公开不应局限于下文所述的具体实施方式。另外,为了简便起见,省略了对与本公开没有直接关联的公知技术的详细描述,以防止对本公开的理解造成混淆。
图1A是示出了根据本公开一个实施例的可移动设备10的框图。如图1A所示,可移动设备10包括本体110、摄像机120和控制器130。
需要指出的是,本文中描述的可移动设备10可以是各种形式的设备,包括但不限于无人机、无人车或无人船等。在下文中,以无人机作为可移动设备10的示例来描述本公开的原理。然而,本领域技术人员可以理解,所描述的原理同样可以适应于其他形式的可移动设备。
本体110用于承载可移动设备的各种元件。例如,可以在本体110上承载(安装)摄像机120和控制器130。
摄像机120承载于本体110上,控制器130通过控制本体110的姿态而控制所述摄像机120的姿态。在本申请中,摄像机120可以被配置为拍摄可见光影像流和/或红外影像流。可以理解,当摄像机120包括可见光摄像机时,可以拍摄可见光影像流。当摄像机120包括红外摄像机时,可以拍摄红外影像流。当摄像机120包括双镜头摄像机(例如具有可见光镜头和红外镜头的摄像机)时,可以拍摄可见光影像流和红外影像流。这里,“影像流”是指包括图像和/或视频在内的数据流。
控制器130安装于本体110上,并且可以被配置为接收与影像模式和跟踪模式有关的信息,由此来控制摄像机120的拍摄以及姿态。在本文中,影像模式可以包括可见光影像流模式和/或红外影像流模式,而跟踪模式可以包括禁止跟踪、指定目标跟踪和最热点跟踪。在下文中,对影像模式和跟踪模式的细节进行详细介绍。
具体地,本体110可以为壳体结构,所述控制器130可以安装于壳体 内部或壳体外部。
图1B是示出了根据本公开一个实施例的可移动设备10的框图。与图1A相比,图1B中的可移动设备还包括云台140,云台140可以安装在本体110上。摄像机120通过安装在云台140上而挂载于可移动设备10的本体110上。在此情况下,控制器130可以通过控制可移动设备10的本体110和/或云台140的姿态来控制摄像机120的姿态。除此之外,图1B中的可移动设备10的结构及其组件的功能与图1A中的可移动设备10相同。
控制器130可以接收指示指定目标跟踪的跟踪模式信息以及指定目标的位置信息,根据该位置信息来提取指定目标的特征参数,并基于特征参数来控制摄像机120的姿态以使得摄像机120跟踪拍摄指定目标。这里,“指定目标”是指用户感兴趣的、需要跟踪拍摄的目标对象。
如果当前跟踪模式是指定目标跟踪,则控制器130可以控制摄像机120拍摄指定目标的可见光影像流(影像模式是可见光影像流模式)。备选地,控制器130可以控制摄像机120拍摄指定目标的可见光影像流与红外影像流的融合影像流(影像模式是可见光和红外影像流模式;可以理解,在此情况下摄像机120要具有红外摄像能力)。采用可见光影像流与红外影像流的融合影像流可以提高跟踪的准确性,特别是当光线不足时,仅仅采用可见光影像流可能无法准确地跟踪目标对象。
如果当前跟踪模式是最热点跟踪,则控制器130获取摄像机120以红外方式采集到的最热点及其位置信息,由此来控制摄像机120的姿态,使得摄像机120跟踪拍摄最热点。在此情况下,摄像机120可以拍摄包含最热点的可见光影像流与红外影像流的融合影像流(影像模式是可见光和红外影像流模式),或者仅拍摄包含最热点的红外影像流(影像模式是红外影像流模式)。例如,当监视夜晚的火灾情况时,可能仅对着火点感兴趣。这样,仅需要跟踪拍摄着火点的红外影像流,以便为救灾提供帮助。
如果当前跟踪模式是是禁止跟踪,则控制器130可以停止对摄像机120的跟踪控制。
采用上述实施例中的技术方案,可以实现可移动设备的多种跟踪模式包括可见光和红外下的各种跟踪模式。因此,提高了可移动设备的跟踪的准确度,并且拓宽了可移动设备的应用场景。
图2是示出了根据本公开一个实施例的由可移动设备执行的方法的流 程图。例如,该方法可以由图1A或1B所示的可移动设备10来执行。以下为了简便,省略了对可移动设备10的细节的描述。
如图2所示,在步骤S210,接收与影像模式和跟踪模式有关的信息。如上文所述,影像模式可以包括可见光影像流模式和/或红外影像流模式、跟踪模式可以包括禁止跟踪、指定目标跟踪和最热点跟踪。
在步骤S220,根据与影像模式和跟踪模式有关的信息,跟踪拍摄可见光影像流和/或红外影像流。
在一个示例中,接收指示指定目标跟踪的跟踪模式信息以及指定目标的位置信息。根据位置信息来提取指定目标的特征参数,并且基于特征参数来控制跟踪拍摄指定目标。
例如,如果当前跟踪模式是指定目标跟踪,则可以拍摄可见光影像流,或者拍摄可见光影像流与红外影像流的融合影像流。
例如,如果当前跟踪模式是最热点跟踪,则可以获取以红外方式采集到的最热点及其位置信息,由此来跟踪拍摄最热点。备选地,在最热点跟踪模式下,也可以仅拍摄红外影像流。
以上,描述了根据本公开的一个实施例的可移动设备及其执行的方法。下面,对与可移动设备相对应地操作的控制设备及其方法进行详细描述。
图3是示出了根据本公开一个实施例的控制设备的框图。如图3所示,控制设备30包括影像模式选择单元310、跟踪模式选择单元320和发送单元330。备选地,控制设备30还可以包括显示单元340(以虚线框示出)。下面,详细描述控制设备30中的各个组件的操作。
需要指出的是,控制设备30可以是各种形式的设备,包括但不限于智能手机、控制终端、个人数字助理PDA等。下文,以智能手机作为控制设备30的示例来描述本公开的原理。本领域技术人员可以理解,所描述的原理同样可以适应于其他形式的控制设备。
影像模式选择单元310被配置为接收用户对可见光影像流模式和/或红外影像流模式的选择。如上文所述,影像模式可以包括可见光影像流模式和/或红外影像流模式、跟踪模式可以包括禁止跟踪、指定目标跟踪和最热点跟踪。
跟踪模式选择单元320被配置为接收用户对跟踪模式的选择。如上文所述,跟踪模式可以包括禁止跟踪、指定目标跟踪和最热点跟踪。
发送单元330被配置为向可移动设备(例如图1A和1B中示出的可移动设备10)发送与所选择的影像模式和跟踪模式有关的信息。
优选地,在控制设备30包括显示单元340时,跟踪模式选择单元320可以被配置为:如果用户选择了指定目标跟踪模式,则进一步接收用户对显示单元340上显示的用户界面中的指定目标的选择。此外,发送单元330还可以被配置为:如果用户选择了指定目标跟踪模式,则将指示指定目标跟踪的跟踪模式信息连同所述指定目标在用户界面中的位置信息发送给可移动设备(例如图1A和1B中示出的可移动设备10)。
优选地,显示单元340还可以被配置为:根据用户(通过影像模式选择单元310)对影像模式的选择,接收可移动设备拍摄的可见光影像流和/或红外影像流。例如,在指定目标跟踪模式下,可以在显示单元340上显示包含指定目标的可见光影像,或者包含指定目标的融合影像(可见光影像和红外影像)。备选地,在最热点跟踪模式下,可以在显示单元340上显示包含最热点的红外影像,或者包含最热点的融合影像。
例如,控制设备30的一个示例是运行特定应用的智能手机,该特定应用包括显示界面。在该显示界面中,存在跟踪模式选择按钮。例如,用户可以连续点击该按钮,从而在三种跟踪模式(即禁止跟踪、指定目标跟踪和最热点跟踪)之间连续切换。该显示界面还可以包括供用户用于选择可见光码流和/或红外码流的按钮。
-当用户选择禁止跟踪模式时,可移动设备不会有任何跟踪行为。
-当用户选择指定目标跟踪模式时,如果用户选择了可见光镜头的码流,用户可以在显示界面上选择跟踪目标。该特定应用会把用户所选择的跟踪目标的坐标以及当前的跟踪模式发给可移动设备。这样,可移动设备根据可见光码流和坐标参数,提取目标特征并控制摄像机跟踪指定目标。
-当用户选择指定目标跟踪模式时,如果用户选择了红外与可见光的融合码流,用户可以在显示界面上选择跟踪目标。该特定应用会把用户所选择的跟踪目标的坐标以及当前的跟踪模式发给可移动设备。这样,可移动设备根据融合码流和坐标参数,提取目标特征并控制摄像机跟踪指定目标。
-当用户选择最热点跟踪模式时,此时显示界面上必须是纯红外码流或红外与可见光的融合码流(即,不会出现仅可见光码流)。该特定应用会把当前的跟踪模式发给可移动设备。可移动设备的摄像机以红外方式实时采 集最热点(最高温度的点)的温度和坐标,并根据该最热点的温度和坐标来控制摄像机以跟随拍摄最热点。
备选地,可以在智能手机的显示界面上显示可移动设备拍摄的影像流。
图4是示出了根据本公开一个实施例的由控制设备执行的方法的流程图。例如,该方法可以由图3所示的控制设备30来执行。以下为了简便起见,省略了对控制设备30的细节的描述。
如图4所示,在步骤S410,选择可见光影像流模式和/或红外影像流模式。
在步骤S420,接收用户对跟踪模式的选择。如上文所述,跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪
在步骤S430,向可移动设备发送与所选择的影像模式和跟踪模式有关的信息。
优选地,如果用户选择了指定目标跟踪模式,则进一步接收用户对控制设备的用户界面中的指定目标的选择。如果用户选择了指定目标跟踪模式,则将指示指定目标跟踪的跟踪模式信息连同指定目标在用户界面中的位置信息发送给可移动设备。
优选地,根据用户对影像模式的选择,接收可移动设备拍摄的可见光影像流和/或红外影像流。
此外,本公开的实施例可以借助于计算机程序产品来实现。例如,该计算机程序产品可以是计算机可读存储介质。计算机可读存储介质上存储有计算机程序,当在计算设备上执行该计算机程序时,能够执行相关的操作以实现本公开的上述技术方案。
例如,图5是示出了根据本公开一个实施例的计算机可读存储介质50的框图。如图5所示,计算机可读存储介质50包括计算机程序510。计算机程序510在由至少一个处理器运行时,使得至少一个处理器执行例如根据图2和图4所描述的方法的各个步骤。本领域技术人员可以理解,计算机可读存储介质50的示例包括但不限于:半导体存储介质、光学存储介质、磁性存储介质、或任何其他形式的计算机可读存储介质。
上文已经结合优选实施例对本公开的方法和涉及的设备进行了描述。本领域技术人员可以理解,上面示出的方法仅是示例性的。本公开的方法 并不局限于上面示出的步骤和顺序。
应该理解,本公开的上述实施例可以通过软件、硬件或者软件和硬件两者的结合来实现。本公开的这种设置典型地提供为设置或编码在例如光介质(例如CD-ROM)、软盘或硬盘等的计算机可读介质上的软件、代码和/或其他数据结构、或者诸如一个或多个ROM或RAM或PROM芯片上的固件或微代码的其他介质、或一个或多个模块中的可下载的软件图像、共享数据库等。软件或固件或这种配置可安装在计算设备上,以使得计算设备中的一个或多个处理器执行本公开实施例所描述的技术方案。
此外,上述每个实施例中所使用的设备的每个功能模块或各个特征可以由电路实现或执行,所述电路通常为一个或多个集成电路。设计用于执行本说明书中所描述的各个功能的电路可以包括通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)或通用集成电路、现场可编程门阵列(FPGA)或其他可编程逻辑器件、分立的门或晶体管逻辑、或分立的硬件组件、或以上器件的任意组合。通用处理器可以是微处理器,或者所述处理器可以是现有的处理器、控制器、微控制器或状态机。上述通用处理器或每个电路可以由数字电路配置,或者可以由逻辑电路配置。此外,当由于半导体技术的进步,出现了能够替代目前的集成电路的先进技术时,本公开也可以使用利用该先进技术得到的集成电路。
运行在根据本发明的设备上的程序可以是通过控制中央处理单元(CPU)来使计算机实现本发明的实施例功能的程序。该程序或由该程序处理的信息可以临时存储在易失性存储器(如随机存取存储器RAM)、硬盘驱动器(HDD)、非易失性存储器(如闪速存储器)、或其他存储器***中。用于实现本发明各实施例功能的程序可以记录在计算机可读记录介质上。可以通过使计算机***读取记录在所述记录介质上的程序并执行这些程序来实现相应的功能。此处的所谓“计算机***”可以是嵌入在该设备中的计算机***,可以包括操作***或硬件(如***设备)。
如上,已经参考附图对本发明的实施例进行了详细描述。但是,具体的结构并不局限于上述实施例,本发明也包括不偏离本发明主旨的任何设计改动。另外,可以在权利要求的范围内对本发明进行多种改动,通过适当地组合不同实施例所公开的技术手段所得到的实施例也包含在本发明的技术范围内。此外,上述实施例中所描述的具有相同效果的组件可以相互 替代。

Claims (22)

  1. 一种控制设备,用于控制可移动设备的跟踪模式,所述控制设备包括:
    影像模式选择单元,被配置为接收用户对可见光影像流模式和/或红外影像流模式的选择;
    跟踪模式选择单元,被配置为接收用户对跟踪模式的选择,所述跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪;以及
    发送单元,被配置为向可移动设备发送与所选择的影像模式和跟踪模式有关的信息。
  2. 根据权利要求1所述的控制设备,还包括显示单元;
    其中,所述跟踪模式选择单元还被配置为:如果用户选择了指定目标跟踪模式,则进一步接收用户对显示单元的用户界面中的指定目标的选择;以及
    其中,所述发送单元还被配置为:如果用户选择了指定目标跟踪模式,则将指示指定目标跟踪的跟踪模式信息连同所述指定目标在用户界面中的位置信息发送给可移动设备。
  3. 根据权利要求2所述的控制设备,其中,
    所述显示单元还被配置为:根据用户对影像模式的选择,接收可移动设备拍摄的可见光影像流和/或红外影像流。
  4. 一种用于控制可移动设备的跟踪模式的方法,所述方法包括:
    选择可见光影像流模式和/或红外影像流模式;
    接收用户对跟踪模式的选择,所述跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪;以及
    向可移动设备发送与所选择的影像模式和跟踪模式有关的信息。
  5. 根据权利要求4所述的方法,其中,
    如果用户选择了指定目标跟踪模式,则进一步接收用户对用户界面中的指定目标的选择;以及
    如果用户选择了指定目标跟踪模式,则将指示指定目标跟踪的跟踪模式信息连同所述指定目标在用户界面中的位置信息发送给可移动设备。
  6. 根据权利要求5所述的方法,其中,根据用户对影像模式的选择,接收可移动设备拍摄的可见光影像流和/或红外影像流。
  7. 一种可移动设备,包括:
    本体,用于承载元件;
    摄像机,承载于所述本体,所述摄像机被配置为拍摄可见光影像流和/或红外影像流;以及
    控制器,安装于所述本体,所述控制器被配置为接收与影像模式和跟踪模式有关的信息,由此来控制摄像机的拍摄以及姿态;
    其中,所述影像模式包括可见光影像流模式和/或红外影像流模式,所述跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪。
  8. 根据权利要求7所述的可移动设备,还包括云台,所述摄像机通过安装在所述云台而挂载于所述可移动设备的本体上,
    其中,所述控制器还被配置为:通过控制所述可移动设备的本体和/或云台的姿态来控制所述摄像机的姿态。
  9. 根据权利要求7所述的可移动设备,其中,所述控制器被配置为:
    接收指示指定目标跟踪的跟踪模式信息以及指定目标的位置信息;以及
    根据所述位置信息来提取指定目标的特征参数,基于特征参数来控制所述摄像机的姿态以使得所述摄像机跟踪拍摄所述指定目标。
  10. 根据权利要求7所述的可移动设备,其中,所述控制器被配置为:如果当前跟踪模式是最热点跟踪,则获取所述摄像机以红外方式采集到的最热点及其位置信息,由此来控制所述摄像机的姿态以使得所述摄像机跟踪拍摄所述最热点。
  11. 根据权利要求7所述的可移动设备,其中,所述控制器被配置为:如果当前跟踪模式是指定目标跟踪,则控制所述摄像机拍摄可见光影像流或可见光影像流与红外影像流的融合影像流。
  12. 根据权利要求7所述的可移动设备,其中,所述控制器被配置为:如果当前跟踪模式是最热点跟踪,则控制所述摄像机仅拍摄红外影像流。
  13. 根据权利要求7所述的可移动设备,其中,所述摄像机包括双镜头摄像机。
  14. 根据权利要求7所述的可移动设备,其中,所述可移动设备包括无 人机。
  15. 一种由可移动设备执行的方法,包括:
    接收与影像模式和跟踪模式有关的信息,所述影像模式包括可见光影像流模式和/或红外影像流模式,所述跟踪模式包括禁止跟踪、指定目标跟踪和最热点跟踪;以及
    根据所述与影像模式和跟踪模式有关的信息,跟踪拍摄可见光影像流和/或红外影像流。
  16. 根据权利要求15所述的方法,其中,
    接收指示指定目标跟踪的跟踪模式信息以及指定目标的位置信息;以及
    根据所述位置信息来提取指定目标的特征参数,基于特征参数来控制跟踪拍摄所述指定目标。
  17. 根据权利要求15所述的方法,其中,如果当前跟踪模式是最热点跟踪,则获取以红外方式采集到的最热点及其位置信息,由此来跟踪拍摄所述最热点。
  18. 根据权利要求15所述的方法,其中,如果当前跟踪模式是指定目标跟踪,则拍摄可见光影像流或可见光影像流与红外影像流的融合影像流。
  19. 根据权利要求15所述的方法,其中,如果当前跟踪模式是最热点跟踪,则仅拍摄红外影像流。
  20. 根据权利要求15所述的方法,其中,所述可移动设备包括双镜头摄像机,所述双镜头摄像机用于拍摄可见光影像流和/或红外影像流。
  21. 根据权利要求15所述的方法,其中,所述可移动设备包括无人机。
  22. 一种计算机可读存储介质,存储有计算机程序,当所述计算机程序由至少一个处理器运行时,使所述至少一个处理器执行根据权利要求4-6或15-21中任一项所述的方法。
PCT/CN2017/108574 2017-10-31 2017-10-31 用于跟踪拍摄的方法和设备 WO2019084769A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/108574 WO2019084769A1 (zh) 2017-10-31 2017-10-31 用于跟踪拍摄的方法和设备
CN201780004508.0A CN108496363A (zh) 2017-10-31 2017-10-31 用于跟踪拍摄的方法和设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/108574 WO2019084769A1 (zh) 2017-10-31 2017-10-31 用于跟踪拍摄的方法和设备

Publications (1)

Publication Number Publication Date
WO2019084769A1 true WO2019084769A1 (zh) 2019-05-09

Family

ID=63344772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/108574 WO2019084769A1 (zh) 2017-10-31 2017-10-31 用于跟踪拍摄的方法和设备

Country Status (2)

Country Link
CN (1) CN108496363A (zh)
WO (1) WO2019084769A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111901563A (zh) * 2020-07-21 2020-11-06 河南中光学集团有限公司 多功能智能监测***及监测方法
CN116758117A (zh) * 2023-06-28 2023-09-15 云南大学 可见光与红外图像下的目标跟踪方法及***

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103402044A (zh) * 2013-08-07 2013-11-20 重庆大学 一种基于多源视频融合的目标识别与跟踪***
CN104796611A (zh) * 2015-04-20 2015-07-22 零度智控(北京)智能科技有限公司 移动终端遥控无人机实现智能飞行拍摄的方法及***
US20160171330A1 (en) * 2014-12-15 2016-06-16 Reflex Robotics, Inc. Vision based real-time object tracking system for robotic gimbal control
CN106254836A (zh) * 2016-09-19 2016-12-21 南京航空航天大学 无人机红外图像目标跟踪***及方法
CN106970639A (zh) * 2017-03-15 2017-07-21 武汉理工大学 一种基于无人机平台的港口实景监控***及方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104597912A (zh) * 2014-12-12 2015-05-06 南京航空航天大学 一种六旋翼无人直升机跟踪飞行控制***及方法
US9922659B2 (en) * 2015-05-11 2018-03-20 LR Acquisition LLC External microphone for an unmanned aerial vehicle
CN106274617B (zh) * 2016-08-26 2019-01-22 湖南航天远望科技有限公司 艇、地、机组合监控方法及监控***
CN106986042A (zh) * 2017-04-27 2017-07-28 诸城宏远安防科技有限公司 防火红外热成像露天仓库无人机

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103402044A (zh) * 2013-08-07 2013-11-20 重庆大学 一种基于多源视频融合的目标识别与跟踪***
US20160171330A1 (en) * 2014-12-15 2016-06-16 Reflex Robotics, Inc. Vision based real-time object tracking system for robotic gimbal control
CN104796611A (zh) * 2015-04-20 2015-07-22 零度智控(北京)智能科技有限公司 移动终端遥控无人机实现智能飞行拍摄的方法及***
CN106254836A (zh) * 2016-09-19 2016-12-21 南京航空航天大学 无人机红外图像目标跟踪***及方法
CN106970639A (zh) * 2017-03-15 2017-07-21 武汉理工大学 一种基于无人机平台的港口实景监控***及方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111901563A (zh) * 2020-07-21 2020-11-06 河南中光学集团有限公司 多功能智能监测***及监测方法
CN116758117A (zh) * 2023-06-28 2023-09-15 云南大学 可见光与红外图像下的目标跟踪方法及***
CN116758117B (zh) * 2023-06-28 2024-02-09 云南大学 可见光与红外图像下的目标跟踪方法及***

Also Published As

Publication number Publication date
CN108496363A (zh) 2018-09-04

Similar Documents

Publication Publication Date Title
JP6103948B2 (ja) 撮像装置、遠隔操作端末、カメラシステム、撮像装置の制御方法およびプログラム、遠隔操作端末の制御方法およびプログラム
US20150350523A1 (en) Image processing device, image processing method, and program
JP5430428B2 (ja) 撮影機器
TW200930098A (en) Camera control system capable of positioning and tracking object in space and method thereof
WO2019240988A1 (en) Camera area locking
US11637968B2 (en) Image photographing method of electronic device and electronic device
WO2019104569A1 (zh) 一种对焦方法、设备及可读存储介质
US20160084932A1 (en) Image processing apparatus, image processing method, image processing system, and storage medium
EP3381180B1 (en) Photographing device and method of controlling the same
JP6892524B2 (ja) 対象追跡に基づくスローモーションビデオキャプチャ
WO2019084769A1 (zh) 用于跟踪拍摄的方法和设备
US20160073020A1 (en) Image processing device, image processing method, and program
TW201741938A (zh) 動作感應方法及裝置
JP6326892B2 (ja) 撮像システム、撮像方法、画像再生装置及びプログラム
KR102475994B1 (ko) 정보 처리장치, 정보 처리방법 및 기억매체
WO2019056312A1 (zh) 用于跟踪拍摄的方法和设备
JP2015029188A (ja) 電子装置およびその制御方法
JP2015161748A5 (ja) 投影装置、画像処理装置およびそれらの制御方法、並びにプログラム
KR102021363B1 (ko) 곡면 디스플레이 장치 및 그 동작 방법
US11750912B2 (en) Main subject tracking and prioritization using depth and successive temporal location proximity
JP7342883B2 (ja) 撮像制御装置、撮像装置、撮像制御方法
TWI671684B (zh) 影像監視方法及系統
TW201500967A (zh) 防盜方法及其電腦系統
US20140226023A1 (en) Imaging apparatus, control method, and program
WO2019183746A1 (zh) 无人机的跟踪处理方法及控制终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17930499

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17930499

Country of ref document: EP

Kind code of ref document: A1