Выделить слова: 


Патент США №

9922659

Автор(ы)

Bradlow и др.

Дата выдачи

20 марта 2018 г.


External microphone for an unmanned aerial vehicle



РЕФЕРАТ

Several embodiments include a remote tracker for a videography drone. The remote tracker can include a spatial information sensor and a microphone configured to capture audio data surrounding the remote tracker. The remote tracker can also include a logic control component configured to decorate the audio data with location-based metadata or temporal metadata. A network interface of the remote tracker can communicate with the videography drone, including streaming the audio data captured by the microphone to the videography drone.


Авторы:

Henry W. Bradlow (Berkeley, CA), Antoine Balaresque (Berkeley, CA)

Патентообладатель:

ИмяГородШтатСтранаТип

LR Acquisition, LLC

New York

NY

US

Заявитель:

LR Acquisition LLC (New York, NY)

ID семейства патентов

57277607

Номер заявки:

14/875,268

Дата регистрации:

05 октября 2015 г.

Prior Publication Data

Document IdentifierPublication Date
US 20160336020 A1Nov 17, 2016

Отсылочные патентные документы США


Application NumberFiling DatePatent NumberIssue Date
62159794May 11, 2015

Класс патентной классификации США:

1/1

Класс совместной патентной классификации:

G01S 19/49 (20130101); G10L 19/167 (20130101); B64C 39/024 (20130101); G05D 1/0038 (20130101); G05D 1/005 (20130101); G01S 19/14 (20130101); G05D 1/0094 (20130101); H04N 7/185 (20130101); H04R 1/086 (20130101); B64C 2201/127 (20130101); G10L 2015/223 (20130101); H04R 3/005 (20130101); H04R 2420/01 (20130101); H04R 2420/07 (20130101); H04R 2460/07 (20130101); H04R 2499/13 (20130101); B64C 2201/027 (20130101); B64C 2201/145 (20130101); G10L 25/57 (20130101); B64C 2201/12 (20130101)

Класс международной патентной классификации (МПК):

H04B 15/00 (20060101); G10L 19/16 (20130101); B64C 39/02 (20060101); G05D 1/00 (20060101); G01S 19/14 (20100101); G01S 19/49 (20100101); H04N 7/18 (20060101); H04R 1/08 (20060101); G10L 15/22 (20060101); H04R 3/00 (20060101)

Область поиска:

;381/94.1

Использованные источники

[Referenced By]

Патентные документы США

5479351December 1995Woo
7817905October 2010Shellshear
8606079December 2013Shibata et al.
8903568December 2014Wang et al.
8948575February 2015Webb et al.
9111580August 2015Kirsch
9332160May 2016Chenillo et al.
2006/0104616May 2006Shellshear
2009/0171902July 2009MacLaurin et al.
2009/0207277August 2009Kurihara et al.
2009/0226149September 2009Shibata et al.
2010/0224732September 2010Olson et al.
2013/0077805March 2013Kirsch et al.
2014/0169768June 2014Webb et al.
2015/0104151April 2015Webb et al.
2015/0370250December 2015Bachrach
2016/0054737February 2016Soll et al.
2016/0055883February 2016Soll
2016/0063987March 2016Xu
2016/0161946June 2016Wuth Sepulveda et al.

Зарубежные патентные документы

2370829Oct 2009RU
116628May 2012RU

Другие источники


PCT Search Report and Written Opinion; dated Sep. 15, 2016 PCT Application No. PCT/US2016/031482. cited by applicant .
Non-Final Office Action dated Jun. 21, 2016 of U.S. Appl. No. 15/094,796 by Bradlow, H.W., et al., filed Apr. 8, 2016. cited by applicant.

Главный эксперт: Chin; Vivian
Assistant Examiner: Hamid; Ammar
Уполномоченный, доверенный или фирма: Hard IP LLC

Текст решения-прецедента




ПЕРЕКРЁСТНАЯ ССЫЛКА НА "РОДСТВЕННЫЕ" ЗАЯВКИ



This application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/159,794, Entitled "EXTERNAL MICROPHONE FOR AN UNMANNED AERIAL VEHICLE," filed May 11, 2015, which is incorporated by reference herein in its entirety.

ФОРМУЛА ИЗОБРЕТЕНИЯ



What is claimed is:

1. A video drone control system, comprising: an unmanned aerial vehicle (UAV) that comprises: a camera configured to capture video data, a processor configured to process the video data, a drone-side microphone configured to capture background audio data surrounding the UAV, and a drone-side network interface configured to communicate with a remote tracker; and the remote tracker that comprises: a spatial information sensor configured to generate location-based metadata by continuously determining spatial location of the remote tracker, a tracker-side microphone configured to capture audio data surrounding the remote tracker, a logic control component configured to decorate the audio data with the location-based metadata and temporal metadata, and a tracker-side network interface configured to communicate with the drone-side network interface, wherein the tracker-side network interface is configured to stream the audio data decorated by the logic control component to the drone-side network interface, and wherein, upon receiving the audio data from the remote tracker, the processor of the UAV is configured to: produce filtered audio data by filtering the background audio data from the audio data, and generate audio/video (A/V) data by synchronizing the video data and the filtered audio data based on the temporal metadata or the location-based metadata.

2. The video drone control system of claim 1, wherein the UAV further comprises a first global positioning system (GPS) module, wherein the remote tracker further comprises a second GPS module, and wherein the first GPS module and the second GPS module are of a same type and model.

3. The video drone control system of claim 1, wherein the processor is configured to synchronize, in real-time, a video stream captured form the camera and an audio stream captured from the tracker-side microphone and streamed from the remote tracker to the UAV via the tracker-side network interface.

4. The video drone control system of claim 3, wherein the processor is configured to synchronize the video stream and the audio stream based on at least a timestamp entry associated with the video stream and a timestamp entry associated with the audio stream.

5. The video drone control system of claim 4, wherein the timestamp entry associated with the video stream and the timestamp entry associated with the audio stream are global positioning system (GPS) timestamps from a single GPS module or GPS modules of a same type and model.

6. The video drone control system of claim 1, wherein the logic control component is configured to analyze the audio data from the tracker-side microphone to select a voice command by matching against one or more voice patterns associated with one or more voice commands.

7. The video drone control system of claim 6, wherein the tracker-side network interface is configured to send the selected voice command to the drone-side network interface.

8. The video drone control system of claim 1, wherein the logic control component is configured to analyze the audio data to identify a high noise event, and wherein the tracker-side network interface is configured to notify the drone-side network interface regarding the high noise event.

9. The video drone control system of claim 8, wherein the processor is configured to process the video data differently in response to the drone-side network interface receiving a message indicative of the high noise event.

10. The video drone control system of claim 8, wherein the processor is configured to process the video data differently by processing the video data in slow motion.

11. The video drone control system of claim 1, wherein the processor is configured to filter propeller noise from the streamed audio data received from the remote tracker.

12. The video drone control system of claim 1, wherein the tracker-side microphone is configured to start recording the audio data when the drone-side network interface notifies the tracker-side network interface that the UAV is in flight or the UAV is on.

13. The video drone control system of claim 1, wherein the tracker-side microphone is configured to start recording when the tracker-side network interface receives a command from a mobile device separate from the UAV.

14. A portable device configured to communicate with a videography drone, the portable device comprising: a spatial information sensor configured to generate location-based metadata by continuously determining spatial location of the portable device; a microphone configured to capture audio data surrounding the portable device; a logic control component configured to: generate temporal metadata using a digital clock, and decorate the audio data with the location-based metadata and the temporal metadata; and a tracker-side network interface configured to communicate with the videography drone, wherein the tracker-side network interface is configured to stream the audio data decorated by the logic control component to the videography drone, wherein the videography drone is configured to capture background audio data, and wherein the videography drone is further configured to, upon receiving the stream of the audio data, produce filtered audio data by filtering the background audio data from the audio data, and generate audio/video (A/V) data by synchronizing video data captured by the videography drone and the filtered audio data based on the temporal metadata or the location-based metadata.

15. The portable device of claim 14, further comprising a shockproof casing enclosing the spatial information sensor or the microphone.

16. The portable device of claim 14, further comprising: a protective shell partially enclosing the microphone and exposing a portion of the microphone.

17. The portable device of claim 16, further comprising an impact dampener between the protective shell and the microphone.

18. The portable device of claim 14, wherein the tracker-side network interface is configured to automatically discover a network interface of the videography drone when the videography drone is within wireless communication radius from the portable device.

19. The portable device of claim 14, wherein the logic control component is configured to convert the audio data to text via a voice recognition process and annotate the audio data with a caption based on the text.

20. The portable device of claim 14, wherein the logic control component is configured to remove propeller noise from the audio data prior to streaming the audio data to the videography drone.

21. The portable device of claim 14, further comprising a speaker configured to play a sound in response to a command or an alert received from the videography drone via the tracker-side network interface.

22. The portable device of claim 14, further comprising a camera flash or a light source to facilitate the videography drone to photograph a user carrying the portable device.

23. A method comprising: recording background noise data with a microphone of a videography drone; receiving a stream of location data and audio data from a user tracker device, the user tracker device separate from the videography drone, wherein the audio data is decorated with location-based metadata or temporal metadata; navigating the videography drone based on the location data; capturing video data with a camera in the videography drone; producing filtered audio data by filtering the background noise data from the audio data; and generating audio/video (A/V) data by synchronizing the filtered audio data and the video data based on the temporal metadata or the location-based metadata.


ОПИСАНИЕ




ОБЛАСТЬ ТЕХНИКИ



At least one embodiment of this disclosure relates generally to unmanned aerial vehicles (UAVs).


УРОВЕНЬ ТЕХНИКИ



UAVs for consumers have traditionally been limited to entertainment as toys or as hobbyist collector items. Recently, however, UAVs have been used for personal photography and videography. A UAV can be equipped with a portable power source and an image capturing apparatus, such as a camera or other types of sensors. For example, a photographer or a videographer can use a UAV to photograph or film athletes participating in outdoor activities when there are no overhead obstructions. The UAV can also be used to document special occasions, such as weddings, engagement proposals, and other activities that may occur in an open field. These applications require video recording along with audio to fully capture the moment. Conventional UAVs carry a camera and capture audio from the air, which is very low quality because of noise from the propellers and distance from the user.

DISCLOSURE OVERVIEW

Disclosed is a design of a UAV with a camera and an external microphone that records audio directly from the user. The noise created by propellers on a UAV, as well as the typical distance a UAV flies from its subject makes audio collected by the UAV useless. Adding an external microphone in a remote tracker carried by the subject enables an UAV to combine and synchronize audio from the remote tracker with the video captured by the UAV.

In some embodiments, the audio is streamed via electromagnetic signals (e.g., WiFi, Bluetooth, Bluetooth low energy, infrared, laser, other radiofrequency, etc.) to the main camera system in the UAV. In real time, the audio is streamed to the main system to ensure that audio is recorded in the event that the microphone is lost or damaged. This also reduces the need for a large memory storage solution on the microphone device.

In some embodiments, audio is saved on the microphone device. Audio can be saved in raw or encoded format (e.g., MP3) on the microphone device and can be later synchronized with the video. This can be used if a wireless connection with the main video system is not possible, due to interference or unreliability. This also reduces the need for an RF connection between the two devices.

In some embodiments, the microphone device can be clipped onto clothing to better capture user speech. The microphone device can be part of various kinds of accessories (e.g., clips, plastic cases, remote trackers, etc.) and various kinds of form factors.

For applications that require user speech to be recorded, proper placement of a microphone is important to the quality of the audio. A special clip can be used to ensure that the device is mounted near the subject's mouth. The attachment mechanism can be a necklace, a clip to this shirt, a headband, an armband, or any combination thereof. For example, the attachment mechanism can be modularly detachable to facilitate convenient switching of attachment mechanism types. Similar mechanical mounts can be used on machines or other parts of a subject to capture specific types of sounds: for example, hard mounting to a skateboard to capture the sound of the wheels rolling.

In some embodiments, the microphone device is waterproof and can capture underwater audio. Ruggedizing of the microphone device can enable the user to be recorded in more extreme environments, which can yield more interesting content. In some embodiments, a plastic case is provided for the microphone that protects the device from dust and water. This reduces the cost and complexity of the device, and allows for a smaller device that can be used when waterproofness and dust proofing are not required.

In some embodiments, a Global Positioning System (GPS) timestamp is used to synchronize the audio with the video. Both the UAV and the microphone device have internal GPS modules that periodically record the GPS timestamp. The audio and video are later integrated by aligning these timestamps. In some embodiments, a system can be used to synchronize the audio and video by sharing a unique event or time based data between the two devices.

In some embodiments, the camera on the UAV is mounted on a vibration isolation system. The vibration isolation system can reduce vibration from the propellers to ensure sharper video. The vibration isolation system can protect the glass lens from impacts. The vibration isolation system can enable the UAV to be more rugged than conventional drone-camera systems. The camera lens may be one of the most fragile parts. In some embodiments, the vibration isolation system involves a hard shell that surrounds the camera. For example, the hard shell can be made of rubber, so that the dampening is less hard. This enables for more impact space.

Some embodiments of this disclosure have other aspects, elements, features, and steps in addition to or in place of what is described above. These potential additions and replacements are described throughout the rest of the specification


КРАТКОЕ ОПИСАНИЕ РИСУНКОВ



FIG. 1 is a perspective view of an unmanned aerial vehicle (UAV), in accordance with various embodiments.

FIG. 2A is a top view of a remote tracker of an UAV, in accordance with various embodiments.

FIG. 2B is a side view of the remote tracker of FIG. 2A.

FIG. 3 is a block diagram illustrating components of a UAV, in accordance with various embodiments.

FIG. 4 is a block diagram illustrating components of a remote tracker of a UAV, in accordance with various embodiments.

FIG. 5 is a flowchart illustrating a method of recording a video utilizing an UAV and a remote tracker, in accordance with various embodiments.

The figures depict various embodiments of this disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of embodiments described herein.


ПОДРОБНОЕ ОПИСАНИЕ



FIG. 1 is a perspective view of an unmanned aerial vehicle (UAV) 100, in accordance with various embodiments. In several embodiments, the UAV 100 is a videography drone that includes a camera 104. The camera 104 can be for filming and/or for photographing. The UAV 100 can be a copter. For example, the UAV 100 includes one or more propellers 108. In various embodiments, the UAV 100 is controlled by one or more operator devices, such as a remote tracker (see FIG. 2) and/or a drone control application running on a general-purpose device (e.g., a mobile device, such as a smart phone, a laptop, or a wearable device).

FIG. 2A is a top view of a remote tracker 200 of an UAV (e.g., the UAV 100), in accordance with various embodiments. FIG. 2B is a side view of the remote tracker 200 of FIG. 2A. The remote tracker 200 can be coupled wirelessly to the UAV. The remote tracker 200 can be a portable device separate from the UAV. For example, the remote tracker 200 can be shaped as a puck or a disk. In the illustrated top view, the remote tracker 200 is circular. In other embodiments, the remote tracker 200 can have a rectangular or oval top view. In the illustrated side view, the remote tracker 200 can have a rounded side profile.

The remote tracker 200 can include a microphone 202, a first input button 206, a second input button 210, a power port 214, or any combination thereof. The remote tracker 200 can include a protective case 218 enclosing various components (e.g., as described in FIG. 4) and exposes the first input button 206, the second input button 210, and the power port 214. The protective case 218 can at least partially encloses the microphone 202. For example, the protective case 218 can expose at least a portion of the microphone 202 to record external sound. In some embodiments, the remote tracker 200 can include multiple microphones. For example, the remote tracker 200 can include four microphones spaced equally apart (e.g., 90.degree. apart and along the same radius from the center).

The first input button 206 can be a round shaped button in the center of the remote tracker 200. The second input button 210 can be a ring-shaped button (e.g., a complete ring or a segment of a ring) surrounding the center of the remote tracker 200. The input buttons enable a user carrying the remote tracker 200 to interact with a logic component therein. For example, clicking on or holding down one of the input buttons can turn the remote tracker 200 on or turn the UAV on. In another example, clicking on or holding down one of the input buttons can mute, start, pause, or stop an audio recording of the microphone 202 or start, pause, stop, or censor a video recording of a camera (e.g., the camera 104) of the UAV.

The power port 214 can be a universal serial bus (USB) port. The power port 214 can accept a cable with an adapter head that plugs into the power port 214. The cable can deliver electrical power (e.g., direct current (DC) power) to charge the remote tracker 200. In some embodiments, the power port 214 can also be a communication port that enables a wired interconnection with an external computing device. For example the wire interconnection can be used to download data stored in a memory of the remote tracker 200 and/or to update or debug logical/functional components within the remote tracker 200.

FIG. 3 is a block diagram illustrating components of a UAV 300 (e.g., the UAV 100), in accordance with various embodiments. The UAV 300 can include a camera 302, a vibration isolation system 304 for the camera 302, a processor 306, a memory 308, a network interface 310, or any combination thereof. Optionally, the UAV 300 can include a light source 314 (e.g., camera flash or a flashlight). The light source 314 can provide illumination to the subject of the camera 302. The camera 302 can be the camera 104 of FIG. 1. In some embodiments, the UAV 300 can include a spatial information sensor 318 (e.g., an accelerometer, a GPS module, a motion detector, a gyroscope, a cellular triangulation module, other inertial sensors, etc.). The processor 306 can implement various logical and functional components (e.g., stored as processor-executable instructions in the memory 308) to control the UAV 300 in real-time in absence of explicit real-time commands from an authorized user. However, in several embodiments, the authorized user can configure (e.g., via a drone control application) the operating modes of the UAV 300 prior to or during its flight. The drone control application can implement an interactive user interface to configure the UAV 300 and/or a remote tracker of the UAV 300. The drone control application can be a mobile application.

The network interface 310 can enable wireless communication of the UAV 300 with other devices. For example, the network interface 310 enables the UAV 300 to communicate wirelessly with a computing device (e.g., a mobile device) running the drone control application (e.g., a mobile application). In several embodiments, the network interface 310 can also enable the UAV 300 to communicate with a remote tracker (e.g., the remote tracker 200 of FIG. 2 and/or the remote tracker 400 of FIG. 4). In some embodiments, the network interface 310 enables a computing device to update firmware or software of the UAV 300 (e.g., stored in the memory 308).

In several embodiments, the UAV 300 can also include an energy storage 324 and a driver circuit 326. The energy storage 324, for example, can be a battery, a fuel cell, a fuel tank, or any combination thereof. The driver circuit 326 can be configured to drive propellers (e.g., the propellers 108 of FIG. 1) of the UAV 300. The processor 306 can control the driver circuit 326. The driver circuit 326, in turn, can individually control the driving power and speed of each propeller.

FIG. 4 is a block diagram illustrating components of a remote tracker 400 (e.g., the remote tracker 200) of a UAV (e.g., the UAV 100 and/or the UAV 300), in accordance with various embodiments. The components of the remote tracker 400 can be enclosed by a protective shell (e.g., the protective case 218 of FIG. 2). In some embodiments, the remote tracker 400 includes an impact dampener 404 between the protective shell (e.g., the protective case 218) and the components (e.g., a spatial information sensor 402, logic control component 406, a memory 408, and a microphone 410) of the remote tracker 400.

The remote tracker 400 can include the spatial information sensor 402. For example the spatial information sensor 402 can be a global positioning system (GPS) module, an accelerometer, a gyroscope, a cellular triangulation module, other inertial motion sensors, or any combination thereof. In some embodiments, the spatial information sensor 402 is a GPS module. The spatial information sensor 402 can be a GPS module of the same model and type as the spatial information sensor 318 of the UAV 300.

The remote tracker 400 can be a portable device to be carried by a user of the UAV. The remote tracker 400 further includes the logic control component 406, the memory 408, the microphone 410, a network interface 414, a light source 418, or any combination thereof. In some embodiments, the remote tracker 400 includes a wearable attachment mechanism 420 (e.g., a belt, a strap, fastener, a clip, a hook, a headband, an armband or any combination thereof). The logic control component 406 can implement various logical and functional components (e.g., stored as machine executable instructions in the memory 408) of the remote tracker 400.

In several embodiments, the remote tracker 400 can passively control the UAV 300 in real-time without the user's direct involvement or input in real-time. For example, the user can configure the UAV 300 to follow the remote tracker 400. That is, the user does not control the movement of the UAV 300, but the UAV 300 tracks the user movement via the spatial information sensor 402 of the remote tracker 400. The network interface 414 can send the spatial information captured by the spatial information sensor 402 to the UAV 300 such that the UAV 300 navigates within a constant distance (and/or constant direction/angle) from the remote tracker 400 and points the camera 302 toward the remote tracker 400. In some embodiments, the remote tracker 400 includes an input component 422 (e.g., the first input button 206 and/or the second input button 210) such that the user can actively interact with the remote tracker 400.

The microphone 410 can be configured to capture audio data surrounding the remote tracker 400. The logic control component 406 can be configured to decorate the audio data with location-based metadata (e.g., derived from the spatial information sensor 402) and temporal metadata (e.g., from a digital clock implemented by the logic control component 406 or from the spatial information sensor 402). For example, the temporal metadata can be a GPS timestamp from a GPS module. In some embodiments, the logic control component 406 is configured to convert the audio data to text via a voice recognition process and annotate the audio data with caption based on the text.

The network interface 414 can be configured to communicate with the network interface 310. In some embodiments, the network interface 414 is configured to automatically discover a network interface (e.g., the network interface 310) of a videography drone when the videography drone is within wireless communication radius from the remote tracker 400.

The network interface 414 can be configured to stream the audio data captured by the microphone 410 to the network interface 310. In various embodiments, when the network interface 310 receives the streamed audio data, the processor 306 stores the streamed audio data in the memory 308, or other buffer, cache, and/or data storage space. In some embodiments, the processor 306 synchronizes a video file captured from the camera 302 with an audio file from the microphone 410 (e.g., in the memory 308). In these embodiments, the processor 306 stitches the video file together with the audio file. The stitching can occur after the streamed audio data is saved as the audio file. In some embodiments, the processor 306 is configured to synchronize, in real-time, a video stream captured from the camera 302 and the stream of audio data. That is, the processor 306 can generate and append to a video file with the streamed audio data integrated therein in real-time. The processor 306 can save the generated video file into the memory 308. For example, synchronization of the video stream and the audio stream can be based on at least a timestamp entry associated with the video stream and a time stamp entry associated with the audio stream. These timestamps can be GPS timestamps from the same GPS module or from GPS modules of the same type and model.

In some embodiments, the logic control component 406 is configured to analyze the audio data from the microphone 410 to select a voice command by matching against one or more voice patterns associated with one or more voice commands. The memory 408 can store the voice patterns and associations between the voice patterns and the voice commands. The network interface 414 can be configured to send the selected voice command (e.g., a command to start/stop/pause/sensor the video recording by the camera 302 or to switch between operating modes of the UAV 300) to the network interface 310, in response to selecting the voice command based on the audio data analysis. The logic control component 406 can be configured to execute the selected command (e.g., a command to start/stop/pause/mute the audio recording by the microphone 410).

In some embodiments, the logic control component 406 is configured to analyze the audio data to identify a high noise event. The network interface 414 can be configured to notify the network interface 310 regarding the high noise event. The processor 306 can be configured to process the video data differently in response to the network interface 310 receiving a message indicating the high noise event. For example, processing the video data differently can include processing the video data in slow motion.

In some embodiments, the processor 306 is configured to filter propeller noise from the streamed audio data received from the remote tracker 400. In one example, the UAV 300 includes a microphone 322. The processor 306 can subtract the propeller noise recorded by the microphone 322 from the streamed audio data from the remote tracker 400. In some embodiments, the logic control component 406 is configured to remove propeller noise from the audio data prior to streaming the audio data to the videography drone.

In some embodiments, the microphone 322 is configured to start recording the audio data when the network interface 310 notifies the network interface 414 that the UAV 300 is in flight or the UAV 300 is on. In some embodiments, the microphone 322 is configured to start recording when the network interface 310 receives a command from the computing device implementing the drone control application. The drone control application, in response to a user interaction with the computing device, can send a command to stop or pause the recording. In some embodiments, the drone control application, in response to a user interaction with the computing device, can add an audio filter, audio transformer, and/or data compressor to process the audio data captured by the microphone 322.

In some embodiments, the remote tracker 400 includes a speaker 428. The speaker 428 can be configured to play a sound in response to a command or an alert received via the network interface 414 from the videography drone (e.g., the UAV 300). For example, the received alert can be an indication that an energy storage (e.g., the energy storage 324) of the UAV 300 is running low.

In some embodiments, the remote tracker 400 includes the light source 418 to illuminate an area surrounding the remote tracker 400. Because the remote tracker 400 is designed to track the movement of a target subject of the camera 302, the light source 418 can facilitate the UAV 300 to photograph/film the target subject.

Components (e.g., physical or functional) associated with the UAV 300 and/or the remote tracker 400 can be implemented as devices, modules, circuitry, firmware, software, or other functional instructions. For example, the functional components can be implemented in the form of special-purpose circuitry, in the form of one or more appropriately programmed processors, a single board chip, a field programmable gate array, a network-capable computing device, a virtual machine, a cloud computing environment, or any combination thereof. For example, the functional components described can be implemented as instructions on a tangible storage memory capable of being executed by a processor or other integrated circuit chip. The tangible storage memory may be volatile or non-volatile memory. In some embodiments, the volatile memory may be considered "non-transitory" in the sense that it is not a transitory signal. Memory space and storages described in the figures can be implemented with the tangible storage memory as well, including volatile or non-volatile memory.

Each of the components may operate individually and independently of other components. Some or all of the components may be executed on the same host device or on separate devices. The separate devices can be coupled through one or more communication channels (e.g., wireless or wired channel) to coordinate their operations. Some or all of the components may be combined as one component. A single component may be divided into sub-components, each sub-component performing separate method step or method steps of the single component.

In some embodiments, at least some of the components share access to a memory space. For example, one component may access data accessed by or transformed by another component. The components may be considered "coupled" to one another if they share a physical connection or a virtual connection, directly or indirectly, allowing data accessed or modified by one component to be accessed in another component. In some embodiments, at least some of the components can be upgraded or modified remotely (e.g., by reconfiguring executable instructions that implements a portion of the functional components). The systems, engines, or devices described herein may include additional, fewer, or different components for various applications.

FIG. 5 is a flowchart illustrating a method 500 of recording a video utilizing an UAV (e.g., the UAV 100 and/or the UAV 300) and a remote tracker (e.g., the remote tracker 200 and/or the remote tracker 400), in accordance with various embodiments. The UAV can be a videography drone. At step 502, the remote tracker can record its location data (e.g., via the spatial information sensor 402) and audio data (e.g., via the microphone 410) of its environment. At step 504, the remote tracker can decorate the audio data with location-based metadata and/or temporal metadata. At step 506, the remote tracker can process the audio data according to one or more gesture-triggered or voice-triggered commands.

For example, the spatial information sensor 402 can provide motion vector information that tracks the movement of the remote tracker. The remote tracker can then match the motion vector information against movement patterns associated with gesture-triggered commands. When there is a match, the matching gesture-triggered command is executed by the remote tracker and/or delivered to the UAV for execution. In one example, the spatial information sensor (e.g., an accelerometer) can detect a jumping motion to trigger a slow mode for the video capture at the UAV. In another example, a logic control component in the remote tracker can process the audio data to recognize audio patterns associated with voice-triggered commands. When there is a match, the matching voice-triggered command is executed by the remote tracker and/or delivered to the UAV for execution. The gesture-triggered command or the voice triggered command can include turning on/off the UAV, starting/stopping/pausing/muting an audio recording by the microphone of the remote tracker, starting/stopping/pausing/censoring a video recording by the camera of the UAV, initiating a slow motion video capture at the UAV and a corresponding slow audio recording at the remote tracker, a preset data transformation of the audio data or the video data, or any combination thereof.

At step 508, the UAV can receive, wirelessly and continuously, a stream of the location data and the audio data from the remote tracker. At step 510, the UAV can navigate to a position based on the received location data (e.g., at a preset distance and/or angle/direction from the remote tracker). At step 512, the UAV can capture video data with a camera pointing toward the remote tracker based on the location data of the remote tracker. At step 514, a processor of the UAV can stitch the audio data with the video data based on the temporal metadata of the audio data and/or the location-based metadata of the audio data. For example, the stitching can include matching a segment of the audio data and a segment of the video data when both segments share the same timestamp and/or the same location tag (e.g., after shifting at least one of the location tag by the constant distance and/or constant direction designated as the preset positioning of the UAV and the remote tracker).

While processes or methods are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. In addition, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times.

Some embodiments of the disclosure have other aspects, elements, features, and steps in addition to or in place of what is described above. These potential additions and replacements are described throughout the rest of the specification. Reference in this specification to "various embodiments," "several embodiments," or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.

Some embodiments of the disclosure have other aspects, elements, features, and steps in addition to or in place of what is described above. These potential additions and replacements are described throughout the rest of the specification.

* * * * *