Embodiments relate to tracking one or more objects using a sensor (for example, a camera).
One embodiment discloses an object tracking system including a base, a platform rotatably coupled to the base via an actuator, a sensor, and a controller having a memory and an electronic processor. The platform is configured to receive an accessory device. The sensor is configured to sense movement of the object. The controller is configured to receive, via the sensor, data indicative of movement of the object, and control the actuator based on the data indicative of movement of the object.
Another embodiment discloses a method of operating an object tracking apparatus. The object tracking apparatus includes a base and a platform rotatably coupled to the base via an actuator. The method includes sensing, via a sensor, a first image of an object at a first time, and sensing, via the sensor, a second image of the object at a second time. The method further includes receiving, via a controller, the first image and the second image, determining, via the controller, a delta between the first image and the second image, and determining, via the controller, motion of the object based on the delta. The method further includes determining, via the controller, if the motion of the object is valid, determining, via the controller, a position of the object based on the motion when the motion is valid, and controlling, via the controller, the actuator to move the platform in a direction of the position.
Other aspects of the application will become apparent by consideration of the detailed description and accompanying drawings.
Before any embodiments of the application are explained in detail, it is to be understood that the application is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The application is capable of other embodiments and of being practiced or of being carried out in various ways.
As discussed in more detail below, the sensor 115 is configured to sense one or more objects (for example, a human). In some embodiments, the sensor 115 includes one or more cameras or other sensors. In the illustrated embodiment, the sensor 115 is located within a support 120 of the base 105. However, in other embodiments, the sensor 115 may be located in other areas of the base 105 or other areas of the system 100. Additionally, in some embodiments, one or more sensors may be located at multiple points of the system 100 (for example, a first sensor located within the base 105, a second sensor located within the platform 110, and/or a third sensor located externally from a housing of the system 100).
In general operation, as the one or more objects move, the sensor 115 senses the movement of the object, including a current object location. In response, the platform 110, and thus the accessory 135 coupled to the platform 110 via the accessory coupler 130, moves to direct the accessory 135 toward the current object location.
The printed-circuit board 200 may include control circuitry (such as, but not limited to, control system 400 of
The actuator 205 may be any actuator that applies a force. The actuator 205 may be a motor configured to provide a rotational force in the x-direction, but is not limited to one or more of the following: an alternating-current motor, an alternating-current synchronous motor, an alternating-current induction motor, a direct-current motor, a commutator direct-current motor (for example, permanent-magnet direct-current motors, wound field direct-current motors, etc.), a reluctance motor (for example, switched reluctance motors), and a hydraulic motor. In some embodiments, the actuator 205 may be configured to provide rotational force in the x-direction, as well as rotational force in the y-direction.
In some embodiments, the system 100 includes a second power supply (for example, an alternating-current (AC) power supply). In such an embodiment, the second power supply may be in addition to, or in lieu of, the battery receptacle 210 and battery 300. For example, in an embodiment in which the system 100 is incorporated into a ceiling light and/or fan assembly, the system 100 may be powered from an AC power supply (for example, a mains voltage).
In some embodiments, the system 100 includes a battery charger configured to charge battery 300. In such an embodiment, the second power supply supplies power to the battery charger to charge battery 300. When a second power supply source (for example, an AC source), is not available (for example, at jobsites, campsites, etc.), the system 100 may be powered by battery 300. In some embodiments, such as but not limited to embodiments in which system 100 is incorporated into a ceiling light and/or fan assembly, the second power supply may be the main power supply, while battery 300 may be used as a battery backup. In such an embodiment, the main power supply may provide power to a battery charger to charge battery 300.
The controller 405 includes an electronic processor 425 and memory 430. The memory 430 stores instructions executable by the electronic processor 425. In some instances, the controller 405 includes one or more of a microprocessor, digital signal processor (DSP), field programmable gate array (FPGA), application specific integrated circuit (ASIC), or the like. The control system 400, via the controller 405, is electrically and/or communicatively coupled to the sensor 115, the actuator 205, the battery receptacle 210, and the switch 215.
The power supply apparatus 410 receives power and outputs a nominal power to the controller 405. In the illustrated embodiment, the power supply apparatus 410 receives power from the battery 300 via the battery receptacle 210. As discussed above, in other embodiments, the power supply apparatus 410 may receive power from the second power supply. The I/O apparatus 415 provides wired and/or wireless communication between controller 405 and an external device (for example, a smartphone, a tablet, an external computer, etc.).
The user-interface 420 provides information to, and/or receives input from, a user. The user-interface 420 may include one or more of the following: switch 215; a display (for example, a liquid crystal display (LCD)); one or more light emitting diodes (LEDs) or other illumination devices; speakers for audible feedback (for example, beeps, spoken messages, etc.); or other feedback devices.
As illustrated in
The controller 405 may then filter the delta image (block 620). In some embodiments, the controller 405 filters the delta image by applying contour detection to the pixels indicating motion (for example, the pixels illustrated as white). In such an embodiment, the contour detection finds one or more outlines of one or more pixel groups that indicate motion, while filtering out small irrelevant contours. In some embodiments, filtering the delta image may also include excluding changes in light (for example, changes in daylight due to cloud coverage) from the delta image. In such an embodiment, the controller 405 may exclude light from the delta image by removing areas (for example, an area of pixels) that have brightness over a predetermined threshold. In such an embodiment, the predetermined threshold may be a calculated average brightness of the first and/or second images. In another embodiment, the controller 405 may exclude light from the delta image by first determining movement of light and excluding that movement from the delta image. In another embodiment, the controller 405 may exclude light, or the movement of light, from the delta image by selectively filtering, and/or removing areas of, the delta image that are suspected to contain movements caused by light or other extraneous artifacts. The area of the delta image to filter and/or exclude, along with the filter parameters, may be calculated by determining vector(s) of motion for one or more objects 500 in the present delta image, along with previously determined vector(s) of motion from previous delta images. In some embodiments, the data may not be filtered.
Motion of the object 500 is then tracked (block 625). In some embodiments, motion of the object 500 is tracked by finding the largest contour in the delta image. A rectangle may then be created around the largest contour in the delta image. The area of the rectangle may be measured and compared to previous measurements (for example, previous area measurements of rectangles corresponding to the object 500 from previous operations) to determine that the same object 500 is being tracked over time. The center point of the rectangle may then be determined. Motion may then be tracked by comparing a change of the center point from previous operations. In some embodiments, various delta image characteristics (for example, rectangle areas and center point locations) from previous operations may be stored in memory 430 of control system 400.
The controller 405 next determines if the motion is valid (block 630). In some embodiments, the motion is determined to be found valid by comparing the area and location (for example, location of the center point) of the rectangle from the current operation to previous operations. For example, if the area is not approximately the same between the current delta image and previous delta images, or if the second location is over a predetermined distance away from the first location, then the motion may be determined to be invalid. If the motion is invalid, operation 600 cycles back to block 605.
If the motion is found to be valid, the controller 405 determines the new position of the object 500 (for example, based on the center point of the rectangle) (block 635). The controller 405 then controls the actuator 205 to move the platform 110 in the direction of the position of the object 500 (block 640). Operation 600 then cycles back to block 605.
The controller 405 may then filter the delta image (block 820). In some embodiments, the controller 405 filters the delta image by applying contour detection to the pixels indicating motion (for example, the pixels illustrated as white). In such an embodiment, the contour detection finds one or more outlines of one or more pixel groups that indicate motion, while filtering out small irrelevant contours. In some embodiments, filtering the delta image may also include excluding changes in light (for example, changes in daylight due to cloud coverage) from the delta image. In such an embodiment, the controller 405 may exclude light from the delta image by removing areas (for example, an area of pixels) that have brightness over a predetermined threshold. In such an embodiment, the predetermined threshold may be a calculated average brightness of the first and/or second images. In another embodiment, the controller 405 may exclude light from the delta image by first determining movement of light and excluding that movement from the delta image. In some embodiments, the data may not be filtered.
Motion of one or more objects 500 (for example, objects 500a, 500b of
Additionally, in some embodiments, motion of the objects 500 being tracked includes determining if an object (for example, objects 500a, 500b of
The controller 405 next determines if the motion (for example, motion of objects 500a, 500b and/or motion of a new object 500) is valid (block 830). In some embodiments, the motion is determined to be found valid by comparing the area and location (for example, location of the center point) of the rectangles from the current operation to previous operations. For example, if the area is not approximately the same between the current delta image and previous delta images, or if the second location is over a predetermined distance away from the first location, then the motion may be determined to be invalid. If the motion is invalid, operation 800 cycles back to block 805.
If motion is valid, the new position of the objects 500 are stored and/or updated in memory 430 (block 835). A weighted location 700 of the objects 500 may then be determined (block 840). In some embodiments, the weighted location 700 is weighted based on one or more factors, including: area of rectangle(s) around the one or more object 500, distance from system 100 to the one or more object 500, an aspect ratio of the rectangle(s) around the one or more object 500 (for example, a change in aspect ratio), frequency of motion of the one or more object 500, distance of motion of the one or more object 500 (for example, relatively small movements (such as painting a wall) versus relatively large movements (such as sweeping a floor)), and speed of motion of the one or more objects 500.
In other embodiments, the weighted location 700 is determined by determining a midpoint between the center point of the first object 500a and the center point of the second object 500b. The controller 405 then controls the actuator 205 to move the platform 110 in the direction of the weighted location 700 (block 845). Operation 800 then cycles back to block 805.
Thus, the application provides, among other things, an object tracking system and method of tracking an object. Various features and advantages of the application are set forth in the following claims.
This application claims priority to U.S. Provisional Patent Application No. 62/666,852, filed May 4, 2018, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62666852 | May 2018 | US |