1. Technical Field
The present disclosure relates to electronic devices and, particularly, to an electronic device capable of detecting user motion and executing a function corresponding to the user motion detected.
2. Description of Related Art
Nowadays, electronic devices, such as TV sets and air conditioners are very popular. Usually, the TV sets and the air conditioners are controlled by a remote controller, and if the remote controller is lost, controlling the device becomes very inconvenient if not impossible.
Therefore, it is desirable to provide an electronic device which overcomes the aforementioned limitations.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
An embodiment of the present disclosure will now be described in detail, with reference to the accompanying drawings.
Referring to
The motion detection module 10 includes a heat detection unit 101, an analysis unit 102, and a recognition unit 103. The heat detection unit 101 is used to detect presence of one or more humans around the electronic device 1 within a predetermined area (e.g., 2 meters in radius), and to produce a series of detection signals when the human presence is detected. In the embodiment, the heat detection unit 101 is an infrared detector or other heat detector capable of detecting the heat emitted by a human body. Each detection signal is a grayscale image (a grayscale or black-and-white or monochrome image is composed exclusively of shades of gray, from black to represent the darkest color to white representing the lightest color). The analysis unit 102 receives the grayscale images produced by the heat detection unit 101, and converts the grayscale images into a series of corresponding binary images using the method of Nobuyuki Otsu or a similar method. The recognition unit 103 compares the binary images and determines the changes between these binary images, and so determines the particular human motion. In the embodiment, if the recognition unit 103 determines a lack of significant movement from one binary image to the next over a predetermined time (e.g. 10 seconds), the recognition unit 103 determines that that particular human motion has been completed.
When a particular human motion has been determined, the processing unit 30 determines the function corresponding to the particular human motion determined by the recognition unit 103 according to the relationship table stored in the storage unit 20, and executes the corresponding function. The relationship between different particular human motions and corresponding functions recorded in the relationship table can be set and reset according to the type of the electronic device 1. For example, if the electronic device 1 is a TV set, the functions recorded in the relationship table are suitable for controlling a TV set, such as changing channels, increasing the volume, and so on. If the electronic device 1 is an air conditioner, the functions recorded in the relationship table are suitable for controlling the air conditioner, such as increasing the fan speed, lowering the required temperature, and so on.
In the embodiment, the processing unit 30 allows user input to set or reset the relationships between particular human motions and particular functions, the predetermined area viewed, and any predetermined times.
Referring to
The recognition unit 103 defines the center of each binary image to produce the base coordinates, and generates a rectangular coordinate system built on the base coordinates. The x-axis and y-axis of the rectangular coordinate system divides each binary image into four parts, a first quadrant, a second quadrant, a third quadrant, and a fourth quadrant. The recognition unit 103 compares each binary image and determines the quadrant in which human movement if any has taken place, and further tracks the movement according to changes in the binary images, the particular human motion can then be determined from the data relevant to these two characteristics or priorities.
In detail, the relative position of each pixel to the next pixel in any binary image and thus any change between these binary images in pixel positions is recognized by the rectangular coordinate system utilized by the recognition unit 103, which can also map pixel coordinates and changes therein in relation to the four quadrants.
For example, as shown in
In the embodiment, the heat detection unit 101 can detect a number of users, and effectively isolate and convey images of each user. In detail, the heat detection unit 101 produces a series of grayscale images including the number of users, the analysis unit 102 converts the grayscale images into a series of corresponding binary images using the method of Nobuyuki Otsu or a similar method. The recognition unit 103 can recognize the motion of each user and the processing unit 30 has the potential to execute each and every function corresponding to all the motions recorded. In detail, the recognition unit 103 compares the binary images and determines each user of these binary images, and respectively determines changes of each user by comparing the changes between these binary images, and so determines the particular human motion of corresponding user. In the embodiment, if the recognition unit 103 determines at least two simultaneous motions, the processing unit 30 executes the function corresponding to the motion first recognised by the recognition unit 103.
In step S302, the analysis unit 102 converts the grayscale images into a corresponding series of binary images.
In step S303, the recognition unit 103 compares each binary image with the next and determines any change in the binary images, and then determines the particular human motion according to the changes. In the embodiment, the recognition unit 103 determines the center of each binary image, and establishes the center of each binary image as the base coordinates, and creates a rectangular coordinate system built on the base coordinates. The rectangular coordinate system applies an x-axis and a y-axis to divide each binary image into four parts, a first quadrant, a second quadrant, a third quadrant, and a fourth quadrant. The recognition unit 103 compares each binary image and determines the part of the human body which is moving and how it is moving, and further determines the quadrant in which the movement is taking place.
The processing unit 30 determines the function corresponding to the particular human motion according to the relationship table stored in the storage unit 20, and executes the function.
It is believed that the present embodiments and their advantages will be understood from the foregoing description, and it will be apparent that various changes may be made thereto without departing from the spirit and scope of the disclosure or sacrificing all of its material advantages, the examples hereinbefore described merely being exemplary embodiments of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
99145677 | Dec 2010 | TW | national |