The patent relates generally to control systems and, more particularly, to intelligence smart room control system based on three-dimensional sensing.
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
Embodiments of the disclosure related a room control system includes a user interface for displaying at least one of an event, an activity, or a status of an environment, a camera for monitoring at least one of the event, the activity, or the status of an environment, the camera configured to capture a gesture rendered by a person within an interactive area as an input and transmit an information associated with the gesture input, wherein the information is at least one of a control information, a warming information, an acknowledging information, an rejection information, or a respond information. The user interface configured to control an external device such as a home appliance, an entertainment system, a gaming device, a HVAC, a lighting system, and a client device, or a video device. For example, the client device is a phablet, a cellular phone, a tablet, a personal digital assistant (PDA), a laptop, a computer, and a wearable device. The room control system further includes a second camera configured to track a change in biometric position of a person and the user interface configured to display a dynamic angle of view associated to the change in biometric position of the person, wherein the first and second camera are a video camera, an analog camera, a digital video camera, a color camera, a monochrome camera, a camcorder, a PC camera, a webcam, and a CCTV. At least one of a depth sensor or a RGB sensor is integrated into the first and second cameras.
According to another exemplary embodiment of the disclosure, a system comprises a non-transitory computer-readable medium for carrying or having computer-executable instructions to monitor at least one of an event, an activity, or a status of an environment, the instructions causing a machine to modulate at least one beam with electronic image information based on the at least one of the event, activity, or the status of the environment and display the at least one beam with the electronic image information into a visualization image. The instructions to display the visualization image comprising weather information, time and data information, utility usage and costs, traffic information, electronic mail message information, meeting schedule information, incoming call information, and contact list information. The non-transitory computer-readable medium for carrying or having computer-executable instructions to monitor at least one of the event, the activity, or the status of an environment further comprises instructions for capturing a gesture rendered by a person within an interactive area as an input and transmitting an information associated with the gesture input, wherein the information is at least one of a control information, a warming information, an acknowledging information, an rejection information, or a respond information. The instructions to display further comprises tracking a change in biometric position of a person and displaying a dynamic angle of view associated to the change in biometric position of the person.
According to another exemplary embodiment of the disclosure, to systems and methods for display information of an event, an activity, or a status of an environment. For example, a room control system includes a sensing device or a camera, a projector, a communication link communicatively coupled the camera and the projector. One or more computing devices, home appliances, HVAC, and lighting systems may be linked to at least one of the sensing device and projector via one or more communication links where tasks are performed by at least one of the computing device, the sensing device, or projector that are linked over a network through one or more servers.
According to another exemplary embodiment of the disclosure, the room control system is configured to control and monitor one or more of the home appliances, HVAC, and lighting systems and display an event, an activity, or a status of one or more of the home appliances, HVAC, and lighting systems
These and other features, aspects, and advantages of this disclosure will become better understood when the following detailed description of certain exemplary embodiments is read with reference to the accompanying drawings in which like characters represent like arts throughout the drawings, wherein:
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
The following description is presented to enable any person skilled in the art to make and use the described embodiments, and is provided in the context of a particular application and its requirements. Various modifications to the described embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the described embodiments. Thus, the described embodiments are not limited to the embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein.
As depicted in
As further illustrated in
As another example, as the person 106 walks toward or approaches an interactive area, the camera 102 detects the person 106. The detected signal, in the form of imaging signal, is then transmitted via a communication link to the projector 104. The projector 104 receives the detected signal, processes the detected signal, modulates at least one beam with electronic image information, forms the at least one beam with electronic image information into an image or a visualization display 118, and displays the image or the visualization display 118 on the graphical user interface 110, for interaction. The image includes, but not limited to, weather information, time and date information, utility usage and costs, calendar, traffic information, electronic mail message information, meeting schedule information, incoming call information, and so forth. Once the person 106 is within the interactive area, anybody gesture rendered by the person 106 is captured and tracked by the camera 102 as inputs of at least one of the event, the activity, or the status of the environment, for operation. For example, if the person 106 points to at least one area of the image 118 that indicates the electronic mail message information, the input is captured and transmitted to the projector 106 for operation which may be open the electronic mail message, read content of the message, and either respond to the message, save the message for later action, or delete/remove the message. In one embodiment, the camera that detects the person approaching the interactive area may not be the same camera that captures and tracks the body gestures rendered by the person 106. As an example, the first camera that detects the person approaching the interactive area may be located behind the graphical user interface 110 while the second camera for capturing and tracking the body gesture may be located in front of the graphical user interface 110. As illustrated in
The camera or sensing device may be a video camera, an analog camera, digital video camera, a color camera, a monochrome camera, a camcorder, a PC camera, a webcam, an infrared (IR) video camera, a CCTV camera, a depth camera, a Red Green Blue-Depth (RGB-D) camera and the like. To compute the distance of an object such as a person 214 from the camera or sensing device, at least a pair of stereo cameras, one or more stereo cameras in a network, or one or more depth cameras 202 may be used. A processor either integrated into the camera or communicatively coupled to the camera measures the distance or depth information, modulates at least one beam with electronic image information including depth information, forms the at least one beam with electronic image information into an image or the visualization display 212, and displays/projects the image or visualization display 212 based on a person 206 perspective.
As depicted in
As further illustrated in
The camera 302 includes an imaging sensor 308, a video interface 310, and a communication interface 312 communicatively coupled to the sensor 308 and the interface 310 via one or more interface buses 314. Other computer implemented devices such as a communication readable medium, a processor, a transducer (microphone and/or speaker) for performing other features not defined herein may be incorporated into the camera 302. The projector 304 includes a processor 316, a computer implemented device such as input/output interface 318, a communication interface 320, a computer readable medium 324, and an acoustic device 326. These various computer implemented devices 316, 318, 320, 324, 326 are communicatively coupled to each other by one or more interface buses 328. Other computer implemented devices or performing certain features not defined herein may be incorporated into the projector 304. Depending on the desired configuration, the processor 316 may be of any type, including but not limited to a microprocessor, a microcontroller, a digital signal processor, or any combination thereof. The processor 316 may include one or more levels of caching, such as a level cache memory, one or more processor cores, and registers. The communication readable medium 324 may be of any type including but not limited to volatile memory such as RAM, non-volatile memory such as ROM, removable media, non-removable media, or any combination thereof, implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Acoustic device 326 may be a microphone, speaker, or combination thereof for capturing the surround sound and facilitates control of other interface computer implemented devices or other external computer implemented devices.
The computing device 352 may be a personal computer or desktop computer, a laptop, a cellular or smart phone, a tablet, a personal digital assistant (PDA), a gaming console, an audio device, a video device, an entertainment device such as a television, a vehicle infotainment, or the like. One or more computing devices 352 may be linked to at least one of the sensing device 300 and projector 304 via one or more communication links 306 where tasks are performed by at least one of the computing device 352, the sensing device 300, or projector 304 that are linked over the network 350 through one or more servers. The computing device 352 can in some embodiment be referred to as a single client machine or a single group of client machines, while the server may be referred to as a. single server or a single group of servers. In one embodiment a single computing device 352 communicates with more than one server, while in another embodiment a single server communicates with more than one computing device 352. In yet another embodiment, a single computing device 352 communicates with a single server. The server may be an application server, a certificate server, a mobile information server, an e-commerce server, a FTP server, a directory server, CMS server, a printer server, a management server, a mail server, a public/private access server, a real-time communication server, a database server, a proxy server, a streaming media server, or the like. In some embodiment, a cloud computing device may be implemented as one or more servers which may be communicated with via the Internet, and which may be co-located or geographically distributed, wherein shared resources, software, and information are provided to computers and other devices on demand for example, as will be appreciated by those skilled in the art.
The network 350 can comprise one or more sub-networks, and can be installed between any combination of the computing devices 352, the server, computing machines and appliances included within the room control system 300. In some embodiments, the network 350 can be for example a local-area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), a primary network 104 comprised of multiple sub-networks located between the computing device 352 and the server, a primary public network with a private sub-network, a primary private network with a public sub-network, or a primary private network with a private sub-network. Still further embodiments include a network 350 that can be any network types such as a point to point network, a broadcast network, a telecommunication network, a data communication network, a computer network, an ATM (Asynchronous Transfer Mode) network, a SONET (Synchronous Optical Network) network, a SDH (Synchronous Digital Hierarchy) network, a wireless network, a wireline network, and the like. Depending on the application, other networks may be used so that data exchanged between the computing device and the server can be transmitted over the network. Network topology of the network 350 can differ within different embodiments which may include a. bus network topology, a star network topology, a ring network topology, a repeater-based network topology, or a tiered-star network topology. Additional embodiments may include a network of mobile telephone networks that use a protocol to communicate among mobile devices, where the protocol can be for example AMPS, TDMA, CDMA, GSM, GPRS, UMTS, LTE or any other protocol able to transmit data among mobile devices.
The communication interface 320 allows software and data to be transferred between the computer system and other external electronic devices in the form of signals which may be, for example, electronic, electromagnetic, optical, or other signals capable of being received by the communication interface 320. The communication interface 320 may be for example a modem, a network interface, a communication port, a PCM-CIA slot and card, or the like. The computer implemented device 318 incorporated in the projector 304 may include a video adapter, a digitizer, a printer, a gesture recognition module, a camera, and the like. The camera 318 detects user's viewing location and direction to assist the visualization and enable user interactions when then image is displayed on the graphical user interface 110, as depicted in
Through the communication links 306, the system 300 is capable of communicating with the electronic devices 352, home appliances 354, and the like, to enable more functionalities such as room temperature, personal email notification, visitor identification, and so forth.
Now referring to
As depicted in
In one example, the indoor sensor delivers the RGBD information of the indoor scene to the processor. The processor determines whether there is a user in the scene, for example using either a detection module. The detection module may be a 2D based human detection module, a 3D base human detection module, a 2D base face detection module, a 3D base head detection module. For initial set up, head position, orientation, or combination thereof of a person is detected by the indoor camera of the system 400. The head position, orientation, or combination thereof, for example, in a formed of data is transmitted to a coordinate module which then is converted into visualization parameters. The coordinate module may be integrated into the system 400. In some embodiments, the coordinate module may be integrated into one of the indoor camera, a processor, a projector. In another embodiment, the coordinate nodule as a separate device may be communicatively coupled to the system 400. Once a head pose is identified by viewing at an angle 430, the system 400 perform tracking using the visualization parameters and display the image 406 through the graphical user interface 404 based on the angle of view 430.
Now referring to
If multiple depth sensors are used, more complete 3D model can be reconstructed using various algorithms programed or stored in the system 400 or the coordinate module. One example algorithm will be using volumetric-based representation to fuse the depth data from multiple calibrated sensors. In this example, a 3D volume, for example 3D distance field, is used to represent the scene and can be initialized with depth map of one depth sensor. The coordinate system, also referred as a world coordinate system, can be either borrowed from the camera used for initialization, or another one with known relative position and orientation with respect to this camera. The volume can be updated according to the data of the rest of depth sensors, which is firstly transformed to the world coordinate system via parameters from calibration done beforehand. The update can be performed either sequentially or simultaneously, via volumetric fusion techniques. Color information should also be recorded in the meantime. In the example of point or mesh representation, a texture image can be reconstructed. Alternatively, if a volumetric field is used, a single color or a color distribution for each voxel can be recorded.
Now return to the example as discussed in
As another example as illustrated in
While the room control systems 100, 200, 300, 400 provide interface to view information or configure a security system, the systems are configured to facilitate communication and control of at least one of the home appliances, HVAC system, lighting system, and so forth available in a site. In one example, while the user approaches the door for leaving, the system shows alerts that the oven light is turn on. The user can activate a camera installed in the kitchen to display an event or status in the kitchen and then turn off the power to the oven.
The embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling with the sprit and scope of this disclosure.
Embodiments within the scope of the disclosure may also include non-transitory computer-readable storage media or machine-readable medium for carrying or having computer-executable instructions or data structures stored thereon. Such non-transitory computer-readable storage media or machine-readable medium may be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such non-transitory computer-readable storage media or machine-readable medium can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions or data structures. Combinations of the above should also be included within the scope of the non-transitory computer-readable storage media or machine-readable medium.
Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network.
Computer-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Computer-executable instructions also include program modules that are executed by computers in stand-alone or network environments. Generally, program modules include routines, programs, objects, components, and data structures, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.
While the patent has been described with reference to various embodiments, it will be understood that these embodiments are illustrative and that the scope of the disclosure is not limited to them. Many variations, modifications, additions, and improvements are possible. More generally, embodiments in accordance with the patent have been described in the context or particular embodiments. Functionality may be separated or combined in blocks differently in various embodiments of the disclosure or described with different terminology. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosure as defined in the claims that follow.
This application claims priority to a U.S. provisional patent application Ser. No. 62/273,620, filed Dec. 31, 2015, the contents of which are incorporated herein by reference as if fully enclosed herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2016/082905 | 12/30/2016 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62273620 | Dec 2015 | US |