The present disclosure relates to systems and methods for identifying a real time location of a personal electronic device within a structure having a plurality of user workspaces or work desks, such as a building, by identifying a variety of different visual features within a field of view of a camera of the device, to thus help track usage of each of the workspaces or work desks.
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
Shared workspaces or work desks are becoming more and more common in buildings and shared work areas. A shared workplace is a space that may have a power outlet or docking station where a user can charge a personal electronic device (PED), and where the user can log on to a local area network for communication with a local computing system or to obtain access to the Internet. Often, such workspaces are spread out within a building and may be in two or more distinct buildings on a campus like setting, for example a corporate campus or college campus. In such applications, there is often a strong interest in monitoring the usage of the available workspaces to determine which workspaces are being heavily used and which are being only lightly used. This can help the entity providing the workspaces to more efficiently locate the available workstations to make maximum use of each workstation.
In buildings where the workstations may be relatively closely located, or possibly located on several different floors of a single building, a challenge arises in accurately identifying which workstation a user is using. Simple GPS signals may not work reliably inside of a building because of attenuated signal strength, and especially if the workstations are almost directly above one another on different floors of the same building.
Another challenge is being able to reliably identify which workstation a user is using without the need for complex and expensive additional equipment to be installed on all of the workstations that are available for use.
This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
In one aspect the present disclosure relates to a method for identifying a location of a personal electronic device carried and used by a user within a structure, and wherein the personal electronic device has a camera. The method may comprise using the camera of the personal electronic device to obtain at least one image of surroundings where the personal electronic device is being used. The method may further involve using a processor to perform a comparison of the at least one image with a plurality of differing predetermined features stored in a memory, wherein the memory is accessible by the processor. The method may further include identifying, from the comparison, a specific location within the structure where the personal electronic device is located.
In another aspect the present disclosure relates to a method for identifying a location of a portable personal electronic device carried and used by a user within a structure, and wherein the personal electronic device has a camera. The method may comprise using the camera of the personal electronic device to obtain at least one image of surroundings where the personal electronic device is being used. The method may further involve using a processor to access a look-up table containing a plurality of predetermined features. The plurality of predetermined features may include at least one of a specific color, a window, a light fixture, a patterned wall covering, and a heating/ventilation/air conditioning (HVAC) register. The method may further involve using the processor to use the at least one predetermined feature obtained from the look-up table to perform a comparison of the at least one image with the at least one predetermined feature and, from the performed comparison, to determine a specific location within the structure where the personal electronic device is located.
In still another aspect the present disclosure relates to a system for identifying a location of a personal electronic device carried and used by a user within a structure. The system may comprise a portable personal electronic device used by the user, and a camera operably associated with the portable personal electronic device. The camera may be configured to obtain at least one image of surroundings inside the structure where the personal electronic device is being used. A memory may be included for storing images of a plurality of differing predetermined features present in various locations within the structure. A processor may be included which is configured to perform a comparison of the at least one image with the plurality of differing predetermined features and, from the comparison, to identify a specific location within the structure where the personal electronic device is located.
Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
Referring to
The system 10 makes use a camera 16 of the laptop 14 and identification software 18 loaded onto the laptop. The identification software 18 works in connection with the camera 16 to identify features within a field of view of the camera 16 that enable a management system 20a or 20b to determine exactly which workstation from among a plurality of workstations the laptop 14 is located. This determination may be made in real time or in near real time (i.e., possibly within a few minutes) of the user setting up the laptop 14 and powering it on. Management systems 20a and 20b may be identical in construction or they may be different. Management system 20a is shown located in a Cloud environment and in communication with the laptop 14 via a wide area network (e.g., Internet). connection. Management system 20a may include its own processor 21 (hardware and associated software) for performing a comparison of image features and other processing tasks. Management system 20b may be located at a facility (e.g., building or campus) where the workstation 12 is located and may be in communication with the laptop 14 via a local area network (LAN) connection or even a wide area network connection. Management system 20b may likewise include a processor 23 (hardware and software) for performing image comparison and other processing tasks. It will be appreciated that if management system 20a is included, then management system 20b may not be needed and vice versa. Also, the identification software 18 running in the laptop 14 could be included on either of the management systems 20a or 20b, in which case the camera 16 would simply transmit an image to the management system 20a or 20b and the feature comparisons would be performed by the management system 20a or 20b.
The identification software 18 may be loaded onto the laptop 14 and configured to start automatically when the laptop 14 boots up, or the user may be required to manually start it. It is believed that in most instances it will be preferred that the identification software 18 is configured to start automatically. The laptop 14 may also include a docking station 22 and/or an USB charging port 24 and/or an AC Outlet 26 for charging the laptop 14.
The identification software 18 uses the camera 16 to obtain an image within a field of view of the camera 16, where the field of view is indicated by dashed lines 16a. A portion of the field of view 16a will cover the user seated in front of the laptop 14, as well as other portions surrounding the user (i.e., to the left, right and above the user). The identification software 18 identifies various features within the image such as wall colors, light fixtures 28, windows 30, structures such as bookshelves 32, architectural features such as columns 34 or ledges, patterns such as wainscot wall paneling, wall paper patterns, heating registers, and possibly electrical conduit(s) or HVAC ductwork, just to name a few. Preferably, the features that the identification software 18 is designed to identify are relatively permanent features or fixtures (e.g., windows and architectural features) that are not likely to change over time.
The identification software 18 may use the image obtained via the camera 16 to compare the image to various stored image features (e.g., images of architectural features, images of windows, images of wall covering patterns, images of light fixtures, etc.) in a stored image database 18a to determine when a certain feature is present within the image obtained by the camera 16. The stored image features may be constructed and saved in the stored image database 18a based on known features present within the building (e.g., windows, architectural features, etc.). The stored images may show various features (e.g., windows, light fixtures, etc.) from different angles that would be expected to be encountered based on where the workstations are located within the building or environment.
Once the identification software 18 identifies the various features present within the field of view 28 of the camera 16, the identification software may compare the identified features in the field of view 16a against a lookup table which lists various known features for each different location where the various workstations are located within a building (or buildings) or within some other predefined area. By determining which of the features are present in the field of view 16a of the laptop's camera 16, the identification software 18 may determine the exact location of the laptop 14, for example the specific building, floor of the building, room of the building, and specific workstation within the room, and may transmit this information to the management system 20a or 20b. The management system 20a or 20b may then update a real time log to note that the specific workstation 12 where the laptop 14 is present is currently being used. This information may be used by the management system 20 or 20a to track the usage of a plurality of workstations in a given building or other type of structure or environment.
As noted in
Referring to
With further reference to
It will also be appreciated that the system 10 may be configured to obtain “hints” as to where the laptop 14 might be located, such as from inconclusive GPS information, nearby WiFi networks, what hardware monitor device the laptop 14 is connected to, etc. These one or more initial “hints” can be taken into account to make the final attribute search more reasonable, faster and even more reliable. For example, it may be known that weak GPS signals are only receivable at those workstation locations that are adjacent to a window. So the identification software 18 could include a portion that reports any real time GPS information that the laptop 14 has acquired as to its real time location, and if this GPS information is reported, then the identification software 18 will exclude those features that are not present in proximity to any window. Conversely, the identification software may look at only those features that are known to be adjacent to a window. Still further, the system 10 may even use the GPS information to eliminate workstations from consideration that may be in ther buildings where reception of the GPS signal is known to be impossible. Taking into account one or more of the above considerations may significantly simplify and speed up completion of the comparison operations. Such a feature may necessitate the use of one or more additional databases to group specific features in relation to one another (e.g., all features present around the windows of a structure, or all features present in connection with a unique color identified in a scene).
Still further, the time of day and calendar day could be used to further limit the features that would need to be searched. For example, in the month of December at 8:00 p.m., the system 10 may determine that certain features visible through a window will not be present (whereas they otherwise would be present in June), and that the window may appear as a solid black color at 8:00 p.m. in the month of December. This information could be used to eliminate a large number of workstation features from consideration that are known to not be in the same field of view of a window. It would also be possible to use both front and rear facing cameras of PEDs such as computing tablets to obtain even more image information about the surroundings of a given workstation location, and thus even further enhance the accuracy of the system 10 in identifying the exact location of a workstation.
Paint color may be a particularly helpful identifying feature. Even static objects that are outside of the building that the laptop 14 is located in, but still within the field of view 16a of the laptop's camera 16, may be used to help identify the location of a workstation 12.
While various embodiments have been described, those skilled in the art will recognize modifications or variations which might be made without departing from the present disclosure. The examples illustrate the various embodiments and are not intended to limit the present disclosure. Therefore, the description and any claims should be interpreted liberally with only such limitation as is necessary in view of the pertinent prior art.
This application claims the benefit of U.S. Provisional Application No. 62/422,301, filed on Nov. 15, 2016. The entire disclosure of the above application is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62422301 | Nov 2016 | US |