This application is related to U.S. Design Appl. Nos. 29/718,126, 29/718,127, and 29/718,129, which are incorporated herein by reference in their entireties.
When working with large datasets, such as business data, being able to easily visualize the information is necessary. Typically, approaches to interact with the data involve flat, table views of the data. In such views, individual records are generally shown as rows of a table, and the various fields of the records are shown as columns of the table.
In order to understand the data contained in the table, users may be able to search records through the use of an input field. For certain types of values, users may be able to use the input field to limit the displayed data to certain ranges of the values. By interacting with other parts of the table, such as column headings, users may be able to reorder the data based on the corresponding field for that column. Other user interface (UI) elements, like checkboxes, may also be applied to limit the data displayed in the table and aid in comprehension.
These traditional UI controls, while they ultimately allow users to see the data that matches their selection, may not provide an adequate understanding of what the data actually represents. And getting to the needed data selection in this manner may not be intuitive for many data sets.
In addition to viewing large tables, some presentation software packages can present data in graphical form such as pie charts, bar charts, line charts, etc. These applications can display data in a visual form, but they are limited in the dimensions in which data can be presented at one time. In addition, even when the data is presented in a potentially visually appealing manner, the ability to manipulate those visualizations is very limited and thus further limiting for the user to obtain a deeper understanding of the underlying data.
Accordingly, approaches are needed to improve the visualization and interactivity of large datasets.
The accompanying drawings are incorporated herein and form a part of the specification.
In the drawings, like reference numbers generally indicate identical or similar elements. Additionally, generally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
Provided herein are system, apparatus, device, method and/or computer program product embodiments, and/or combinations and sub-combinations thereof, for visualizing and interacting with large datasets from a database.
When reviewing large datasets, such as business data, having access to unique ways of visualizing the data can help impart new understandings regarding the data. Business data in particular is typically visualized in two-dimensional (2D) representations of records (such as sales records). For example, a table or spreadsheet may be used to view sales information, with rows representing individual transaction records and columns representing various fields for the records. By way of non-limiting example, sales records could include fields relating to a price and a manufacturer.
In accordance with an embodiment, these datasets may be visualized in a three-dimensional (3D) representation that allows for the study and manipulation of the data. By way of non-limiting example, this 3D representation is a cuboid, but one skilled in the relevant arts will appreciate that the 3D representation may include any number of sides (e.g., any polyhedron shape), with a variety of height, width, and depth measures for each side as appropriate for presentation and interactivity purposes. In accordance with an exemplary embodiment discussed in further detail herein, the cuboid is specifically a cube.
In the 3D representation, each side corresponds to a ‘dimension’ of data—separate from the three-dimensionality of the representation itself, the ‘dimensions’ in this context represent fields of underlying data records. In the aforementioned sales record example, the price field would correspond to a first dimension, the manufacturer field would correspond to a second dimension, and so on.
Visualization client 102 is able to send requests for data in request 112 to database server 104, and receive responses with the requested data in response 114. One skilled in the relevant arts will appreciate that database server 104 and database 110 need not be implemented strictly as databases (or even relational databases), and may instead be any storage structure permitting data retrieval in accordance with the requirements detailed herein.
As discussed in further detail below, UI input 108 may include any number of 2D or 3D UI input instructions. For example, UI input 108 may be provided by way of a 3D controller (e.g., an instruction to rotate a scene of UI render output 106 about an axis, an instruction to pan the scene, and so on). UI input 108 may also be provided by way of traditional 2D control mechanisms, such as a mouse and/or keyboard.
And as also discussed in further detail below, UI render output 106 is a 3D scene including the 3D representation (e.g., a cube) of the data. The 3D scene may be visualized in a number of different ways as would be appreciated by a person skilled in the relevant art. For example, a user of visualization client 102 may wear a virtual reality headset to view the 3D scene in a 3D manner. In another example, the 3D representation may be shown in an augmented reality (AR) representation, such as through the use of AR glasses, an AR app, or other AR display approaches. Or, in yet another example, the 3D representation may be shown on a flat 2D display, such as a standard computer monitor.
The 3D representation (the cube 202 shown in scenes 200A and 200B in this case) has multiple faces (or sides), each corresponding to a dimension of the underlying data records. For example, in the case of the vehicle business data depicted in scenes 200A and 200B, dimensions may include vehicle make, price, and model year. In the case of a cube, the ability to render multi-dimensional data on the faces of the cube allows for easy visualization and manipulation across six dimensions of data. For different numbers of dimensions, a shape other than a cuboid may be used. Further, the particular 3D representation may be selected on the basis of the number of needed dimensions (e.g., an eight- or twenty-sided shape). A dimension may be duplicated on several faces if desired, in particular in the event that there are fewer dimensions needed or available than there are faces to the 3D representation. Additionally, a face may be left blank if not needed.
A 3D scene is viewed from the perspective of one or more virtual cameras (referred to herein as simply a ‘camera’), which determine how the scene is rendered and shown to a user (such as through UI render output 106 of
In this example, a user of visualization client 102 of
In accordance with an embodiment, a label 218A may be shown in a position near or adjacent to the closest face of the 3D representation, such as cube 202. This label 218A may indicate a name for the dimension (e.g., ‘manufacturer’). In an embodiment, only the label for the face closest to the camera is shown, although additional labels may be shown concurrently if desired. In addition, the location of the label 218A does not need to be at the left of the side closest to the virtual camera but could be displayed anywhere such as the bottom, top, or right edges of that face.
In accordance with a further embodiment, UI input 108 of
When a new/different face of the 3D representation becomes the closest face to the camera, that new face takes on a similar representation to what is shown in scene 200A with respect to the ‘style’ dimension—slices of data within that new face are depicted as 3D ‘drawers’ nested within the 3D representation. All of the other faces (including any previously closest face) have their corresponding slices of data rendered only the surface of those faces of cube 202.
In
By way of non-limiting example, rendering drawers within the cube is accomplished by creating each slice within each face of the 3D representation as a separate 3D object. These slices are part of a 2D face of the 3D representation, and have no depth (or some limited depth selected to promote visibility, but without interfering with the depth of the drawers). When a face of the 3D representation becomes the closest face to a camera, the slices within that face are given depth—the slices become drawers. The depth may be, for example, equivalent to a size of the overall 3D representation in the same direction as the depth of the slice. Looking at scene 200A of
When cube 202 is rotated as shown in scene 200B, however, the former 2D slices become drawers (e.g., drawers 210B, 212B, 214B, and 216B) through the addition of depth as discussed above, while the former drawers become 2D slices (e.g., slices 204B, 206B, and 208B) by removing their depth.
For example, as shown in scene 300A, cube 302A has a face corresponding to the vehicle ‘style’ dimension closest to the camera. Therefore, several vehicle styles (e.g., ‘compact’ 304A, ‘coupe’ 306A, and ‘SUV’ 308A) are depicted as nested drawers (i.e., slices) within cube 302A. In accordance with an embodiment, a label 310A may be shown in a position near or adjacent to the closest face of the 3D representation, such as cube 302A.
The animated sequence of
In the above exemplary table, each record (shown on individual rows) corresponds to a particular vehicle (say, in a sales inventory). The columns—style, price, year—each correspond to a dimension of data populated in the individual records.
The filter interaction of
This visual representation provides many advantages to an end user. First, in this example, six dimensions are easily visible to the end user by simply rotating cube 302A in
As shown in
For example, initially cube 302A of
Cuboid 312D, being derived from cube 302D and the same underlying records, also has the same dimensions. For example, the label 310D attached to cuboid 312D is the same (e.g., ‘style’) as the label corresponding to the same dimension from cube 302D (e.g., label 310A in
By way of non-limiting example, cuboid 312D is maintained as a non-cube cuboid corresponding to the dimensions of drawer 306D as it existed when part of cube 302D. This is done to avoid confusing a user—it is generally easier for a user to understand that cuboid 312D is derived from the filtered data of cube 302D if dimensions are maintained. However, one skilled in the relevant arts will appreciate that other approaches are equally viable, including automatically resizing cuboid 312D to be a cube. Similar approaches may be used for other polyhedron shapes.
In one embodiment, when a drawer like 306D is extracted from a cube 302D, the drawer 306D maintains its general shape when it is in the same view as the cube from which it was extracted. However, in this further embodiment, if the end user drags (translates) drawer 306D such that cube 302D is no longer in view, the processor may automatically resize drawer 306D into a cube. Similarly, having translated drawer 306D out of view of cube 302D, if the end user pans back to review cube 302D, the gap left by removing drawer 306D can be automatically filled be resizing drawers 304D and 308D. If the end user later translates either cube 302D or resized drawer 306D into the same view, the process will automatically open the gap shown in cube 302D and resize the cube drawer back into a non-cube prism drawer 306D as shown in
Even though different dimensions are depicted as the closest faces to the camera for cubes 402A and 412A, the underlying dimensions are the same. For example, both cubes are understood to have a matching table structure, with records corresponding to individual vehicles each having style, price, and model year dimensions. As a result, a unified table may be constructed by combining the records underlying cube 402A and cube 412A, with a new cube generated based on the full table.
At step 604, a position of a camera within a scene is determined, relative to where the 3D representation (e.g., a cuboid) is being rendered. By making this determination, it is possible at step 606 to render a primary face of the cuboid, defined as the face that is closest to the camera. The primary face is rendered with M slices, but these slices are represented as drawers. The number of drawers M to render may be determined by however many distinct elements are present within the dimension corresponding to the primary face (e.g., for manufacturers, as many unique manufacturers as are present in the underlying data). In an additional embodiment, the number of drawers M may be selected according to algorithms that group underlying elements (e.g., price ranges). One skilled in the art will recognize that any approach to group or restrict the elements selected for association with the M drawers may be used, and is not limited to the foregoing approaches.
At step 608, a secondary face of the cuboid is rendered, in accordance with an embodiment. This secondary face has P slices, again corresponding to the number of underlying elements (by the same process above for slicing the M drawers). Step 608 may be repeated for additional secondary faces as needed.
At step 704, a determination is made as to whether the interaction is being made with respect to a drawer of the cuboid, or a frame of the cuboid (e.g., anything other than the drawer). This decision point determines whether the interaction is resolved with respect to the drawer or the cuboid as a whole. One skilled in the relevant art will appreciate that other decisions may be used to determine whether the interaction is intended to be with respect to a drawer or the cuboid as a whole, and this particular decision is provided by way of non-limiting example.
If the interaction is with respect to the drawer, then the process proceeds to step 708 where the drawer is translated within the scene depicting the cuboid, in accordance with an embodiment. This translation may be given by the interaction in a number of different ways, such as a click-drag-and-drop interaction or similar, by way of non-limiting example. At step 710, the drawer is rendered as a new cuboid at the translated location within the scene. In accordance with an embodiment, the original cuboid may be re-rendered without the removed drawer.
If the interaction is with respect to the cuboid frame, then the process proceeds to step 706 where the cuboid is translated or rotated within the scene. The translation may be given by the interaction in a number of different ways, such as a click-drag-and-drop interaction or similar, by way of non-limiting example.
To understand the remaining steps, the rotation is described further in view of the initial rendering steps of flowchart 600 of
A detailed non-limiting example of the appearance of such rotation is provided above with respect to
Various embodiments may be implemented, for example, using one or more well-known computer systems, such as computer system 1000 shown in
Computer system 1000 may include one or more processors (also called central processing units, or CPUs), such as a processor 1004. Processor 1004 may be connected to a communication infrastructure or bus 1006.
Computer system 1000 may also include user input/output device(s) 1003, such as monitors, keyboards, pointing devices, etc., which may communicate with communication infrastructure 1006 through user input/output interface(s) 1002.
One or more of processors 1004 may be a graphics processing unit (GPU). In an embodiment, a GPU may be a processor that is a specialized electronic circuit designed to process mathematically intensive applications. The GPU may have a parallel structure that is efficient for parallel processing of large blocks of data, such as mathematically intensive data common to computer graphics applications, images, videos, etc.
Computer system 1000 may also include a main or primary memory 1008, such as random access memory (RAM). Main memory 1008 may include one or more levels of cache. Main memory 1008 may have stored therein control logic (i.e., computer software) and/or data.
Computer system 1000 may also include one or more secondary storage devices or memory 1010. Secondary memory 1010 may include, for example, a hard disk drive 1012 and/or a removable storage device or drive 1014. Removable storage drive 1014 may be a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup device, and/or any other storage device/drive.
Removable storage drive 1014 may interact with a removable storage unit 1018. Removable storage unit 1018 may include a computer usable or readable storage device having stored thereon computer software (control logic) and/or data. Removable storage unit 1018 may be a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, and/any other computer data storage device. Removable storage drive 1014 may read from and/or write to removable storage unit 1018.
Secondary memory 1010 may include other means, devices, components, instrumentalities or other approaches for allowing computer programs and/or other instructions and/or data to be accessed by computer system 1000. Such means, devices, components, instrumentalities or other approaches may include, for example, a removable storage unit 1022 and an interface 1020. Examples of the removable storage unit 1022 and the interface 1020 may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM or PROM) and associated socket, a memory stick and USB port, a memory card and associated memory card slot, and/or any other removable storage unit and associated interface.
Computer system 1000 may further include a communication or network interface 1024. Communication interface 1024 may enable computer system 1000 to communicate and interact with any combination of external devices, external networks, external entities, etc. (individually and collectively referenced by reference number 1028). For example, communication interface 1024 may allow computer system 1000 to communicate with external or remote devices 1028 over communications path 1026, which may be wired and/or wireless (or a combination thereof), and which may include any combination of LANs, WANs, the Internet, etc. Control logic and/or data may be transmitted to and from computer system 1000 via communication path 1026.
Computer system 1000 may also be any of a personal digital assistant (PDA), desktop workstation, laptop or notebook computer, netbook, tablet, smart phone, smart watch or other wearable, appliance, part of the Internet-of-Things, and/or embedded system, to name a few non-limiting examples, or any combination thereof.
Computer system 1000 may be a client or server, accessing or hosting any applications and/or data through any delivery paradigm, including but not limited to remote or distributed cloud computing solutions; local or on-premises software (“on-premise” cloud-based solutions); “as a service” models (e.g., content as a service (CaaS), digital content as a service (DCaaS), software as a service (SaaS), managed software as a service (MSaaS), platform as a service (PaaS), desktop as a service (DaaS), framework as a service (FaaS), backend as a service (BaaS), mobile backend as a service (MBaaS), infrastructure as a service (IaaS), etc.); and/or a hybrid model including any combination of the foregoing examples or other services or delivery paradigms.
Any applicable data structures, file formats, and schemas in computer system 1000 may be derived from standards including but not limited to JavaScript Object Notation (JSON), Extensible Markup Language (XML), Yet Another Markup Language (YAML), Extensible Hypertext Markup Language (XHTML), Wireless Markup Language (WML), MessagePack, XML User Interface Language (XUL), or any other functionally similar representations alone or in combination. Alternatively, proprietary data structures, formats or schemas may be used, either exclusively or in combination with known or open standards.
In some embodiments, a tangible, non-transitory apparatus or article of manufacture comprising a tangible, non-transitory computer useable or readable medium having control logic (software) stored thereon may also be referred to herein as a computer program product or program storage device. This includes, but is not limited to, computer system 1000, main memory 1008, secondary memory 1010, and removable storage units 1018 and 1022, as well as tangible articles of manufacture embodying any combination of the foregoing. Such control logic, when executed by one or more data processing devices (such as computer system 1000), may cause such data processing devices to operate as described herein.
Based on the teachings contained in this disclosure, it will be apparent to persons skilled in the relevant art(s) how to make and use embodiments of this disclosure using data processing devices, computer systems and/or computer architectures other than that shown in
It is to be appreciated that the Detailed Description section, and not any other section, is intended to be used to interpret the claims. Other sections can set forth one or more but not all exemplary embodiments as contemplated by the inventor(s), and thus, are not intended to limit this disclosure or the appended claims in any way.
While this disclosure describes exemplary embodiments for exemplary fields and applications, it should be understood that the disclosure is not limited thereto. Other embodiments and modifications thereto are possible, and are within the scope and spirit of this disclosure. For example, and without limiting the generality of this paragraph, embodiments are not limited to the software, hardware, firmware, and/or entities illustrated in the figures and/or described herein. Further, embodiments (whether or not explicitly described herein) have significant utility to fields and applications beyond the examples described herein.
Embodiments have been described herein with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined as long as the specified functions and relationships (or equivalents thereof) are appropriately performed. Also, alternative embodiments can perform functional blocks, steps, operations, methods, etc. using orderings different than those described herein.
References herein to “one embodiment,” “an embodiment,” “an example embodiment,” or similar phrases, indicate that the embodiment described can include a particular feature, structure, or characteristic, but every embodiment can not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it would be within the knowledge of persons skilled in the relevant art(s) to incorporate such feature, structure, or characteristic into other embodiments whether or not explicitly mentioned or described herein. Additionally, some embodiments can be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments can be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, can also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
The breadth and scope of this disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
5499306 | Sasaki | Mar 1996 | A |
5504821 | Kanamori | Apr 1996 | A |
5588098 | Chen | Dec 1996 | A |
5926820 | Agrawal | Jul 1999 | A |
6326988 | Gould | Dec 2001 | B1 |
6424344 | Lee | Jul 2002 | B1 |
6434544 | Bakalash | Aug 2002 | B1 |
6466237 | Miyao | Oct 2002 | B1 |
6542895 | DeKimpe | Apr 2003 | B1 |
6546395 | DeKimpe | Apr 2003 | B1 |
6597358 | Miller | Jul 2003 | B2 |
6629065 | Gadh | Sep 2003 | B1 |
6661426 | Jetha | Dec 2003 | B1 |
6798843 | Wright | Sep 2004 | B1 |
6801908 | Fuloria | Oct 2004 | B1 |
7194465 | MacGregor | Mar 2007 | B1 |
7284011 | Narayanaswamy | Oct 2007 | B1 |
7383279 | Tare | Jun 2008 | B2 |
7417762 | Arai | Aug 2008 | B2 |
7639256 | Yablonski | Dec 2009 | B1 |
7692648 | Engel | Apr 2010 | B2 |
7756907 | Stolte | Jul 2010 | B2 |
7917868 | Ok | Mar 2011 | B2 |
7979672 | El-Mahdy | Jul 2011 | B2 |
8111255 | Park | Feb 2012 | B2 |
8234298 | Winter | Jul 2012 | B2 |
8237736 | Flick | Aug 2012 | B2 |
8606827 | Williamson | Dec 2013 | B2 |
8766997 | Hickman | Jul 2014 | B1 |
8799207 | Stolte | Aug 2014 | B1 |
8868544 | Witkowski | Oct 2014 | B2 |
8965836 | Stolte | Feb 2015 | B1 |
8965866 | Varghese | Feb 2015 | B2 |
9025891 | Terada | May 2015 | B2 |
9069455 | Sripada | Jun 2015 | B2 |
9137666 | Bonn | Sep 2015 | B1 |
9171055 | Stolte | Oct 2015 | B1 |
9176985 | Baba | Nov 2015 | B2 |
9183269 | Stolte | Nov 2015 | B1 |
9330091 | Stolte | May 2016 | B1 |
9332257 | Joshi | May 2016 | B2 |
9423929 | Mattos | Aug 2016 | B2 |
9529892 | Tibrewal | Dec 2016 | B2 |
9737811 | Penmatsa | Aug 2017 | B1 |
9753132 | Bordes | Sep 2017 | B1 |
9836263 | Kasahara | Dec 2017 | B2 |
9922437 | Baron | Mar 2018 | B1 |
9959795 | Kim | May 2018 | B2 |
10089147 | Jamjoom | Oct 2018 | B2 |
10289972 | Goyal | May 2019 | B1 |
10318545 | Klippsten | Jun 2019 | B1 |
10325405 | Falstrup | Jun 2019 | B1 |
10346950 | Edwards | Jul 2019 | B2 |
10366464 | Williamson | Jul 2019 | B2 |
10429941 | Kamoda | Oct 2019 | B2 |
10573057 | Dixit | Feb 2020 | B1 |
10621203 | Hunt | Apr 2020 | B2 |
10671241 | Jia | Jun 2020 | B1 |
10699070 | Walia | Jun 2020 | B2 |
10712898 | Christmas | Jul 2020 | B2 |
10768421 | Rosenberg | Sep 2020 | B1 |
11079901 | Natarajan | Aug 2021 | B2 |
20010054034 | Arning | Dec 2001 | A1 |
20020008709 | Suzuki | Jan 2002 | A1 |
20020018066 | Vizer | Feb 2002 | A1 |
20020029207 | Bakalash | Mar 2002 | A1 |
20020091707 | Keller | Jul 2002 | A1 |
20020113865 | Yano | Aug 2002 | A1 |
20030004938 | Lawder | Jan 2003 | A1 |
20030142136 | Carter | Jul 2003 | A1 |
20030204534 | Hopeman | Oct 2003 | A1 |
20030208506 | Greenfield | Nov 2003 | A1 |
20030229652 | Bakalash | Dec 2003 | A1 |
20040081340 | Hashimoto | Apr 2004 | A1 |
20040122820 | Malloy | Jun 2004 | A1 |
20040122844 | Malloy | Jun 2004 | A1 |
20040126007 | Ziel | Jul 2004 | A1 |
20040139061 | Colossi | Jul 2004 | A1 |
20040164957 | Yamaguchi | Aug 2004 | A1 |
20040181503 | Moseler | Sep 2004 | A1 |
20040215626 | Colossi | Oct 2004 | A1 |
20050012745 | Kondo | Jan 2005 | A1 |
20050013507 | Lee | Jan 2005 | A1 |
20050047670 | Qian | Mar 2005 | A1 |
20050057579 | Young | Mar 2005 | A1 |
20050060300 | Stolte | Mar 2005 | A1 |
20050151732 | Yamaguchi | Jul 2005 | A1 |
20050172007 | Avrahami | Aug 2005 | A1 |
20050174361 | Kobayashi | Aug 2005 | A1 |
20050231532 | Suzuki | Oct 2005 | A1 |
20060028543 | Sohn | Feb 2006 | A1 |
20060069698 | Hintikka | Mar 2006 | A1 |
20060156228 | Gallo | Jul 2006 | A1 |
20060206512 | Hanrahan | Sep 2006 | A1 |
20060258449 | Yasui | Nov 2006 | A1 |
20060274060 | Ni | Dec 2006 | A1 |
20070008621 | Satoh | Jan 2007 | A1 |
20070018975 | Chuanggui | Jan 2007 | A1 |
20070027904 | Chow | Feb 2007 | A1 |
20070033279 | Battat | Feb 2007 | A1 |
20070236514 | Agusanto | Oct 2007 | A1 |
20070238981 | Zhu | Oct 2007 | A1 |
20070248259 | Liu | Oct 2007 | A1 |
20080243778 | Behnen | Oct 2008 | A1 |
20080273082 | Miyake | Nov 2008 | A1 |
20090006455 | Carroll | Jan 2009 | A1 |
20090009515 | Tanaka | Jan 2009 | A1 |
20090019393 | Fukushima | Jan 2009 | A1 |
20090027380 | Rajan | Jan 2009 | A1 |
20090136096 | Sirohey | May 2009 | A1 |
20090198663 | Yang | Aug 2009 | A1 |
20100156893 | Mihara | Jun 2010 | A1 |
20100306281 | Williamson | Dec 2010 | A1 |
20110205341 | Wilson | Aug 2011 | A1 |
20110310100 | Adimatyam | Dec 2011 | A1 |
20120038754 | Na | Feb 2012 | A1 |
20120174038 | Tamayo | Jul 2012 | A1 |
20120197950 | Dayal | Aug 2012 | A1 |
20120212490 | Salemann | Aug 2012 | A1 |
20120290976 | Lahm | Nov 2012 | A1 |
20120310874 | Dantressangle | Dec 2012 | A1 |
20120311474 | McPherson | Dec 2012 | A1 |
20120324401 | Morris | Dec 2012 | A1 |
20130031142 | Wester | Jan 2013 | A1 |
20130054137 | Arai | Feb 2013 | A1 |
20130054510 | Beaumont | Feb 2013 | A1 |
20130054608 | Gong | Feb 2013 | A1 |
20130076731 | Rolleston | Mar 2013 | A1 |
20130093756 | Davidson | Apr 2013 | A1 |
20130097563 | Pacheco Rodrigues Velho | Apr 2013 | A1 |
20130159307 | Wolge | Jun 2013 | A1 |
20130339291 | Hasner | Dec 2013 | A1 |
20140058998 | Schwerk | Feb 2014 | A1 |
20140140579 | Takemoto | May 2014 | A1 |
20140152661 | Nishiura | Jun 2014 | A1 |
20140156588 | Mohanty | Jun 2014 | A1 |
20140228119 | Koenig | Aug 2014 | A1 |
20140258938 | Christmas | Sep 2014 | A1 |
20140279824 | Tamayo | Sep 2014 | A1 |
20140279833 | Gong | Sep 2014 | A1 |
20140327667 | Kim | Nov 2014 | A1 |
20150007115 | Kleser | Jan 2015 | A1 |
20150015572 | Izumo | Jan 2015 | A1 |
20150073961 | Cristoforo | Mar 2015 | A1 |
20150186728 | Kimura | Jul 2015 | A1 |
20150205841 | Thiyagarajah | Jul 2015 | A1 |
20150278334 | Gerweck | Oct 2015 | A1 |
20150367230 | Bradford | Dec 2015 | A1 |
20150381968 | Arora | Dec 2015 | A1 |
20160034115 | Natarajan | Feb 2016 | A1 |
20160086028 | Francois | Mar 2016 | A1 |
20160179925 | Hsu | Jun 2016 | A1 |
20160191891 | Gilpin | Jun 2016 | A1 |
20160267705 | O'Leary | Sep 2016 | A1 |
20160378843 | Cherwonka | Dec 2016 | A1 |
20170011082 | Velury | Jan 2017 | A1 |
20170034527 | Lee | Feb 2017 | A1 |
20170103111 | Lavin | Apr 2017 | A1 |
20170116227 | Shaked | Apr 2017 | A1 |
20170116309 | Menon | Apr 2017 | A1 |
20170116313 | Roytman | Apr 2017 | A1 |
20170124770 | Vats | May 2017 | A1 |
20170132846 | Iverson | May 2017 | A1 |
20170147674 | Procops | May 2017 | A1 |
20170154468 | Xu | Jun 2017 | A1 |
20170168782 | Boyd | Jun 2017 | A1 |
20170169092 | Baird | Jun 2017 | A1 |
20170177636 | Nguyen | Jun 2017 | A1 |
20170357227 | Kummer | Dec 2017 | A1 |
20180081921 | Willcock | Mar 2018 | A1 |
20180089336 | Ninomiya | Mar 2018 | A1 |
20180096512 | Dahl | Apr 2018 | A1 |
20180107726 | Dwivedi | Apr 2018 | A1 |
20180137675 | Kwant | May 2018 | A1 |
20180184000 | Lee | Jun 2018 | A1 |
20180189014 | Patil | Jul 2018 | A1 |
20180192032 | Freeman | Jul 2018 | A1 |
20180260661 | Konishi | Sep 2018 | A1 |
20180278918 | Peri | Sep 2018 | A1 |
20180284882 | Shipes | Oct 2018 | A1 |
20180322683 | Dimitrov | Nov 2018 | A1 |
20190073831 | Kim | Mar 2019 | A1 |
20190073832 | Kim | Mar 2019 | A1 |
20190096135 | Dal Mutto | Mar 2019 | A1 |
20190098278 | Koizumi | Mar 2019 | A1 |
20190102442 | Daga | Apr 2019 | A1 |
20190102446 | Ramaiyer | Apr 2019 | A1 |
20190102447 | Ramaiyer | Apr 2019 | A1 |
20190108396 | Dal Mutto | Apr 2019 | A1 |
20190139296 | Lakshman | May 2019 | A1 |
20190187876 | Platt | Jun 2019 | A1 |
20190191146 | Koyama | Jun 2019 | A1 |
20190206280 | Palmer | Jul 2019 | A1 |
20190236840 | Zuckerman | Aug 2019 | A1 |
20190286086 | Gardner | Sep 2019 | A1 |
20190332610 | Krishna | Oct 2019 | A1 |
20190340306 | Harrison | Nov 2019 | A1 |
20190370346 | Xu | Dec 2019 | A1 |
20190371071 | Lyons | Dec 2019 | A1 |
20190378341 | Xie | Dec 2019 | A1 |
20190392069 | Lim | Dec 2019 | A1 |
20200007551 | Valente | Jan 2020 | A1 |
20200012409 | Sadacharam | Jan 2020 | A1 |
20200020024 | Lyons | Jan 2020 | A1 |
20200026592 | Ramaiyer | Jan 2020 | A1 |
20200054398 | Kovtun | Feb 2020 | A1 |
20200090030 | Huang | Mar 2020 | A1 |
20200125550 | Katkade | Apr 2020 | A1 |
20200156363 | Touma | May 2020 | A1 |
20200192906 | Visscher | Jun 2020 | A1 |
20200230337 | Rees | Jul 2020 | A1 |
20200242837 | Sato | Jul 2020 | A1 |
20200257680 | Danyi | Aug 2020 | A1 |
20200267194 | Pilnock | Aug 2020 | A1 |
20200286291 | Ebert | Sep 2020 | A1 |
20200288111 | Sheng | Sep 2020 | A1 |
20200357189 | Godzaridis | Nov 2020 | A1 |
20200372697 | Mange | Nov 2020 | A1 |
20200400954 | Tanaka | Dec 2020 | A1 |
20200409531 | Nankani | Dec 2020 | A1 |
20200410745 | Matsunobu | Dec 2020 | A1 |
20210049190 | Alberg | Feb 2021 | A1 |
20210081386 | Daga | Mar 2021 | A1 |
20210104066 | Haeusler | Apr 2021 | A1 |
20210165552 | Revelsby | Jun 2021 | A1 |
20210191912 | Lakshminarayan | Jun 2021 | A1 |
20210240735 | Roytman | Aug 2021 | A1 |
Number | Date | Country | |
---|---|---|---|
20210192832 A1 | Jun 2021 | US |