The invention relates generally to user interface applications for navigation and autonomous driving systems. More specifically, user interfaces for displaying information related to vehicles within the vicinity of an automobile.
Autonomous vehicles use various computing systems to transport passengers from one location to another. Some autonomous vehicles may require some initial input from an operator, such as a pilot, driver, or passenger while other systems may require continuous input. Other systems, for example autopilot systems, may be used only when the system has been engaged, thus the operator may switch from a manual to an autonomous mode where the vehicle drives itself.
A key component of an autonomous vehicle is the perception system, which allows the vehicle to perceive and interpret its surroundings during a trip. When the autonomous system is engaged, the system will make various decisions during the trip, for example, speed up, slow down, stop, etc. The operator may be unaware of the calculations or “reasoning” behind why the autonomous vehicle is taking some particular action. In order to feel safe and confident, the operator may want to know what the vehicle is planning to do in the immediate future and to be informed as to at least some of the factors influencing the system's reasoning.
Navigation systems may include electronic displays which appear to zoom in or out according to a vehicle's speed of travel to enable to user to identify where the vehicle may be within the next few seconds. Some of these systems provide real-time traffic information received via radio or satellite signals. However, these systems do not provide for the display of the speed, actual location of other vehicles or obstacles, or other useful information related to such vehicles or obstacles.
An aspect of the present disclosure relates to a vehicle having a plurality of control apparatuses including a braking apparatus, an acceleration apparatus, and a steering apparatus. The vehicle further has a user input device for inputting destination information, a geographic position component for determining the current location of the vehicle, an object detection apparatus for detecting and identifying a type of an object in or proximate to a roadway, memory for storing a detailed roadway map including roadways, traffic signals, and intersections, and an electronic display for displaying information to a passenger. A processor is also included in the vehicle and is programmed to receive the destination information, identify a route to the destination, and determine, from location information received from the geographic position component and the stored map information, the current geographic location of the vehicle. The processor is also programmed to identify an object and object type based on object information received from the object detection apparatus and to determine at least one warning characteristic of the identified object based on at least one of: the object type, a detected proximity of the detected object to the vehicle, the location of the detected object relative to predetermined peripheral areas of the vehicle, the current geographic location of the vehicle, and the route. The processor is also configured to select an object warning image to be displayed based on the at least one warning characteristic and display the selected object warning image on the electronic display.
The processor can be further configured to identify a change in position of the identified object over time. In such an example, the at least one warning characteristic can relate to deceleration of the identified object when the identified object is positioned in front of the subject vehicle. In another example, the at least one warning characteristic can relate to one of the presence of the identified object toward the rear of the vehicle and within a predetermined distance thereof and the presence of the object within a blind spot of the vehicle.
Another aspect of the present disclosure relates to a vehicle having an object detection apparatus for detecting and identifying a type of an object in or proximate to a roadway and a location of the object and an electronic display for displaying information to a passenger. The vehicle also includes a processor programmed to identify an object and object type based on object information received from the object detection apparatus and determine a relative position of the object to the vehicle. The processor is further programmed to determine at least one warning characteristic of the identified object based on at least one of: the object type, a detected proximity of the detected object to the vehicle, and the relative position of the detected object. The processor is further programmed to determine one of a plurality of predetermined peripheral areas of the vehicle to associate with the warning characteristic based on the relative position of the detected object and to select an object warning image and a warning location image to be displayed based on the at least one warning characteristic and the predetermined peripheral area associated with the warning characteristic. The processor is further configured to display the selected object warning image and the selected warning location image on the electronic display and a preselected safe indication image associated with the predetermined peripheral areas not associated with the warning characteristic.
Another aspect of the present disclosure relates to a method for selecting images for display on a display apparatus of a vehicle. The method includes receiving destination information from a user input device, identifying a route to the destination, receiving location information from a geographic position component, accessing stored map information including roadways, traffic signals, and intersections, and determining, from the location information and the stored map information, the current geographic location of the vehicle. The method also includes identifying an object of a roadway and an object type based on object information received from an object detection apparatus, and determining at least one warning characteristic of the identified object based on at least one of: the object type, a detected proximity of the detected object to the vehicle, the location of the detected object relative to predetermined peripheral areas of the vehicle, the current geographic location of the vehicle, and the route. The method also includes selecting an object warning image to be displayed based on the at least one warning characteristic and displaying the selected object warning image on the electronic display.
Aspects, features and advantages of the invention will be appreciated when considered with reference to the following description of exemplary embodiments and accompanying figures. The same reference numbers in different drawings may identify the same or similar elements. Furthermore, the following description is not limiting; the scope of the invention is defined by the appended claims and equivalents.
As shown in
The memory 122 stores information accessible by processor 120, including instructions 124 and data 126 that may be executed or otherwise used by the processor 120. The memory 122 may be of any type capable of storing information accessible by the processor, including a computer-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
The instructions 124 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computer code on the computer-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computer language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
The data 126 may be retrieved, stored or modified by processor 120 in accordance with the instructions 124. For instance, although the system and method is not limited by any particular data structure, the data may be stored in computer registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computer-readable format. By further way of example only, image data may be stored as bitmaps comprised of grids of pixels that are stored in accordance with formats that are compressed or uncompressed, lossless (e.g., BMP) or lossy (e.g., JPEG), and bitmap or vector-based (e.g., SVG), as well as computer instructions for drawing graphics. The data may comprise any information sufficient to identify the relevant information, such as numbers, descriptive text, proprietary codes, references to data stored in other areas of the same memory or different memories (including other network locations) or information that is used by a function to calculate the relevant data.
The processor 120 may be any conventional processor, such as processors from Intel Corporation or Advanced Micro Devices. Alternatively, the processor may be a dedicated device such as an ASIC. Although
Computer 110 may include all of the components normally used in connection with a computer such as a central processing unit (CPU), memory (e.g., RAM and internal hard drives) storing data 126 and instructions such as a web browser, an electronic display 134 (e.g., a monitor having a screen, a small LCD touch-screen or any other electrical device that is operable to display information), and user input (e.g., a mouse, keyboard, touch-screen and/or microphone).
Computer 110 may also include a geographic position component 136 to determine the geographic location of the device. For example, computer 110 may include a GPS receiver to determine the device's latitude, longitude and/or altitude position. Other location systems such as laser-based localization systems, inertial-aided GPS, WiFi or cellular signal aided GPS, or camera-based localization may also be used.
Computer 110 may also include other features, such as an accelerometer, gyroscope or other acceleration device 140 to determine the direction in which the device is oriented. By way of example only, the acceleration device may determine its pitch, yaw or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto. In that regard, it will be understood that a computer's provision of location and orientation data as set forth herein may be provided automatically to the user, other computers of the network, or both.
Computer 110 may also include an object detection component 142 to detect and identify the location and movement (e.g. relative speed) of objects such as other vehicles, obstacles in the roadway, traffic signals, signs, etc. The detection system may include lasers, sonar, radar, cameras or any other such detection methods. For example, the object detector may include an imaging device to identify the state of a particular traffic signal as yellow or another color. In use, computer 110 may use this information to instruct the braking system of the vehicle to apply the brakes and to provide information regarding such objects to the passenger of the vehicle, as described further below.
Data 126 may include various types of information used by computer 110. Detailed map information 136 may include maps identifying lane lines, intersections, speed limits, traffic signals, buildings, signs, or other such information. For example, computer 110 may access detailed map information 136 in order to determine where the lane lines should be located on a particular highway and adjust the speed or direction of vehicle 101 accordingly. Computer 110 may also access display images 130, such as roadways, intersections, and other objects in order to provide a passenger of vehicle 101 with an understanding of what actions vehicle 101 will take in the immediate future.
In one example, computer 110 may be an autonomous driving computing system capable of communicating with a vehicle's internal computer such as computer 160. Computer 160 may be configured similarly to computer 110, for example, including a processor 170, memory 172, instructions 174, and data 176. Computer 110 may send and receive information from the various systems of vehicle 101, for example the breaking 180, acceleration 182, signaling 184, and navigation 186 systems in order to control the movement, speed, etc. of vehicle 101. It will be understood that although various systems and computers 110 and 160 are shown within vehicle 101, these elements may be external to vehicle 101 or physically separated by large distances.
Vehicle 101 may include one or more user input devices, such as device 132, for inputting information into the autonomous driving computer 110. For example, a user may input a destination, (e.g. 123 Oak Street), into the navigation system. The navigation system may generate a route between the present location of the vehicle and the destination. If the autonomous driving system is engaged, computer 110 may request or automatically receive the route information from the navigation system. Once a route has been determined, the autonomous driving system may drive the vehicle to the destination.
As shown in
As vehicle 101 moves along the roadway, the location of objects detected by the vehicle, and the features of the roadway may change. These changes may be displayed in order to allow the user to understand that vehicle 101 is continuously monitoring the state of the vehicles, roadway and other objects. For example, as shown in
In addition to displaying representations (i.e. subject vehicle icon 112) of the vehicle 101, and additional representations of other vehicles, such as vehicle boxes 113, the computer 110 can be configured to provide warning information, via either display 134 or second display 188, regarding vehicles, or other objects, that are identified as being potentially problematic for vehicle 101. Potentially problematic objects can be those which are determined to be following too closely, vehicles positioned in front of vehicle 101 that are rapidly decelerating (e.g., braking abruptly), or other vehicles that are within a projected path of vehicle 101 or are potentially entering into the path of vehicle 101 in a way such that they may potentially collide with vehicle 101.
Other objects, such as vehicles, bicycles, or pedestrians, can be identified as potentially problematic simply based on their location relative to the vehicle 101. For example, a potentially problematic vehicle can be one that is driving in the blind spot of vehicle 101. The blind spot of a vehicle is generally identified as a location alongside of the vehicle that is between the line of sight of the rearview mirror and the adjacent side-view mirror. In an example a warning relating to a vehicle in the blind spot of vehicle 101 can be beneficial while the passenger of the vehicle is driving to prevent collision between should the driver wish to change lanes. In another example, a blind-spot warning can be beneficial to the passenger while the car is in an autonomous driving mode to notify the passenger of, for example, the vehicle's delay in changing lanes according to an otherwise predetermined autodriving path. Further, simply communicating such notifications to the passenger can give the passenger comfort in the fact that the computer is monitoring for such conditions, giving the passenger confidence in the autonomous driving system 100.
As shown in
The segments 148 of the notification image 146 can be present, whether or not a specific warning is being communicated to the passenger by computer 110. In an example, when no warning is being communicated, the segments 148 can have an appearance to notify the passenger that no problematic objects are present. This can include presenting segments 148 in a color that is accepted to indicate a positive or neutral system status, such as green or blue, for example.
When a warning is to be communicated to the passenger of the vehicle 101, the segment 148 corresponding to the position of the object identified as potentially problematic (the “identified object”) can change to a warning color, such as yellow, orange, or red, for example.
The manner in which the segments 148a-148d are associated with the predetermined peripheral positions around the vehicle 101 can vary. In one example, the divisions between the segments 148 can extend radially outward from the center of subject vehicle icon 112 to divide the area surrounding vehicle 101 into quadrants. In this example, the segment 148 associated with such a quadrant can change in appearance when a warning is to be presented in connection with an identified object in that quadrant. In another example, the areas associated with the segments 148 can overlap, such that an identified object that is, for example, positioned in front of the vehicle 101 to the passenger side thereof, can be associated with both segments 148a and 148d such that if that vehicle is an identified object for which a warning condition is to be presented, both segments 148a and 148d can change in appearance (e.g., color). In these or other possible configurations, multiple segments can change in appearance simultaneously to indicate multiple warnings associated with separate identified objects, such as a braking car to the front of the vehicle 101 and a car in a blind spot of the vehicle 101.
Computer 110 can be further configured to present a warning image in connection with the change of appearance of a segment, such as segment 148d in the example above. For example, as shown in
As shown in
In another example,
In another example,
Any of the various notification images shown in
As shown in
In one example, the above-described warning can be used to alert the driver of vehicle 101 to a condition to help the driver avoid an accident or the like. In another example, when the system 101 is engaged in autonomous driving, such notifications or warning images can help communicate to a passenger the reasons for system 101 taking various actions. In the example shown in
In another example, should the vehicle represented by box 113a brake or otherwise slow abruptly, vehicle 101, if engaged in autonomous driving, can automatically to do the same. In connection with this action, a notification can be made (and can in an example be presented in an urgent manner such as in red or the like) so that the passenger is made aware of the reason for the abrupt braking by vehicle 101. As shown in
In another example, a similar procedure to that shown in
Computer 110 may also display warnings or indications as discussed above, using notification image 146 and optionally warning images 150 in connection with identified objects in order to inform the passenger that the computer will not take a particular action because of the presence of that identified object. As shown in
Computer 110 may also use the display to indicate to the passenger that the vehicle 101 will be changing lanes, but is waiting to do so because of an identified object that makes changing lanes unsafe. As shown in
Additional warning information, such as an escalated warning state as described above can further be presented in the above scenario, for example, if the identified object indicated by box 113d itself begins to unsafely change lanes into the lane in which vehicle 101 is traveling. This can be done in connection with computer 110 causing vehicle 101 to take action to avoid a collision with the identified object. Other similar notifications and warnings can also be presented by computer 110 according to other criteria, such as a vehicle moving unsafely into the path of the subject vehicle 100 (either determined by the projected route or inferred by the system 101 during non-autonomous driving) or otherwise being detected within the path. Further, the notification scheme described herein can also alert the driver to the ending of the lane of travel, for example, while driving on a highway. It can also alert the driver to unsafe or illegal actions to be taken (inadvertently or by error) by the driver during non-autonomous driving. In one example vehicle 101 could present such a warning to the driver when the driver begins to make an illegal turn or the like.
As these and other variations and combinations of the features discussed above can be utilized without departing from the invention as defined by the claims, the foregoing description of exemplary embodiments should be taken by way of illustration rather than by way of limitation of the invention as defined by the claims. It will also be understood that the provision of examples of the invention (as well as clauses phrased as “such as,” “e.g.”, “including” and the like) should not be interpreted as limiting the invention to the specific examples; rather, the examples are intended to illustrate only some of many possible aspects.
The sample values, icons, types and configurations of data described and shown in the figures are for the purposes of illustration only. In that regard, systems and methods in accordance with aspects of the invention may include different physical attributes, data values, data types and configurations, and may be provided and received at different times and by different entities (e.g., some values may be pre-suggested or provided from different sources).
The present application is a continuation of U.S. patent application Ser. No. 16/166,876, filed Oct. 22, 2018, which is a continuation of U.S. patent application Ser. No. 15/295,433, filed Oct. 17, 2016, now U.S. Pat. No. 10,139,829, and is a continuation of U.S. patent application Ser. No. 15/602,423, filed May 23, 2017, now U.S. Pat. No. 10,168,710, which is also a continuation of U.S. patent application Ser. No. 15/295,433. U.S. patent application Ser. No. 15/295,433 is a continuation of U.S. patent application Ser. No. 14/542,799, filed Nov. 17, 2014, now U.S. Pat. No. 9,501,058, which is a continuation of U.S. patent application Ser. No. 14/171,904, filed Feb. 4, 2014, now U.S. Pat. No. 8,903,592, which is a continuation of U.S. patent application Ser. No. 13/796,037, filed Mar. 12, 2013, now U.S. Pat. No. 8,676,431, the entire disclosures of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
D205596 | Marti et al. | Aug 1966 | S |
D273799 | Darrell | May 1984 | S |
D277113 | Gordon | Jan 1985 | S |
D289621 | Tanaka et al. | May 1987 | S |
4937570 | Matsukawa et al. | Jun 1990 | A |
D323492 | Fulton et al. | Jan 1992 | S |
5272483 | Kato | Dec 1993 | A |
5317323 | Kennedy et al. | May 1994 | A |
5323321 | Smith | Jun 1994 | A |
5392388 | Gibson | Feb 1995 | A |
5526341 | Shiba et al. | Jun 1996 | A |
5638279 | Kishi et al. | Jun 1997 | A |
5732385 | Nakayama et al. | Mar 1998 | A |
5739772 | Nanba et al. | Apr 1998 | A |
5739773 | Morimoto et al. | Apr 1998 | A |
5838562 | Gudat et al. | Nov 1998 | A |
5874905 | Nanba et al. | Feb 1999 | A |
5925090 | Poonsaengsathit | Jul 1999 | A |
5925091 | Ando | Jul 1999 | A |
5929787 | Mee et al. | Jul 1999 | A |
5951621 | Palalau et al. | Sep 1999 | A |
5983161 | Lemelson et al. | Nov 1999 | A |
6049755 | Lou et al. | Apr 2000 | A |
D425499 | Millington | May 2000 | S |
D428397 | Palalau et al. | Jul 2000 | S |
6087961 | Markow | Jul 2000 | A |
6163269 | Millington et al. | Dec 2000 | A |
D438874 | Flamini | Mar 2001 | S |
6199012 | Hasegawa | Mar 2001 | B1 |
6212472 | Nonaka et al. | Apr 2001 | B1 |
6275773 | Lemelson et al. | Aug 2001 | B1 |
6360167 | Millington et al. | Mar 2002 | B1 |
6388578 | Fagan et al. | May 2002 | B1 |
6434482 | Oshida et al. | Aug 2002 | B1 |
D465161 | Truisi | Nov 2002 | S |
6484094 | Wako | Nov 2002 | B1 |
6487500 | Lemelson et al. | Nov 2002 | B2 |
6516262 | Takenaga et al. | Feb 2003 | B2 |
6522347 | Tsuji et al. | Feb 2003 | B1 |
6718258 | Barton | Apr 2004 | B1 |
6728605 | Lash et al. | Apr 2004 | B2 |
D493471 | McIntosh | Jul 2004 | S |
6771189 | Yokota | Aug 2004 | B2 |
D500766 | Hanisch et al. | Jan 2005 | S |
D501210 | Cook | Jan 2005 | S |
6999875 | Tu | Feb 2006 | B2 |
D535207 | Skaggs | Jan 2007 | S |
D536340 | Jost et al. | Feb 2007 | S |
D544495 | Evans et al. | Jun 2007 | S |
D544496 | Evans et al. | Jun 2007 | S |
D544876 | Yamazaki et al. | Jun 2007 | S |
D552121 | Carl et al. | Oct 2007 | S |
D552122 | Carl et al. | Oct 2007 | S |
7289019 | Kertes | Oct 2007 | B1 |
D561193 | O'Mullan et al. | Feb 2008 | S |
D566722 | Jackson | Apr 2008 | S |
7359782 | Breed | Apr 2008 | B2 |
D568336 | Miglietta et al. | May 2008 | S |
7376510 | Green | May 2008 | B1 |
7430473 | Foo et al. | Sep 2008 | B2 |
D586359 | Makoski et al. | Feb 2009 | S |
D595191 | Madden | Jun 2009 | S |
7564376 | Jang | Jul 2009 | B2 |
D599284 | Misumi | Sep 2009 | S |
D599375 | Wipplinger | Sep 2009 | S |
D600704 | Lamanna et al. | Sep 2009 | S |
D601169 | Lamanna et al. | Sep 2009 | S |
D601571 | Vu et al. | Oct 2009 | S |
D602033 | Vu et al. | Oct 2009 | S |
D606091 | O'Donnell et al. | Dec 2009 | S |
D606551 | Willis | Dec 2009 | S |
7663533 | Toennesen et al. | Feb 2010 | B2 |
D611951 | Katzer | Mar 2010 | S |
D615096 | Muehlfelder | May 2010 | S |
D619593 | Fujioka et al. | Jul 2010 | S |
D619614 | O'Mullan et al. | Jul 2010 | S |
7802205 | Bedingfield | Sep 2010 | B2 |
D625317 | Jewitt et al. | Oct 2010 | S |
D627360 | Aarseth | Nov 2010 | S |
D628590 | Vandeberghe et al. | Dec 2010 | S |
7865310 | Nakano et al. | Jan 2011 | B2 |
7869938 | Wako | Jan 2011 | B2 |
D636398 | Matas | Apr 2011 | S |
7925438 | Lo | Apr 2011 | B2 |
7941269 | Laumeyer et al. | May 2011 | B2 |
7949964 | Vimme | May 2011 | B2 |
7963656 | Kuno et al. | Jun 2011 | B2 |
D641762 | Matas | Jul 2011 | S |
7979172 | Breed | Jul 2011 | B2 |
7979173 | Breed | Jul 2011 | B2 |
D644243 | Matas | Aug 2011 | S |
D644661 | Gardner et al. | Sep 2011 | S |
D645470 | Matas | Sep 2011 | S |
D645873 | Cavanaugh et al. | Sep 2011 | S |
8036823 | Akita et al. | Oct 2011 | B2 |
8040253 | Kaller et al. | Oct 2011 | B2 |
D649558 | Matas | Nov 2011 | S |
8050863 | Trepagnier et al. | Nov 2011 | B2 |
D650793 | Impas et al. | Dec 2011 | S |
D650798 | Impas et al. | Dec 2011 | S |
D651613 | Ouilhet | Jan 2012 | S |
8116974 | Cummings | Feb 2012 | B2 |
8126642 | Trepagnier et al. | Feb 2012 | B2 |
D664464 | Muller | Jul 2012 | S |
D665163 | Leifeld et al. | Aug 2012 | S |
8258978 | Greasby | Sep 2012 | B2 |
8260537 | Breed | Sep 2012 | B2 |
8271193 | Nezu | Sep 2012 | B2 |
D669497 | Lee et al. | Oct 2012 | S |
D669499 | Gardner et al. | Oct 2012 | S |
D672256 | Behar | Dec 2012 | S |
8326529 | Kang | Dec 2012 | B2 |
D673982 | Miller | Jan 2013 | S |
8346426 | Szybalski et al. | Jan 2013 | B1 |
8346465 | Panganiban et al. | Jan 2013 | B2 |
8355862 | Matas et al. | Jan 2013 | B2 |
D676857 | MacManus et al. | Feb 2013 | S |
8384532 | Szczerba et al. | Feb 2013 | B2 |
D678304 | Yakoub et al. | Mar 2013 | S |
D679730 | Tyler et al. | Apr 2013 | S |
D681052 | Woo | Apr 2013 | S |
8428873 | Chau et al. | Apr 2013 | B2 |
D681667 | Phelan | May 2013 | S |
8452337 | Kim | May 2013 | B2 |
D683755 | Phelan | Jun 2013 | S |
D684188 | Kocmick et al. | Jun 2013 | S |
8461976 | Yamamoto | Jun 2013 | B2 |
8464182 | Blumenberg et al. | Jun 2013 | B2 |
D686240 | Lin | Jul 2013 | S |
D686245 | Gardner et al. | Jul 2013 | S |
D687057 | Plitkins | Jul 2013 | S |
8479120 | Nezu | Jul 2013 | B2 |
8515664 | Spindler et al. | Aug 2013 | B2 |
8543335 | Gruijters et al. | Sep 2013 | B2 |
D690718 | Thomsen et al. | Oct 2013 | S |
D690720 | Waldman | Oct 2013 | S |
D690737 | Wen et al. | Oct 2013 | S |
D692444 | Lee et al. | Oct 2013 | S |
8560231 | Vu et al. | Oct 2013 | B2 |
D694257 | McKinley et al. | Nov 2013 | S |
D695300 | Lee et al. | Dec 2013 | S |
D695308 | Lee | Dec 2013 | S |
8618952 | Mochizuki | Dec 2013 | B2 |
D698363 | Asai | Jan 2014 | S |
8635019 | Tertoolen | Jan 2014 | B2 |
D699750 | Pearson et al. | Feb 2014 | S |
8676431 | Mariet et al. | Mar 2014 | B1 |
D702251 | Kotler et al. | Apr 2014 | S |
D702257 | Wantland et al. | Apr 2014 | S |
D704220 | Lim et al. | May 2014 | S |
D705805 | Schweizer | May 2014 | S |
D705808 | Anzures et al. | May 2014 | S |
D706791 | Sassoon | Jun 2014 | S |
D706814 | Phelan | Jun 2014 | S |
D708221 | Danton et al. | Jul 2014 | S |
D709898 | Sloo et al. | Jul 2014 | S |
D709915 | Inose et al. | Jul 2014 | S |
8775068 | Pylappan | Jul 2014 | B2 |
D710367 | Quattrocchi | Aug 2014 | S |
D710370 | Inose et al. | Aug 2014 | S |
D711910 | Inose et al. | Aug 2014 | S |
D712911 | Pearson et al. | Sep 2014 | S |
8838321 | Ferguson | Sep 2014 | B1 |
8842176 | Schofield | Sep 2014 | B2 |
D715313 | Hontz | Oct 2014 | S |
D715808 | Ishimoto et al. | Oct 2014 | S |
D716319 | Fan et al. | Oct 2014 | S |
D716320 | Fan et al. | Oct 2014 | S |
D716325 | Brudnicki | Oct 2014 | S |
D716829 | Sik | Nov 2014 | S |
D717822 | Brotman et al. | Nov 2014 | S |
8880336 | Van Os et al. | Nov 2014 | B2 |
8884789 | Wagner et al. | Nov 2014 | B2 |
D719578 | Inose et al. | Dec 2014 | S |
D719973 | Inose et al. | Dec 2014 | S |
8930139 | Goddard | Jan 2015 | B2 |
8935046 | Muhlfelder et al. | Jan 2015 | B2 |
D722069 | Lee et al. | Feb 2015 | S |
D722079 | Charles et al. | Feb 2015 | S |
8963702 | Follmer et al. | Feb 2015 | B2 |
D725144 | Johnson | Mar 2015 | S |
8977486 | Cho | Mar 2015 | B2 |
8983778 | McCarthy | Mar 2015 | B2 |
D726208 | Dorfmann et al. | Apr 2015 | S |
D726219 | Chaudhri et al. | Apr 2015 | S |
D726741 | Lee et al. | Apr 2015 | S |
D727336 | Allison et al. | Apr 2015 | S |
D727928 | Allison et al. | Apr 2015 | S |
D728616 | Gomez et al. | May 2015 | S |
D729260 | Ahn et al. | May 2015 | S |
D729273 | Mariet et al. | May 2015 | S |
D729274 | Clement et al. | May 2015 | S |
D729838 | Clement et al. | May 2015 | S |
D730366 | Brush et al. | May 2015 | S |
D730404 | Yu et al. | May 2015 | S |
D730405 | Yu et al. | May 2015 | S |
9043069 | Ferguson et al. | May 2015 | B1 |
D731541 | Lee | Jun 2015 | S |
D731542 | Clement et al. | Jun 2015 | S |
D732075 | Clement et al. | Jun 2015 | S |
D733722 | Ueda | Jul 2015 | S |
D734343 | Yamasaki et al. | Jul 2015 | S |
D735214 | Mariet et al. | Jul 2015 | S |
9081483 | Nezu | Jul 2015 | B2 |
D736223 | Park | Aug 2015 | S |
D736258 | Kim et al. | Aug 2015 | S |
D736820 | Clement et al. | Aug 2015 | S |
D736830 | Lyman et al. | Aug 2015 | S |
9103681 | McGavran et al. | Aug 2015 | B2 |
D737854 | Kim et al. | Sep 2015 | S |
D738244 | Shallice et al. | Sep 2015 | S |
D738380 | Nielsen | Sep 2015 | S |
D739872 | Bang et al. | Sep 2015 | S |
9121724 | Piemonte et al. | Sep 2015 | B2 |
9139133 | Eng | Sep 2015 | B2 |
9146125 | Vulcano et al. | Sep 2015 | B2 |
D740302 | Son et al. | Oct 2015 | S |
D741356 | Park et al. | Oct 2015 | S |
D741890 | Chaudhri et al. | Oct 2015 | S |
D741896 | Park et al. | Oct 2015 | S |
D741898 | Soegiono et al. | Oct 2015 | S |
D741904 | Clement et al. | Oct 2015 | S |
9170122 | Moore et al. | Oct 2015 | B2 |
9171464 | Khetan et al. | Oct 2015 | B2 |
D743438 | Inose et al. | Nov 2015 | S |
9182243 | Van Os et al. | Nov 2015 | B2 |
D744365 | Rogers | Dec 2015 | S |
D744535 | Shin et al. | Dec 2015 | S |
D745046 | Shin et al. | Dec 2015 | S |
9200915 | Vulcano et al. | Dec 2015 | B2 |
9201421 | Fairfield et al. | Dec 2015 | B1 |
9221461 | Ferguson et al. | Dec 2015 | B2 |
D747352 | Lee et al. | Jan 2016 | S |
D747731 | Oliveira | Jan 2016 | S |
9239245 | Ishikawa et al. | Jan 2016 | B2 |
D750130 | Baumann | Feb 2016 | S |
9269178 | Piemonte et al. | Feb 2016 | B2 |
D750663 | Mariet et al. | Mar 2016 | S |
D753715 | Clement et al. | Apr 2016 | S |
D753717 | Mariet et al. | Apr 2016 | S |
D753718 | Mariet et al. | Apr 2016 | S |
D753719 | Mariet et al. | Apr 2016 | S |
D753720 | Mariet et al. | Apr 2016 | S |
D753721 | Mariet et al. | Apr 2016 | S |
D753722 | Mariet et al. | Apr 2016 | S |
D753723 | Clement et al. | Apr 2016 | S |
D753724 | Clement et al. | Apr 2016 | S |
D754189 | Mariet et al. | Apr 2016 | S |
D754190 | Mariet et al. | Apr 2016 | S |
D754203 | Mariet et al. | Apr 2016 | S |
D754204 | Mariet et al. | Apr 2016 | S |
D754686 | Mandeville | Apr 2016 | S |
9303997 | McGavran et al. | Apr 2016 | B2 |
9319831 | Vulcano | Apr 2016 | B2 |
D756403 | Moon et al. | May 2016 | S |
D760276 | Huang et al. | Jun 2016 | S |
D761812 | Motamedi | Jul 2016 | S |
D761857 | Mariet et al. | Jul 2016 | S |
D764546 | Boria | Aug 2016 | S |
D766304 | Mariet et al. | Sep 2016 | S |
D768184 | Mariet et al. | Oct 2016 | S |
D769324 | Inose et al. | Oct 2016 | S |
D769935 | Hall | Oct 2016 | S |
D771681 | Mariet et al. | Nov 2016 | S |
D771682 | Mariet et al. | Nov 2016 | S |
D772274 | Mariet et al. | Nov 2016 | S |
9501058 | Mariet et al. | Nov 2016 | B1 |
D773517 | Mariet et al. | Dec 2016 | S |
D778292 | Mochizuki et al. | Feb 2017 | S |
20010027377 | Shimabara | Oct 2001 | A1 |
20020171685 | Christianson et al. | Nov 2002 | A1 |
20030050756 | McGovern | Mar 2003 | A1 |
20040204833 | Yokota | Oct 2004 | A1 |
20040204845 | Wong | Oct 2004 | A1 |
20040236507 | Maruyama et al. | Nov 2004 | A1 |
20050081148 | Deganello et al. | Apr 2005 | A1 |
20050102102 | Linn | May 2005 | A1 |
20050234612 | Bottomley et al. | Oct 2005 | A1 |
20050234639 | Endo et al. | Oct 2005 | A1 |
20050273256 | Takahashi | Dec 2005 | A1 |
20050278098 | Breed | Dec 2005 | A1 |
20060031005 | Sakano et al. | Feb 2006 | A1 |
20060100774 | Barkowski et al. | May 2006 | A1 |
20060195231 | Diebold et al. | Aug 2006 | A1 |
20060195259 | Pinkus et al. | Aug 2006 | A1 |
20060247855 | De et al. | Nov 2006 | A1 |
20070001830 | Dagci et al. | Jan 2007 | A1 |
20070136679 | Yang | Jun 2007 | A1 |
20070150179 | Pinkus et al. | Jun 2007 | A1 |
20070213092 | Geelen | Sep 2007 | A1 |
20070256030 | Bedingfield | Nov 2007 | A1 |
20080040004 | Breed | Feb 2008 | A1 |
20080040024 | Silva | Feb 2008 | A1 |
20080040031 | Tu | Feb 2008 | A1 |
20080046150 | Breed | Feb 2008 | A1 |
20080046274 | Geelen et al. | Feb 2008 | A1 |
20080082225 | Barrett | Apr 2008 | A1 |
20080126992 | Scheu et al. | May 2008 | A1 |
20080161986 | Breed | Jul 2008 | A1 |
20080162043 | Emoto et al. | Jul 2008 | A1 |
20080167801 | Geelen et al. | Jul 2008 | A1 |
20080167811 | Geelen | Jul 2008 | A1 |
20080208450 | Katzer | Aug 2008 | A1 |
20080208469 | Obradovich et al. | Aug 2008 | A1 |
20080288165 | Suomela et al. | Nov 2008 | A1 |
20080312827 | Kahlow et al. | Dec 2008 | A1 |
20090005980 | Nakao et al. | Jan 2009 | A1 |
20090012709 | Miyazaki | Jan 2009 | A1 |
20090024321 | Bando et al. | Jan 2009 | A1 |
20090037094 | Schmidt | Feb 2009 | A1 |
20090046111 | Joachim et al. | Feb 2009 | A1 |
20090063041 | Hirose et al. | Mar 2009 | A1 |
20090063048 | Tsuji | Mar 2009 | A1 |
20090083665 | Anttila et al. | Mar 2009 | A1 |
20090096937 | Bauer et al. | Apr 2009 | A1 |
20090171561 | Geelen | Jul 2009 | A1 |
20090171578 | Kim et al. | Jul 2009 | A1 |
20090171580 | Nezu | Jul 2009 | A1 |
20090171582 | Stockinger et al. | Jul 2009 | A1 |
20090182497 | Hagiwara | Jul 2009 | A1 |
20090187335 | Muhlfelder et al. | Jul 2009 | A1 |
20090216431 | Vu et al. | Aug 2009 | A1 |
20090268946 | Zhang et al. | Oct 2009 | A1 |
20100045704 | Kim | Feb 2010 | A1 |
20100057358 | Winer et al. | Mar 2010 | A1 |
20100063663 | Tolstedt et al. | Mar 2010 | A1 |
20100087230 | Peh et al. | Apr 2010 | A1 |
20100191457 | Harada | Jul 2010 | A1 |
20100250116 | Yamaguchi et al. | Sep 2010 | A1 |
20100253541 | Seder et al. | Oct 2010 | A1 |
20100253602 | Szczerba et al. | Oct 2010 | A1 |
20100253688 | Cui et al. | Oct 2010 | A1 |
20100253918 | Seder et al. | Oct 2010 | A1 |
20100254019 | Cui et al. | Oct 2010 | A1 |
20100283591 | Schick | Nov 2010 | A1 |
20100292886 | Szczerba et al. | Nov 2010 | A1 |
20100299063 | Nakamura et al. | Nov 2010 | A1 |
20100312466 | Katzer et al. | Dec 2010 | A1 |
20100318573 | Yoshikoshi | Dec 2010 | A1 |
20110071818 | Jiang | Mar 2011 | A1 |
20110098918 | Siliski et al. | Apr 2011 | A1 |
20110112756 | Winkler et al. | May 2011 | A1 |
20110128138 | Yamamoto | Jun 2011 | A1 |
20110153166 | Yester | Jun 2011 | A1 |
20110153209 | Geelen | Jun 2011 | A1 |
20110193722 | Johnson | Aug 2011 | A1 |
20110208421 | Sakashita | Aug 2011 | A1 |
20110249005 | Hautvast | Oct 2011 | A1 |
20110285717 | Schmidt et al. | Nov 2011 | A1 |
20120035788 | Trepagnier et al. | Feb 2012 | A1 |
20120096383 | Sakamoto et al. | Apr 2012 | A1 |
20120143504 | Kalai et al. | Jun 2012 | A1 |
20120154591 | Baur et al. | Jun 2012 | A1 |
20120197839 | Vervaet et al. | Aug 2012 | A1 |
20120236287 | Lee | Sep 2012 | A1 |
20120249456 | Taka et al. | Oct 2012 | A1 |
20120259539 | Sumizawa | Oct 2012 | A1 |
20120268262 | Popovic | Oct 2012 | A1 |
20120303263 | Alam et al. | Nov 2012 | A1 |
20120310530 | Lee | Dec 2012 | A1 |
20130035853 | Stout et al. | Feb 2013 | A1 |
20130070093 | Rivera et al. | Mar 2013 | A1 |
20130100287 | Chien | Apr 2013 | A1 |
20130151145 | Ishikawa | Jun 2013 | A1 |
20130171590 | Kumar | Jul 2013 | A1 |
20130191020 | Emani et al. | Jul 2013 | A1 |
20130197736 | Zhu et al. | Aug 2013 | A1 |
20130293466 | Shibata | Nov 2013 | A1 |
20130325339 | McCarthy | Dec 2013 | A1 |
20130325342 | Pylappan et al. | Dec 2013 | A1 |
20130326425 | Forstall et al. | Dec 2013 | A1 |
20130328924 | Arikan et al. | Dec 2013 | A1 |
20130345980 | Van Os et al. | Dec 2013 | A1 |
20140002252 | Fong | Jan 2014 | A1 |
20140032049 | Moshchuk et al. | Jan 2014 | A1 |
20140039786 | Schleicher et al. | Feb 2014 | A1 |
20140063196 | Daniel | Mar 2014 | A1 |
20150113483 | Van Der Westhuizen et al. | Apr 2015 | A1 |
20150208004 | Ruohonen | Jul 2015 | A1 |
20150254983 | Mochizuki et al. | Sep 2015 | A1 |
20150325271 | Kim et al. | Nov 2015 | A1 |
20160062730 | Kwon et al. | Mar 2016 | A1 |
20170003853 | Min et al. | Jan 2017 | A1 |
Entry |
---|
Hiswe , “Pseudo Element Squared Spinner”, CodePen, Available online at: <http://codepen.io/Hiswe/pen/BHKql>, Apr. 22, 2014, 2 pages. |
Jibin , “Lighting Panels-Big Lights”, CodePen, Available online at: <http://codepen.io/jmathew1991/pen/EyKXav>, Jun. 29, 2016, 2 pages. |
Uco , “Square Spinner”, CodePen, Available online at: <http://codepen.io/escapism/pen!VIbjwy>, Jun. 6, 2015, 2 pages. |
Number | Date | Country | |
---|---|---|---|
Parent | 16166876 | Oct 2018 | US |
Child | 17079810 | US | |
Parent | 15602423 | May 2017 | US |
Child | 16166876 | US | |
Parent | 15295433 | Oct 2016 | US |
Child | 15602423 | US | |
Parent | 14542799 | Nov 2014 | US |
Child | 15295433 | US | |
Parent | 14171904 | Feb 2014 | US |
Child | 14542799 | US | |
Parent | 13796037 | Mar 2013 | US |
Child | 14171904 | US |