This disclosure generally relates to interactions of Near Field Communication (NFC) tags with devices (e.g., mobile devices). More particularly, this disclosure relates to devices and method for NFC-based interactions that involve generating one or more commands by the devices based on detecting one or more NFC tags by the devices and determining one or more qualifying events by the devices.
The use of NFC tags, such as those that interact with smartphones, have become common in many aspects of human life as they enhance the functionality of a device (e.g., a mobile device). For example, NFC tags can be used to gain access to a building, grant an entitlement, direct a user to a website, make a payment, open an application, etc. An NFC tag is typically write protected and includes unique information that determines the action performed when a mobile device or reader interacts with the tag. Devices will typically read an NFC tag once and perform one action, limiting the functionality of the NFC tag. In addition, NFC chips are being built into an increasing number of devices and can also be implanted into individuals. Thus, improvements to interactions with NFC tags and NFC chips are needed.
This section is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description section. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The following concepts and aspect of the disclosure are designed to solve at least some drawbacks known in the art. The discussion of devices include mobile devices, and the discussion of applications include mobile applications.
In one embodiment of this disclosure, there is provided a method for NFC-based interactions. The method can comprise receiving, by a device, an electromagnetic signal of a NFC tag when the device is within a first predetermined distance from the NFC tag, retrieving, by the device, an identifier of the NFC tag from the electromagnetic signal of the NFC tag, and, based on the identifier, generating, by the device, a first command, wherein the first command is configured to cause performing one or more of the following: change at least one setting of the device, switch at least one of operating modes of the device, activate at least one application, deactivate at least one application, enable at least one feature of at least one application, disable at least one feature of at least one application, and send a first instruction to at least remote electronic device.
The method can further include identifying, by the device, a qualifying event before or after the first command is generated, and, based on the identification of the qualifying event, generating, by the device, a second command, wherein the second command is generated after the first command, and wherein the second command is configured to cause performing one or more of the following: change at least one of the settings of the device, switch at least one of the operating modes of the device, activate at least one of the applications, deactivate at least one of the applications, enable at least one feature of at least one of the applications, disable at least one feature of at least one of the applications, and send a second instruction to at least one of the remote electronic devices.
In certain embodiments, the method may further comprise suppressing, by the device, the generation or use of the first command when the qualifying event is identified. In some embodiments, the identification of the qualifying event includes determining, by the device, that the device is located within the first predetermined distance from the NFC tag for a predetermined period. In other embodiments, the identification of the qualifying event can include determining, by the device, that the device is orientated substantially upside down such that a display of the device is substantially parallel to the ground and faces the ground.
In yet other embodiments, the identification of the qualifying event can include determining, by the device, that the device is locked. In yet other embodiments, the identification of the qualifying event can include determining, by the device, that the device is unlocked. In yet other embodiments, the identification of the qualifying event can include determining, by the device, that the device is located within a second predetermined distance from a predetermined geographical location associated with the identifier of the NFC tag.
In certain embodiments, the method may further comprise providing, by the device, a first notification to a user when the first command is generated or to be generated by the device. In certain embodiments, the method may further comprise providing, by the device, a second notification to the user when the second command is generated or about to be generated by the device. Moreover, in some embodiments, the first notification is a vibrational notification generated by the device, and the second notification is a doubled vibrational notification generated by the device.
In another aspect of this disclosure, there is provided a device. The device comprises an NFC communication module configured to wirelessly communicate with one or more NFC tags, a processor, and a memory storing processor-executable codes. The processor can be configured to implement the following operations upon executing the processor-executable codes stored by the memory: cause reading of an electromagnetic signal of one of the NFC tags when the device is within a first predetermined distance from the NFC tag, cause retrieving an identifier of the NFC tag from the electromagnetic signal of the NFC tag, based on the identifier, cause generating a first command, wherein the first command is configured to cause performing one or more of the following: change at least one setting of the device, switch at least one of operating modes of the device, activate at least one of applications, deactivate at least one of the applications, enable at least one feature of at least one of the applications, disable at least one feature of at least one of the applications, and send a first instruction to at least one of remote electronic devices; cause identifying a qualifying event before or after the first command is generated; and, based on the identification of the qualifying event, cause generating a second command, wherein the second command is generated after the first command, and wherein the second command is configured to cause performing one or more of the following: change at least one of the settings of the device, switch at least one of the operating modes of the device, activate at least one of the applications, deactivate at least one of the applications, enable at least one feature of at least one of the applications, disable at least one feature of at least one of the applications, and send a second instruction to at least one of the remote electronic devices.
In other embodiments, an NFC chip can be on the person, as an adhesive or wearable device. An NFC chip can be implanted within a person. An NFC tag can also be used to control certain actions within a video game, depending on the location of a controller relative to the user with the NFC tag, as well as on the manner in which the controller is being handled by the user.
In another embodiment, the implanted or wearable NFC chip can be used as an authentication device for a laptop, a device, a door lock, or a portable lock such as one for a bicycle or locker, in combination with a series of interactions that can be used to enforce dual factor authentication.
In yet another aspect of the disclosure, there is provided a non-transitory processor-readable medium having instructions stored thereon, which when executed by one or more processors, cause the one or more processors to implement the above-outlined method for NFC-based interaction.
Additional objects, advantages, and novel features of the examples will be set forth in part in the description, which follows, and in part will become apparent to those skilled in the art upon examination of the following description and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the concepts may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out in the appended claims.
Embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.
Corresponding reference characters indicate corresponding components throughout the several views of the drawings. Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present disclosure. In addition, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure.
The present disclosure generally relates to systems and methods for providing or improving interaction with a device based on NFC tags. NFC tags can be also referred to as “smart tags” or Radio-Frequency Identification (RFID) tags, and the like. NFC tags can be implemented as stickers, key fobs, or in any other similar manner so that they can be fixed, mounted, connected, or otherwise coupled to certain items indoors or outdoors. For example, NFC stickers can be placed near a bed, on a kitchen table, door, inside vehicle, and the like. A device includes mobile devices, and an application includes mobile applications.
To interact with NFC tags, a user can use a device, such as a smartphone, which has a NFC communication module configured to wirelessly communicate with one or more NFC tags when the device is in a vicinity of NFC tags (e.g., at a distance less than about 2 inches). Each NFC tag can be uniquely identified as it bears a unique identifier. When the device reads an electromagnetic signal generated by the NFC tag, the device can detect or retrieve the unique identifier of the NFC tag. Furthermore, as a response to the identification of the unique identifier of the NFC tag, the device can perform pre-programmed operations associated with the unique identifier. The pre-programmed operations can cause a change in settings of the device, cause switching between operating modes of the device, cause activating/deactivating of an application, cause enabling/disabling a feature of the application, or cause sending an instruction to a remote electronic device. For example, placing the device in a vicinity of one of the NFC tags coupled to a couch, could activate a remote control application on the device to enable a remote control application to control a television (TV) device by the user. In another example, placing the device close to another NFC tag attached to a refrigerator can activate a tracking feature of the device or application to track amount of water that the user drinks.
Each NFC tag includes a passively emitting device not requiring electrical power as it emits signals in response to a wireless signal received from the device. In some example embodiments, the longest side of the NFC tag can be, optionally, of about one inch or less. NFC tags can be of different colors such as white, black, red, blue, green, yellow, metallic, transparent, and so forth. Furthermore, each tag can include a chip and a communication module having a coil or an antenna. The chip can be configured to cause the communication module to emit a certain electromagnetic signal bearing a unique identifier of an NFC tag, for example, as a response to a received signal from a device. Each NFC tag can include an adhesive layer so that it can be used as a sticker. Moreover, users can secure NFC tags with magnets, pins, or clips so that they can be easily transportable from one location to another.
The device can be programmed before it can interact with NFC tags. To these ends, after placing the NFC tag at a desired location, the user can open a certain application and place the device near the NFC tag. The user would need to pre-program and indicate what actions or operations the device is to perform upon sensing the electromagnetic signal bearing the unique identifier of the NFC tag. For example, the user can set up IFTTT (If This Then That) commands to be executed upon sensing the electromagnetic signal bearing the unique identifier of the NFC tag. The user can also set up the device such that the presence of the NFC tag would open and perform actions by one or more specific applications of the device. Moreover, two or more actions can be available for every NFC tag. For example, a first action can initiate with a tap or placement of the device in a vicinity of an NFC tag. The device can send a notification such as a vibrational notification when the first action is successfully initiated or completed or about to be initiated.
Furthermore, upon detection of certain qualifying event, a second action can be initiated by the device. For instance, if the user holds the device over the NFC tag after the first notification for a predetermined period, the first action can be suppressed and the second action can be initiated instead of the first action. In this case, the device can notify the user with a second notification such as a double vibration. Confirmations (vibrations or audible beeps) can be enabled or disabled for each NFC tag individually. After the NFC tag is registered with the user's device, the user can move the NFC tag and/or set it to perform other operations without the need to re-register the NFC tag with the device. In some embodiments, the users can also initiate actions directly on the application for all of their registered NFC tags.
The following describes some example scenarios of using NFC tags to cause implementations of certain predetermined actions by a device. In one example, an NFC tag can be added to a magnet to be placed on a refrigerator. When a user taps a device on the NFC tag, it causes a fitness tracking application or nutrition tracking application to add up a cup of water to a user's account. However, if the user holds the device near the NFC tag for a predetermined period, the device causes the opening of a food tracking form of the nutritional application instead of adding a cup of water. In another example, an NFC tag can be added to a pin to be placed on a shirt of a user such that when the user places his mobile device close to the NFC tag, it causes the opening of a favorite mobile game or another application.
In an alternative embodiment, when the NFC tag is placed on a couch, the device can open a remote control application if the device comes into close contact with the NFC tag. In yet additional example, a NFC tag can be glued to a clip. The clip can be attached to food packaging such as a snack box. When the user taps the mobile device over the NFC tag, the device can automatically record the number of snacks the user eats. If the NFC tag is placed inside a car, the application can automatically cause opening a garage door when the device comes into a close proximity of the NFC tag.
According to some embodiments, there can be some additional ways to interact with NFC tags, which can cause implementation of one or more actions upon detecting NFC tags by a device and upon identifying one of a plurality of qualifying events. The qualifying events can be as follows: a device taps a NFC tag (e.g., a brief touch of the NFC tag by the device), an orientation of the device (e.g., upside down), a state of the device (e.g., locked or unlocked), a period of holding the device close to the NFC tag, a geographical location of the device (e.g., it can be determined if the device is close to a predetermined location or item).
The following detailed description of embodiments includes references to the accompanying drawings, which form a part of the detailed description. Approaches described in this section are not prior art to the claims and are not admitted to be prior art by inclusion in this section. Reference throughout this specification to “one embodiment,” “an embodiment,” “some embodiments,” “some implementations” or similar language means that a particular feature, structure, or characteristic described in connection with an example implementation is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” “in some embodiments,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
Furthermore, the described features, structures, or characteristics of embodiments may be combined in any suitable manner in one or more implementations. In the following description, numerous specific details are provided, such as examples of programming, software modules, user selections, network transactions, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments. One skilled in the relevant art will recognize, however, that the embodiments can be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
Aspects of the embodiments of this disclosure will now be presented with reference to accompanying drawings which show blocks, components, circuits, steps, operations, processes, algorithms, and the like, collectively referred to as “elements” for simplicity. These elements may be implemented using electronic hardware, computer software, or any combination thereof. Whether such elements are implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. By way of example, an element, or any portion of an element, or any combination of elements may be implemented with a “processing system” that includes one or more processors. Examples of processors include microprocessors, microcontrollers, Central Processing Units (CPUs), digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to perform various functions described throughout this disclosure. One or more processors in the processing system may execute software, firmware, or middleware (collectively referred to as “software”). The term “software” shall be construed broadly to mean processor-executable instructions, instruction sets, code segments, program code, programs, subprograms, software components, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, and the like, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
Accordingly, in one or more embodiments, the functions described herein may be implemented in hardware, software, or any combination thereof. If implemented in software, the functions may be stored on or encoded as one or more instructions or code on a non-transitory computer-readable medium. Computer-readable media includes computer storage media. Storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise a random-access memory (RAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), compact disk ROM (CD-ROM) or other optical disk storage, magnetic disk storage, solid state memory, or any other data storage devices, combinations of the aforementioned types of computer-readable media, or any other medium that can be used to store computer executable code in the form of instructions or data structures that can be accessed by a computer.
For purposes of this patent document, the terms “or” and “and” shall mean “and/or” unless stated otherwise or clearly intended otherwise by the context of their use. The term “a” shall mean “one or more” unless stated otherwise or where the use of “one or more” is clearly inappropriate. The terms “comprise,” “comprising,” “include,” and “including” are interchangeable and not intended to be limiting. For example, the term “including” shall be interpreted to mean “including, but not limited to.”
The term “mobile device” shall be construed to mean any electronic device configured to process digital data and wirelessly communicate with one or more NFC tags using NFC or similar technology. By way of example, and not limitation, some examples of mobile devices include a user device, cellular phone, user equipment, terminal, mobile phone, smart phone, Internet phone, tablet computer, laptop computer, personal digital assistant, music player, multimedia player, gaming device, game console, and so forth.
The term “remote electronic device” shall be construed to mean any electronic device configured to implement one or more actions based on instructions or commands generated by a mobile device. By way of example, and not limitation, some examples of remote electronic devices include a computing device, navigation system, television (TV) device, set-top box, multimedia system, multimedia player, display, speaker, network device, modem, network router, network hub, home gateway, broadband gateway, network appliance, access gateway, networking switch, entertainment system, infotainment system, in-vehicle computer, home appliance (e.g., a dish washing machine, refrigerator, oven, etc.), “smart” appliance, electronically operated door, access control system, security system, programmable electrical socket, programmable electrical switch, dimmer, intercom, telephone, fax machine, server, web service, online service, electrical elements of “smart” home, Internet-of-Things devices, and the like.
The term “NFC tag” shall be construed to mean an electronic apparatus configured to actively or passively emit an electromagnetic signal bearing a unique identifier of NFC tag and, optionally, other information. Passive NFC tags are powered by and read at short ranges via magnetic fields so that they emit electromagnetic waves conveying certain information. In other words, the passive NFC tag, when powered, emits an electromagnetic signal. The passive NFC tag can be powered by electromagnetic field generated by a mobile device. Active NFC tags are powered by a local power source, such as a battery, and repeatedly emit an electromagnetic signal bearing a unique identifier of NFC tag and, optionally, other information. Some examples of NFC tags can include, but not limited to, RFID tags, RFID labels, transponders, Wi-Fi emitter tags, Bluetooth emitter tags, Wi-Fi beacons, Bluetooth beacons, and so forth.
Referring now to the drawings, example embodiments are described. The drawings are schematic illustrations of idealized example embodiments. Thus, the example embodiments discussed herein should not be construed as limited to the particular illustrations presented herein, rather these example embodiments can include deviations and differ from the illustrations presented herein.
Device 105 includes at least a NFC communication module 135, a processor 140, and a memory 145 for storing processor-executable instructions. Other elements of device 105 shall be evident from
Processor 140 can also identify one or more identifying events, such as a period of presence of device 105 near a particular NFC tag 110, determination of a current geographical location of device 105, determination of a current orientation of device 105, determination of a current status of device 105 (e.g., if it is locked or unlocked), and so forth. The determination of geographical location of device 105 can be implemented with the help of a Global Positioning System (GPS) module or another common device for determining a geographical location, which can be installed in device 105. The determination of current orientation of device 105 can be determined with the help of inertial sensors, gyroscopes, or motion sensors, which can be installed in device 105. Determination of a period indicating how long device 105 is located near a particular NFC tag 110 can be executed with the help of a clock, which can be installed in device 105.
As indicated above, device 105 can execute one or more actions as response to detection of NFC tag 110 or detection of one of qualifying event. Some examples of these “actions” include, but not limited to, causation of changing a setting of the device, causation of switching an operating mode of the device, causation of activating/deactivating the application, causation of enabling/disabling a feature of the application, and the like. In some embodiments, an action can be a causation of sending a predetermined instruction to one or more remote electronic devices 150. The instruction can be sent via a communications network 155. Communications network 155 can refer to any wired, wireless, or optical networks including, for example, the Internet, intranet, local area network (LAN), Personal Area Network (PAN), Wide Area Network (WAN), Virtual Private Network (VPN), cellular phone networks (e.g., packet switching communications network, circuit switching communications network), Bluetooth radio, Ethernet network, an IEEE 802.11-based radio frequency network, IP communications network, or any other data communication network utilizing physical layers, link layer capability, or network layer to carry data packets, or any combinations of the above-listed data networks.
At operation 205, device 105 receives an electromagnetic signal of NFC tag 110 when device 105 is within a first predetermined distance from the NFC tag (e.g., less than two inches). The electromagnetic signal of NFC tag 110 can be received in response to another electromagnetic signal generated by device 105.
At operation 210, device 105 retrieves an identifier of NFC tag 110 from the electromagnetic signal received at operation 205. The identifier can uniquely identify NFC tag 110. Device 105 can include one or more settings associated with the identifier of NFC tag 110.
At operation 215, device 105 generates a first command based on the identifier. The first command is configured to cause performing one or more of the following: change at least one setting of device 105, switch at least one of operating modes of device 105, activate at least one of the applications of device 105, deactivate at least one of the applications of device 105, enable at least one feature of at least one of the applications of device 105, disable at least one feature of at least one of the applications of device 105, and send a first instruction to at least one of remote electronic devices 150. Further, at operation 215, device 105 can generate a first notification (such as an audio alert, displayable message, or vibration alert) and deliver it to the user prior to or after the generation of first command.
At operation 220, device 105 identifies a qualifying event before or after the first command is generated. In some embodiments, the identification of the qualifying event can include determining, by device 105, that device 105 is located within the first predetermined distance from NFC tag 110 for a predetermined period (e.g., 1, 2, 3, 4, 5 or more seconds). In some embodiments, the identification of the qualifying event can include determining, by device 105, that device 105 is orientated substantially upside down such that a display of device 105 is substantially parallel to and faces the ground. In some embodiments, the identification of the qualifying event can include determining, by device 105, that device 105 is locked. In some embodiments, the identification of the qualifying event can include determining, by device 105, that device 105 unlocked. In some embodiments, the identification of the qualifying event can include determining, by device 105, that device 105 is located within a second predetermined distance (e.g., less than 5 feet) from a predetermined geographical location associated with the identifier of NFC tag 110. This predetermined geographical location associated with the identifier can be, for example, a garage door, piece of furniture, a particular room or office in a building, outdoor location, vehicle, and so forth.
At operation 225, device 105 generates a second command based on the identification of the qualifying event. In one embodiment, the second command is generated after the first command is generated. In another embodiment, the second command is generated and the first command is suppressed upon identification of the qualifying event. The second command can be configured to cause performing one or more of the following: change at least one setting of device 105, switch at least one of operating modes of device 105, activate at least one of the applications of device 105, deactivate at least one of the applications of device 105, enable at least one feature of at least one of the applications of device 105, disable at least one feature of at least one of the applications of device 105, authenticate the identity of an individual, enable at least one or more additional immersive interactions with the at least one application, and send a second instruction to at least one of remote electronic devices 150. Further, at operation 225, device 105 can generate a second notification (such as an audio alert, displayable message, or a double vibration notification) and deliver it to the user prior to or after the generation of second command.
As shown in
Memory 320, according to one example, is configured to store information within computing device 300 during operation. For example, memory 320 can store settings or pre-programmed commands associated with one or more NFC tags 110. Memory 320, in some example embodiments, may refer to a non-transitory computer-readable storage medium or a computer-readable storage device. In some examples, memory 320 is a temporary memory, meaning that a primary purpose of memory 320 may not be long-term storage. Memory 320 may also refer to a volatile memory, meaning that memory 320 does not maintain stored contents when memory 320 is not receiving power. Examples of volatile memories include RAM, dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art. In some examples, memory 320 is used to store program instructions for execution by processor(s) 310. Memory 320, in one example, is used by software applications or mobile applications. Generally, software or mobile applications refer to software applications suitable for implementing at least some operations of the methods as described herein.
Mass storage devices 330 can also include one or more transitory or non-transitory computer-readable storage media or computer-readable storage devices. For example, memory 320 can store instructions for processor(s) 310, settings or pre-programmed commands associated with one or more NFC tags 110. In some embodiments, mass storage devices 330 may be configured to store greater amounts of information than memory 320. Mass storage devices 330 may be also configured for long-term storage of information. In some examples, mass storage devices 330 include non-volatile storage elements. Examples of such non-volatile storage elements include magnetic hard discs, optical discs, solid-state discs, flash memories, forms of electrically programmable memories (EPROM) or electrically erasable and programmable memories, and other forms of non-volatile memories known in the art.
Computing device 300 may also include one or more optional input devices 360. Input device(s) 360 may be configured to receive input from a user through tactile, audio, video, or biometric channels. Examples of input device(s) 360 may include a keyboard, keypad, mouse, trackball, touchscreen, touchpad, microphone, video camera, image sensor, fingerprint sensor, or any other device capable of detecting an input from a user or other source, and relaying the input to computing device 300 or components thereof.
Optional output device(s) 350 may be configured to provide output to a user through visual or auditory channels. Output device(s) 350 may include a video graphics adapter card, display, such as liquid crystal display (LCD) monitor, light emitting diode (LED) monitor, or organic LED monitor, sound card, speaker, lighting device, projector, or any other device capable of generating output that may be intelligible to a user. Output device(s) 350 may also include a touchscreen, presence-sensitive display, or other input/output capable displays known in the art.
Computing device 300 can also include network interface(s) 370. Network interface(s) 370 can be utilized to communicate with external devices via one or more communications networks such as communications network 155 or any other wired, wireless, or optical networks. Network interface(s) 370 may be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information. Network interface(s) 370 can also include a NFC communication module 135 for communicating, sensing, or receiving data from NFC tags 110.
An operating system of computing device 300 may control one or more functionalities of computing device 300 or components thereof. For example, the operating system may interact with the software or mobile applications and may facilitate one or more interactions between the software/mobile applications and processor(s) 310, memory 320, mass storage devices 330, input device(s) 360, output device(s) 350, and network interface(s) 370. The operating system may interact with or be otherwise coupled to software applications or components thereof. In some embodiments, software or mobile applications may be included in the operating system.
Thus, methods and systems for NFC-based interactions have been described. Although embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes can be made to these example embodiments without departing from the broader spirit and scope of the present application. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Number | Name | Date | Kind |
---|---|---|---|
3147341 | Gibson, Jr. | Sep 1964 | A |
3200193 | Biggs et al. | Aug 1965 | A |
4016540 | Hyatt | Apr 1977 | A |
4090216 | Constable | May 1978 | A |
4104625 | Bristow et al. | Aug 1978 | A |
4355334 | Fitzgibbon et al. | Oct 1982 | A |
4445187 | Best | Apr 1984 | A |
4475132 | Rodesch | Oct 1984 | A |
4514727 | Van Antwerp | Apr 1985 | A |
4569026 | Best | Feb 1986 | A |
4677569 | Nakano et al. | Jun 1987 | A |
4704696 | Reimer et al. | Nov 1987 | A |
4752069 | Okada | Jun 1988 | A |
4757525 | Matthews et al. | Jul 1988 | A |
4952917 | Yabuuchi | Aug 1990 | A |
5057744 | Barbier et al. | Oct 1991 | A |
5167010 | Elm et al. | Nov 1992 | A |
5241671 | Reed et al. | Aug 1993 | A |
5274560 | LaRue | Dec 1993 | A |
5321833 | Chang et al. | Jun 1994 | A |
5358259 | Best | Oct 1994 | A |
5377997 | Wilden et al. | Jan 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5446714 | Yoshio et al. | Aug 1995 | A |
5498002 | Gechter | Mar 1996 | A |
RE35314 | Logg | Aug 1996 | E |
5598297 | Yamanaka et al. | Jan 1997 | A |
5617407 | Bareis | Apr 1997 | A |
5649861 | Okano et al. | Jul 1997 | A |
5659732 | Kirsch | Aug 1997 | A |
5704837 | Iwasaki et al. | Jan 1998 | A |
5724567 | Rose et al. | Mar 1998 | A |
5732232 | Brush, II et al. | Mar 1998 | A |
5765150 | Burrows | Jun 1998 | A |
5786801 | Ichise | Jul 1998 | A |
5802361 | Wang et al. | Sep 1998 | A |
5818553 | Koenck et al. | Oct 1998 | A |
5823879 | Goldberg et al. | Oct 1998 | A |
5870740 | Rose et al. | Feb 1999 | A |
5890122 | Van Kleeck et al. | Mar 1999 | A |
5947823 | Nimura | Sep 1999 | A |
5948040 | DeLorme et al. | Sep 1999 | A |
5974412 | Hazlehurst et al. | Oct 1999 | A |
5977968 | Le Blanc | Nov 1999 | A |
6001013 | Ota | Dec 1999 | A |
6012053 | Pant et al. | Jan 2000 | A |
6017272 | Rieder | Jan 2000 | A |
6067539 | Cohen | May 2000 | A |
6098061 | Gotoh et al. | Aug 2000 | A |
6155924 | Nakagawa et al. | Dec 2000 | A |
6168524 | Aoki et al. | Jan 2001 | B1 |
6183366 | Goldberg et al. | Feb 2001 | B1 |
6202058 | Rose et al. | Mar 2001 | B1 |
6210273 | Matsuno | Apr 2001 | B1 |
6264560 | Goldberg et al. | Jul 2001 | B1 |
6273818 | Komoto | Aug 2001 | B1 |
6283861 | Kawai et al. | Sep 2001 | B1 |
6296570 | Miyamoto et al. | Oct 2001 | B1 |
6319121 | Yamada et al. | Nov 2001 | B1 |
6327590 | Chidlovskii et al. | Dec 2001 | B1 |
6363378 | Conklin et al. | Mar 2002 | B1 |
6366272 | Rosenberg et al. | Apr 2002 | B1 |
6375571 | Ohnuma et al. | Apr 2002 | B1 |
6409604 | Matsuno | Jun 2002 | B1 |
6413163 | Yamauchi et al. | Jul 2002 | B1 |
6419580 | Ito | Jul 2002 | B1 |
6428411 | Togami | Aug 2002 | B1 |
6434556 | Levin et al. | Aug 2002 | B1 |
6456977 | Wang | Sep 2002 | B1 |
6508706 | Sitrick et al. | Jan 2003 | B2 |
6529875 | Nakajima et al. | Mar 2003 | B1 |
6533663 | Iwao et al. | Mar 2003 | B1 |
6538666 | Ozawa et al. | Mar 2003 | B1 |
6554707 | Sinclair et al. | Apr 2003 | B1 |
6556983 | Altschuler et al. | Apr 2003 | B1 |
6571208 | Kuhn et al. | May 2003 | B1 |
6572478 | Miyamoto et al. | Jun 2003 | B2 |
6582230 | Aoshima et al. | Jun 2003 | B1 |
6582309 | Higurashi et al. | Jun 2003 | B2 |
6585599 | Horigami et al. | Jul 2003 | B1 |
6652384 | Kondo et al. | Nov 2003 | B2 |
6684127 | Fujita et al. | Jan 2004 | B2 |
6705945 | Gavin et al. | Mar 2004 | B2 |
6729954 | Atsumi et al. | May 2004 | B2 |
6899628 | Leen et al. | May 2005 | B2 |
6920426 | Takechi | Jul 2005 | B2 |
6928433 | Goodman et al. | Aug 2005 | B2 |
6935954 | Sterchi et al. | Aug 2005 | B2 |
6966832 | Leen et al. | Nov 2005 | B2 |
6979267 | Leen et al. | Dec 2005 | B2 |
7029394 | Leen et al. | Apr 2006 | B2 |
7062561 | Reisman | Jun 2006 | B1 |
7085722 | Luisi | Aug 2006 | B2 |
7137891 | Neveu et al. | Nov 2006 | B2 |
7155157 | Kaplan | Dec 2006 | B2 |
7172118 | Urken | Feb 2007 | B2 |
7180529 | Covannon et al. | Feb 2007 | B2 |
7202613 | Morgan et al. | Apr 2007 | B2 |
7233904 | Luisi | Jun 2007 | B2 |
7438642 | Walker et al. | Oct 2008 | B2 |
7452273 | Amaitis et al. | Nov 2008 | B2 |
7455589 | Neveu et al. | Nov 2008 | B2 |
7572187 | Van Luchene | Aug 2009 | B2 |
7613616 | Luisi | Nov 2009 | B2 |
7717782 | Van Luchene | May 2010 | B2 |
7731589 | Kataoka et al. | Jun 2010 | B2 |
7764026 | Dowling et al. | Jul 2010 | B2 |
7880746 | Marks et al. | Feb 2011 | B2 |
7946909 | Neveu et al. | May 2011 | B2 |
7965859 | Marks | Jun 2011 | B2 |
8295549 | Marks et al. | Oct 2012 | B2 |
8442403 | Weaver | May 2013 | B2 |
8714983 | Kil | May 2014 | B2 |
8964298 | Haddick et al. | Feb 2015 | B2 |
9108108 | Zalewski et al. | Aug 2015 | B2 |
9126116 | Turner et al. | Sep 2015 | B2 |
9155960 | Argiro | Oct 2015 | B2 |
9833707 | Watson | Dec 2017 | B2 |
9950259 | Watson | Apr 2018 | B2 |
20010009867 | Sakaguchi et al. | Jul 2001 | A1 |
20020068626 | Takeda et al. | Jun 2002 | A1 |
20020082065 | Fogel et al. | Jun 2002 | A1 |
20020103031 | Neveu et al. | Aug 2002 | A1 |
20020169617 | Luisi | Nov 2002 | A1 |
20030109305 | Gavin et al. | Jun 2003 | A1 |
20040029625 | Annunziata | Feb 2004 | A1 |
20040166935 | Gavin et al. | Aug 2004 | A1 |
20050054290 | Logan | Mar 2005 | A1 |
20050174889 | Marcantonio et al. | Aug 2005 | A1 |
20060039017 | Park et al. | Feb 2006 | A1 |
20060178179 | Neveu et al. | Aug 2006 | A1 |
20060190270 | Luisi | Aug 2006 | A1 |
20070037605 | Logan | Feb 2007 | A1 |
20070060231 | Neveu et al. | Mar 2007 | A1 |
20070087797 | Van Luchene | Apr 2007 | A1 |
20070099709 | Okada | May 2007 | A1 |
20070244704 | Luisi | Oct 2007 | A1 |
20070257928 | Marks et al. | Nov 2007 | A1 |
20070279427 | Marks | Dec 2007 | A1 |
20080064019 | Kaufman et al. | Mar 2008 | A1 |
20080109491 | Gupta | May 2008 | A1 |
20080167106 | Lutnick | Jul 2008 | A1 |
20090063463 | Turner et al. | Mar 2009 | A1 |
20100041475 | Zalewski et al. | Feb 2010 | A1 |
20100111374 | Stoica | May 2010 | A1 |
20100138764 | Hatambeiki et al. | Jun 2010 | A1 |
20100171430 | Seydoux | Jul 2010 | A1 |
20100213873 | Picard et al. | Aug 2010 | A1 |
20100241496 | Gupta | Sep 2010 | A1 |
20100312366 | Madonna et al. | Dec 2010 | A1 |
20130344960 | Perry et al. | Dec 2013 | A1 |
20140121009 | Watson | May 2014 | A1 |
20140132628 | Hoff, III | May 2014 | A1 |
20140135631 | Brumback | May 2014 | A1 |
20140142403 | Brumback | May 2014 | A1 |
20140191848 | Imes | Jul 2014 | A1 |
20140329613 | Savarese | Nov 2014 | A1 |
20140361872 | Garcia | Dec 2014 | A1 |
20150304804 | Lotito | Oct 2015 | A1 |
20150347738 | Ulrich | Dec 2015 | A1 |
20160018934 | Turner et al. | Jan 2016 | A1 |
20160057565 | Gold | Feb 2016 | A1 |
20160282899 | Inagaki | Sep 2016 | A1 |
20170368459 | Watson | Dec 2017 | A1 |
Number | Date | Country |
---|---|---|
1201180 | Dec 1998 | CN |
1385783 | Dec 2002 | CN |
1848742 | Oct 2006 | CN |
101836362 | Sep 2010 | CN |
101849436 | Sep 2010 | CN |
101968827 | Feb 2011 | CN |
101968827 | May 2014 | CN |
104797311 | Jul 2015 | CN |
19905076 | Jun 2002 | DE |
0789296 | Aug 1997 | EP |
0850673 | Jul 1998 | EP |
0898237 | Feb 1999 | EP |
0901803 | Mar 1999 | EP |
0913175 | May 1999 | EP |
1029569 | Aug 2000 | EP |
1078661 | Feb 2001 | EP |
1262955 | Dec 2002 | EP |
1355707 | Oct 2003 | EP |
1388357 | Feb 2004 | EP |
1434627 | Jul 2004 | EP |
1630754 | Mar 2006 | EP |
1650706 | Apr 2006 | EP |
1793588 | Jun 2007 | EP |
1262955 | Mar 2010 | EP |
2322257 | May 2011 | EP |
2322257 | Apr 2018 | EP |
2355627 | Sep 1998 | GB |
2351637 | Jan 2001 | GB |
2356785 | May 2001 | GB |
2411065 | Aug 2005 | GB |
S59202779 | Nov 1984 | JP |
H07178246 | Jul 1995 | JP |
H08155140 | Jun 1996 | JP |
H09265379 | Oct 1997 | JP |
H10272258 | Oct 1998 | JP |
H10295935 | Nov 1998 | JP |
H11000467 | Jan 1999 | JP |
H11070273 | Mar 1999 | JP |
H11119791 | Apr 1999 | JP |
H11197359 | Jul 1999 | JP |
2000024322 | Jan 2000 | JP |
2000116946 | Apr 2000 | JP |
2000176154 | Jun 2000 | JP |
2000334168 | Dec 2000 | JP |
2001009156 | Jan 2001 | JP |
2001029649 | Feb 2001 | JP |
2001079265 | Mar 2001 | JP |
2001157779 | Jun 2001 | JP |
2001198350 | Jul 2001 | JP |
2002052256 | Feb 2002 | JP |
2002085835 | Mar 2002 | JP |
2002092474 | Mar 2002 | JP |
2002159740 | Jun 2002 | JP |
2002166048 | Jun 2002 | JP |
2002191868 | Jul 2002 | JP |
2003047768 | Feb 2003 | JP |
2003228585 | Aug 2003 | JP |
2004529678 | Sep 2004 | JP |
2005505357 | Feb 2005 | JP |
3741687 | Feb 2006 | JP |
2006031670 | Feb 2006 | JP |
2006087459 | Apr 2006 | JP |
2006099125 | Apr 2006 | JP |
3865721 | Jan 2007 | JP |
2007249899 | Sep 2007 | JP |
2011025044 | Feb 2011 | JP |
5580131 | Aug 2014 | JP |
1020000072753 | Dec 2000 | KR |
100464877 | Dec 2004 | KR |
100469822 | Jan 2005 | KR |
1020020044919 | Jun 2005 | KR |
1020070052493 | Sep 2008 | KR |
101226305 | Jan 2013 | KR |
WO1994018790 | Aug 1994 | WO |
WO9714102 | Apr 1997 | WO |
WO2001082626 | Nov 2001 | WO |
WO2002060548 | Aug 2002 | WO |
WO2003031003 | Apr 2003 | WO |
WO2005040900 | May 2005 | WO |
WO2006033360 | Mar 2006 | WO |
WO2007130641 | Nov 2007 | WO |
WO2009052659 | Apr 2009 | WO |
WO2009060376 | May 2009 | WO |
WO2014070677 | May 2014 | WO |
Entry |
---|
Arcadia, vol. 2, No. 12, Enterbrain, Inc., Dec. 1, 2001, pp. 56-63. |
Konami Corporation, Konami Official Guide Perfect Series, Tokimeki Memorial—Forever with You: Official Guide, First Edition, Jun. 29, 1997, 19 pages [37 pages with translation]. |
Login, vol. 21, No. 4, Enterbrain, Inc. Apr. 1, 2002, pp. 70-77. |
Reynolds, Craig, “Flocks, Herds, and Schools: A Distributed Behavioral Model,” Proceedings of SIGGRAPH '87, Computer Graphics 21(4), Jul. 1987, 13 pages. |
Reynolds, Craig, “Interaction with Groups of Autonomous Characters,” Proceedings of Game Developer Conference 2000, San Francisco, CA 2000, 12 pages. |
Reynolds, Craig, “Steering Behaviors for Autonomous Characters,” Proceedings of Game Developers Conference 1999, 21 pages. |
Super Mario Brothers: Complete Cheat Book, Tokuma Publishing Co., Ltd., Nov. 20, 1985, p. 9. |
Yu, Bin et al., “A Social Mechanism of Reputation Management in Electronic Communities,” Proceedings of 4th International Workshop on Cooperative Information Agents, 2000, 12 pages. |
Aguilera, S. et al., “Impaired Persons Facilities Based on a Multi-Modality Speech Processing System,” Proc. on Speech & Language Tech., 1993, 4 pages. |
Arons, B., “Authoring and Transcription Tools for Speech-Based Hypermedia,” Proc. of American Voice I/O Society, 1991, 6 pages. |
Arons, B., “Hyperspeech: Navigating in Speech-Only Hypermedia,” Proc. of Hypertext, Dec. 1991, pp. 133-146. |
Bennacef, S.K., A Spoken Language System for Information Retrieval Proc. of ICSLP, Sep. 1994, 4 pages. |
Gauvain, J.L. et al., “Speech Recognition for an Information Kiosk,” Proc. of ICSLP, 1996, 4 pages. |
Gauvain, J.L. et al., “Spoken Language Component of the MASK Kiosk,” Human Comfort and Security fo Information Systems, Oct. 26, 1995, 11 pages. |
Gauvain, J.L. et al., “The LIMSI Continuous Speech Dictation System,” Proc. ARPA Human Lang. & Technology, Apr. 1994, 6 pages. |
Gauvain, J.L. et al., “The LIMSI Continuous Speech Dictation System: Evaluation on the ARPA Wall Street Journal Task,” Proc. of the IEEE-ICASSP, 1994, 4 pages. |
Goddeau, D. et al., “Galaxy: A Human-Language Interface to On-Line Travel Information,” Proc. of ICSLP, 1994, 4 pages. |
House, D., “Spoken-Language Access to Multimedia (SLAM): Masters Thesis,” Oregon Graduate Inst., Dept. of CS and Eng., 1995, 59 pages. |
Mostow, Jack et al., “Towards a Reading coach That Listens: Automated Detection of Oral Reading Errors”, Proc. of the 11th Ntl. Conf. On A.I., 1993, 6 pages. |
Russell, M. et al., “Applications of Automatic Speech Recognition to Speech and Language Development in Young Children,” Proc. of ICSLP, 1996, 4 pages. |
Lamel, L.F. et al., “Recent Developments in Spoken Language Systems for Information Retrieval,” ESCA ETRW Spoken Dialog Systems, 1995, 4 pages. |
Language Industry Monitor, “Janet Baker's Optimism,” 1992, 2 pages. |
Dorsey et al., Design and Simulation of Opera Lighting and Projection Effects, Program of Computer Graphics, Computer Graphics, Jul. 1991, vol. 25, No. 4, New York, pp. 41-50. |
Calvert, Justin, SCEE's latest plans for its Eye Toy peripheral will effectively turn the PlayStation 2 into a videophone. First screens inside, SCEE announces Eye Toy; Chat, Game spot, http://www.gamespot.com/news/6095429.html, May 5, 2004, 1 page. |
Nayer et al., Lighting Sensitivity Display, ACM Transactions on Graphics, Oct. 2004, vol. 23, No. 4, pp. 963-979, New York, pp. 963-979. |
Spagnoletti, Philips Ambilight TV, Home Entertainment, engadget, Jul. 8, 2004, 1 page. |
Wikipedia Article on Diablo II, http://en.wikipedia.org/wiki/Diablo_II, 2010, 8 pages. |
Diablo II Frost nova Description, http://diablo2.diablowiki.net/Frost_Nova, Oct. 30, 2009, 5 pages. |
Diefendorff, “Sony's Emotionally Charged Chip”, Microprocessor Report, vol. 13, No. 5, Apr. 19, 1999, 8 pages. |
Sony Computer Entertainment, Inc., “Fantavision Game Manual”, 2000, 18 pages. |
Wikipedia, “Aimbot”, http://en.wikipedia.org/wiki/Aimbot (last updated Jun. 3, 2005; last accessed Jul. 5, 2005), 1 page. |
Agarwal et al., “Ranking database Queries Using User Feedback: A Neural network Approach”, CS511 Project, Advanced Database Management Systems, Fall 2006, 9 pages. |
Agichtein et al., “Improving Web Search Ranking by Incorporating User Behavior Information”, SIGIR Aug. 6-11, 2006, ACM, 8 pages. |
Bhattacharjee et al., “Incentive Based ranking Mechanisms”, Position Paper, Department of Computer Science, Stanford University, 2006, 7 pages. |
Chaudhuri et al., “Probabilistic Information Retrieval Approach for Ranking of Databased Query Results,” 2006, 43 pages. |
Chidlovskii et al., “Collaborative Re-Ranking of Search Results”, Xerox Research Centre Europe, AAAI-2000, Workshop on AI for Web Search, 2001, 5 pages. |
Kang et al., Establishing Value Mappings Using Statistical Models and User Feedback, CIKM '05, Oct. 31-Nov. 5, 2005, ACM, 8 pages. |
W3C Working Draft Jun. 18, 2007, The XMLHttpRequest Object, W3C, http://www.w3.org/TR/2007/WD-XMLHttpRequest-20070618/, 12 pages. |
European Search Report, dated Jan. 19, 2004, European Patent Application No. 02009339.9, 2 pages. |
“Office Action”, European Patent Application No. 02009339.9, dated Jan. 19, 2006, 5 pages. |
“Office Action”, European Patent Application No. 02009339.9, dated Dec. 11, 2006, 3 pages. |
“Office Action”, European Patent Application No. 02009339.9, dated Jul. 4, 2007, 4 pages. |
“Office Action”, European Patent Application No. 02009339.9, dated Sep. 17, 2008, 4 pages. |
“Notice of Allowance”, European Patent Application No. 02009339.9, dated Nov. 16, 2009, 33 pages. |
“International Search Report”, Patent Cooperation Treaty Application No. PCT/US02/32438, dated Feb. 4, 2013, 1 page. |
“International Search Report” Patent Cooperation Treaty Application No. PCT/US2007/010944, dated Feb. 18, 2008, 5 pages. |
“Search Report”, European Application No. 02769043.7, dated Dec. 21, 2004, 4 pages. |
“Office Action”, European Patent Application No. 02769043.7, dated Apr. 28, 2005, 6 pages. |
“Office Action”, European Patent Application No. 02769043.7, dated Oct. 24, 2006, 5 pages. |
“Office Action”, European Patent Application No. 02769043.7, dated Jan. 31, 2007, 3 pages. |
“International Search Report” Patent Cooperation Treaty Application No. PCT/US02/02710, dated Sep. 12, 2002, 3 pages. |
“Office Action”, European Patent Application No. 02704295.1, dated Apr. 23, 2004, 3 pages. |
“Office Action”, European Patent Application No. 02704295.1, dated Dec. 15, 2004, 4 pages. |
“Office Action”, European Patent Application No. 02704295.1, dated Apr. 12, 2006, 10 pages. |
“Office Action”, China Patent Application No. 201010245413.8, dated Sep. 7, 2012, 3 pages [11 pages with translation]. |
“Office Action”, European Patent Application No. 10007803.9, dated Aug. 8, 2013, 6 pages. |
“Office Action”, Japan Patent Application No. 2010-167803, dated Mar. 26, 2013, 3 pages [6 pages with translation]. |
Rejection dated Mar. 16, 2012 in KR Application No. 10-2010-0072613. |
“International Search Report & Written Opinion”, Patent Cooperation Treaty Application No. PCT/US2013/067135, dated May 1, 2014, 18 pages. |
Rejection dated Mar. 2, 2004 in KR Application No. 10-2002-00265621. |
Decision to Grant dated Oct. 5, 2005 in JP Application 2002-5607373. |
Rejection dated Nov. 16, 2003 in JP Application 2002-5607373. |
“Office Action”, China Patent Application No. 201010245413.8, dated Nov. 5, 2013, 4 pages [12 pages with translation]. |
“European Search Report”, European Patent Application No. 03254168.2, dated Apr. 23, 2004, 3 pages. |
“Office Action”, European Patent Application No. 03254168.2, dated Sep. 29, 2006, 4 pages. |
Stern, Andrew. Virtual Babyz: Believeable agents with Narrative Intelligence, Narrative Intelligence AAAI Symposium, Nov. 1999. Online. Viewed Apr. 28, 2006. http://www.cs.cmu.edu/afs/cs/user/michaelm/www/nidocs/Stern.html, 7 pages. |
“Babyz Features Page” Online. Viewed May 3, 2006. www.babyz.net/features.html, 1 page. |
“Babyz”. Wikipedia online reference. Viewed May 1, 2006. http://en.wikipedia.or.q/wiki/babyz, 2 pages. |
Northwestern University CS395, Game Design Course “Simulation and Modeling: Under the Hood of the Sims”, Spring 2002. http://www.cs.northwestern.edu/%7Eforbus/c95-gd/lectures/The_Sims_Under_the_Hood_files/frame.htm, 32 pages. |
Simpson, Dan. “The Complete Sims Guide” Feb. 6, 2005, pertinent sections printed from the Internet, may also be found in its entirety at: http://www.neoseeker.com/resourcelink.html?rlid=16238&rid=15516, 18 pages. |
“Sequence Paradium 2—Laughter in the Dark—Tactical Guidebook”, First Edition, Keibunsha Inc., Feb. 10, 2005, pp. 5-32. |
Examiner's Affidavit, Nov. 9, 2005, 5 pages. |
Stern, Andrew. “Andrew Stern”. Online. Viewed Apr. 28, 2006. http://quvu.net/andrew/resume.html, 6 pages. |
Stewart, Nick. “The Adrenaline Vault Review of the Sims”, Mar. 9, 2000. Printed from the Internet, 5 pages. |
Decision to Grant / Notice of Allowance dated Jun. 3, 2014 in JP 2010167803 filed Jul. 27, 2010. |
“Office Action”, Japan Patent Application No. 2003-288128, dated Mar. 15, 2005. |
“Office Action”, Japan Patent Application No. 2003-288128, dated Dec. 13, 2005. |
Notice of Allowance dated Oct. 31, 2012 in KR 10-2010-0072613. |
“Office Action”, European Patent Application No. 10007803.9, dated Sep. 29, 2014, 4 pages. |
“Office Action”, China Patent Application No. 201380056819.3, dated Nov. 15, 2016, 6 pages [16 pages including translation]. |
“Office Action,” Chinese Patent Application No. 201380056819.3, dated Jun. 23, 2017, 3 pages [7 pages including translation]. |
“Notice of Allowance”, European Patent Application No. 10007803.9, Oct. 23, 2017, 7 pages. |
“Office Action”, Chinese Patent Application No. 201380056819.3, dated Dec. 18, 2017, 3 pages [7 pages including translation]. |
“Notice of Allowance”, Chinese Patent Application No. 201380056819.3, dated Jun. 8, 2018, 2 pages [4 pages including translation]. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/047694, dated Sep. 21, 2018, 12 pages. |