Uniquely Identifiable Articles of Fabric Configured for Data Communication

Information

  • Patent Application
  • 20210192302
  • Publication Number
    20210192302
  • Date Filed
    December 01, 2020
    4 years ago
  • Date Published
    June 24, 2021
    3 years ago
Abstract
A fabric or article has a pattern on an exposed surface that encodes a unique identification code, wherein the pattern is configured to be read and decoded by a mobile computing device in a manner wherein the selected article is contextually recognizable. A two-dimensional plaid pattern may be used to carry the identification code, which can be decoded according to described methods. The pattern may be a plaid code or may be incorporated into a representational aesthetic environment. Fabrics may include optical transmitters embedded therein and configured to transmit information that can be detected by a mobile computing device, directed to the wearable article, and executing a suitable application. Fabrics may include optical receives configured to receive information such as from a free-space optical communication system of certain embodiments.
Description
TECHNICAL FIELD

The present invention relates to methods and systems for unique identification of articles of fabric, and more particularly to use of such methods and systems in the context of social networking.


Summary of the Embodiments

In accordance with one embodiment of the invention, there is provided an article, the article being a selected one of a set of articles. In this embodiment, each article of the set includes a fabric and is associated with a unique identification code. Additionally, the selected article has a pattern distributed over at least 10% of an exposed surface of the selected article. In this embodiment, the pattern encodes the identification code associated with the selected article, wherein the pattern is configured to be read and decoded by a mobile computing device in a manner wherein the selected article is contextually recognizable.


In a related embodiment, the identification code is represented in the pattern in a large format, wherein each item of information content in the identification code is represented in the pattern by a set of attributes, each attribute of the set of attributes having a minimum dimension of 1 mm.


In yet another related embodiment, each attribute of the set of attributes has a minimum dimension of 2 mm. Optionally, each attribute of the set of attributes has a minimum dimension of 3 mm. Optionally, the pattern is distributed over at least 30% of the exposed surface of the selected article.


In a further related embodiment, the exposed surface of the selected article includes a front and a back of the selected article. Optionally, the pattern includes an error correction code. Alternatively or in addition, the error correction code is a forward error correction code.


Optionally, the pattern includes a repetition of encoding of the identification code. Further optionally, the pattern encodes a minimum of 24 bits of information comprising the identification code. Optionally, the pattern encodes a minimum of 32 bits of information comprising the identification code. Also optionally, the pattern encodes a minimum of 64 bits of information comprising the identification code.


In a related embodiment, the unique identification code, encoded by the pattern, was transmitted by a server system for use in manufacturing of the article. In a further related embodiment, the unique identification code has been associated with an owner of the article by updating information in the server system in connection with a sale of the article to the owner.


In yet another related embodiment, the pattern is not discernible to an ordinary unaided human observer.


Optionally, the pattern includes a plurality of horizontal lines having varying thickness, spacing, and color, wherein the plurality of horizontal lines extending over at least 80% of a first dimension of the exposed surface of the article of clothing. Also optionally, the article of clothing is from a group consisting of a shirt, jacket, sweater, and vest, wherein the first dimension is parallel to a line drawn from shoulder-to-shoulder of the article of clothing.


In another embodiment, the invention provides a server-based method for identifying a specific article of fabric in a social context, the method includes computer processes which include:


receiving at a server system a request message, from a first instance of a fabric identification application executing on a mobile computing device of a regarding individual who has caused the mobile computing device to capture an image in which at least a part of the specific article appears, the request message containing identity data corresponding to a pattern on the article, the pattern encoding a unique identification code associated with the specific article and the pattern configured to render the article contextually recognizable,


processing by the server system the identity data, in relation to a database system storing identification codes for a set of articles in relation to corresponding user information, to identify a specific user associated with the specific article of fabric; and


sending by the server system a reply message to the application executing on the mobile computing device that, consistent with permissions of the specific user, includes user-defined content, such content defined by the specific user.


Optionally, the user-defined content includes personal information concerning the specific user. Also optionally, the user-defined content includes music selected according to preferences of the specific user.


In a related embodiment, the invention provides a method that further includes, before sending the reply message by the server system:


receiving by the server system, from the first instance of the fabric identification application executing on the mobile computing device of the regarding individual, first geolocation data defining a location of the computing device of the regarding individual;


receiving, by the server system, from a second instance of the fabric identification application executing on a mobile computing device of the specific user, second geolocation data defining a location of the specific user's computing device;


processing by the server system the first and second geolocation data to determine if the mobile computing device of the regarding individual is within a predetermined distance from the specific user's mobile computing device, and, if not within the predetermined distance, configuring the reply message to convey denial of permission to provide personal information about the specific user.


In a further related embodiment, the method includes sending by the server system, after receiving the identity data, to the application executing on the mobile computing device, a confirmatory message including validity information associated with the identity data.


In yet a further related embodiment, the identity data has been derived by the application executing on the mobile computing device from a processed version of the image, the processed version being a result of processing of the image on the mobile computing device.


In another related embodiment, the method includes receiving by the server system the image captured by the mobile computing device and processing by the server system the image to derive the identity data. Optionally, the method further includes configuring by the server system the reply message to the application to initiate a request to a third-party application executing on the mobile computing device using the identity data.


In yet another embodiment, the method includes, if the permissions of the specific user prevent the personal information about the specific user from being included in the reply message, configuring, by the server system, the reply message to redirect the application executing on the mobile computing device of the regarding individual to cause other appropriate content to be displayed thereon.


In another related embodiment, wherein the user-defined content includes a plurality of content items, a specific one thereof being selected, for transmission to the mobile computing device of the regarding individual, according to a set of selection criteria specified by the specific user. Optionally, the set of selection criteria includes an identity of the regarding individual. Also optionally, the set of selection criteria includes an item selected from the group consisting of time of day of receipt by the server system of the request message, date of such receipt, geolocation of the mobile computing device of the regarding individual, and combinations thereof.


In yet another related embodiment, the reply message includes at least some third party-defined content. Optionally, the at least some third party-defined content includes advertising.


In another related embodiment, the fabric identification application is a portion of a social network application, wherein a first instance of the social network application is executing on the mobile computing device of the regarding individual.


In a further related embodiment, the fabric identification application is a portion of a social network application, wherein a first instance of the social network application is executing on the mobile computing device of the regarding individual and a second instance of the social network application is executing on the mobile computing device of the specific user.


In another embodiment, the invention provides a server-based method for identifying a specific article of fabric in a social context, the method comprising computer processes including:


receiving at a server system a request message, from a first instance of a fabric identification application executing on a mobile computing device of a regarding individual who has caused the mobile computing device to capture an image in which at least a part of the specific article appears, the request message containing identity data corresponding to a pattern on the article, the pattern encoding a unique identification code associated with the specific article and the pattern configured to render the article contextually recognizable,


processing by the server system the identity data, in relation to a database system storing identification codes for a set of articles in relation to corresponding user information, to identify a specific user associated with the specific article of fabric; and


sending by the server system a reply message to the application executing on the mobile computing device that, consistent with permissions of the specific user, includes third party-defined content.


In another embodiment, the invention provides a method for alerting a regarding individual having a first mobile computing device that an encoded pattern is present in an article of clothing of a specific user having a second mobile computing device, the encoded pattern not discernible to an ordinary unaided human observer, the method comprising initiating wireless communication from the first mobile computing device to the second mobile computing device, the wireless communication including an alert viewable on a fabric identification application executing on the first mobile computing device that the encoded pattern is not discernible to the ordinary unaided human observer.


In another embodiment, the invention provides a fabric onto which has been impressed a pattern, the pattern including at least one repeatable unit wherein the repeatable unit includes, in a first direction, a first leading strip and a first set of associated data strips and, in a second direction, a second leading strip and a second set of associated data strips, the second direction distinct from the first direction, each data strip having a set of stripes shaped to convey data, each stripe defined by a first transition edge from a first color to a second color and a second transition edge from the second color to a third color, the first transition having a distance D1 from a leading edge of the data strip and the second transition having a distance D2 from the leading edge of the data strip, wherein D2>D1, and D1 and D2 collectively encode data. Optionally, this embodiment includes all of the features associated with the embodiment described in the first paragraph of this Summary of Embodiments. Also optionally, the first and second directions correspond to local directions associated with a warp and a weft of the fabric respectively, and the warp and weft directions vary over at least a portion of the fabric.


In a related embodiment, the first and second directions are orthogonal to one another. Optionally, the first direction is vertical and the second direction is horizontal.


In another related embodiment, the first set encodes data distinct from the data encoded by the second set.


In yet another related embodiment, each of the first and second leading strips comprises stripes of the first and second color, the first leading strip having a stripe of the first color with a minimum width W1 and the second leading strip having a stripe of the first color with a minimum width W2, wherein W1 does not equal W2.


In another related embodiment, the repeatable unit has no more than three data strips in the first set and no more than three data strips in the second set. Optionally, the repeatable unit has a dimension of at least 75 mm in the first direction and a dimension of at least 75 mm in the second direction.


In yet another related embodiment, the repeatable unit has a dimension of at least 75 mm in the first direction and a dimension of at least 75 mm in the second direction.


In another related embodiment, the surface has a dimension and the pattern includes no more than five repeatable units along the dimension. Optionally, the surface has a dimension and the pattern includes no more than ten repeatable units along the dimension.


In yet another related embodiment, the repeatable unit has no more than five data strips in the first set and no more than five strips in the second set. Optionally, the repeatable unit has no more than eight data strips in the first set and no more than eight strips in the second set.


In another embodiment, the invention provides a tangible item onto which has been applied a pattern, the pattern including at least one repeatable unit wherein the repeatable unit includes, in a first direction, a first leading strip and a first set of associated data strips and, in a second direction, a second leading strip and a second set of associated data strips, the second direction distinct from the first direction, each data strip having a plurality of stripes shaped to convey data, each stripe defined by a first transition edge from a first color to a second color and a second transition edge from the second color to a third color, the first transition having a distance D1 from a leading edge of the data strip and the second transition having a distance D2 from the leading edge of the data strip, wherein D2>D1, and D1 and D2 collectively encode data.


In another related embodiment, the repeatable unit has no more than three data strips in the first set and no more than three data strips in the second set. Optionally, the repeatable unit has a dimension of at least 75 mm in the first direction and a dimension of at least 75 mm in the second direction.


In yet another related embodiment, the repeatable unit has a dimension of at least 75 mm in the first direction and a dimension of at least 75 mm in the second direction.


In another related embodiment, the surface has a dimension and the pattern includes no more than five repeatable units along the dimension. Optionally, the surface has a dimension and the pattern includes no more than ten repeatable units along the dimension.


In yet another related embodiment, the repeatable unit has no more than five data strips in the first set and no more than five strips in the second set. Optionally, the repeatable unit has no more than eight data strips in the first set and no more than eight strips in the second set.


In another embodiment, there is provided a method of decoding an image of a pattern in fabric, the pattern encoding text, the image having been captured by a camera, the method carried out to capture the encoded text and employing computer processes. In this embodiment, the method includes:


selecting a first portion of the image for analysis;


slicing the image into sub-images; and


processing each of the sub-images.


In turn, processing each of the sub-images includes:


locating edge boundaries of the sub-image;


mapping the edge boundaries into a set of symbols; and


determining for each decoded symbol set, determining whether it applies to a weft or to a warp.


In a further related embodiment, processing each of the sub-images further comprises validating each symbol set for closeness of fit and validating each symbol set against a set of design rules. In another related embodiment, processing each of the sub-images further comprises collecting votes for symbol candidates for each sub-image and determining a wining symbol candidate for reach sub-image.


In another embodiment there is provided an article, the article being a selected one of a set of articles, each article of the set comprising a fabric and being associated with a unique identification code, the selected article having a pattern, distributed over at least 10% of an exposed surface of the selected article, the pattern encoding the identification code associated with the selected article, wherein the pattern is incorporated into a representational aesthetic environment.


In another embodiment there is provided a wearable article comprising a fabric, the fabric including a set of fiber transmitters embedded therein, the transmitters operating in a set of wavelengths selected from the group consisting of visible, invisible, and combinations thereof, the transmitters configured to transmit information that can be detected by a mobile computing device, directed to the wearable article, and executing a suitable application.


In yet another embodiment there is provided a method of decoding data transmitted from a set of fiber transmitters embedded in a fabric of a wearable article, the data being captured in a set of images by a camera of a mobile computing device, the method utilizing computer processes carried out by an application executing on the mobile computing device, the processes comprising: selecting a first portion of the image for analysis; dividing the image into sub-images; and processing each of the sub-images, such processing including: determining possible transmitting source within the sub-image; assigning a specific tracker to follow the source through subsequent images to read the data; and determining for each tracker, whether a valid data stream is present, and, if so, decoding the stream; otherwise, discarding the data and returning tracker to the resource pool.


In a further related embodiment, determining whether a valid data stream is present includes validating each received symbol set against a set of design rules.


In another embodiment, there is provided a computer-implemented method of authenticating communication between a first mobile device detecting an encoded sequence from an article of fabric having embedded therein a set of fiber transmitters, the article worn by a specific user having a second mobile computing device, the method utilizing computer processes executed by a server system. The processes of this embodiment include: receiving by the server system a connection request from the first mobile computing device, such connection request including the encoded sequence; transmitting by the server system a follow-on connection request to the second mobile computing device with a one-time verification code that is downloadable by the second mobile computing device, the second mobile computing device configured to cause the set of fiber transmitters to transmit the verification code for detection by the first mobile computing device; receiving by the server system from the first mobile computing device a signal corresponding to the verification code detected by the first mobile computing device; determining by the server system whether there is a match between the verification code transmitted to the second mobile computing unit and the verification code received by the server system from the first mobile computing unit, and, in the event of a match, sending a connection signal to at least the first mobile computing device to authenticate communication between the first mobile computing device and the second mobile computing device.





BRIEF DESCRIPTION OF THE DRAWINGS

Unless the context otherwise suggests, in these drawings, like elements in related figures are indicated by like numerals. The drawings and the elements depicted therein are not necessarily drawn to consistent scale or to any scale.


The foregoing features of embodiments will be more readily understood by reference to the following detailed description, taken with reference to the accompanying drawings, in which:



FIG. 1 is a diagram of an exemplary embodiment of a fabric-based article of clothing having a pattern and a mobile computing device interacting with the article of clothing.



FIG. 2 is a diagram of an exemplary embodiment of an article of clothing having a pattern embedded with a unique identification code in form of a barcode.



FIG. 3 is a diagram of an exemplary embodiment of a system network including an article of clothing and a mobile computing device.



FIG. 4 is a diagram of an exemplary embodiment of a process for embedding a unique identification in the article of clothing at the point of manufacturing.



FIG. 5 is a diagram of an exemplary embodiment of a process for transferring ownership of the article of clothing from a first owner, such as the manufacturer, to another entity, such as a distributor.



FIG. 6 is a diagram of an exemplary embodiment of a process of transactions taking place at the point of sale.



FIG. 7 is a diagram of an exemplary embodiment of a process for a user registering his or her purchase of the article of clothing with the server system.



FIG. 8 is a diagram of an exemplary embodiment of a process for accessing authorized user information, encoded in an article of clothing, by a mobile computing device.



FIG. 9 is a diagram of an exemplary embodiment of a process for redirecting the user of a mobile computing device in the case where user does not authorize the sharing of user information.



FIG. 10A shows a photograph of an exemplary embodiment of a pattern on an article of clothing in which a unique identification code is embedded in the form of a barcode. FIG. 10B-10C show graphic representations (front and back, respectively) of a pattern on the article of clothing.



FIG. 11 is a graphical representation of an exemplary embodiment of an article of clothing with pattern that encodes information in the orientation(s) of a set of symbols.



FIG. 12 is a graphical representation of an exemplary embodiment of pattern encoding information in the form of font modification.



FIG. 13 is an exemplary embodiment of a QR code that can be used to encode a unique identification code in a pattern on an article of clothing.



FIG. 14 is a front and back views of an exemplary embodiment of an article of clothing having encoded information in the form of a two-dimensional pattern.



FIGS. 15A-15C are graphical representations of exemplary embodiments of patterns encoding information in the form of color content.



FIG. 16 is a graphical representation of an exemplary embodiment of an article of clothing having a pattern encoding information in the positions of a set of symbols.



FIGS. 17A-17B are graphical representations of an exemplary embodiment of a pattern to encode a unique identification code.



FIGS. 18A-18B are diagrams of exemplary embodiments of a network including a user of an article of clothing, an app user having a mobile computing device, and server system.



FIG. 19 is a diagram of an exemplary embodiment of a network including a user of an article of clothing, an app user having a mobile computing device, and server system.



FIGS. 20A-20B shows screenshots of an exemplary embodiment of an application executing on a mobile computing device.



FIG. 21 is a diagram of an exemplary embodiment of architecture of the application in the form of a storyboard.



FIG. 22 is a flowchart of an exemplary embodiment of a server-based method for identifying a specific article of clothing in a social context.



FIG. 23 is a diagram of exemplary stadium seating in which two users are viewing a sporting match between two teams.



FIG. 24 is an exemplary embodiment of a repeatable unit of a plaid code that is encoded with a two-dimensional code.



FIG. 25 is a backpack having an exemplary plaid pattern.



FIG. 26A is an exemplary first set of strips in the first direction for a repeatable unit and FIG. 26B is an exemplary second set of strips in the second direction for a repeatable unit.



FIG. 27 is a plot of distances x and y illustrating constraints on distances x1, y1, x2, and y2 to encode the plaid pattern.



FIG. 28 is an exemplary representation of symbols for encoding information in plot.



FIGS. 29A-29B are exemplary representations of symbols (such as those of FIG. 28) overlaid onto the plot of distances x and y (such as that of FIG. 27).



FIG. 30A is a diagram showing exemplary sets of strips in the first direction.



FIG. 30B is a plot illustrating the positions of valid symbol (x1c, y1c) and invalid symbol (x1a′, y1a′) in the x-y space.



FIGS. 31-33 are flowcharts of exemplary methods of decoding the plaid code described herein.



FIG. 34A is an exemplary start screen for an exemplary application interface.



FIG. 34B is an exemplary scan screen for the exemplary application interface. FIG. 34C is an exemplary timeline screen for the exemplary application interface.



FIG. 35 is an exemplary profile screen for the exemplary application interface.



FIGS. 36A-36D are exemplary article screens for the exemplary application interface.



FIG. 37A is an exemplary claim scan screen for the exemplary application interface. FIG. 37B is an exemplary confirmation screen following the scanning of an eligible article (referred to as a “Lookable”).



FIGS. 38A-38B are exemplary profile edit screens for the exemplary application interface.



FIG. 39 is a shared song screen for the exemplary application interface.



FIG. 40 is a connection screen for the exemplary application interface.



FIGS. 41A, 41B, and 41C illustrate an enhanced embodiment of the present invention for decoding fabric patterns in which repetition of the pattern is not strictly necessary and, when there is repetition, the repeated units can be arbitrarily positioned with respect to one another;



FIG. 41A illustrates a fabric in which there has been embedded, in two distinct dimensions, a unit of the pattern in accordance with an embodiment of the present invention;



FIG. 41B illustrates another embodiment of a fabric in which the pattern has been altered so that the two distinct components are presented in two distinct horizontal rows and yet the pattern remains decodable;



FIG. 41C illustrates yet another embodiment of a fabric in which the pattern has been further altered so that the components are presented in a manner having an arbitrarily oriented repetition and yet the pattern remains decodable;



FIG. 42 is a logical flow diagram showing how processing of image data from the fabric pattern is achieved in accordance with the embodiments of FIGS. 41A, 41B, and 41C wherein repetition of the pattern is not necessary and, when there is repetition, the repeated units can be arbitrarily positioned with respect to one another;



FIG. 43 illustrates processes 4204, 4206, and 4208 of FIG. 42, in which the image is converted to gray-scale, a portion of the converted image is chosen for analysis, and then sliced into sub-images;



FIG. 44 illustrates processes 4210, 4212, 4214, 4216, 4218, 4220, and 4222, in which parallel processing, carried out for each sub-mage, involves locating edge boundaries, attempting to map the boundaries to valid codes, and, if the mapping is successful, decoding the pattern into a weft or warp symbol as the case may be, validating the resulting symbol for closeness of fit, and further validating the resulting symbol against design rules;



FIGS. 45A through 45E illustrate examples of augmented reality experiences provided through graphical elements (e.g., names, avatars, text) overlayed on image frames based on codes identified in encoded articles, in accordance with embodiments of the present invention;


In FIG. 45A, the fabric of a chair is shown to have impressed thereon an encoded pattern, and in FIG. 45B, this code is used to produce an augmented reality experience in which an avatar and name associated with the code are caused to overlie the image of the pattern;



FIG. 45C shows that the avatar can follow the fabric pattern even when the wearer of the fabric is in motion;


In FIG. 45D, the fabric pattern is associated with a backpack and the avatar, name, and greeting are displayed over the pattern of the backpack.


In FIG. 45E, there are present three different backpacks, each with a different pattern, and the augmented reality system overlays on each distinct pattern a distinct name and avatar associated with the distinct pattern;



FIGS. 46A and 46B illustrate that, in different image sizes of the patterned backpack (e.g., owing to different distances between the smartphone camera and the backpack or different zoom settings of the smartphone camera), the augmented reality system overlays on the pattern a correspondingly scaled name and avatar associated with the pattern;



FIG. 47 is a logical flow diagram illustrating processes for a basic encoded pattern system (in the first column) and for an augmented reality system (occupying both columns) in accordance with embodiments of the present invention;



FIG. 48 is an embodiment wherein a pattern is incorporated into a representational aesthetic environment;



FIG. 49 is a plaid-encoded article, in this case a backpack, that may be used in connection with an application executing on a mobile computing device to provide identification of a wearer of the article, based on the plaid code embedded in the article;



FIGS. 50 through 53 are representations of display screens on a mobile computing device executing an application providing identification of the wearer of the plaid-encoded article of FIG. 49, as well as providing a platform for social interaction, with an e-commerce extension;



FIG. 50 is a screen displayed to a third-party user (who will later be identified as John Smith) of the application, after the application has recognized the wearer as Dr. Jane Doe based on the plaid code, and the user of the application has invoked the “share a coffee” functionality of the application, accessed by icon 31;



FIG. 51 is a screen displayed by the same application, this time executing on the phone of the wearer of the plaid-encoded article, in this case Dr. Jane Doe, after the user in FIG. 50 has invoked the “share a coffee” functionality of the application;



FIG. 52 is a screen displayed by the application executing on the same phone as in FIG. 51, after Dr. Jane Doe has graphically invoked the message icon 41 of FIG. 51, informing Dr. Doe in text region 51 that John Smith has shared a coffee with her, and providing graphical button 52 by which she can redeem the shared coffee;



FIG. 53 is a screen displayed by the application executing on the same phone as in FIG. 51, after Dr. Jane Doe has graphically invoked the graphical button 52 to redeem the shared coffee;



FIGS. 54A, 54B, and 54C illustrate embodiments of the present invention wherein a set of fiber transmitters is embedded in a fabric of a wearable article, wherein the transmitters operate in visible or invisible wavelengths;


In FIG. 54A, the set of fibers is embedded in the fabric of a hat;


In FIG. 54B, the set of fibers is embedded in the fabric of an outer garment;


In FIG. 54C, there is provided detail of a light-emitting portion of the fiber in FIG. 54B;



FIG. 55 is a block diagram of logical flow used in decoding data transmitted by a set of fiber transmitters embedded in a fabric of a wearable article in accordance with an embodiment of the present invention;



FIG. 56 is a block diagram of logical flow used in a tracker process in the logical flow of FIG. 55;



FIG. 57 is a schematic diagram showing a cloth-covered shank button, in accordance with one exemplary embodiment;



FIG. 58 is a schematic diagram showing two alternative configurations for a cloth-covered button, in accordance with various alternative embodiments;



FIG. 59 shows exemplary shanks and corresponding rivets, in accordance with one exemplary embodiment;



FIG. 60 shows an exemplary shank cap that mates with the shank of FIG. 59, in accordance with one exemplary embodiment;



FIG. 61 is a schematic diagram showing an exemplary shank cap onto which a number of LEDs have been placed, in accordance with one exemplary embodiment;



FIG. 62 is a schematic diagram three exemplary cloth-covered shank buttons of the type that can be formed as described herein;



FIG. 63 is a schematic diagram showing a hat incorporating a button of the type described herein, in accordance with one exemplary embodiment; and



FIG. 64 is a photograph of an obfuscated coded pattern, in accordance with one exemplary embodiment.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Definitions. As used in this description and the accompanying claims, the following terms shall have the meanings indicated, unless the context otherwise requires:


A “plaid code” is a two-dimensional plaid pattern, encoding information in a fabric article, based in part on geometry of strips incorporated into the pattern, as explained in paragraphs 186-209 and 217-227 of the present patent application and in paragraphs 158-181 and 189-199 of the Prior Application.


A “representational aesthetic environment” is a graphical rendering (as in the case of art that is “representational”) that evokes objects that exist in the physical world.


A “fabric” is a flexible material that is made of woven or knitted yarn or threads or filaments or of felted fibers.


A “set” includes at least one member.


The “exposed surface” of an article is a portion of the entire surface of an article that is normally visible to an observer when the article is in normal use. In cases where the article is a garment worn by a human, the exposed surface of the article includes the front, back, and sides of the garment, but only those portions thereof that are normally visible to an ordinary observer when the garment is worn. (For this definition, we assume that the wearer of the article is standing and that the observer has an opportunity to walk 360 degrees around the wearer, and is approximately the same height as the wearer.) In a case where the garment is a coat, for example, the exposed surface of the garment includes the front, back, and sides of the coat, but excludes (a) the under-arm region of the coat, (b) the contiguous portion of the sleeve of the coat that faces the coat when the arms of the wearer of the coat are hanging straight down, and (c) the portion of the side of the coat that is covered when the arms of the wearer of the coat are hanging straight down. The exposed surface of the coat also excludes, for example, the entire interior of the coat.


A pattern encodes, in an article, an identification code in a “large format” wherein each item of information content in the identification code is represented in the pattern by a set of attributes wherein each attribute has a minimum dimension of 1 mm.


A pattern that is “configured to be readable and decodable by a mobile computing device” need not in practice actually be read and decoded by the mobile computing device as long as the pattern is capable of being read and decoded by the mobile computing device. Thus, in some applications, equipment other than a mobile computing device can be used to read and decode the pattern, such as a computer coupled to a digital imaging device having a telephoto lens.


A specific article of fabric is “contextually recognizable” if a mobile computing device having a typical camera with a resolution of at least 8 megapixels can read, while the article is in normal use and at a distance of at least 2 m from the mobile computing device, a pattern on the article that enables identifying the article in a manner that distinguishes the specific article from other articles.


A “mobile computing device” is any device selected from the group consisting of a smartphone, a tablet computer, and a portable computer, such device including a camera.


A pattern on an article of fabric encoding a unique identification code is “not discernable” to an ordinary unaided human observer if (a) this observer is unable to see any features of the pattern (as, for example, when the pattern exists only in the infrared region of the spectrum) or (b) features of the pattern do not appear to this observer as encoding information (as, for example, when the pattern is visible to this observer but appears to be either random or regular).


A “computer process” is the performance of a described function in a computer using computer hardware (such as a processor, field-programmable gate array or other electronic combinatorial logic, or similar device), which may be operating under control of software or firmware or a combination of any of these or operating outside control of any of the foregoing. All or part of the described function may be performed by active or passive electronic components, such as transistors or resistors. In using the term “computer process” we do not necessarily require a schedulable entity, or operation of a computer program or a part thereof, although, in some embodiments, a computer process may be implemented by such a schedulable entity, or operation of a computer program or a part thereof. Furthermore, unless the context otherwise requires, a “process” may be implemented using more than one processor or more than one (single- or multi-processor) computer.


To “impress a pattern” onto fabric includes establishing a pattern on the fabric by weaving or knitting, applying the pattern to the fabric by printing or embossing or other means, and adhering to the fabric a decal having the pattern. To “impress a pattern” onto a tangible item includes establishing a pattern on the tangible item by weaving or knitting (when compatible with the nature of the item), applying the pattern to the item by printing or embossing or other means, and adhering to the item a decal having the pattern.


A “leading strip” of a repeatable unit of a pattern impressed on a tangible item is a strip having spatial properties that mark the presence of the repeatable unit on the item. The leading strip is used to identify a set of associated data strips. Although a leading strip can be considered to fall at the boundary of the repeatable unit, in fact, it is valuable, for decoding purposes, to consider that the actual position of the leading strip in the repeatable unit is arbitrarily located. The reason for this conceptualization is to address a circumstance wherein the leading strip cannot conveniently be recognized at a boundary of the unit.


A “color” of a pattern refers to a characteristic spectral content of the pattern involving electromagnetic energy in any or all of the visible, ultraviolet, or infrared spectra, and any other spectra of electromagnetic energy, in which the energy is subjected by the pattern to a process selected from the group consisting of reflection, absorption, transmission, and combinations thereof.


In the case of “a stripe defined by a first transition edge from a first color to a second color and a second transition edge from the second color to a third color,” the third color need not be distinct from the first color. In such a case, the stripe would have a width defined by the second color and appear against a background defined by the first color. On the other hand, the first color can also be distinct from the third color.


A “tangible item” includes any item having a physical existence, including an item of fabric, a display of a computing device, a billboard, a pamphlet, a briefcase, a backpack, a messenger bag, a piece of luggage, etc.


A unit of a pattern is “repeatable” when (i) the unit includes a first set of components of which each component includes a leading strip and a set of associated data strips conforming to a symbol constellation distinct to such set of components and (ii) further instances of the set of components may optionally be present, but, if present, need not be oriented in the same direction or have the same overall size as the first set of components.



FIG. 1 is a diagram of an exemplary embodiment of a fabric-based article of clothing 102 having a pattern 104 and a mobile computing device 106 interacting with the article of clothing 102. (In a manner similar to an article of clothing, other tangible items may be fabric-based and may similarly have such a pattern. Furthermore the tangible items need not be fabric-based.) The article of clothing 102 is associated with a unique identification code 108 (the “looks_id”). The unique identification code 108 is embedded in the article 102 in the form of a pattern 104 that can be optically read by the mobile computing device 106. The number of unique identification codes that can exist is a function of the number of bits in the code. For example, for a code with n=32 bits, there are 232=4.3 billion unique combinations. For a code with n=64 bits, there are 264=1.8×1019 unique combinations. Note that the pattern 104 on each article of clothing 102 is encoded with a unique identification code 108. This unique identification code 108 can be linked to user-determined information corresponding to the user or owner of the article of clothing 102. The user information can include user name, user picture, user email, etc.



FIG. 2 is a diagram of an exemplary embodiment of an article of clothing 102 having a pattern 202 embedded with a unique identification code in form of a barcode. This barcode subscribes to the Code 128 barcode symbology. Note that this barcode pattern 202 is distributed over at least 10% of the exposed surface of the article of clothing 102, allowing the pattern 202 to be scanned by a mobile computing device 106 at a distance to return information about the user or wearer of the article of clothing 102. In embodiments, for a pattern 202 to be resolved by the camera of mobile computing device 106, the minimum dimension of the attributes of the barcode is at least 1 mm. For example, the bars of the barcode pattern 202 have widths of at least 1 mm (drawing in FIG. 2 not to scale). In other embodiments, for a pattern 202 to be resolved by the camera of mobile computing device 106, the minimum dimension of the attributes of the barcode is at least 2 mm. In yet other embodiments, for a pattern 202 to be resolved by the camera of mobile computing device 106, the minimum dimension of the attributes of the barcode is at least 3 mm.



FIG. 3 is a diagram of an exemplary embodiment of a system network 300 including an article of clothing 102, a mobile computing device 106, and a system server 302. The article of clothing 102 is coupled to a first user 304 (in this case, its owner) as will be further described below. A second user 306 (“app user”) can use an application, such as a fabric identification application, executing on their mobile computing device 106 to scan the article of clothing 102 and decode its unique identification code. The application, typically connected to the Internet, is connected with a server system 302 (“Central Authority”) to retrieve information that the first user 304 has previously determined that he or she wants to share. In some embodiments, the functionalities of the fabric identification application may be packaged as an “add-on” to third-party applications such as Facebook, Instagram, Snapchat, and the like.


Registration of Unique Identification Codes


FIG. 4 is a diagram of an exemplary embodiment of a process 400 for embedding a unique identification code in a pattern 402 in the article of clothing 102 at the point of manufacturing 403. As first step 404, a manufacturer 403, registered with the Central Authority server system 302, sends a request to the server system 302 for a new unique identification code. In a second step 405, the server system 302 generates a new unique identification code (in this example, an eight-digit code “43229921”) and logs the request event in a database 406. In step 408, the server system 302 sends the newly generated unique identification code to the manufacturer 403. In some embodiments, this unique identification code can be sent with other information, such as instructions on how to embed the code into a pattern on the article of clothing 102 that can be optically read, specifications for the pattern, and/or a pattern to encode the accompanying unique identification code. In step 410, the manufacturer 403 creates an article of clothing 102 with the unique identification code embedded in a pattern 402. The manufacturer 403 can repeat this process 400 for subsequent articles of clothing. Each time the manufacturer 403 issues a request, the Central Authority server system 302 returns a new unique identification code. In other embodiments, for increased efficiency, the manufacturer 403 can request a batch of unique identification codes. Each unique identification code can then be embedded in one article of clothing.



FIG. 5 is a diagram of an exemplary embodiment of a process 500 for transferring ownership of the article of clothing 102 from a first entity, such as the manufacturer 403, to a second entity, such as a distributor 501. In step 502, the first entity 403 sends the Central Authority server system 302 a notification of the transfer of ownership of the article of clothing 102. In step 504, the server system 302 logs the transfer of ownership into database 506. In step 508, the second entity 501 receives the ownership information from the server system 302. In some embodiments, the second entity 501 can first request the ownership information and then receive the requested information via step 508. Note that in a parallel or serial step 510, the first entity 403 can provide the one or more articles of clothing 102 to the second entity 501. In step 512, the process of transfer of ownership can continue until the article arrives at the point of sale 514.



FIG. 6 is a diagram of an exemplary embodiment of a process 600 of transactions taking place at the point of sale. In step 604, the seller 602 sends a sale authorization to the Central Authority server system 302. In step 606, the server system 302 logs the sale authorization in database 608 with respect to the unique identification code associated with the article of clothing 102. The server system 302 generates a key pair, Key A and Key B, corresponding to the unique identification code, stored in database 610. The keys can be, for example, generated using a cryptosystem such as RSA. In step 612, the server system 302 returns a first key, “Key B”, (while holding back the second key, “Key A”) to the seller 602. In step 614, Key B is provided to user 616 at the time of purchase or thereafter. In step 615, in parallel or in serial with step 614, the article of clothing 102 having a pattern 202 is provided to a user 616 at the time of purchase. Key B of the key pair allows the user 616 to prove ownership to the server system 302 and initiate registration of the article of clothing 102.



FIG. 7 is a diagram of an exemplary embodiment of a process 700 for a user 616 registering his or her purchase of the article of clothing 102 with the server system 302. After purchasing the article of clothing 102 (with Key B), the user 616 can use the key to prove ownership to the server system 302. In step 702, the user 616 may enter a web portal to provide Key B in addition to other information, such as the user details. The web portal can be under the control of the server system 302. User details can include user name, password, email, phone number, user preferences, user picture, and the like. At step 704, the server system 302 validates ownership of the article of clothing 102 by checking database 610 for the corresponding key information. Having validated the ownership, the server system 302 can associate the unique identification code (“looks_id”) with user input details at the web portal and add the user input details into a database 705, which can include when the ownership of the particular article of clothing starts and ends. For example, an owner of the article of clothing may sell or gift the article of clothing, at which time the ownership of the previous owner ends. Optionally, in step 706, the user 616, at the same time or another time, can link other applications to the server system 302. For example, the user 616 may authorize the server system 302 to interact with his or her social media (such as Facebook, Twitter, Snapchat, Instagram, etc.) application by inputting application specific data, such as username, links, etc. This application specific data can be stored in a database 708.


Scanning an Encoded Pattern on an Article of Clothing


FIG. 8 is a diagram of an exemplary embodiment of a process 800 for accessing authorized user information, encoded in an article of clothing 102, by a mobile computing device 106. In step 802, a user (the “regarding individual” 803) of the mobile computing device 106 scans the article of clothing 102 using an application, such as a fabric identification application, executing on the mobile computing device 106. For example, the regarding individual may see a person in a social setting, such as a bar or a conference, wearing a patterned shirt and access the application to scan the article having the pattern. In some embodiments, once the mobile computing device 106 has scanned the article of clothing 102, the server system 302 or application executing on the mobile computing device 106 can confirm to the regarding individual 803 (the user having the mobile computing device 106) that a valid pattern has been scanned. In some embodiments, the application itself may decode the embedded code and send the decoded information to the server 302. In step 804, the application requests authorized information about the user 616 by sending the decoded unique identification code to the server system 302. In step 806, the server system 302 checks one or more databases, to ensure the user 616 has authorized the release of his or her user information. Once validated, the server system can retrieve user-specific information from a social media database 708. In step 808, the server system 302 returns only the data that the user 616 has authorized for sharing.


In some embodiments, the fabric identification application can be a module of another application, including social network applications such as Facebook, Twitter, Instagram, Snapchat, and the like. For example, the fabric identification module of a social network application, such as Facebook, can scan and process the pattern on the article of clothing 102 to read the embedded code in the pattern. The module can then send the decoded information to a server system connected to the specific network application (in this case, a Facebook server system). In some embodiments, the Facebook server can communicate with the server system 302 to request authorized information related to the article of clothing 102. If the user 616 has authorized information to be shared, the information can be transmitted back to the mobile computing device 106 of the regarding individual 803.


In some embodiments, the regarding individual 616 using the mobile computing device 106 gives the fabric identification application access to certain data from a social network account, such as his or her Facebook account. The fabric identification application can send, to the server system 302, the embedded code in the pattern of the article of clothing 102 together with data from the Facebook account of the user 616. The server system 302 can determine what data related to the article of clothing 102 to return to the mobile computing device of the regarding individual based on the specifics of the data from the Facebook account of user 616.


The returned authorized data can be provided to a third-party application 810, such as Facebook, and in step 812, the user of the mobile computing device 106 may be redirected to the third-party application 810. In some embodiments, a third-party application 810 (such as Facebook) itself can be used for authentication (using standard third-party authentication methods). Optionally, in step 813, the scanned image of the article of clothing 102 can be processed, by an application specific to the server system 302 or by a third-party application 810 (in this example, Facebook).



FIG. 9 is a diagram of an exemplary embodiment of a process 900 for redirecting the application executing on the mobile computing device 106 in the case where user 616 does not authorize the sharing of user information. At step 902, the regarding individual 803 uses the mobile computing device 106 to scan the article of clothing 102 into an application executing on the mobile computing device 106. The scanned image of the article of clothing can be processed, at step 904, by an application specific to the server system 302 or by a third-party application 906 (in this example, Twitter). At step 908, the application can request, from the server system 302, the user information of user 616. At step 910, the server system 302 can check the authorization related to the unique identification code (“looks_id”) in database 708. If the user 616 has not previously provided authorization (for example, through a user interface or web portal connected with the server system 302 as described for FIG. 7), then, at step 912, a message from the server system 302 can be provided back to the application, such as “Not found” or “Not authorized to share user information”. In some embodiments, in addition to or instead of the “Not found” message, at step 914, the application may be redirected to retrieve manufacturer information and/or to an advertisement.


Examples of Encoded Patterns in Articles of Clothing


FIG. 10A shows a photograph of an exemplary embodiment of a pattern 1002 on an article of clothing 1004 in which a unique identification code is embedded in the form of a barcode. FIG. 10B-10C show front and back graphic representations of pattern 1006 and 1008 on the article of clothing 1004. Note that in this example, the pattern is distributed on both the front and back of the article of clothing 1004. In embodiments, the unique identification code is repeated within the encoded pattern so that the mobile computing device can successfully capture the pattern into the application from multiple angles relative to the article of clothing. In pattern 1006 and 1008, the barcode uses standard Code 39 encoding (for further details, see standard ANSI/AIM BC1/1995, Uniform Symbology Specification—Code 39). In this example, the red and blue colors in the bars of the code were added for aesthetic reasons and do not affect the information content. The information is encoded in the presence and width of dark bars and light spaces. The format of information content is 7 alpha-numeric characters (e.g. “ATZMLZB”). Each character can take one of 39 possible values and therefore the information content is equivalent to approximately 37 bits (log 2(387)=36.99).


Note that this pattern 1006, 1008 is distributed over at least 10% of the exposed surface of the article of clothing 1004, allowing the pattern 1006, 1008 to be scanned by a mobile computing device at a distance to return information about the user or wearer of the article of clothing 1004. In embodiments, for a pattern to be resolved by the camera of mobile computing device, the minimum dimension of the attributes of the barcode is at least 1 mm. In other embodiments, for a pattern to be resolved by the camera of mobile computing device, the minimum dimension of the attributes of the barcode is at least 2 mm. In yet other embodiments, for a pattern to be resolved by the camera of mobile computing device, the minimum dimension of the attributes of the barcode is at least 3 mm. Note that the width of the narrow bars of the pattern 1006, 1008 in FIG. 10A is approximately 3 mm.


In some embodiments, other standard one-dimensional barcode symbologies can be used, such as Code 128, UPC, EAN, etc. Further, barcodes can be customized such that the widths of bars and spaces could be chosen to improve the aesthetics of the garment. For example, the spaces between the bars can be made wider to create a more subtle design that looks less like a standard barcode. In some embodiments, the barcodes can be further customized such that the color of the bars and spaces can be used to encode information.


Note that the patterns may be applied to the fabric of the articles of clothing described herein in a variety of manners, including by weaving into the fabric in a manner so as to be integral with the fabric, by printing, and/or by applying a carrier in which the pattern has been embedded to the fabric by means of an adhesive. Note also that articles of clothing can include standard forms of dress, including shirts, jackets, pants, shorts, dresses, skirts, outerwear, and accessories such as hats, bags, umbrellas, and the like.



FIG. 11 is a graphical representation of an exemplary embodiment of an article of clothing 1102 with pattern 1104 that encodes information in the orientation(s) of a set of symbols. Two types of symbols (in this case, hearts and droplets) are arranged on an 8-by-8 rectangular grid. Each symbol has 4 possible orientations: for example, the narrow tip of the heart symbol can be pointing left, right, top, or bottom. This corresponds to two bits of information per symbol (for example: “00” is right, “01” is top, “10” is left, and “11” is bottom). Since there are 64 symbols in the 8-by-8 grid, a total of 64×2=128 bits of information can be encoded in this fashion. Note that only the orientation of the symbols encodes information. The color and shape of each symbol can be chosen to improve the aesthetics of the garment. In some embodiments, the color and shape of the symbols themselves can also be used to encode information.


In some embodiments, the pattern in or on the article of clothing can include an error correction code (ECC), such as a forward error correction (FEC) code. This can ensure that any errors that may occur during a scan of the pattern by the mobile computing device can be detected and corrected. FEC can be used to encode the unique identification code in the pattern in a redundant manner to limit errors in scanning and decoding. Further, in some embodiments, the pattern may be repeated to ensure that scanning is more reliable in less than ideal situations, such as when the camera of the mobile computing device can scan only a portion of the pattern on the article of clothing. In the example used in FIG. 11, a unique identification code may be repeated in the top and bottom four rows of the 8-by-8 grid pattern to ensure robust scanning. In embodiments, the 8-by-8 grid pattern can be repeated on the exposed surface of the article of clothing so that the mobile computing device can successfully capture the pattern into the application from multiple angles relative to the article of clothing.



FIG. 12 is a graphical representation of an exemplary embodiment of pattern 1202 encoding information in the form of font modification. For example, logo text (“affoa” provided in gray font) is displayed on a garment. Within each letter of the text, there are five white squares positioned in five areas 1204a-1204e along the letter. Each white square can have four positions relative to area of the letter in which it is embedded. The four possible positions are shown here by the larger black square around each white square. The white square could be in the top right corner of the black square (encoding bits “00”), in the top left corner (encoding “01”), in the bottom left corner (encoding “10”), or in the bottom right corner (encoding “11”). Each white square can thus encode two bits, and since there are 5 such white squares in a letter, a letter can encode 10 bits. The four letters that carry encoded information (“a”, “f”, “f”, and “a” in this example) can carry a total of 40 bits of information. Note that black squares are shown here only for ease of understanding the concept. In some embodiments, the area shown in black can have the same color as the rest of the letter (gray) while the white squares are visible in contrast. In this type of encoding, the processor of the image can be pre-programmed to recognize the four possible positions available to a white square in each of the five areas 1204a-1204e in each letter of the alphabet. This is specific to the exact shape of the letter. This structure can be stored in the processor in the form an enhanced font that defines both the shape of each letter and the possible positions of the information encoding squares. Note that the processor may be in the mobile computing device or in the server system.



FIG. 13 is an exemplary embodiment of a QR code that can be used to encode a unique identification code in a pattern on an article of clothing. QR Code is established as an International Standards Organization (ISO/IEC18004) standard. The Version 1 QR code, shown in this example, can encode 128 bits when using error correction level M (up to 15% damage to the QR code).



FIG. 14 illustrates front and back views of an exemplary embodiment of an article of clothing 1402 having encoded information in the form of a two-dimensional pattern. Note that in this example, the pattern covers at least 30% of the exposed surface of the article of clothing 1402. The pattern of the lighter color (beige in this instance) triangles 1404 provides a reference two-dimensional grid of locations for the darker color triangles 1406. In this example code, there is a fixed or pre-programmed set of locations where darker color triangles are found. Note that, in this embodiment, the size of each triangle is approximately 25 mm by 25 mm. Each darker color triangle 1406 can have two orientations (90-degree angle pointing towards bottom-right or top-left) and thus can each encode one bit of information. In some embodiments, the color of the darker color triangles 1406 can also encode information. For example, two additional bits of information per triangle position if four possible colors are used.



FIGS. 15A-15C are graphical representations of exemplary embodiments of patterns encoding information in the form of color content. In each example, a pre-defined standard set of distinguishable colors is first chosen. Each color corresponds to a bit. If the bit equals 1, then the corresponding color is present on the article of clothing. If the bit equals 0, then the color is absent from the article. A subset of colors that need to be present for each unique identification code is determined and an article is designed with a pattern that includes the “present” colors and none of the “absent” colors. Note that, in this example, the spatial location of each color is not constrained, which can provide flexibility to the designer to optimize aesthetics of the article of clothing. Decoding the information includes the extracting of a histogram of the colors present in an image of the garment. In some embodiments, to make decoding more robust to noise that may interfere with the decoding, the set of colors can be chosen to have significant amount of contrast relative to one another. In particular, the colors can be chosen based on a rectangular grid of points in the three-dimensional RGB color space. In the examples shown in FIGS. 15A-15B, 14 out of a possible 27 colors are present (corresponding to a 27-bit binary code with 14 “1”s and 13 “0”s). These colors can be spatially arranged in many ways. FIGS. 15A-15B shows colors arranged in rectangular shapes with different aspect ratios 1502 and 1504. FIG. 15C shows colors used in a non-geometric or free-form arrangement 1506.



FIG. 16 is a graphical representation of an exemplary embodiment of an article of clothing 1602 having a pattern 1604 encoding information in the positions of a set of symbols. In this example, each circular dot 1605 can have four possible positions 1606. These positions are shown for one of the dots as three dashed circles where the dot could be located in addition to its current position (solid gray dot). In some embodiments, the size and color of each dot 1605 can be used to encode information in addition to the position.



FIGS. 17A-17B are graphical representations of an exemplary embodiment of a pattern to encode a unique identification code. The difference between FIGS. 17A and 17B illustrate how specially-tailored spectral content in the garment can be used to enhance code readability with a mobile computing device, such as a smartphone, while keeping the appearance of the code more subtle to the human eye. Human eyes and smartphone charge-coupled-device (CCD) cameras have similar but different sensitivities to different wavelengths of light. For example, a CCD camera is more sensitive than the human eye to wavelengths around 700 nm, which are at the border between visible and near-infrared ranges. If an article of clothing contains a pattern of bright and dark regions at the 700 nm wavelength, such a pattern may not be easily visible to the human eye. A CCD camera, however, can detect a relatively larger contrast in the pattern. In embodiments, this contrast effect can be further enhanced by processing in the mobile computing device or in the server system.


Security Features


FIGS. 18A-18B are diagrams of exemplary embodiments of a network 1800 including a user 1802 (“owner”) of an article of clothing 1804, an app user 1806 (“regarding individual”) having a mobile computing device 1808, and server system 1810 (“Central Authority”). The interaction in the physical world (the app user 1806 scanning the owner's 1802 article of clothing 1804) can trigger an interaction with the server system 1810. The chain of events after an initial scan can depend on settings chosen by the owner 1802 and the app user 1806. FIG. 18A shows a first scenario in which the owner 1802 can determine application settings such that he or she will share the song he or she is currently listening to with any app user 1806 who is scanning his or her article of clothing 1804. FIG. 18B shows a second scenario in which the owner 1802 can determine settings such that he or she will not share any information to any app user 1806. In another scenario, the owner 1802 can determine settings such that he or she will share the song he or she is listening to only if the app user 1806 also has this particular setting enabled in the app user's instance of the application.



FIG. 19 is a diagram of an exemplary embodiment of a network including a user 1802 (“owner”) of an article of clothing 1804, an app user 1806 (“regarding individual”) having a mobile computing device 1808, and server system 1810 (“Central Authority”). The interaction between the owner 1802 and the app user 1806 can be controlled by the server system 1810 by using the geolocations of the mobile computing devices 1902 and 1808 of the owner 1802 and the app user 1806, respectively. For example, the owner 1802 can determine a setting to share information with an app user 1806 only if the app user 1806 has a geolocation feature enabled on the app user's mobile computing device 1808 and the app user 1806 is located in close proximity to the owner 1802 (e.g. within a distance of 10 meters or 30 meters) as determined by their geolocations. In this example, the server system 1810, before sending a reply message to the mobile computing device of the regarding individual, can determine if the app user should be denied permission to access user information by collecting the geolocation data from each mobile computing device 1902 and 1808 and processing the location data to determine if the app user is within a predetermined distance (either set by the owner of the article of clothing or the server system). If the app user is at a distance greater than a predetermined distance, the server system 1810 can send a message to the app user's instance of the application executing on mobile device 1808 that the owner 1802 of the article of clothing 1804 denied permission to access user information. This measure can, for example, prevent an interaction that is triggered by an app user 1806 scanning a photograph of the owner's article of clothing 1804 when the app user 1806 is in reality far away from the owner 1802.


Exemplary User Interfaces


FIGS. 20A-20B show screenshots of an exemplary embodiment of an application executing on a mobile computing device. FIG. 20A is an application interface 2002 showing an image of an article of clothing 2004 having a pattern 2006 similar to that shown in FIG. 10 as seen by the camera of a mobile computing device while the device is moved around to see different parts of a room. After the pattern 2006 is identified, the code “AF2TZLA” is decoded from the article of clothing 2004. The app user can then choose to tap the “Look!” button 2008 on the interface 2002. This action changes interface 2002 to the interface 2010 as shown in FIG. 20B. In FIG. 20B, interface 2010 shows the profile page for the owner of the article of clothing 2004. The profile page can include the name 2012 of the owner, a picture 2014 of the owner, etc. The various types of information that the owner has chosen to share can be linked to the icons 2016 at the bottom of the profile page.



FIG. 21 is a diagram of an exemplary embodiment of architecture of the fabric identification application in the form of a storyboard. There are two major branches that separate from the main window 2102. The top branch 2104 allows a user that owns at least one article of clothing with an encoded pattern, for example, to manage the customization of owned articles in view 2106. In view 2108, the owner can determine the information to associate with each article of clothing 2110 in their collection and adjust settings to control the information 2112 shared with different types of users. The bottom branch 2114 allows a user to manage the scanning and decoding of an article of clothing. In view 2116, when a user scans an article of clothing, a query is sent to the Central Authority. Based on current settings of the owner of the article of clothing, a set of authorized information is displayed. In view 2118, the user can then choose to access that set of information in more detail. In some embodiments, the set of information can lead to a third-party application, such as a social media site, a music player, video player, and the like.


Server-Based Method for Identifying a Specific Article of Clothing


FIG. 22 is a flowchart of an exemplary embodiment of a server-based method for identifying a specific article of clothing in a social context. In step 2202, the server system receives a message from the fabric identification application executing on the mobile computing device. The message can include identity data corresponding to a pattern on a specific article of clothing that was captured by the mobile computing device. The pattern encodes a unique identification code associated with the specific article. In step 2204, the identity data is processed by the server system in relation to a database system. The database system stores identification codes for a set of articles of clothing in related to corresponding user information. The identity data is processed to identify a specific user associated with the specific article of clothing. In step 2206, the server system sends a reply message to the application executing on the mobile computing device that, if the user of the article of clothing authorizes, includes information about the user of the article of clothing.


Other Embodiments

In some embodiments, the unique identification code can be linked to third party-defined content provided by marketers, campaigns, organizations, institutions, and the like. For example, the content that appears on the mobile computing device of a regarding individual as a result of scanning a pattern on an article of clothing can be an advertisement to a product or service. In some embodiments, the unique identification code can be linked to a plurality of content items including user-defined content and third party-defined content. For example, a first portion of the interface to the fabric identification application can be reserved for the profile of the specific user wearing the article of clothing with the scanned pattern while a second portion can be dedicated to an advertisement.


In some embodiments, the network includes a first mobile computing device of the specific user wearing the article of clothing and a second mobile computing device of the regarding individual. The network can allow unique isomorphic exchange of customized messages between the first mobile computing device and the second mobile computing device.


In some embodiments, the server system 302 can track usage by the regarding individual of the fabric identification application or module. This usage can be logged in a database by the server system. The log can have usage information such as when the regarding individual scanned a specific user's article of clothing. The specific user can view the log of scans of his or her article of clothing. In some embodiments, the regarding individual may not be able to view decoded information related to the scanned article of clothing unless the specific user has released his or her information. For example, the specific can review the log after some time of wearing his or her article of clothing in a social setting and determine if each regarding individual can view the decoded information encoded in the pattern of the article of clothing.


Example Use Cases: Encoded Apparel in Social Settings

In an exemplary embodiment, codes that are embedded within clothing can be decoded differently based on the profile type of the app user that scans the code. For example, such a code may be particularly suited for a conference or tradeshow environment. A user wearing an encoded shirt or jacket can be scanned by people in a conference, such as professional contacts, friends, or members of the public. The user can pre-set the types of information that each of these categories of people may see. For example, the user may categorize the conference attendee list (or a subset of) as “professional contacts”. When a professional contact scans the encoded shirt or jacket, he or she may be linked to the user's LinkedIn page, prompted with a meeting invite, provided with a booth location at a trade show, and the like. In an embodiment, the user may link LinkedIn contacts to a “professional contacts” list in his or her profile so that anyone from the user's LinkedIn community can view the same material. Similarly, the user can add lists of friends or friend groups which, upon decoding the encoded clothing, may be privy to the user's Facebook profile, music playlists, and the like. For a member of the public, the user may link to a website, advertisement, blog, charity organization, and the like. Thus, in the above described example, the same code can be decoded in three different ways depending on who scans the code.


In some embodiments, the profile of the scanning user may be automatically provided to the server that is coupled to the scanning user's mobile device. The server may then determine to which category the scanning user's profile belongs based on, for example, a lookup table (such as the conference attendee list in the above example). The categories may be pre-set by the user of encoded clothing as described above or may be determined by the server. In some embodiments, the user of the encoded clothing may receive a notification on his or her mobile device to categorize the profile of a particular scanning user. In some embodiments, once a profile of a scanning user has been categorized (for example, into an “acquaintance” category), the user of the encoded clothing can re-categorize (or be prompted to re-categorize) the profile of the scanning user to another category (such as “close friend” category).


Returning to the example of a conference or tradeshow environment, the user may be prompted to register his or her encoded clothing to a particular profile at the start or registration phase of the conference. In an exemplary embodiment, even if a user does not have encoded clothing, his or her profile may be registered at a conference to facilitate social connections between attendees. For example, the registrations may be used in a game at a gathering or conference—such as a game to encourage attendees to scan as many people and collect rewards.


Example Use Cases: Communication and Encoded Fabrics

In an exemplary embodiment, information can be communicated to users of encoded clothing. FIG. 23 is a diagram of exemplary stadium seating in which two users, user 2302a and user 2302b, are viewing a sporting match between two teams, Team A and Team B. User 2302a has headwear A with the logo, pattern, or color combination of Team A while user 2302b has headwear B with the logo, pattern, or color combination of Team B. Near each of the seats 2304a, 2304b, and 2304c is a corresponding data transmitter 2306a, 2306b, and 2306c, respectively. The data transmitter includes a scanner to scan the headwear of the user sitting in the corresponding seat and transmit data to the user depending on the team associated with the headwear. The user, having a data receiver and headphones, can receive information related to the sporting match via the headphones coupled to the data receiver. In this embodiment, because the users have headwear of opposing teams, transmitter 2306a transmits broadcasting by pro-Team A sportscaster to user 2302a while transmitter 2306b transmits broadcasting by a pro-Team B sportscaster to user 2302b. In some embodiments, the data transmitter-receiver system is a Bluetooth system or WiFi. In an exemplary embodiment, a data transmitter equipped with LiFi (line of sight wireless communication), modulated LEDs, or other optical communication device can transmit data to headwear having photovoltaic fibers. The photovoltaic fibers, for example, can receive optical signals from the optical transmitter to provide the user with the appropriate broadcasting. In some embodiments, a single data transmitter can be configured to transmit data to two or more users within signal range.


In an exemplary embodiment, encoded apparel can be used for navigation purposes. One or more users wearing encoded apparel can be tracked and provided information to move around a space or a building based on the encoding. For example, indoor navigation has been a particular challenge to conventional means such as GPS. By positioning scanners around a building, a person wearing encoded apparel can be scanned and information can be provided (by wireless communication such as WiFi, Bluetooth, cellular, radio, etc.) in real time or near real time to a mobile device. The information can include directions, maps, items of interest, and the like. A further advantage of such a scheme is that different information can be provided to different people. For example, advertisement can be targeted to specific users wearing encoded apparel in a commercial space (such as a shopping mall, downtown area of a city, etc.). An article of encoded apparel may be encoded with the profile of the user having information such as age, gender, residence, previous purchases, likes, dislikes, etc. This information can be used to better guide the particular user to areas of a commercial space suited to his or her tastes.


In another exemplary embodiment, encoded apparel can be scanned by aerial vehicles equipped with a scanner such as helicopters, planes, unmanned aerial vehicles (UAVs), drones, and the like. These aerial vehicles can provide information to the wearer of the encoded apparel. For example, in a rescue situation, a UAV may be able scan a side of a mountain or the woods for encoded apparel that may be otherwise difficult to detect. The UAV can attempt to transmit information to the wearer or use the location of the encoded apparel to aid a rescue effort.


In another exemplary embodiment, encoded apparel can be used for collision prevention. A vehicle equipped with a scanner may be able to detect the presence of a pedestrian having encoded apparel before the driver is able to see the pedestrian. For example, encoded apparel that can include reflective materials or materials with optical properties outside of the visible spectrum. Such clothing can be scanned by a scanner on a vehicle and processed to provide feedback to the driver or control system of the vehicle to avoid a collision with the pedestrian.


In some embodiments, the encoding of an article of clothing can be changed by the user in real time or near real time. For example, the user can access a portal to change profile information linked with the particular article to convey different information at different times. In this way, a user can communicate with a scanner at will.


In some embodiments, certain encoded fabrics can be “public use” articles in that they may be used a first user and encoded with a first information and subsequently released to be used by a second user and encoded with a second information, and so on. For example, an encodable life vest may be rented by a kayaker for a duration of time. At the rental registration, the life vest may be encoded with the kayaker's personal information in the case of an emergency such that, as previously described, a rescue vehicle is able to locate the kayaker. In another example, a user can communicate messages to a scanner. A user may change the encoding of his or her clothing based on the types of social settings he or she is in on a particular day. For example, the user may encode a first message or profile during the day time or in a professional setting and encode a second message or profile in the evening or casual setting. In another example, a user can pre-program messages and profiles such that the encoding of apparel changes over time or based on the location of the user. The server system having the encoded information for a particular article of clothing may be triggered to change the encoded information based on feedback from the user or the user's mobile device. The mobile device may provide information such as location of the user. The location of user, as described, can determine the type of information that is encoded in an article of clothing.


In an exemplary embodiment, military apparel can be encoded. For example, clothing with a camouflage pattern may be encoded such that a soldier wearing such clothing can be scanned in the field. A scanner used in the field can be configured to detect “friend v. foe” status, identity, or other information about the soldier. The scanner may be also used for rescue or recovery missions as described above. In some embodiments, weapons can be outfitted with a scanner to determine whether the weapon should fire upon trigger. In other words, the scanner may provide a control signal to allow a user to trigger the weapon.


In an exemplary embodiment, encoded apparel can be used to get through security checkpoints. For example, queues for security checkpoints at airports and stations are regularly held up due to the processing time needed to scan a person and their belongings. However, a person can don encoded apparel that is provides sufficient information to an agency such as the Transportation Security Administration (TSA) of the United States to process them at a faster rate than a person without encoded apparel.


In another exemplary embodiment, a unique code can be assigned to a group of articles owned by a single user instead of a unique code per article. For example, a hiker may choose encode all of his or her clothing or accessories with the same code. In another example, a unique code can be assigned per user, such as a soldier, such that the particular code is used in the camouflaging pattern in any or all of the soldier's gear. In an exemplary embodiment, a particular code can consist of a user code (which can be the same for a group of articles) and an article code (which can be unique per article of clothing or gear).


In an exemplary embodiment, the code can be stitched on or beaded onto an article. For example, a pattern of beads can encode information in similar ways to other methods of impressing codes described herein. In some embodiments, a craftsperson can make and encode his or her own apparel or accessories.


In an exemplary embodiment, an animal-skin pattern may be encoded with information. This can include fashion patterns leopard print, zebra print, snakeskin print, alligator skin print, and the like.


In some exemplary embodiments, apparel can be encoded with active fibers or materials, such as fibers that change color or properties based on heat exposure (thermochromic fibers) or electric current, voltage, or power (electrochromic fibers). For example, fiber affected by temperature can be used to encode at least two different data. The fiber in this example can be configured to have a first state at temperature T1 and a second state at temperature T2. In another example, encoded apparel having electrochromic fibers can be coupled to a controller that can determine amounts of current, voltage or power to be applied to the fibers. A user wearing the encoded apparel can determine the encoding by manipulating the controller. In some embodiments, the controller of the encoded apparel may be coupled, with or without a wire, to the user's mobile device, through which the user may affect changes to the fibers.


Note that any two or more of the methods or technologies of encoding of apparel may be combined or layered. In this way, the encoding may be enhanced or more data can be encoded in an article of clothing.


Exemplary Encoded Plaid in Apparel and Goods

Plaid (also referred to as ‘tartan’) is an exemplary aesthetic pattern that can be encoded with information. Typically, a plaid pattern includes horizontal and vertical bars of different thicknesses, spacing, colors, and opacities. In an exemplary embodiment, a two-dimensional code can be incorporated as part of a plaid pattern. In other words, a plaid pattern can constitute a two-dimensional code carrying information. In some embodiments, one or more aspects of a plaid pattern can be used to encode information such that the code is not discernable to a human but is decodable by a processor. Such an encoded plaid pattern may be hereinafter referred to as a “plaid code”. An exemplary plaid code can include a first set of two or more strips in a first direction and a second set of two or more strips in a second direction, where the first set and second set are positioned relative to one another in the plane of a material, such as a fabric or a surface of a tangible item. For instance, the first set and second set can be positioned orthogonally relative to one another. In another instance, the first set and second set can be positioned at any angle greater than 0 degrees and less than 90 degrees. For instance, an encoded argyle pattern can include a first set of strips at an angle 30-60 degrees relative to the second set of strips.


In an exemplary embodiment of a plaid code, a plaid pattern is established by use of a “repeatable unit” that is repeated contiguously in or over a fabric. A plaid code implemented in this way can be virtually undetectable by the untrained human eye. The subtlety of such encoding promotes aesthetic desirability of the encoded article of clothing. FIG. 24 is an exemplary embodiment of a repeatable unit 2402 that is encoded with a two-dimensional code. Note that, in this example, not many strips are needed to encode. In some ways, the sparseness of the plaid code—low spatial frequency—allows a human eye to pass over the pattern without suspecting the presence of the code. For example, to enforce “low spatial frequency” in a particular plaid code, the repeatable unit may have no more than three strips in a first direction and no more than three data strips in a second direction. For a repeatable unit with a dimension of at least 75 mm in the first direction and at least 75 mm in the second direction, each strip would, in its minimum configuration, occupy 25 mm. There are at least three benefits associated with these constraints: (1) the pattern can be conveniently implemented in weaving or printing; (2) a pattern within those constraints can be designed to have aesthetic value; and (3) a camera on a smartphone at a suitable distance can read the encoded pattern. In another embodiment, however, it may be desirable to have a higher spatial frequency based on particular aesthetic choices or technical implementation details.


The distribution of the repeatable unit of the plaid code can further the goal of inconspicuousness. For example, the repeatable unit may be distributed over at least 70% of the visible portions of an article of clothing or tangible item (such as a purse, blanket, or furniture cover). In other cases, the repeatable unit may be distributed over at least 50% of the visible portions of an article of clothing or tangible item.


In an exemplary embodiment, the aesthetic component of the plaid code can be inspired initially from an existing plaid pattern or be composed by a plaid designer. From an established aesthetic component, the code component can be implemented. FIG. 25 is a backpack 2502 having an exemplary plaid pattern. A portion 2504 of the plaid is isolated. From this isolated portion 2504, the repeatable unit 2402 can be derived. The repeatable unit 2402 can be divided into a first set 2404a of strips in a first direction 2406a and a second set 2404b of strips in a second direction 2406b. Note that once a collection of unique repeatable units have been produced based an aesthetic goal, the repeatable units may undergo further scrutiny as a “quality check” on the attractiveness of a particular coded plaid. In this way, there may exist continuous feedback between the aesthetic and utility components of the plaid code.



FIG. 26A is an exemplary first set 2404a of strips in the first direction for a repeatable unit and FIG. 26B is an exemplary second set 2404b of strips in the second direction for a repeatable unit. In an exemplary embodiment, if the plaid code is implemented using weaving technology, the first direction corresponds to the ‘weft’ while the second direction corresponds to the ‘warp’. Each of the first and second set 2404a, 2404b can be broken down further into a leading strip 2602 and 2604, respectively, and a set of associated strips 2606a-2606c and 2608a-2608c. Note that while this example includes three associated strips per leading strip, there can be as few as one associated strip. Each leading strip acts as a marker for the decoding of the associated strips in a particular direction. Note that, while the leading strip is shown to one side of the associated strips, the leading strip can be in any position within a repeatable unit. In the example shown, the leading strips 2602, 2604 each can include a dark stripe 2610, 2612, respectively, and a white stripe 2614, 2616, respectively. The dark stripe in a leading strip marks a “start” point that enables a scanner to identify the leading strip. The optional white stripe in a leading strip has the purposes of (i) allowing aesthetic adjustments to the plaid pattern and (ii) breaking the symmetry between the first set 2404a and second set 2404b. Breaking the symmetry refers to notion that the first set 2404a and second set 2404b are implemented such that they are not exactly the same. This allows a scanner, which can be configured to read the first set independently from the second set, to be able to identify the proper direction of a set of strips. In some embodiments, the scanner may not need to determine the direction of the strips before decoding. In an embodiment, each of the first and second directions of the plaid code may encode the same data.


In this example, each of the associated strips 2606a-2606c and 2608a-2608c correspond to a “symbol” having two transition edges. The first transition edge 2618 marks the line of transition from a first color to a second color. Note that x1 corresponds to a distance from a leading edge 2620 of the associated strip to the first transition edge 2618. The second transition edge 2622 marks the line of transition from a second color to a third color. Note that y1 corresponds to a distance from the leading edge 2620 of the associated strip to the second transition edge 2622. A first stripe in an associated strip may be defined by the leading edge 2620 and the first transition edge 2618. A second stripe in an associated strip may be defined by the first transition edge 2618 and the second transition edge 2622. In the example shown in FIGS. 26A-26B, the first color is white, the second color is gray (or a shade of black), and the third color is white. In other examples, the colors can be different from each other, shades or tints of the same color, and the like. In some embodiments, each stripe may contain multiple colors or patterns. In some instances, stripes that contain multiple colors or patterns may be configured such that their presence does not interfere with the detection of the transition edges, as detailed below. In some embodiments, y1 is required to be greater than x1 (y>x). In an example, x1=0 and y1=W1 causing the strip to be made up of one color. Note that the embodiments described herein for x1 and y1 also pertain to x2 and y2.



FIG. 27 is a plot of distances x and y illustrating constraints on distances x1, y1, x2, and y2 to encode the plaid pattern. Generally, the x-axis corresponds to either x1 or x2 and y-axis corresponds to the respective y1 or y2. The region 2704 of the plot 2702 for which y is less than x (y<x) is not considered to be valid coding space. In addition to the region 2702, the area (“min dark linewidth”) 2705 defined between line 2706 and area 2702 further restricts the coding region to be within region 2710. This area 2705 is enforced to be greater than twice the width of the dark stripe 2610 in the leading strip 2602 to prevent an error in decoding. Specifically, this enforcement prevents the scanner from identifying a stripe in an associate strip (intended to carry encoded information) as the dark stripe 2610 of the leading strip 2602 (which, in contrast, marks the leading position of the associated strips). Note that region 2710 is further defined by lines 2708a and 2708b. Lines 2708a, 2708b corresponds to minimum light stripe widths of the optional light stripe 2614 within the leading strip 2602. In some embodiments, the minimum light stripe widths can be the same or different based on a favored aesthetic. For example, a minimum light stripe width may be greater than or equal to 10%, 30%, or 50% of the symbol width W1 (for the first direction) or W2 (for the second direction) with the goal of breaking symmetry between the strip(s) of the first direction relative to the strip(s) of the second direction, as discussed above.



FIG. 28 is an exemplary representation of symbols for encoding information in plot 2702. For maximum packing and protection, symbols are chosen from a hexagonal grid. For instance, the position of the origin 2702 of a hexagon 2704 encodes specific information. The hexagonal space 2704 prevents confusion with a neighboring symbol 2706.



FIGS. 29A-29B are exemplary representations of symbols (such as those of FIG. 28) overlaid onto the plot of distances x and y (such as that of FIG. 27). In each of FIGS. 29A-29B, the overlaying of symbols onto the plot shows the number of possible codes that are encodable in an associated strip based on the various constraints described above. Those symbols that are encodable are shown to the top and left of the exemplary line 2706. Thus, for example, in FIG. 29A, symbols 2902 are encodable while symbol 2904 is excluded. Each of symbols 2902 have an x-distance and a y-distance that results in specific positions of the transition edges and thus one or more stripes of the associated strips.



FIG. 29A shows the symbols available for encoding in the first direction and FIG. 29B shows the symbols available for encoding in the second direction. Note that the symbols 2906 belonging to the second direction are shifted relative to the symbols 2902 belonging to the first direction by an amount 2908. This has the effect of breaking symmetry that may exist between the encoding of the strips in the first direction and the second direction. In some embodiments, the shifting amount 2908 can be half the size of the hexagonal space 2704.


In an exemplary embodiment, the shifting of the first set of strips in the first direction relative to the second set of strips in the second direction can encode information. In another exemplary embodiment, the colors, and their shades, used within the strips can further encode information.


Exemplary Decoding of Encoded Plaid

The above described encoding has advantages of being latent or otherwise hidden from detection by the human eye. However, in many embodiments, to enable proper decoding of the plaid code, the following exemplary guidelines are used:

    • i) the repeatable unit is isolatable by the scanner;
    • ii) the strips of the first direction are distinguishable from the strips in the second direction; and/or
    • iii) the plaid code can be read in any orientation.


In some embodiments, to address the guideline (i) of isolatability of the repeatable unit, the dark stripe 2610 (or a stripe of a first color) of the leading strip 2602 is restricted to have a narrow width compared to the stripes in the associated strips 2606a-2606c (see FIGS. 26A-26B). Alternatively, the stripes of the associated strips 2606a-2606c have widths wider than the width of the dark stripe 2610.


In some embodiments, to address the guideline (ii) of distinguishability of the strips of the first direction from the strips of the second direction, a light stripe 2614 of the leading strip 2602 can be included. Further, the width of the light stripe 2614 of the first direction 2404a is different from the width of the light stripe 2616 of the second direction 2404b (see FIGS. 26A-26B). In some embodiments, to address guideline (ii), the grid of encoded symbols of the associated strips 2606a-2606c in the first direction 2404a is shifted with respect to the grid of encoded symbols of the associated strips 2608a-2608c in the second direction 2404b (see FIGS. 29A-29B).


As previously discussed, the leading strip may be positioned in any part of a repeatable unit. For decoding, however, such positioning may present a challenge in correctly identifying the direction of the associated strips. Thus, in some embodiments, to address the guideline (iii) of the plaid code's being read in any orientation, the validity of the encoded symbols can be assessed by the processor coupled to the scanner. FIG. 30A is a diagram showing exemplary sets 3000a, 3000b of strips 2606a-2606c in the first direction 2404a (where 3000a=3000b). The diagram also shows a set 3001 of strips flipped horizontally to the left such that set 3001 is a mirror image of 3000a or 3000b. This scenario may happen if the fabric having the encoded plaid is flipped in the process of manufacturing apparel or a tangible item (such as a backpack, messenger bag, luggage, and the like) and the plaid visible to the scanner is a mirror image of the intended orientation of the plaid code.


The transition edges x1a′ and y1c′ of set 3001 can be defined by the following relationships:






x
1a′(1−b)−y1c






y
1c′=(1−b)−x1c


where b is the width of the light padding stripe within the leading strip 2602. The processing of the plaid code ensures that if (x1c, y1c) is a valid symbol and that the possible symbol (x1a′, y1a′) is not a valid symbol. This ensures that a valid code exists in one direction (such as the direction along the set 3000) and not in the mirror direction (such as the direction along set 3001). FIG. 30B is a plot illustrating the positions of valid symbol (x1c, y1c) and invalid symbol (x1a′, y1a′) in the x-y space.



FIG. 31 is a flowchart of an exemplary method of decoding the plaid code described herein. Process 3102 of the decoding method is the detecting, by a scanner, of the repeatable unit of the plaid code. This can be executed by identifying the leading strip as described in detail above. Process 3104 is the detecting, by the scanner, of the orientation of the repeatable unit based on the leading strip. Process 3106 is the detecting, by the scanner, of the transition edges of the two or more associated strips belonging to the leading strip. The transition edges can be detected by edge detection algorithms such as Hough, Canny, Derish, differential, Sobel, Prewitt, Roberts cross, and the like. Process 3108 is the calculating, by a processor coupled to the scanner, of the x and y distances of each of transition edges of the associated strips that encode information. Process 3110 is the determining, by the processor, the information corresponding to the x,y pair. The information can be determined by accessing a lookup table of codes. The lookup table may be stored in a server system to which the processor has access.



FIG. 32 is a flowchart of an exemplary method of decoding the plaid code described herein. At the start of the decoding process (process 3202), an image of a plaid code has been captured by camera or other image capture device. At process 3204, the image is encoded, by a processor coupled to the camera, in grayscale or color. At process 3208, one or more portions of the image are analyzed in parallel. If, at process 3210, a repeatable unit is identified, then control passes to process 3214. In process 3214, each set of strips of the plaid is decoded. In other words, the first set of strips in the first direction is decoded and the second set of strips in the second direction is decoded. In process 3216, each of the symbols is validated for ‘goodness of fit’ and the code is validated against design rules by the processor. Exemplary ‘goodness of fit’ checks include: (i) each strip (or symbol corresponding to the strip) should be more than 60% matched with an ideal shape stored in a database accessed by the processor; (ii) the light to dark contrast within a strip should be greater than 4%; and (iii) best fitting symbol should have a greater than 2/33 matching percentage than other symbols.


Design rules can be characterized as the aesthetic component of a plaid code, as described in detail above. In an embodiment, design rules can include constraints on choosing certain symbols in the x-y space for one or more of the strips of the repeatable unit. In some embodiments, the results of the decoded strips may be checked against a database of aesthetically acceptable plaid patterns. For example, an exemplary database entry may include the following: a set of strips in a direction of a repeatable unit containing a first strip with a first stripe having a width of 2 mm of a first color, a second stripe having a width of 10 mm of a second color, and a third stripe having a width of 21 mm of a third color; a second strip with a first stripe having a width of 7 mm of a first color, a second stripe having a width of 25 mm of a second color, and a third stripe of 1 mm of a third color; and a third strip with a first stripe having a width of 1 mm of a first color, a second stripe having a width of 25 mm of a second color, and a third stripe having a width of 7 mm of a third color. Such entries create visually pleasing dimensions while maintaining readability. In process 3218, the results of the decoded strips are combined to determine if there is a ‘winning code’. In an embodiment, the ‘winning code’ can be the result that is the closest match to an entry in a lookup table. If, at process 3220, a winning code is determined, then the winning code is returned to the calling function in order to trigger additional application behavior, in process 3222. If no unit cells are located at process 3210 or no winning code is determined at process 3220, then the message “Code Not Found” is provided to the calling function at process 3212, and control passes back to the start of the decoding process (process 3202). In some embodiments, the calling function provides the message to the user interface to provoke the user to rescan the image.



FIG. 33 is a flowchart of an exemplary method of decoding the plaid code described herein. At the start of the decoding process (process 3302), an image of a plaid code has been captured by camera or other image capture device. At process 3304, the image is encoded, by a processor coupled to the camera, in grayscale or color. At process 3306, one or more portions of the image are chosen for analysis by the processor. The portions can be chosen by the processor at random or cycled through for analysis. At process 3308, the dominant direction of the image can be determined by the processor and designated as the analysis direction. The dominant direction may be either the first or second directions. Subsequently, the image can be rotated to the analysis direction or the analysis direction can be aligned to the image by the processor. At process 3310, the image can be divided into portions by the processor. At process 3312, a repeatable unit can be identified by the processor. In some cases, the leading strips can be initially identified followed by the identification of the associated strips that constitute the repeatable unit. If, at process 3314, the repeatable unit is identified, control passes to process 3318. If, at process 3314, the repeatable unit is not identified, then, at process 3316, the message “Code Not Found” is provided to the calling function and control passes back to the start of the decoding process (process 3302). In some embodiments, the calling function provides the message to the user interface to provoke the user to rescan the image. Note that, in the above embodiments of decoding methods, one or more processes can be removed or added to the methods to achieve an acceptable outcome.


At process 3318, the strips in each direction of the repeatable unit are decoded, by the processor, to determine the corresponding symbols. At process 3320, the symbols are validated for ‘goodness of fit’ and, at process 3322, the code is validated against design rules by the processor. At process 3324, the processor collects votes per portion of the image. The votes are cast by features of the image seeking compatible model parameters. This scheme is used in an edge detection technique called the Hough transform. At process 3326, the processor applies voting rules (of the edge detection technique) to determine if there is a winning code. If, at process 3328, a winner is not selected, then control passes to process 3316. If, at process 3328, a winner is selected, then, at process 3330, the winning code is returned to the calling function to trigger additional application behavior. In an embodiment, additional application behavior can include displaying the decoded message of the plaid code on the user interface. In another embodiment, the behavior can include proving the winning code for further processing by, for example, a social media server system.


In an exemplary embodiment, the plaid code can be configured such that a scanner, configured to scan such a code, can collect portions of two or more repeatable units to successfully decode the plaid code. For example, the scanner may collect a first portion of a first repeatable code and a second portion of a second repeatable code. If the portions have any overlap or can be pieced together side-by-side to reconstruct a single repeatable unit, then the collected portions can be processed to enable efficacious decoding. If the features, such as folds, pockets, wrinkles, etc., of an article of clothing or tangible item force the repeatable unit to be represented in portions less than one whole unit, the scanner may be able to “piece” together the repeatable unit for decoding. Returning to the example of the backpack 2502, some features of the backpack may not contain a whole single repeatable unit (such as the sides of the backpack) and thus may require the “piecing” together of the unit for successful decoding.


In another exemplary embodiment, the scanner of the plaid code is configured to decode the first set of strips in the first direction separately from the second set of strips in the second direction. In another exemplary embodiment, the scanner is configured to decode the combination of the first set and the second.


In another exemplary embodiment, the size of the repeatable unit can be determined by a minimum distance required for imaging, and ultimately decoding, the plaid code. For example, a mobile phone having a camera positioned too close to the article of clothing having the repeatable unit and, therefore, may not be able to successfully decode the plaid code. In an exemplary repeatable unit, there can be four strips having equivalent widths (1 leading strip and 3 data strips). The leading dark stripe is about 10% of the symbol width, which makes it 2.5% of the repeatable unit width (in one direction), or the repeatable unit is 40 times the width of the leading dark stripe. If the leading dark stripe is 25 pixels, repeatable unit would be 1000 pixels, approximately filling the display of a smartphone. This is unlikely in a real use case. On the other hand, if the leading dark stripe is 1 pixel, the repeatable unit is 40 pixels. This is the resolution limit for the display and would result in a repeatable unit that is barely 4% of the display width. Thus, the actual operation range is in between these two extremes.


Exemplary Application for Interacting with Encoded Apparel and Tangible Items


A user can be enabled to interact with encoded articles, such as apparel or tangible items, via a user interface coupled to a scanner, such as a camera. An application executing on a mobile electronic device, such as a smartphone, tablet, laptop, notebook computer, and the like, can be used to interact with the encoded articles.



FIG. 34A is an exemplary start screen for an exemplary application interface. The start screen 3402 includes a logo 3404 and login 3406 options for the user to input login credentials. FIG. 34B is an exemplary scan screen for the exemplary application interface. The scan screen 3408 includes the view through the coupled camera lens. The view includes a scan frame 3410 prompting the user to align an encoded article 3412 within the bounds of the frame 3410 (the bounds, in this case, indicated by the corner lines of the frame). The scan screen 3408 includes a profile button 3414 and a prompt button 3416 labelled “Go and find some Lookables”. The profile button 3414 leads to a profile screen 3502 and the prompt button 3416 leads to one or more “lookable” screens 3602a-3602d describing the types of articles to scan. The scan screen 3408 also includes a timeline button 3418 to access a timeline screen 3420 of events in chronological order, as illustrated in FIG. 34C.



FIG. 35 is an exemplary profile screen for the exemplary application interface. The profile screen 3502 can be accessed from the profile button 3414 on screen 3408. The profile screen 3502 includes an editable portion 3504 to include user profile details such as a name, title, alma mater, song, email address, and the like. The profile screen 3502 also includes an article related portion 3506 prompting the user to claim or register one or more scannable articles with a button 3508 labelled “Claim a Lookable”. Button 3508 leads to a claim scan screen 3702 as illustrated in FIG. 37A and described further below.



FIGS. 36A-36D are exemplary article screens for the exemplary application interface. The article screens 3602a-3602d can be accessed from the prompt button 3416 on screen 3408. Each of the article screens 3602a-3602d show a type of article, such as a hoodie 3602a, a jacket 3602b, a tote or laptop sleeve 3602c, or a backpack 3602d.



FIG. 37A is an exemplary claim scan screen for the exemplary application interface. The claim scan screen 3702 includes a scan frame 3704 with a prompt 3706 to scan the QR code tagged to an article with the goal of claiming or registering the particular article with the profile information provided in the profile screen 3502. Note that, other than scanning the QR code of a particular article, the process for claiming the article can also achieved by scanning the encoded pattern of the article (see, e.g., FIGS. 6-7 for examples of associating an article with a user). FIG. 37B is an exemplary confirmation screen following the scanning of an eligible article (referred to in this embodiment as a “Lookable”). Once the article has been scanned, a “Lookable found” confirmation screen 3708 includes a message 3710 that confirms that a “Lookable” such as a sweatshirt has been added to the user's profile. The confirmation screen 3708 includes a prompt button 3712 to scan another article or to complete the task (“Done” button 3714″). FIGS. 38A-38B are exemplary profile edit screens for the exemplary application interface. Each of the screens 3802a-3802b can be the result of selecting “Edit” in screen 3502. Screen 3802a shows the editable fields related to a user's profile before a user has filled the fields. The “Claim your first Lookable” button 3804 leads to the claim scan screen 3702. Once an article has been scanned, the button 3804 changes to a plus button 3806 indicating the number of articles claimed by a particular user (in this case, three articles). Screen 3802b illustrates some filled out text fields, such as the name and job title of the user. Screen 3802b also includes a button 3808 to add a song to the user's profile. In some embodiments, the screen can include a text field for a message or a mixed media field to share images, audio, or video.



FIG. 39 is a shared song screen for the exemplary application interface. The shared song screen 3902 has a search field 3904 for searching for music from a database or web based application, such as YouTube or Vevo. Once a song is selected, the button 3906 can be used to add the song to the profile, as shown on screens 3802a, 3802b.



FIG. 40 is a connection screen for the exemplary application interface. The connection screen 4002 can result on the user interface when the mobile device is used to scan an encoded article, such as the backpack 3412 in screen 3408. The connection screen 4002 can include a picture 4004 of the user associated with encoded article, a shared song 4006, and alma mater 4008.


Enhanced Pattern Decoding


FIGS. 41A, 41B, and 41C illustrate an enhanced embodiment of the present invention for decoding fabric patterns in which repetition of the pattern is not strictly necessary and, when there is repetition, the repeated units can be arbitrarily positioned with respect to one another.



FIG. 41A illustrates a fabric in which there has been embedded, in a manner as discussed above, in two distinct dimensions, a unit of the pattern in accordance with an embodiment of the present invention.



FIG. 41B illustrates another embodiment of a fabric in which the pattern has been altered so that the two distinct components are presented in two distinct horizontal rows and yet, using the processes discussed below in connection with FIG. 42, the pattern remains decodable.


Similarly, FIG. 41C illustrates yet another embodiment of a fabric in which the pattern has been further altered beyond the extent of alteration shown in FIG. 41B, so that the components are presented in a manner having an arbitrarily oriented repetition. In this embodiment, the pattern remains decodable using the processes discussed below in connection with FIG. 42.



FIG. 42 is a logical flow diagram showing how processing of image data from the fabric pattern is achieved in accordance with the embodiments of FIGS. 41A, 41B, and 41C wherein repetition of the pattern is not necessary and, when there is repetition, the repeated units can be arbitrarily positioned with respect to one another. At the start of the decoding process (process 4202), an image of a pattern in fabric has been captured by a camera in a smartphone or by another image capture device. At process 4204, the image is encoded, by a processor coupled to the camera, in grayscale or color. At process 4206, a first portion of the image is chosen for analysis by the processor. The portions can be chosen by the processor at random or cycled through for analysis. At process 4208, the image is sliced into sub-images. Thereafter each sub-image is subject to processing, and the processing of the sub-images can be carried out in parallel. Although we next discuss the process for a given sub-image, this processing is carried out until all sub-images are processed. At process 4210, the processor attempts to locate edge boundaries of the sub-image. At process 4212, the processor attempts to map the edge boundaries into valid codes (which we sometimes call “symbols”). If, at process 4214, one or more codes are located, control passes to process 4218. If, at process 4214, no codes are located, then, at process 4216, the message “Code Not Found” is provided to the calling function and control passes back to the start of the decoding process (process 4202). In some embodiments, the calling function provides the message to the user interface to provoke the user to rescan the image. Note that, in the above embodiments of decoding methods, one or more processes can be removed or added to the methods to achieve an acceptable outcome.


At process 4218, it is determined whether the decoded symbol is for the weft or for the warp. Because this embodiment does not require a fixed set of orientation directions for successful decoding, the directions for weft and warp need not necessarily be constant. In other words, the local directions associated with a warp and a weft of the fabric can vary over at least a portion of the fabric.


At process 4420, the symbols are validated for closeness of fit (to assure that a symbol is not selected that is outside a range of dimensional tolerances for selection) and, at process 4222, the code is validated against design rules by the processor (to assure that selection of a given code complies with system context requirements, such as check-sum or other error checking methods). At process 4224, the processor collects “votes” for symbol candidates for each sub-image. (Although up to this point we have discussed the processes as if only a single symbol candidate is presented as part of the decoding processes, in fact the system allows for a plurality of symbol candidates.) Criteria giving rise to selection of a given symbol candidate as the decoded result of a given pattern element are evaluated to produce a number of “votes” or weights in favor of selection of a given symbol candidate.


At process 4226, the processor applies voting rules to determine if there is a winning symbol set. If, at process 4228, a winner is not selected, then control passes to process 4216. If, at process 4228, a winner is selected, then, at process 4230, the winning symbol set is returned to the calling function to trigger additional application behavior. In an embodiment, additional application behavior can include displaying the decoded message of the code on the user interface. In another embodiment, the behavior can include providing the winning code for further processing by, for example, a social media server system. In a further related embodiment, the processor is configured to handle a situation, as illustrated in FIG. 45E, wherein a plurality of distinct patterns are found in a single image; in that case, each pattern is first identified, and then each pattern is subject to separate processing, in each case as described above in connection with FIG. 42.



FIG. 43 illustrates processes 4204, 4206, and 4208 of FIG. 42, in which the image is converted to gray-scale, a portion of the converted image is chosen for analysis, and then sliced into sub-images. In FIG. 43, the superimposed grid lines and diagonal line indicate how the image may be sliced into sub-images.



FIG. 44 illustrates processes 4210, 4212, 4214, 4216, 4218, 4220, and 4222, in which parallel processing, carried out for each sub-image, involves locating edge boundaries, attempting to map the boundaries to valid codes, and, if the mapping is successful, decoding the pattern into a weft or warp symbol as the case may be, validating the resulting symbol for closeness of fit, and further validating the resulting symbol against design rules. In FIG. 44, on the left is shown a sub-image 442 and on the right a corresponding set of boundaries 444 determined from the sub-image. These boundaries can be further processed to yield a set of symbol candidates as described previously in connection with FIG. 42.


An example of evaluating the votes provided for symbol candidates is provided in Table 1:










TABLE 1





Collect results from all sub-image
Determine winning code


analyses
if any



















(warp)
044
12
votes
✓ warp = 044


(weft)
AD2
10
votes
✓ weft = AD2


(weft)
0FD
1
vote



(weft)
BDD
2
votes



(warp)
1D1
2
votes









Exemplary Augmented and Virtual Reality Applications

In various embodiments, one or more image frames capturing one or more encoded articles can be modified based on one or more codes contained in the encoded article(s) to provide an augmented or virtual reality experience. For example, a graphical element can be overlayed over part or all of an image frame, where the graphical element is selected based on one or more codes contained in the encoded article(s). The graphical element can include any type of graphic that can be displayed in an image frame, such as, for example, textual information, a digital image in any suitable format (e.g., gif, tiff, etc.), a computer-generated icon, an avatar, an animation (with or without sound), an “emoji,” or other type of graphic. In appropriate contexts, the graphical element may be fixed (e.g., pre-selected and stored) or may be generated or updated in real-time. Multiple graphical elements may be displayed simultaneously based on multiple codes captured in the image frame, e.g., graphical elements for multiple people or items identified in the image frame.


In some embodiments, the owner of an encoded article (e.g., an article of clothing, a backpack, etc.) may be permitted to specify the graphical element to be displayed when the encoded article is captured in one or more image frames. For example, the owner of the encoded article may specify a photograph or avatar to be displayed.


In some embodiments, the graphical element can be interactive. For example, a user may be permitted to select or otherwise interact with a graphical element, e.g., to contact or obtain additional information about the owner of the encoded article associated with the graphical element. Additionally or alternatively, when multiple graphical elements are displayed, the graphical elements can be programmed to interact either under the control of a user or independently in order to provide additional augmented or virtual reality experiences.


In some embodiments, the position and/or size of a graphical element can be updated in real-time, such as to reflect changes across a sequence of image frames (e.g., changes in the position of a person wearing an encoded article). In this respect, the image processor may be configured to identify the location and visible size of a code in each image frame, select an appropriate graphical element based on the code (e.g., obtain a user-specified graphical element from a database), resize the selected graphical element if needed based on the visible size of the code in the image frame, and overlay the selected and optionally resized graphical element on the image frame based on the location of the code in the image frame.


It should be noted that augmented and virtual reality experiences can be used in a wide range of applications, including, without limitation, entertainment applications (e.g., social media and video games), security/tracking applications (e.g., identifying security personnel in security videos), and military applications (e.g. monitoring soldiers on a battlefield).



FIGS. 45A through 45E illustrate examples of augmented reality experiences provided through graphical elements (e.g., names, avatars, text) overlayed on image frames based on codes identified in encoded articles, in accordance with embodiments of the present invention. (In FIGS. 45B through 45E and 46A and B, the names and some of the avatars have been obscured to protect privacy.)


In FIG. 45A, the fabric of a chair is shown to have impressed thereon an encoded pattern 452, and in FIG. 45B, this decoded pattern is used to produce an augmented reality experience in which an avatar 454 and name 456 associated with the code are caused to overlie the image of the pattern.



FIG. 45C shows that the avatar 458 can follow the fabric pattern even when the wearer of the fabric is in motion.


In FIG. 45D, the fabric pattern is associated with a backpack and the avatar 459, name 4591, and greeting 4592 are displayed over the pattern of the backpack.


In FIG. 45E, there are present three different backpacks, each with a different pattern, and the augmented reality system overlays on each distinct pattern a distinct name and avatar associated with the distinct pattern.



FIGS. 46A and 46B illustrate that, in different image sizes of the patterned backpack 461 (e.g., owing to different distances between the smartphone camera and the backpack or different zoom settings of the smartphone camera), the augmented reality system overlays on the pattern a correspondingly scaled name and avatar associated with the pattern.



FIG. 47 is a logical flow diagram illustrating processes for a basic encoded pattern system (in the first column) and for an augmented reality system (occupying both columns) in accordance with embodiments of the present invention. The image frame is processed to detect a code in a pattern in the image (in block 4702) and also (in block 4704) to return information such as the location and visual size of the pattern in the image frame. The decoded pattern can be used to retrieve relevant information including the identity of the owner and an owner-selected graphical element (block 4706). For operation of the augmented reality aspect of the system, in processing of the image data, there is returned the location and size of the pattern in block 4708. The owner-selected and optionally resized graphical element can then be overlayed (in block 4710) on the image frame based on the location of the code in the image frame and updated with any augmented or virtual reality interactions (block 4712) that are presented to the user.


Pattern in Aesthetic Environment



FIG. 48 is an embodiment wherein a pattern is incorporated into a representational aesthetic environment. In this environment, the pattern operates just as described herein and in the Prior Application, for example, by incorporating strips oriented in one or more directions (such as, for example, associated with the warp and/or weft of the fabric, although other orientations are possible). As with patterns described herein and in the Prior Application, the pattern can encode information based characteristics of the strips, such as, for example, the relative widths, lengths, and/or positions of the strips. Strips can be linear or non-linear (e.g., an encoded pattern could incorporate circular or other non-linear strips having different characteristics such as widths). Thus, a “strip” can include virtually any type of visual element including elements with linear, non-linear, regular, and/or irregular shapes. As described in paragraphs 217-227 of the present patent application and in paragraphs 189-199 of the Prior Application, repetition of the pattern is not essential (see paragraph 221 of the present patent application and paragraph 193 of the Prior Application) and when repetition is present, the repeated units can be arbitrarily positioned with respect to one another. Furthermore (see paragraph 222 of the present patent application and paragraph 194 of the Prior Application), the directions for weft and warp need not necessarily be constant, and the local directions associated with a warp and a weft of the fabric can vary over the fabric.


Accordingly, the representational aesthetic environment of FIG. 48, in the context of the above-discussed flexibility of the pattern, is used to further disguise the presence of the plaid code. In this case, the representational aesthetic environment evokes plants, and specifically grasses, thereby providing an aesthetically-pleasing decoration for the item (perhaps evoking springtime) while hiding the fact that an encoded pattern is present. Here, information can be encoded in the relative widths, lengths, positions, and/or colors of the grasses. Thus, for example, the pattern can be a one-dimensional code or a two-dimensional code (e.g., akin to a plaid code), although other encodings are possible. Other types of representational aesthetic environments can be contemplated. For example, an embodiment evoking summer could encode information in representations of ocean waves or in representations of beach umbrellas, an embodiment evoking fall could encode information in representations of tree branches or in representations of leaves that have fallen from a tree, and an embodiment evoking winter could encode information in representations of snowflakes. These are non-limiting examples, and encodings of the types described herein can be applied to many other representational aesthetic environments.


In another set of embodiments, the present invention can be used to inspire, not simply social interaction using social networks and the like as discussed herein and in the Prior Application, but also e-commerce. We begin with FIG. 49, in which is depicted a plaid-encoded article, in this case a backpack, that may be used in connection with an application executing on a mobile computing device to provide identification of a wearer of the article, based on the plaid code embedded in the article.



FIGS. 50 through 53 are representations of display screens on a mobile computing device executing an application providing identification of the wearer of the plaid-encoded article of FIG. 49, as well as providing a platform for social interaction, with an e-commerce extension.



FIG. 50 is a screen displayed to a third-party user (who will later be identified as John Smith) of the application, after the application has recognized the wearer as Dr. Jane Doe based on the plaid code, and the user of the application has invoked the “share a coffee” functionality of the application, accessed by icon 31.



FIG. 51 is a screen displayed by the same application as in FIG. 50, this time executing on the phone of the wearer of the plaid-encoded article, in this case Dr. Jane Doe, after the user in FIG. 50 has invoked the “share a coffee” functionality of the application.



FIG. 52 is a screen displayed by the application executing on the same phone as in FIG. 51, after Dr. Jane Doe has graphically invoked the message icon 41 of FIG. 51, informing Dr. Doe in text region 51 that John Smith has shared a coffee with her, and providing graphical button 52 by which she can redeem the shared coffee.



FIG. 53 is a screen displayed by the application executing on the same phone as in FIG. 51, after Dr. Jane Doe has graphically invoked the graphical button 52 to redeem the shared coffee.


Fabrics with Embedded Fiber Transmitters


In another set of embodiments, there is provided an article including a fabric in which is embedded a set of fiber transmitters, operating in visible or invisible wavelengths, that are configured to transmit information. In one embodiment, a single fiber can include one or more light-emitting diodes (LEDs) that typically are no wider than the width of the fiber itself, thereby making the LED(s) virtually undetectable when not turned on, and one or more of such fibers can be embedded in a fabric. In another embodiment, a single fiber can change a visual property such as color or opacity, and one or more of such fibers can be embedded in a fabric. In one embodiment, as described herein and in the Prior Application, the article is a selected one of a set of articles, each article of the set comprising a fabric and being associated with a unique identification code; in that embodiment, the transmitted information includes the unique identification code. However, because the fiber transmitters are active elements, the transmitted information can be changed at will according to needs or context. Virtually any type of data can be encoded in a transmission, e.g., a unique identification code (e.g., identifying the fabric or the wearer), an emergency beacon (e.g., indicating that the wearer is in distress or needs assistance), streaming media (e.g., a video or music stream), a file identifier, a web address, a secret message, etc. Signals can be transmitted on a continual basis or only upon the occurrence of a particular event (e.g., activation by the user, such as in an emergency, or activated automatically, such as in a “person down” situation). Different codes can be transmitted for different types of events under user control or otherwise. In any case, the transmitted data are configured to be read and decoded by a receiver such as a mobile computing device, typically in a manner wherein the selected article is contextually recognizable. In a related embodiment, the encoded information is represented in the on-off pattern of the fiber transmitters, wherein the fiber transmitter typically operates at 60 Hz to 120 Hz and can be varied so fast that the on-off pattern will be undetectable by the naked eye (e.g., above around 80 Hz). In a further related embodiment, the fiber transmitters of the selected article are visible from the front and the back of the selected article.



FIGS. 54A, 54B, and 54C thus illustrate embodiments of the present invention wherein a set of fiber transmitters is embedded in a fabric of a wearable article, wherein the transmitters operate in visible or invisible wavelengths.


In FIG. 54A, the set of fibers is embedded in the fabric of a hat.


In FIG. 54B, the set of fibers is embedded in the fabric of an outer garment.


In FIG. 54C, there is provided detail of a light-emitting portion of the fiber in FIG. 54B.



FIG. 55 is a block diagram of logical flow used in decoding data transmitted by a set of fiber transmitters embedded in a fabric of a wearable article in accordance with an embodiment of the present invention, and FIG. 56 is a block diagram of logical flow used in a tracker process in the logical flow of FIG. 55. Image data are collected at a frame rate sufficiently fast to ensure that on-off frame sequence information from the transmitter is captured without significant loss of frames. In particular, a capture frame rate of 240 frames per second is used with the source transmitting speed of 120 Hz. The image frame containing the fabric is split into a set of sub-images. A set of trackers are kept in a resource pool for decoding needs. For each incoming image frame, each active tracker is updated with the sub-image it is tracking. The on-off sequence is accumulated. The exact location of the transmitting source is updated so that the tracker will continue to follow a potential moving source. If there is a reading error, the tracker is retired and returned to the resource pool. Once the message is read completely and validated against design rules including error correction codes, it is passed to the calling function, triggering additional app behavior. For the sub-images where no active tracker is working on, a test is performed to determine if there is a potential transmitting source in the sub-image. If one is found, a tracker is assigned to the sub-image. Specifically, in one exemplary embodiment, the light intensity in the sub-image is compared with that from the same location of the previous image frame, and if the change is above a threshold, it is considered to contain a potential transmitting source.


Exemplary basic modules executing on an iPhone to implement the foregoing logical flow are attached as Exhibits A and B (for the light code reader) and Exhibits C and D (for the light code tracker). Each of these Exhibits is hereby incorporated herein physically and by reference in its entirety.


Accordingly, in various of these embodiments, there is provided a method of decoding the transmitted data from the fiber transmitters embedded in fabric, the images having been captured by a camera, the method carried out to capture the encoded text and employing computer processes. In this embodiment, the method includes: selecting a first portion of the image for analysis; dividing the image into sub-images; and processing each of the sub-images. In turn, processing each of the sub-images includes: determining possible transmitting source within the sub-image; assigning a specific tracker to follow the source through subsequent images to read the data; and determining for each tracker, whether a valid data stream is present, and, if so, decoding the stream; otherwise, discarding the data and returning tracker to the resource pool.


In a further related embodiment, processing each tracker further comprises validating the received symbol set and validating each symbol set against a set of design rules.


In some embodiments, at least one fiber embedded in a fabric can sense light and therefore can act as a receiver, such as for receiving light-based signals generated from another nearby fabric as discussed above. Among other things, the use of such receiver fibers can allow for fabric-to-fabric communications. Fabrics can include both transmitter and receiver fibers (or a combination transmitter/receiver fiber) to allow for communication in one or both directions. Such fabrics can be made to monitor for a specific signal and transmit an appropriate reply signal (e.g., fabrics can be individually polled by a central controller).


Fibers that can change visual properties can be used in fabrics that include an encoded pattern in order to allow for dynamically changing the encoded pattern. For example, when such fibers are used as part of an encoded pattern in a fabric (e.g., in the warp or weft direction), the ability to change visual properties of the fibers can be used to program an encoded pattern (e.g., multiple fabrics can be produced using the same configuration of fibers but programmed to encode different patterns) and/or to dynamically change an encoded pattern (e.g., by changing the color or opacity of one or more fibers to dynamically increase or decrease the effective width of a stripe within an encoded pattern).


Novel Authentication Arrangement


In another embodiment, the invention provides a method where a first mobile computing device detects the encoded pattern from an article of clothing of a specific user having a second mobile computing device. The first mobile computing device sends the detected code to a server, requesting connection. The server sends the request to the second mobile computing device to determine between the two a one-time verification code that the second mobile computing device downloads to the article of clothing to transmit. The first mobile computing device reads this one-time verification code and sends it to the server to complete the verification that it is indeed looking at that article of clothing in real time. Because the ability to perform a real-time in-person confirmation of the digital/physical links is not present in the previous static-pattern system described herein and in the Prior Application, in that context, because the communicated code is static, a recorded picture can as well be used for authentication.


In another embodiment, which does not depend on whether the transmitted code is active or passive, the device reading the code need not be another individual holding a mobile computing device. The reading device may be a set of cameras as part of a vendor's infrastructure, such as in a store or in another commerce venue. In these cases, the reading of the identification on an article of clothing allows the vendor to provide an individualized experience to each user wearing an article of clothing having a uniquely identifiable code in accordance with various embodiments of the present invention.


More generally, one or more transmitters and/or receivers (referred to generally herein as “active elements”) operating in visible or invisible wavelengths can be integrated into an article or fabric. In some exemplary embodiments described above, transmitters and/or receivers in the form of fiber transmitters/receivers are embedded into an article or fabric such as by weaving the fiber into the article or fabric, although transmitters and/or receivers of other forms can be integrated into an article or fabric in other ways. For example, individual transmitters and/or receivers (e.g., LEDs) can be integrated into an article or fabric such as by attaching the individual transmitters/receivers to the article or fabric (e.g., by gluing, sewing, etc.), or one or more transmitters and/or receivers can be integrated into a separate device that in turn is integrated into an article or fabric.


In certain additional embodiments, one or more transmitters and/or receivers operating in visible or invisible wavelengths are integrated into a button that can be placed on an article or fabric. For example, one or more transmitters and/or receivers can be embedded in, attached to, or otherwise made part of a button. Generally speaking, buttons of typical embodiments include a button shank that is covered by a covering material such as a cloth or other fabric, although some buttons are not covered. The button shank may be a single piece of material or may be formed from multiple pieces of material. In some embodiments, active elements are molded into the button shank, such as during a plastics molding process.


In various exemplary embodiments, active elements can be placed under the covering material, can be placed on or over the covering material, or can be woven into the covering material. In embodiments where transmitters are placed under the covering material, the transmitters and covering material may be selected so that the button appears to be a regular button (i.e., without active elements) when the transmitters are off but light from the transmitters passes through the covering material when the transmitters are on such that transmitted signals can be detected and decoded by an outside device such as described above. Similarly, the covering material can be chosen to allow light to pass through and be received by receivers under the covering material.


The button can be attached to an article or fabric using any appropriate mechanism, such as by sewing or by attachment using a rivet, grommet, pin, clip, or other attachment mechanism. The button can include other elements for operating the transmitters and/or receivers, such as, for example, a power source (e.g., a battery, an electromagnetically-actuated energy source such as used in passive RFID tags, a light-actuated energy source such as photocell or photoresistor, etc.), a processor such as for controlling transmitters and/or processing signals from receivers, a wireless receiver or transceiver such as for communication with an external device, various electrical leads (e.g., wire, thin conductive yarn, fiber transmitter or receiver, etc.) that can be attached to a power source or controller, and/or other elements.



FIG. 57 is a schematic diagram showing a cloth-covered shank button, in accordance with one exemplary embodiment. Shown from left to right are a top view 1002 of the cloth-covered shank button with LEDs on so as to show through the fabric, a bottom view showing a shank insert 1004 and surrounding cloth cover 1006, and a rivet 1008 that can be used to attach the cloth-covered shank button to an article or fabric such as a hat. In this example, the rivet 1008 includes a fastening element that mates with a corresponding receptacle of the shank 1004 such that the shank 1004 can be placed on one side of an article or fabric and then the rivet 1008 can be inserted through the article or fabric from the reverse side and then mated with the shank 1004. In this example, the LED transmitters are placed along the outer periphery and top center of the button shank such that transmitted light can be seen from virtually any direction around and above the button. Electrical leads for the LEDs (not shown for convenience) can be placed under the cloth cover 1006 or can be included as part of the cloth cover 1006 (e.g., fibers woven into the cloth). FIG. 58 is a schematic diagram showing two alternative configurations for a cloth-covered button, in accordance with various alternative embodiments.


Shank buttons of the types shown in FIGS. 57-58 are typically formed from two pieces, specifically a shank insert and a shank cap that mates with the shank insert. In order to form a cloth-covered button, the shank cap is covered with cloth, and then the cloth-covered shank cap is mated with the shank insert to secure the cloth and form a unitary shank button that then can be secured onto an article or fabric (e.g., using a rivet or other appropriate fastener). FIG. 59 shows exemplary shank inserts 1202 and corresponding rivets 1204, in accordance with one exemplary embodiment. FIG. 60 shows an exemplary shank cap 1302 that mates with the shank insert 1202, specifically by inserting the top portion of the shank insert 1202 into the opening of the cloth-covered shank cap 1302. In certain exemplary embodiments, active elements can be placed on the shank cap 1302, after which the shank cap 1302 can be covered with cloth and then mated with the shank insert 1202 to form a unitary button. As mentioned above, electrical leads for the active elements can be placed under the cloth cover 1006 or can be included as part of the cloth cover 1006 (e.g., fibers woven into the cloth). In either case, the electrical leads can be made to protrude from the button such as to allow for connection to an external device such as a power source or controller. FIG. 61 is a schematic diagram showing an exemplary shank cap 1402 onto which a number of LEDs have been placed, such as by using an adhesive to attach the LEDs to the shank cap 1402, in accordance with one exemplary embodiment. Alternatively, active elements could be placed on the shank insert 1202 or integrated inside of the shank cap 1302, and openings could be included in the shank cap 1302 to expose the active elements as needed. FIG. 62 is a schematic diagram showing three exemplary cloth-covered shank buttons of the type that can be formed as described herein, where button 1502 is shown from the bottom side with no LEDs showing through the cloth covering, button 1504 is shown from the top side with a number of red LEDs showing through the cloth covering, and button 1506 is shown from the top side with no LEDs showing through the cloth covering. It should be noted that active elements can be used with other types of buttons and the present invention is not limited to any particular type of button. Thus, for example, embodiments can include other types of buttons that are configured to be attached to an article or fabric in other ways.


As discussed above, buttons can include other elements for operating the transmitters and/or receivers. It is envisioned that such elements can be included in or on the shank insert, the shank cap, and/or the rivet or other connector of buttons of the types described herein.



FIG. 63 is a schematic diagram showing a hat incorporating a button of the type described herein, in accordance with one exemplary embodiment. In this example, the hat is a baseball-cap style hat with a button 1602 (sometimes referred to as a “squatchee”) containing LEDs and potentially other elements as discussed herein. The hat can include other elements, such as a power source and/or controller, which in some cases may be hidden under the hat or within a seam of the hat. Among other things, having LEDs in or on the button on top of a hat can provide all-around visibility by LED-reading enabled cameras.


It should be noted that, while various exemplary embodiments are described herein with reference to buttons, other types of devices having active elements may be produced. Devices of the type described herein can be permanently or temporarily affixed to an article or fabric. For example, devices that are sewn on or attached by rivet might considered permanently affixed, while buttons that are pinned or clipped onto an article or fabric might be considered temporarily affixed, which may be desirable for certain uses (e.g., distributing devices for a particular event in which participants can be tracked using the active devices). Such devices can be used on virtually any type of product, such as, for example, baseball caps, beanies, handbags, apparel, home furnishings, and consumer soft goods products, to name but a few.


As with other exemplary embodiments described above, each button or other device may be associated with a unique identification code, which may be hard-coded or programmable, and the information transmitted by the button may include the unique identification code. In some cases, because the transmitters are active elements, the transmitted information can be changed at will according to needs or context. Virtually any type of data can be encoded in a transmission (e.g., a unique identification code (e.g., identifying the fabric or the wearer), an emergency beacon (e.g., indicating that the wearer is in distress or needs assistance), streaming media (e.g., a video or music stream), a file identifier, a web address, a secret message, etc. Signals can be transmitted on a continual basis or only upon the occurrence of a particular event (e.g., activation by the user, such as in an emergency, or activated automatically, such as in a “person down” situation). Different codes can be transmitted for different types of events under user control or otherwise. In any case, the transmitted data are configured to be read and decoded by a receiver such as a mobile computing device, typically in a manner wherein the selected article is contextually recognizable. In a related embodiment, the encoded information is represented in the on-off pattern of the transmitters, wherein the transmitter typically operates at 60 Hz to 120 Hz and can be varied so fast that the on-off pattern will be undetectable by the naked eye (e.g., above around 80 Hz). In a further related embodiment, when the button or other device is placed on an article such as at the top of a hat (e.g., a baseball cap or similar type of hat), the transmitters are visible from the front and the back of the article and also from the top of the article, allowing transmitted information to be read from virtually any direction.


One potential advantage of a button or other device that includes the active transmitter and/or receiver elements and can be attached to an article or garment is that the active elements can be protected from potentially harmful fabrication processes involved in production of the article or fabric, such as, for example, molding, heat setting, sewing, welding, weaving, or knitting processes. The article or fabric can be produced, and then the button or other device can be added to the article or fabric in order to integrate the active elements.


Obfuscated Codes


In other embodiments, a coded pattern that is woven into a fabric as discussed herein and in the Prior Application can be obfuscated such as by applying additional graphics onto the fabric (e.g., by printing) in such a way that the coded pattern can still be read and decoded by a reader through the additional graphics. FIG. 64 is a photograph of an obfuscated coded pattern, in accordance with one exemplary embodiment. In this example, a barcode-type pattern is woven into the fabric, and then additional graphics are printed onto the fabric such that the underlying coded pattern can still be discerned by a reader (and in this case also visually) but is obfuscated by the overlying graphics. Of course, other types of coded patterns of the types described herein and in the Prior Application (e.g., plaid codes, two dimensional codes, etc.) may be woven into the fabric and obfuscated. In this way, fabrics with various types of aesthetics can be produced while still allowing coded patterns to be present and read. In order to read an obfuscated coded pattern, the reader may be configured to adjust the contrast of the image of the fabric so as to enhance the underlying coded pattern.


Providing Personalized Experiences


In certain exemplary embodiments, light-based communications of the type described above are used to provide personalized experiences to visitors in museums, sports stadiums, stores, shopping centers, and other facilities such as to tailor an experience to a person's individual needs or tastes (e.g., based on a user profile or user responses to a questionnaire) or to provide location-specific information (e.g., exhibit information, advertising, wayfinding/navigation, etc.). For purposes of the following discussion, the term “facility” is virtually unlimited and can include any area or space in which cameras and optionally also lighting can be placed, including enclosed areas (e.g., within a building or vehicle) and open areas (e.g., in a park). Generally speaking, such experiences have been provided through private tour guides or have been limited to a small fraction of visitors able to pay premium prices using specialized equipment.


In exemplary embodiments, visitors to a facility are equipped with a user device that transmits information to the facility infrastructure via light-based communications, e.g., from an LED. The user device may be provided by the facility or may be provided by the visitor or other source. The information transmitted by the user device to the infrastructure typically includes a device identifier and additionally or alternatively could include other information such as, for example, information about the device, information about the device user, requests or other inputs from the device user, etc. Cameras located in the facility infrastructure allow for receiving light-based communications from such devices, allowing the infrastructure to identify the devices, the locations of the devices (e.g., at a particular exhibit, entrance/exit, floor, etc.), the orientations of the devices (e.g., facing toward or away from a particular exhibit), and/or other parameters, enabling precision locational and directional guidance not easily achievable with other technology, such as Bluetooth, GPS, WiFi, etc.


Once the infrastructure identifies the devices via light-based communications, the infrastructure can then transmit personalized information to each device using the reception capabilities of the devices, e.g., BLE RSSI, Wi-Fi, Bluetooth, etc. However, the aforementioned approaches generally require the use of a smart phone or other similar communication device in order to function. Moreover, each has limitations with respect to spatial resolution and orientation of the visitor.


Therefore, in certain exemplary embodiments, the devices are equipped with a light-based receiver for receiving information from the infrastructure as well as an output device through which personalized information received from the infrastructure can be conveyed to the device user. Typically, the device would include an audio output device such as a speaker, headphones/earphones, or a headphone/earphone interface (e.g., audio jack, Bluetooth, etc.), although the device additionally or alternatively could include other types of output devices such as a display device, a tactile output device for vision-impaired users, etc. In this way, the infrastructure can transmit personalized information to the device for output to the user such as using existing LED lighting infrastructure and/or supplemental LED lighting that can be modulated to transmit information to the user devices. The light-based transmissions from the infrastructure to the user devices can be carried on separate logical or physical communication channels (e.g., using wavelength-division multiplexing) or can be carried over a common communication channel and addressed for individual devices (e.g., using packet-based communications). Broadcast and multicast communications also could be supported, e.g., for providing the same information to a group of users (e.g., members of a particular tour group), or to broadcast a message to all users (e.g., in case of an emergency or a public announcement).


Thus, exemplary embodiments can provide a free space optical communication system that can leverage existing LED lighting infrastructure in conjunction with cameras to create a bidirectional optical communication stream between the infrastructure and the customer such as for indoor navigation or personalized experience streaming. Each visitor to a facility can enjoy a personalized experience by simply donning a typically low-profile, light-weight device (e.g., upgraded versions of commercially-available wireless earphones, or a button or hat of the types described above to be carried or worn by the user or attached to the user's clothing). Such embodiments do not require expensive private human guides and would not require the visitor to use any other devices such as smart phones, smart watches, etc.


The infrastructure generally includes a processor system (which may include one or more physical or virtual machines and may be cloud-based) to control the cameras and decode light-based transmissions received from the user devices, to generate personalized information to transmit to the user devices, and to control the infrastructure lights to transmit the personalized information to the user devices, e.g., using wavelength division multiplexing or by modulating the lights to transmit information. Many LED lighting providers are adding modulation capabilities to their lights and lighting systems, and such modulation capabilities can be leveraged in exemplary embodiments described herein. Generally speaking the modulation circuitry modulates the LED at a rate that is not perceivably by human vision.


The personalized information transmitted to a given device can be based on various types of inputs, such as, for example, the user's identity (e.g., based on a user login or a correlation between the user and the device), the user's location and directional orientation (e.g., as determined by the infrastructure from light-based communications), the user's intentions or desires (e.g., based on a user profile or user inputs such as responses to questions), information from social media and other online sources (e.g., personal posts, browsing history, purchasing history, etc.), and decisions made by the infrastructure as to what information to present to each user (e.g., advertiser-driven information). Thus, for example, visitors to a museum may be provided with different experiences even when viewing the same exhibit based on the users' individual intentions/desires (e.g., one visitor might be interested a particular artist and therefore might receive information about the artist including instructions for locating other artworks by the same artist while another visitor might be interested in a particular artwork genre and therefore might receive information about the genre including instructions for locating other artworks of the same genre) or may be provided with exhibit-specific information selected by the infrastructure based on the direction the user is facing (e.g., from one location, a visitor may view multiple different exhibits, and the information presented to the visitor may be selected based on which exhibit the visitor is facing).


The following scenario describes various aspects and capabilities of an exemplary free-space optical communication system.


Visitor experience: As visitors enter a facility equipped with this technology, they may be greeted by an automatic Kiosk Stand, where they are invited by displayed words and/or audio to pick up a Mobile Unit.


Mobile Unit: The mobile unit may be a lightweight, low-profile headphone or speaker unit that contains a free-space optical communication capability and a LED for identification transmission. When the visitor dons the unit, the lights above the user begin to stream audio directions to the Mobile Unit (and hence to the visitor) optically. These instructions can orient the visitor on how optical communications work and how to enter information into the kiosk. The mobile unit preferably is invisible from the perspective of the visitor, e.g., it should not impede the movement of the visitor or detract from the visitor's experience in any way.


Kiosk Stand: The kiosk stand may be an iPad or other electronic device that may be connected to the cloud and contains a camera to read the LED identification transmission from the mobile unit. The kiosk stand may contain a software app that allows the visitor to select a custom experience from a list of available experiences. In the context of indoor navigation, for example, the custom experience may be to select a destination or tour within the building. Once the experience has been selected, the camera at the kiosk will read the unique ID being transmitted by the LED on the mobile unit. This information will be coupled with the visitor's intentions and transmitted to the cloud.


Infrastructure units: The infrastructure units consist of two elements—LED and camera. In the context of existing LED infrastructure, the LEDs can be dual-purpose, e.g., to provide necessary lighting within the facility and to facilitate optical communications through modulation of the LED lighting. The LED unit preferably disappears into the infrastructure, e.g., it can look like (and in fact can be) a standard ceiling light in the building. The camera element associated with the LED is used to read the unique IDs being transmitted by mobile units and to ascertain the orientation of the visitor. Each infrastructure unit typically possesses a unique address within the infrastructure such that when a camera reads the ID of a mobile unit, information specific to the visitor's intention and local address of the infrastructure unit can be transmitted to the cloud.


Devices for use in the free space optical communication system can be equipped with ancillary input devices to allow additional information to be conveyed to the infrastructure. As but one simple example, the devices can include one or more pushbutton switches that can be pressed by the user in response to a prompt from the infrastructure, e.g., to answer “yes” or “no” to a question or to convey user intention. Of course, more complex user interfaces, such as, for example, graphical user interfaces, can be included in devices. User inputs can be conveyed by the device to the infrastructure via the light-based transmitter of the device and received by the infrastructure camera, similar to the transmission of a device identifier.


In some exemplary embodiments, a smart phone or similar smart device may be used as the user device. For example, the “flash” of the smart device could be used to transmit information to the infrastructure, while the camera of the smart device could be used to receive light-based communications from the infrastructure using the type of camera-based reception and decoding of light-based communications described above. In such a scenario, a special app may be developed to run on the device to operate the flash and camera.


Exemplary embodiments of the free space optical communication system can be used for virtually unlimited purposes. The following are but a few additional examples.


The free space optical communication system can be used to provide personalized advertising. For example, different shoppers may receive different offers when entering a particular store.


The free space optical communication system can be used to provide information about a specific product or exhibit based on the location and directional orientation of the user, as mentioned above.


The free space optical communication system can be used to provide personalized instructions to the user based on the location and directional orientation of the user, e.g., directions to get to a particular store within a shopping mall or to a particular exhibition booth at a convention, directions to the nearest exit in the event of an emergency, or directions back to a safe place if the user is lost. Thus, for example, the free space optical communication system can be used to provide real-time wayfinding directions as the user moves within a facility.


The free space optical communication system can be used to provide security and monitoring of people in a facility, with information transmitted by the infrastructure based on the identity of a particular user or user device not required to be directed to that particular user or user device. For example, the infrastructure can be configured to monitor the location of users based on user devices carried or worn by the users (e.g., children at a daycare center or on a field trip) and the infrastructure can be configured to transmit information to another user or user device (e.g., transmitting the last known location of a lost child to an administrator or tour guide).


MISCELLANEOUS

Aspects of the present invention may be embodied in many different forms, including, but in no way limited to, computer program logic for use with a processor (e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer), programmable logic for use with a programmable logic device (e.g., a Field Programmable Gate Array (FPGA) or other PLD), discrete components, integrated circuitry (e.g., an Application Specific Integrated Circuit (ASIC)), or any other means including any combination thereof.


Computer program logic implementing all or part of the functionality previously described herein may be embodied in various forms, including, but in no way limited to, a source code form, a computer executable form, and various intermediate forms (e.g., forms generated by an assembler, compiler, networker, or locator.) Source code may include a series of computer program instructions implemented in any of various programming languages (e.g., an object code, an assembly language, or a high-level language such as Fortran, C, C++, JAVA, or HTML) for use with various operating systems or operating environments. The source code may define and use various data structures and communication messages. The source code may be in a computer executable form (e.g., via an interpreter), or the source code may be converted (e.g., via a translator, assembler, or compiler) into a computer executable form.


The computer program may be fixed in any form (e.g., source code form, computer executable form, or an intermediate form) either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), a PC card (e.g., PCMCIA card), or other memory device. The computer program may be fixed in any form in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies, networking technologies, and internetworking technologies. The computer program may be distributed in any form as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software or a magnetic tape), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web).


Hardware logic (including programmable logic for use with a programmable logic device) implementing all or part of the functionality previously described herein may be designed using traditional manual methods, or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD), a hardware description language (e.g., VHDL or AHDL), or a PLD programming language (e.g., PALASM, ABEL, or CUPL).


POTENTIAL CLAIMS

Various embodiments of the present invention may be characterized by the potential claims listed in the paragraphs following this paragraph (and before the actual claims provided at the end of the application). These potential claims form a part of the written description of the application. Accordingly, subject matter of the following potential claims may be presented as actual claims in later proceedings involving this application or any application claiming priority based on this application. Inclusion of such potential claims should not be construed to mean that the actual claims do not cover the subject matter of the potential claims. Thus, a decision to not present these potential claims in later proceedings should not be construed as a donation of the subject matter to the public.


Without limitation, potential subject matter that may be claimed (prefaced with the letter “P” so as to avoid confusion with the actual claims presented below) includes:


P1. An article, the article being a selected one of a set of articles, each article of the set comprising a fabric and being associated with a unique identification code, the selected article having a pattern, distributed over at least 10% of an exposed surface of the selected article, the pattern encoding the identification code associated with the selected article, wherein the pattern is configured to be readable and decodable by a mobile computing device in a manner wherein the selected article is contextually recognizable.


P2. An article according to claim P1, wherein the identification code is represented in the pattern in a large format, wherein each item of information content in the identification code is represented in the pattern by a set of attributes, each attribute of the set of attributes having a minimum dimension of 1 mm.


P3. An article according to claim P2, wherein each attribute of the set of attributes has a minimum dimension of 2 mm.


P4. An article according to claim P2, wherein each attribute of the set of attributes has a minimum dimension of 3 mm.


P5. An article according to claim P1, wherein the pattern is distributed over at least 30% of the exposed surface of the selected article.


P6. An article according to claim P1, wherein the exposed surface of the selected article includes a front and a back of the selected article.


P7. An article according to claim P1, wherein the pattern includes an error correction code.


P8. An article according to claim P7, wherein the error correction code is a forward error correction code.


P9. An article according to claim P1, wherein the pattern includes a repetition of encoding of the identification code.


P10. An article according to claim P1, wherein the pattern encodes a minimum of 24 bits of information comprising the identification code.


P11. An article according to claim P1, wherein the pattern encodes a minimum of 32 bits of information comprising the identification code.


P12. An article according to claim P1, wherein the unique identification code, encoded by the pattern, was transmitted by a server system for use in manufacturing of the article.


P13. An article according to claim P12, wherein the unique identification code has been associated with an owner of the article by updating information in the server system in connection with a sale of the article to the owner.


P14. An article according to claim P1, wherein the pattern is not discernible to an ordinary unaided human observer.


P15. An article according to claim P1, wherein the pattern comprises a plurality of horizontal lines having varying thickness, spacing, and color, the plurality of horizontal lines extending over at least 80% of a first dimension of the exposed surface of the article of clothing.


P16. An article according to claim P15, wherein the article of clothing is from a group consisting of a shirt, jacket, sweater, and vest, and wherein the first dimension is parallel to a line drawn from shoulder-to-shoulder of the article of clothing.


P17. A server-based method for identifying a specific article of fabric in a social context, the method comprising computer processes including: receiving at a server system a request message, from a first instance of a fabric identification application executing on a mobile computing device of a regarding individual who has caused the mobile computing device to capture an image in which at least a part of the specific article appears, the request message containing identity data corresponding to a pattern on the article, the pattern encoding a unique identification code associated with the specific article and the pattern configured to render the article contextually recognizable, processing by the server system the identity data, in relation to a database system storing identification codes for a set of articles in relation to corresponding user information, to identify a specific user associated with the specific article of fabric; and sending by the server system a reply message to the application executing on the mobile computing device that, consistent with permissions of the specific user, includes user-defined content, such content defined by the specific user.


P18. A method according to claim P17, wherein the user-defined content includes personal information concerning the specific user.


P19. A method according to claim P18, further comprising: before sending the reply message by the server system: receiving by the server system, from the first instance of the fabric identification application executing on the mobile computing device of the regarding individual, first geolocation data defining a location of the computing device of the regarding individual; receiving, by the server system, from a second instance of the fabric identification application executing on a mobile computing device of the specific user, second geolocation data defining a location of the specific user's computing device; processing by the server system the first and second geolocation data to determine if the mobile computing device of the regarding individual is within a predetermined distance from the specific user's mobile computing device, and, if not within the predetermined distance, configuring the reply message to convey denial of permission to provide the personal information about the specific user.


P20. A method according to claim P17, further comprising sending, by the server system, after receiving the identity data, to the application executing on the mobile computing device, a confirmatory message including validity information associated with the identity data.


P21. A method according to claim P17, wherein the identity data has been derived by the application executing on the mobile computing device from a processed version of the image, the processed version being a result of processing of the image on the mobile computing device.


P22. A method according to claim P17, further comprising receiving by the server system the image captured by the mobile computing device and processing by the server system the image to derive the identity data.


P23. A method according to claim P17, further comprising configuring by the server system the reply message to the application to initiate a request to a third-party application executing on the mobile computing device using the identity data.


P24. A method according to claim P17, further comprising, if the permissions of the specific user prevent the personal information about the specific user from being included in the reply message, configuring, by the server system, the reply message to redirect the application executing on the mobile computing device of the regarding individual to cause other appropriate content to be displayed thereon.


P25. A method according to claim P17, wherein the user-defined content includes a plurality of content items, a specific one thereof being selected, for transmission to the mobile computing device of the regarding individual, according to a set of selection criteria specified by the specific user.


P26. A method according to claim P25, wherein the set of selection criteria includes an identity of the regarding individual.


P27. A method according to claim P25, wherein the set of selection criteria includes an item selected from the group consisting of time of day of receipt by the server system of the request message, date of such receipt, geolocation of the mobile computing device of the regarding individual, and combinations thereof.


P28. A method according to claim P17, wherein the reply message includes at least some third party-defined content.


P29. A method according to claim P28, wherein the at least some third party-defined content includes advertising.


P30. A method according to claim P17, wherein the fabric identification application is a portion of a social network application, wherein a first instance of the social network application is executing on the mobile computing device of the regarding individual.


P31. A method according to claim P19, wherein the fabric identification application is a portion of a social network application, wherein a first instance of the social network application is executing on the mobile computing device of the regarding individual and a second instance of the social network application is executing on the mobile computing device of the specific user.


P32. A method according to claim P17, wherein the user-defined content includes music selected according to preferences of the specific user.


P33. A server-based method for identifying a specific article of fabric in a social context, the method comprising computer processes including: receiving at a server system a request message, from a first instance of a fabric identification application executing on a mobile computing device of a regarding individual who has caused the mobile computing device to capture an image in which at least a part of the specific article appears, the request message containing identity data corresponding to a pattern on the article, the pattern encoding a unique identification code associated with the specific article and the pattern configured to render the article contextually recognizable, processing by the server system the identity data, in relation to a database system storing identification codes for a set of articles in relation to corresponding user information, to identify a specific user associated with the specific article of fabric; and sending by the server system a reply message to the application executing on the mobile computing device that, consistent with permissions of the specific user, includes third party-defined content.


P34. A method for alerting a regarding individual having a first mobile computing device that an encoded pattern is present in an article of clothing of a specific user having a second mobile computing device, the encoded pattern not discernible to an ordinary unaided human observer, the method comprising initiating wireless communication from the first mobile computing device to the second mobile computing device, the wireless communication including an alert viewable on a fabric identification application executing on the first mobile computing device that the encoded pattern is not discernible to the ordinary unaided human observer.


P35. An article according to claim P1, wherein the pattern includes at least one repeatable unit having, in a first direction, a first leading strip and a first set of associated data strips and, in a second direction, a second leading strip and a second set of associated data strips, the second direction distinct from the first direction, each data strip having a set of stripes shaped to convey data, each stripe defined by a first transition edge from a first color to a second color and a second transition edge from the second color to a third color, the first transition having a distance D1 from a leading edge of the data strip and the second transition having a distance D2 from the leading edge of the data strip, wherein D2>D1, and D1 and D2 collectively encode data.


P36. An article according to claim P35, wherein the first and second directions correspond to local directions associated with a warp and a weft of the fabric respectively, and wherein the warp and weft directions vary over at least a portion of the fabric.


P37. An article according to claim P36, wherein the first and second directions are orthogonal to one another.


P38. An article according to claim P37, wherein the first direction is vertical and the second direction is horizontal.


P39. An article according to claim P36, wherein the first set encodes data distinct from the data encoded by the second set.


P40. An article according to claim P36, wherein each of the first and second leading strips comprises stripes of the first and second color, the first leading strip having a stripe of the first color with a minimum width W1 and the second leading strip having a stripe of the first color with a minimum width W2, wherein W1 does not equal W2.


P41. An article according to claim P36, wherein the repeatable unit has no more than three data strips in the first set and no more than three data strips in the second set.


P42. An article according to claim P41, wherein the repeatable unit has a dimension of at least 75 mm in the first direction and a dimension of at least 75 mm in the second direction.


P43. An article according to claim P36, wherein the repeatable unit has a dimension of at least 75 mm in the first direction and a dimension of at least 75 mm in the second direction.


P44. An article according to claim P36, wherein the surface has a dimension and the pattern includes no more than five repeatable units along the dimension.


P45. An article according to claim P36, wherein the surface has a dimension and the pattern includes no more than ten repeatable units along the dimension.


P46. An article according to claim P36, wherein the repeatable unit has no more than five data strips in the first set and no more than five strips in the second set.


P47. An article according to claim P36, wherein the repeatable unit has no more than eight data strips in the first set and no more than eight strips in the second set.


P48. A method for providing an augmented or virtual reality experience, the method comprising: capturing an image including an article according to any of claims 1-16 and 35-47; decoding the pattern to recover the unique identification code; identifying a user associated with the article based on the unique identification code; selecting a graphical element based on the identity of the user; and producing a display including the selected graphical element on a computer display device.


P49. A method according to claim P48, wherein the display includes a portion of the captured image overlayed with the selected graphical element.


P50. A method according to claim P49, wherein the selected graphical element is overlayed at a location corresponding to a location of the pattern in the displayed portion of the captured image.


P51. A method according to claim P49, wherein the selected graphical element is sized according to a size of the pattern in the displayed portion of the captured image.


P52. A method according to claim P48, wherein the selected graphical element is an interactive graphical element.


P53. A method according to claim P48, wherein the selected graphical element is an animated graphical element.


P54. A method according to claim P48, wherein the selected graphical element includes an identifier associated with the user.


P55. A method of decoding an image of a pattern in fabric, the pattern encoding text, the image having been captured by a camera, the method carried out to capture the encoded text and employing computer processes comprising: selecting a first portion of the image for analysis; slicing the image into sub-images; processing each of the sub-images by: locating edge boundaries of the sub-image; mapping the edge boundaries into a set of symbols; and determining for each decoded symbol set, determining whether it applies to a weft or to a warp.


P56. A method according to claim P55, wherein processing each of the sub-images further comprises validating each symbol set for closeness of fit and validating each symbol set against a set of design rules.


P57. A method according to claim P55, wherein processing each of the sub-images further comprises collecting votes for symbol candidates for each sub-image and determining a winning symbol candidate for reach sub-image.


CONCLUSION

While the invention has been particularly shown and described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended clauses. While some of these embodiments have been described in the claims by process steps, an apparatus comprising a computer with associated display capable of executing the process steps in the claims below is also included in the present invention. Likewise, a computer program product including computer executable instructions for executing the process steps in the claims below and stored on a computer readable medium is included within the present invention.

Claims
  • 1. An article, the article being a selected one of a set of articles, each article of the set comprising a fabric and being associated with a unique identification code, the selected article having a representational aesthetic environment, distributed over at least 10% of an exposed surface of the fabric that is normally visible to an observer when the article is in normal use, the representational aesthetic environment encoding the identification code associated with the selected article, such that the presence of the encoded identification code is disguised or hidden by virtue of the representational aesthetic environment.
  • 2. An article according to claim 1, wherein the article is a wearable article.
  • 3. An article according to claim 1, wherein the identification code is a one-dimensional code.
  • 4. An article according to claim 1, wherein the identification code is a two-dimensional code.
  • 5. An article according to claim 1, wherein the identification code includes a plurality of elements, wherein the identification code is encoded based on at least one of the width, length, position, size, shape, or color of the elements.
  • 6. An article according to claim 5, wherein at least one of the elements is linear.
  • 7. An article according to claim 5, wherein at least one of the elements is non-linear.
  • 8. An article according to claim 5, wherein at least one of the elements is regularly shaped.
  • 9. An article according to claim 5, wherein at least one of the elements is irregularly shaped.
  • 10. An article according to claim 5, wherein at least one of the elements is curved or circular.
  • 11. An article according to claim 5, wherein the elements are arranged in a pattern that appears random.
  • 12. An article according to claim 5, wherein the elements are arranged in a pattern that appears regular.
  • 13. An article according to claim 5, wherein the representational aesthetic environment evokes a nature environment.
  • 14. An article according to claim 1, wherein the representational aesthetic environment comprises a coded pattern woven into the fabric and a graphic applied over at least a portion of the coded pattern so as to obfuscate the coded pattern while still allowing the coded fabric to be read and decoded by a reader through the graphic.
  • 15. An article according to claim 14, wherein the graphic is printed onto the fabric.
  • 16. An article according to claim 1, wherein the fabric includes a set of fiber transmitters embedded therein, the transmitters operating in a set of wavelengths selected from the group consisting of visible, invisible, and combinations thereof, the transmitters configured to transmit information that can be detected by a mobile computing device, directed to the wearable article, and executing a suitable application.
  • 17. An article according to claim 1, wherein the fabric includes a set of fibers that can change visual properties, and wherein the visual properties of such fibers are configurable to encode at least part of the identification code associated with the selected article.
  • 18. An article according to claim 17, wherein each article of the set includes the same configuration of fibers, and wherein the visual properties of such fibers are used to encode different identification codes for different articles.
  • 19. An article according to claim 17, wherein the visual properties of such fibers are configurable to dynamically change an encoded pattern associated with the selected article to allow for representation of different identification codes for the selected article.
  • 20. An article according to claim 1, wherein the fabric includes a set of fiber receivers embedded therein.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This patent application is a continuation-in-part of, and therefore claims priority from, international patent application no. PCT/US2019/026549 entitled Uniquely Identifiable Articles of Fabric Configured for Data Communication having an International Filing Date of Apr. 9, 2019, which claims priority from U.S. Provisional Patent Application No. 62/682,975 entitled Uniquely Identifiable Articles of Fabric Configured for Data Communication filed on Jun. 10, 2018, U.S. Provisional Patent Application No. 62/743,913 entitled Uniquely Identifiable Articles of Fabric Configured for Data Communication filed on Oct. 10, 2018, and U.S. Provisional Patent Application No. 62/781,437 entitled Uniquely Identifiable Articles of Fabric Configured for Data Communication filed Dec. 18, 2018. Each of these patent applications is hereby incorporated herein by reference in its entirety. This patent application is also a continuation-in-part of, and therefore claims priority from, U.S. patent application Ser. No. 16/457,075 filed Jun. 28, 2019, which is a continuation of international patent application no. PCT Patent Application No. PCT/US2018/012193 having an international filing date of Jan. 3, 2018, which claims priority from U.S. Provisional Patent Application No. 62/442,283 filed Jan. 4, 2017 and also claims priority from U.S. Provisional Patent Application No. 62/521,150 filed Jun. 16, 2017. Each of these patent applications is hereby incorporated herein by reference in its entirety. Also, the contents of PCT Patent Application No. PCT/US2018/012193 (referred to herein as the “Prior Application”) are physically incorporated into this patent application (Detailed Description paragraphs 123-239 of the present patent application correspond to paragraphs 95-211 of PCT Patent Application No. PCT/US2018/012193).

STATEMENT AS TO FEDERALLY FUNDED RESEARCH

This invention was made with U.S. Government support under Agreement No. W15QKN-16-3-0001 awarded by the ACC-NJ. The Government has certain rights in the invention.

Provisional Applications (5)
Number Date Country
62682975 Jun 2018 US
62743913 Oct 2018 US
62781437 Dec 2018 US
62442283 Jan 2017 US
62521150 Jun 2017 US
Continuations (1)
Number Date Country
Parent PCT/US18/12193 Jan 2018 US
Child 16457075 US
Continuation in Parts (2)
Number Date Country
Parent PCT/US2019/026549 Apr 2019 US
Child 17109037 US
Parent 16457075 Jun 2019 US
Child PCT/US2019/026549 US