This disclosure generally includes methods and apparatus for creating 3D objects and digital 3D objects with viewpoint dependent optical illusions, and more particularly, methods and apparatus for producing lens-covered and barrier-based 3D objects with lenticular or changeable picture appearance.
Many technologies currently exist for producing flat or simply curved objects and displays with depth illusions (three-dimensional effects) or to reveal independent images under different viewpoints. Two broad approaches may currently be used to accommodate parallax and multi-view displays namely, lens-based methods and barrier-based methods.
A lens-based method typically includes a lens array positioned on top of an image layer. When viewed from various viewpoints, the light refracted by the lenses allows individuals to perceive different portions of the interlaced image beneath the lenses. Two main categories of lens configurations based on lens geometry and distribution exist, namely lenticular lenses and fly's eye lenses (also known as integral lenses). Lenticular lenses are typically long, narrow cylindrical lenses arranged in an array, while fly's eye lenses are typically spherical lenses arranged in an array. Both lenticular and integral imaging techniques may be used in the 3D display industry for parallax displays. Displays using cylindrical lens arrays (known as lenticular panoramagrams) display only horizontal parallax, while displays using spherical lens arrays (known as integral photographs or integrams) provide both horizontal and vertical directional information to create a full parallax image (Halle, M. (1997). Autostereoscopic Displays and Computer Graphics. Computer Graphics, ACM Siggraph, vol. 31, no. 2, 1997, pp. 58-62.). In the display industry, lenticular imaging is more commonly used due to its higher spatial resolution, while the directional information provided by integral imaging is less important in typical use cases (Ibid). In the hardcopy printing industry, lenticular methods are more widely adopted than integral methods. A popular technique is lenticular printing, which typically involves combining pre-made flat lenticular sheets with 2D color patterns made up of multiple images. Fly's eye lens sheets (which can be used to create 3D integral images) are less common due to their lower spatial resolution sacrificed to directional information. As used herein, the term “lenticular” is broadly defined and can refer to both cylindrical and spherical lens arrays. The term “lenticular effect” is also used to describe any optical illusions created by individual lenses (lenticules) placed on top of a composite of two or more interlaced graphics. Some of the most popular lenticular effects include, for example, 3D, morph, animation, flip, and zoom.
A barrier-based method typically involves using an opaque layer or slits to block certain parts of an underlying image. In the 3D display industry, autostereoscopic effects can be achieved through the use of a parallax barrier. Parallax barrier may comprise a layer with tiny, precisely spaced slits that separates two sets of pixels. Each eye sees a different set of pixels, creating the illusion of depth through the effect of parallax. In the movie and printing industries, moving picture or color shifting effects can be achieved through barrier-grid animation. Barrier-grid animation is typically created by sliding a transparent overlay with stripes over an interlaced image. In the art and printing industries, the “Agamograph” is a form of kinetic art that uses optical illusion to create dynamic artwork that appears to change when viewed from different angles. Inspired by the barrier grid, Agamographs do not typically use lenses, but rather employ the insertion of different colors and images on surfaces facing different angles to produce radically different images from various viewpoints.
The methods and techniques described above are all used to create optical illusions on flat or simply curved surfaces, displays, or objects. These techniques are suitable for creating depth illusions or autostereoscopic displays on 2D surfaces. However, none of these existing techniques can be used to produce kinetic optical illusions on 3D objects or doubly curved surfaces in the context of creating multi-view changeable images.
Conventional 2D lenticular printing has limitations. For example, the color/pattern shifting effect and quality of optical aberrations are often highly restricted by current manufacturability in optical lens and printer resolution. Mass-produced lenticular sheets are typically rigid flat with production standards such as, for example, fixed lens curvature, thickness and refractive index. The sheet might be bended and covers simply curved surfaces, but the sheet cannot be mapped onto doubly curved ones. For example, the making of a lenticular sheet and content images are separate and handled by different people or companies. The designer/end user has to create the 2D patterns following the exact format or instruction that works with the lenticular sheets. The number of embedded images, image size and qualities all fit into presets or templates, and are limited by lens size and printer resolution. Additionally, image data can only be designed and produced in 2D and the resulting lenticular effect may be viewed in a linear viewing path, otherwise the end user may see a broken or discontinuous lenticular effect.
Conventional barrier-based methods (such as parallax barrier and barrier-grid animation) have the limitation of halving the horizontal pixel count viewable by each eye, which reduces the overall horizontal resolution of the image. Agamographs has a limited number of image frames, typically with two interlaced images inserted into two different viewpoints, due the accordion shaped image layer.
Both lens-based and barrier-based methods for creating multi-view optical illusions have yet to be implemented on 3D objects due to technical barriers and the lack of efficient workflow. An example of a technical barrier is related to manufacturability. In lenticular printing, there was previously no suitable fabrication or manufacturing technology that could produce highly transparent geometry and high-resolution color patterns on doubly curved surfaces until recent development of multi-material 3D printing (such as Polyjet technology which can print voxel level clear materials as well as CMYK materials). Similarly, methods like Agamographs that rely on cutting or folding patterned flat paper to create multiple surfaces facing different angles are difficult to apply to complex 3D geometries. Applying images or colors to fully cover the exposed surfaces of complex 3D geometries with convex or concave patterns can also be challenging. Existing techniques such as spraying or printing may also be difficult to use in this context. Additionally, there was no workflow or method to compute and digitally model 3D objects or surfaces with multi-view optical illusions. In lens-based methods, no prior workflow is found to generate lenticular or fly's eye lenses on 3D surface to achieve desired effects. Therefore, it was not possible to map lenses and apply color-shifting optical effects on free-form 3D objects, for both physical objects and digital CAD models. For barrier-based methods, there is currently no sufficient workflow or software available that is specifically designed to assign different colors or interlace images onto specific areas of a complex 3D geometry in a way that allows people to view different images from different perspectives.
A strong need exists for a 3D object or surface with changeable multi-view displays. A strong need also exists for a workflow or method for creating a 3D object or surface with desired optical illusion effects.
The disclosure includes methods and systems for applying optical illusion effects (e.g. lenticular effects, reflective and light-distorting appearance) to any surface of any object. By manipulating the optical properties of an object, the disclosure allows for the creation of a range of visual effects, such as changes in color, depth, or perspective. In various embodiments, the disclosure is particularly useful for creating 3D objects with complex, doubly curved surfaces. The ability to apply the lenticular effect on various forms opens up a wide range of possibilities for designers and creators.
In various embodiments, the disclosure allows the whole design of the front layer and backing layer to be simultaneously handled in the same system or by the same company or person, which provides greater control and flexibility for designers and ensure that the final product is consistent and meets their specific requirements. In various embodiments, the disclosure also provides designers with more control over the lens geometry, size, and fabrication method, enabling designers to experiment with different shapes and depths of the lens, surface form, and maximum viewing angle of the image frames. The disclosure also allows for more freedom in locating and designing each patterned region or content under the lens layer, as well as in controlling the transparency and hardness of the image data. Additionally, the ability to design three-dimensional image data gives designers even more freedom in their creative process.
In various embodiments, the disclosure pertains to defining structures and properties to create 3D objects with multi-view displays, as well as to methods for designing and producing such objects. Examples of multi-view 3D objects include lens-based and barrier-based displays that create optical illusions.
In various embodiments, a method is provided for determining lens geometry and image data using ray tracing. The method includes steps for determining lens locations, which may be based on various distribution methods such as parallel, concentric, or following surface curvatures (such as using UV curves or mesh vertices). The method also includes steps for determining lens geometry and applying patterns to the area under each lens using ray tracing from multiple viewpoints.
In various embodiments, the disclosure includes a ray tracing method to determine 3D element geometry and surface pattern of a barrier-based 3D object. The method includes determine 3D element locations. This step may be based on different 3D element distribution methods, such as parallel distribution, concentric distributions, distribution follows surface curvatures like UV curves or mesh vertices, etc. The method also includes determine 3D element geometry. The method also includes determine and apply patterns to the area under each lens using ray tracing from different viewpoints.
In various embodiments, the disclosure includes a method for creating viewpoint dependent objects that reveal different contents based on a variety of pre-defined patterns of different viewing angles. For lens-based 3D objects, in various embodiments, the method includes steps for determining a set of 2D images to be revealed at different viewpoints, placing virtual cameras to represent these viewpoints, and using ray tracing to determine the sizes, shapes, and locations of focal windows under each lens at each viewpoint. In various embodiments, the method includes a step for patterning the defined focal lenses with images derived from the pre-selected 2D images. For barrier-based 3D objects, the method includes projecting an image onto the surfaces of 3D elements from a specific viewpoint.
In various embodiments, the methods and systems are provided for customizing the geometry and properties of an object to create unique and visually striking designs using a wider range of materials, including soft and rigid materials. In various embodiments, the disclosure may be used in conjunction with 3D printing to provide greater flexibility and customization in the design process. In other embodiments, the disclosure may be used in conjunction with precision glass moulding and CNC techniques to enhance the accuracy and quality of the final design. In other embodiments, the disclosure may be used for printing directly on fabrics.
In various embodiments, a method is provided for designing fibers on a fabric with an optical illusion display. The method includes steps for determining the locations, distributions, geometries, orientations, and patterns of the 3D fibers. The method also includes a step for generating the 3D fibers. In various embodiments, the fibers may be designed to follow the curvature of a specific shape, such as a human body shape, using techniques such as UV mapping and unwrapping.
In various embodiments, the disclosure provides a lenticular candy design. The candy may include a front layer made from transparent material (e.g sugar glass) which covers a backing layer. The front layer comprises an array of elongated or integral lenses with different heights, curvatures and shapes that provide different refractive behaviors. Different color pixels/patterns/strips embedded in the backing layer reveal at different viewpoints.
The disclosure includes methods for designing and producing the structures and properties of 3D objects with multi-view displays. The 3D objects may include one or more of viewpoint dependent displays, multi-view displays, optical illusions, kinetic optical displays, integral displays and/or lenticular displays. The disclosure may also include creating kinetic optical 3D objects. Examples of multi-view 3D objects include lens-based and barrier-based displays that create optical illusions. The disclosure also includes methods for designing and distributing elements that contribute to the optical effects on a 3D geometry, and to methods for producing physical 3D objects or textiles with optical illusion displays. Additionally, the disclosure pertains to the creation of viewpoint-dependent 3D objects that display a specific image at desired viewpoints. As used herein, “object” includes one or more of any item, sculpture, vase, food, candy, lollipop, textile or digital object of any shape or size.
With respect to the types of lens-based 3D objects for multi-view displays, as set forth in
As set forth in more detail in
As set forth in
As set forth in
In various embodiments, a lenticular textile 402 is created by applying a plurality of lenses to the surface of a fabric 408. The disclosure allows designers to program lens density into the fabric, which affects the drape of the fabric. The size and arrangement of the lenses on the fabric affects the flexibility and visual appearance of the fabric. For instance, areas with lower spatial density and larger, coarser lenses are more rigid and have a calmer visual appearance; while areas with higher spatial density and smaller, finer lenses will be more flexible and have a more dynamic visual appearance. Patterns and image data may be embedded either in the fabric or the front lens layer. Section view 404 of the textile 402 shows that patterns 412 may be embedded in the fabric 408. In various embodiments, the patterns 412 may be sewed or printed to the fabric 408. The front layer 406 on top of the fabric 408 may include a plurality of individual lenses with varying sizes, heights, stiffness and spatial density. Section view 414 of the textile 402 shows that patterns 416 are embedded in the lenses 420, a plurality of lenses embedded with patterns constitute a front layer 418. In various embodiments, the patterns 416 and the transparent parts of lens 420 may be produced in a single material layer using technique like multi-material 3D printing.
In various embodiments, the textile 402 may be further made into clothing, garments, accessories and etc. to display a unique visual dynamism when the fabric drapes.
With respect to structures and rules of lens-based 3D objects for multi-view displays, as set forth in
Each standard spherical lenticule 502 may comprise a thin spherical section and a solid cuboid;
Each standard cylindrical lenticule 504 may comprise a thin cylindrical section and a solid cuboid. The path of a ray 508 bends when it travels from a transparent substance (e.g. air) into another (e.g. resin, glass). Light traveling through optics follows Snell's Law, which states that the ratio of the sines of the angle of incidence (θ1) and the angle of refraction (θ2) is equal to the ratio of the refractive indices (n2/n1) of the two media, where “n2” is the refraction index of the lens; and “n1” is the refraction index of the air.
Snell's law of refraction may be applied to understand how light travels through a lens and further help to define rules for lens design. In
A lens with a different geometry may have a focal window 604 in various shapes and sizes. For example, a standard spherical lenticule may have a nearly circular focal window 606, while a standard cylindrical lenticule may have a stripe-shaped focal window 608.
These rules may be useful in the design of a viewpoint-dependent object. For example, designers may wish to increase the number of patterned regions available under each lens by reducing the width of each focal window 604, thereby allowing for the inclusion of more image frames to be revealed at different viewpoints. By utilizing Snell's law of refraction, it is possible to calculate the size and shape of each focal window 604 corresponding to each viewpoint and visualize the patterned region under the lens at that viewpoint.
In various embodiments, designers may wish to design lenticule geometry that enables the maximum viewing angle. In order to achieve this, the lenticule geometry is designed such that the full range of the vignetting angle (γ) displays the entire image sequence (full aperture). In various embodiments, geometrically, this means the extreme ray 1002 entering from one edge of the lenticule is refracted to reach the opposite edge of the image strip, as shown in
Where p≤2r,
nair is the Refraction index of air
nlens material is the Refraction index of lens material
R is the Refraction angle of the extreme ray
γ is the Vignetting angle
Using the above formulas, the lenticule thickness (t) can then be derived based on a given value of r and p. In various embodiments, the designed lenticular effects can be validated and applied to the final product by adjusting the parameters of the control variables.
With respect to the types of barrier-based 3D objects for multi-view displays, in various embodiments, kinetic optical illusions may be created without the use of lenses. Barrier-based 3D objects rely on images on different orientation surfaces and the movement of the base material to create color shifting effects, rather than using lens-based methods.
As depicted in
The proposed system may also enable a plurality of 3D stripes to be wrapped around a three-dimensional geometry. Each 3D stripe may be in long, narrow slits with different colors or patterns assigned to each face.
As depicted in
As set forth in
With respect to structures and rules of barrier-based 3D objects for multi-view displays, in various embodiments,
3D elements on a 3D object may take various patterns and shapes.
By following the basic rules outlined in the system, a wide range of optical illusions and visually dynamic 3D objects can be generated. Such as allowing an image to be revealed only at certain viewpoints and produces color shifting effects when viewpoints are changed. The system also allows for the creation of visually dynamic 3D objects by combining a plurality of 3D elements in various shapes, sizes, colors, gradients, patterns, and flexibilities.
With respect to a computer system to create a digital model of a 3D object for multi-view displays, the design of a 3D object for multi-view displays may involve using a computer system.
The design of a 3D object with optical illusion may include certain user inputs 2004 to be provided to the design program 2006. These user inputs 2004 may include a set of parameters or attributes, or files, that help the system define the geometry, surface patterns, material properties, etc. of the object based on the rules and examples discussed above, such as in
The graphic processing engine 2008 receives data from the design software 2006 and combines the data into a stream. This data stream is then transmitted to a processing unit, such as a GPU, which processes the data and sends the results to a display 2010. The display 2010 is configured to display a graphical user interface (GUI) 2012, which serves as an interface between a user and the operating system or applications running behind. The GUI 2012 may comprise various graphical elements such as cursors, buttons, menus, windows, and design spaces/views, etc. These graphical elements may also include a visualization of the current design choices created by specific actions taken by the user. The user may view the design from different angles using a 3D view in the design space. Once the design is finalized by the user, the design program 2006 may generate an output file 2014 that can be viewed, printed, rendered, or processed by other software.
In various embodiments, the design program 2006 may be capable of volumetric modeling and is optimized for voxel printing. Unlike traditional CAD tools, which can only create a hollow shell of geometry, the design program 2006 with volumetric modeling capability allows the user to create the interior of a geometry and specify the properties of each individual voxel (3D pixel) throughout the entire volume of the model. For example, the user may define a color and material property for each voxel. The models generated by the proposed system may be full of material information that can be transmitted to a 3D printer for printing.
With respect to designing and creating lens-based 3D objects for multi-view displays,
The disclosure includes the mapping of two broad types of lenses onto 3D surfaces. The first type of lens is an elongated lens, such as a lenticular lens, which is typically long and narrow and arranged in an array. The second type of lens is an integral lens, such as a fly's eye lens, which is typically a dotted lens dispersed over a surface. The disclosure further includes the design of lens-based 3D objects using different lens distribution methods. The workflow described below may utilize any of the 3D element geometries introduced in previous sections, including but not limited to the 3D element geometries depicted in
With respect to the design with elongated lenses, in various embodiments, the system may create lenticular 3D objects with long and narrow lens arrays or with lens in line distributions. In various embodiments, a 3D lenticular object may have lenses arranged in parallel to each other.
In various embodiments, the original geometry and offsetted geometry may be segmented by using a set of paralleled cutting planes, such as cutting planes 2302 and 2308. The gap between these cutting planes may be constant or varied as desired.
In various embodiments, the front layer 2206 may include n slices, and the backing layer 2306 may include n*m slices in order to create an optimal optical effect for viewing multiple underlying images (where m represents the number of embedded images).
In various embodiments, generating elongated structures 2314 on each offsetted layer slice 2312 may be performed using tools and features such as “surface loft”.
In various embodiments, a 3D lenticular object may have lenses arranged in a spiral or concentric configuration.
In various embodiments, a 3D lenticular object may have backing layer split with intersecting cutting planes, such as cutting planes distributed in circular or curve driven patterns instead of linear patterns.
In various embodiments, the intersecting cutting planes may intersect in one single line or multiple lines. The gap between these cutting planes may be constant or varied as desired.
In various embodiments, the front layer 2714 may include n slices, and the backing layer 2706 may include n*m slices in order to create an optimal optical effect for viewing multiple underlying images (where m represents the number of embedded images).
In various embodiments, generating elongated structures 2716 on each offsetted layer slice 2712 may be performed using tools and features such as “2-rail sweeping loft”.
A digital model of an even more complex lenticular object, as shown in
In various embodiments, design workflow 2730 may apply to a sweep object (object created by taking a closed section profile and moving it along a defined path curve). As shown in
In various embodiments, design workflow 2730 may be applied on a spread surface rather than around a hollow geometry, as depicted in
In various embodiments, a 3D lenticular object may have lenses and textures that are mapped to follow surface curvatures, such as UV curves. The design process 3332 for an exemplary 3D lenticular object 3318 with UV mapped lenses is shown in
In various embodiments,
With respect to design with integral lenses, in various embodiments, the system may include creating designs of lens-covered 3D objects with lenses arranged in a grid, lens-covered 3D objects with dotted lenses dispersed over a surface, lens-covered 3D objects with standalone lenses, lens-covered 3D objects with scattered or clustered lenses, and/or lenticular 3D objects with lens in dot distributions. In various embodiments, a 3D object may have standalone lenses dispersed over the surface.
With respect to design viewpoint-dependent lens-covered object, in various embodiments, designers may wish to precisely define a plurality of images to be revealed at desired viewpoints. This can be achieved by applying basic rules in ray tracing and light refractions. As set forth in
In various embodiments, it is desired to specify the display of a set of images at specific viewpoints. This can be accomplished through the application of fundamental principles of ray tracing and light refraction. As illustrated in
In various embodiments, the overall workflow for creating a lens object with viewpoint-dependent display may involve one or more of the steps outlined in
In various embodiments, an image region from the 2D image may be further divided into a number of smaller image sectors. At each viewpoint, each lens may display an image sector rather than an image region. In this case, the resolution of the lens, or the lenticular display resolution, does not need to be the same as the resolution of the segmented image.
With respect to designing and creating barrier-based 3D objects for multi-view displays, like lens-based objects, there are also two broad types of 3D elements on barrier-based 3D objects for multi-view displays. The first type is an elongated 3D strip, which is typically long and narrow and arranged in an array. The second type is a standalone 3D element dispersed over a surface. The disclosure further includes the design of barrier-based 3D objects using different 3D element distribution methods. The workflow described below may utilize any of the 3D element geometries introduced in previous sections, including but not limited to the 3D element geometries depicted in
With respect to designing with 3D elongated stripes, in various embodiments, the process of creating a barrier-based object with 3D elongated stripes may involve one or more steps as outlined in
As set forth in
With respect to designing with standalone 3D elements, in various embodiments, a barrier-based 3D object may have standalone 3D elements dispersed over the surface. The process of creating an exemplary barrier-based object 4408 with standalone 3D elements may involve one or more steps as outlined in
The design process 4430 may begin by creating or receiving a digital model of a geometry 4402 (block 4432). The process may proceed to block 4434, which involve converting the geometry into a polygon mesh 4406 with a plurality of vertices, edges and faces and determine locations of a plurality of 3D elements 4410 using vertices of the mesh 4406. In various embodiments, 3D elements locations may be determined using intersecting points of UV lines on the geometry 4404. The system may then determine a 3D element's 4410 geometry with a set of parameters (block 4436) and generate the 3D elements 4410 on the geometry 4402 using the defined 3D element center and geometry (block 4438). The system may then apply patterns to the surfaces of each 3D element (block 4440).
With respect to designing viewpoint-dependent barrier-based 3D objects, in various embodiments, designers may wish to define a plurality of images to be revealed at desired viewpoints. This can be achieved multiple ways and by applying basic rules in ray tracing.
In one embodiment, if the structure and distribution of the 3D elements are already defined, view-dependent display may be achieved by assigning colors/patterns to the 3D elements. Using an exemplary 3D object in
In various embodiments, the structure and distribution of the 3D elements may be undefined. In this case, view-dependent display may be achieved by generating the geometry and orientation of the 3D elements in a manner that optimizes the display, and subsequently assigning colors or patterns to the 3D elements. A system is disclosed for precisely controlling the orientation of each face of each 3D element in order to achieve a view-point dependent appearance of an object. The process may include placing a virtual camera to represent a viewpoint, determining the quantity and locations of 3D elements, and the number of faces in each 3D elements using a set of parameters, and generating a face of each 3D element such that all generated faces face the same orientation. The 3D element location and distribution may be determined using UV mapping or mesh vertices. These steps may be repeated to generate all faces of each 3D element, and colors or patterns may be assigned to the faces facing the same orientation.
With respect to manufacturing 3D objects for multi-view displays, in various embodiments, after completing the design of a multi-view 3D object using the aforementioned methods in a digital modeling program, a series of fabrication files can be exported for manufacturing purposes.
The production of lens-based or barrier-based objects often involves the creation of doubly curved transparent and colored material layers, which may be securely attached to one another with matched curvatures in certain embodiments. This process can be difficult to achieve using traditional manufacturing methods, or costly to validate and iterate the design. To address this, various embodiments of the disclosure provide for the use of 3D printing, particularly multi-material 3D printing, as a method for producing lenticular objects and prototyping designs in order to test and verify the lenticular effects before resorting to other manufacturing methods for mass production. Specifically, 3D printing may be used to produce the lenticular object, and prototype the design and validate the lenticular effects before using other manufacturing method for production.
In various embodiments, the disclosure may be used in conjunction with 3D printing to provide greater flexibility and customization in the design process. In other embodiments, the disclosure may be used in conjunction with precision glass moulding and CNC techniques to enhance the accuracy and quality of the final design.
In various embodiments, the front layer (comprises either lenses or barriers-based 3D elements) and the backing layer (having a geometry shape) may be produced separately using a combination of 3D printing, CNC, molding, and other manufacturing techniques. For example, the front layer of a lens-based object may be produced using CNC in glass or acrylic, while the backing layer may be 3D printed. In various embodiments, the front lens layer may be produced through 3D printing with a transparent material such as resin, and then combined with a backing layer produced through other manufacturing methods. Different layers of the object may be then combined together to form a full 3D object with multi-view display.
In various embodiments, multi-view display 3D objects may be integrated with moving mechanisms, such as robotic surfaces, so that the dynamic appearance of the surface can be controlled by the machine without requiring the user to change viewpoints manually.
With respect to creating a textile with optical illusion display, in various embodiments, the design system discussed in previous sections may be utilized in textile design to create flexible materials with optical illusions.
In various embodiments, a textile may be designed and fabricated in 3D format using the design system described above. This can be achieved through the use of robotic arms to map the 3D structure onto a 3D surface for wearable design. In various embodiments, the textile may be produced in 3D segmented pieces that are sewn together to form a full garment when worn on the body.
In various embodiments, the exported design file includes 3D fibers arranged on a flat surface and can be produced using flat manufacturing techniques that are widely available. When the resulting fabric is draped or bent into a 3D form, it may reveal dynamic colors due to the distribution of transparent 3D fibers with varying sizes, heights, transparency, stiffness, and spatial density, as illustrated in
In various embodiments, users may desire for the design to conform to a specific curvature or shape.
There are several methods for producing a textile with an optical illusion display, including but not limited to the methods as following. In various embodiments, 3D fibers (either lens-based or barrier-based) may be 3D printed directly onto a fabric that is lying flat on a printer bed. This may involve the user securing the fabric onto the printer bed before starting the printing process. Some printer materials are designed to adhere to fabric without the need for additional adhesives; the user may calibrate the printer bed in advance for this to be successful. This technique can be applied to a wide range of fabrics, including both synthetic and natural fibers. In other embodiments, the 3D fibers may be produced separately using techniques such as CNC machining or moulding, and then attached to the fabric through sewing or adhesion. For barrier-based fibers, the fibers may be produced first and then colored, or they may be produced in color and then attached to the fabric through sewing or adhesion.
With respect to commercial applications, the disclosure includes a solution for designers, engineers, and brands in the fashion, arts, interior décor, automotive design, aerospace design and consumer products industries to create materials with embedded interactions (e.g. view-based interaction and touch-sensitive interaction). This solution is particularly useful for users in the fields of computer-aided design (CAD), 3D printing, and digital rendering, as it allows them to easily translate digital designs into physical reality without being constrained by manufacturing limitations. With this disclosure, designers are able to create a wide range of digital effects in physical form.
In fashion, art, and design industry, the computational design workflow introduced in this disclosure may be used to mimic the dynamism and colors of nature. The disclosure allows for the creation of timeless pieces that engage with the environment and promote the longevity of our surroundings and the integrity of humanity, rather than being focused on short-term trends or single-use items. The disclosure also allows for the manipulation of light refractions on textiles in a way that is similar to the way light is refracted on animal skins. The disclosure uses an algorithm design flow or sequence of a simple element to create unique patterns that cannot be replicated by hand or with traditional technology. The resulting textiles would be able to change appearance based on the angle and intensity of the light source, creating a dynamic and visually interesting effect. The disclosure demonstrates the potential for future customization and physical materiality in these industries.
Authentication is often used for many products, particularly luxury goods. This is typically achieved through the use of a signature mark, pattern, or label on the item that is hard to be copied to perfection. However, some authentication marks can be easily replicated if they are visible or captured in a photograph and then reproduced using standard manufacturing techniques. The disclosure introduces a new method for authenticating goods using a lenticular design that is impossible to duplicate or reproduce without the use of a specific computational model and manufacturing technique. This authentication pattern may be applied to a variety of items, including wine bottles, bags, tags, and other goods, and is designed to be revealed at a precise angle. The disclosure allows for the creation of a secure and unique authentication system that can be used to protect against counterfeiting and ensure the authenticity of luxury products.
The disclosure may open a new era of color, material and/or finish design on consumer products. Instead of relying on digital screens, designers may use physical materials to create dynamic, moving pictures, interactive skins, and content without the need for electrical input.
In various embodiments, the disclosure involves the use of viewpoint-dependent appearance manipulation for medical or user behavior correction. The disclosure allows for customization of the appearance of an object's skin based on the user's height, viewing angle, and habits, by revealing information at desired positions or providing feedback as a therapeutic aid. This may be useful in many medical applications and physical rehabilitation.
One potential application of this disclosure is to train patients with back pain to adopt diaphragmatic breathing instead of chest breathing. A lightweight textile or wearable product could be designed to guide patients in the rhythm of diaphragmatic breathing using a visual guide that shows the movement of the abdomen at different angles. Previously, this type of training relied on sensors and accelerometers with embedded electronics, this disclosure allows for the elimination of electrical components and customization of the wearable to fit the specific body shape of each patient.
Another example of how to correct a person's sitting posture while writing is provided. As set forth in
In various embodiments, the disclosure includes a method of creating hidden information to be revealed only at specific angles. The hidden information may be used to protect sensitive information or to show the full nature of a product without any obstructions while maintaining essential context. As shown in
The disclosure may also and puts a playful spin on remarkably untouched industries such as traditional food design, to provide textural experiences and color that can only be created with a digital skin. As shown in
The making of this kind of candy may comprise producing a backing layer in a variety of textures and shapes; making a mold wherein the cavity has the shape of the lenses; fixing the backing layer inside the mold; and pouring a transparent syrup into the mold and letting it caramelize.
The proposed disclosure may also be used to create touch-sensitive interactions without the need for an electrical input, such as a button that changes color when pressed.
The detailed description of various embodiments herein makes reference to the accompanying drawings and pictures, which show various embodiments by way of illustration. While these various embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, it should be understood that other embodiments may be realized and that logical and mechanical changes may be made without departing from the spirit and scope of the disclosure. Thus, the detailed description herein is presented for purposes of illustration only and not for purposes of limitation. For example, the steps recited in any of the method or process descriptions may be executed in any order and are not limited to the order presented. Moreover, any of the functions or steps may be outsourced to or performed by one or more third parties. Modifications, additions, or omissions may be made to the systems, apparatuses, and methods described herein without departing from the scope of the disclosure. For example, the components of the systems and apparatuses may be integrated or separated. Moreover, the operations of the systems and apparatuses disclosed herein may be performed by more, fewer, or other components and the methods described may include more, fewer, or other steps. Additionally, steps may be performed in any suitable order. As used in this document, “each” refers to each member of a set or each member of a subset of a set. Furthermore, any reference to singular includes plural embodiments, and any reference to more than one component may include a singular embodiment. Although specific advantages have been enumerated herein, various embodiments may include some, none, or all of the enumerated advantages.
Systems, methods, and computer program products are provided. In the detailed description herein, references to “various embodiments,” “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. After reading the description, it will be apparent to one skilled in the relevant art(s) how to implement the disclosure in alternative embodiments.
The system may allow users to access data, and receive updated data in real time from other users. The system may store the data (e.g., in a standardized format) in a plurality of storage devices, provide remote access over a network so that users may update the data in a non-standardized format (e.g., dependent on the hardware and software platform used by the user) in real time through a GUI, convert the updated data that was input (e.g., by a user) in a non-standardized form to the standardized format, automatically generate a message (e.g., containing the updated data) whenever the updated data is stored and transmit the message to the users over a computer network in real time, so that the user has immediate access to the up-to-date data. The system allows remote users to share data in real time in a standardized format, regardless of the format (e.g. non-standardized) that the information was input by the user. The system may also include a filtering tool that is remote from the end user and provides customizable filtering features to each end user. The filtering tool may provide customizable filtering by filtering access to the data. The filtering tool may identify data or accounts that communicate with the server and may associate a request for content with the individual account. The system may include a filter on a local computer and a filter on a server.
As used herein, “satisfy,” “meet,” “match,” “associated with”, or similar phrases may include an identical match, a partial match, meeting certain criteria, matching a subset of data, a correlation, satisfying certain criteria, a correspondence, an association, an algorithmic relationship, and/or the like. Similarly, as used herein, “authenticate” or similar terms may include an exact authentication, a partial authentication, authenticating a subset of data, a correspondence, satisfying certain criteria, an association, an algorithmic relationship, and/or the like.
The term “non-transitory” is to be understood to remove only propagating transitory signals per se from the claim scope and does not relinquish rights to all standard computer-readable media that are not only propagating transitory signals per se. Stated another way, the meaning of the term “non-transitory computer-readable medium” and “non-transitory computer-readable storage medium” should be construed to exclude only those types of transitory computer-readable media which were found in In re Nuijten to fall outside the scope of patentable subject matter under 35 U.S.C. § 101.
Benefits, other advantages, and solutions to problems have been described herein with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any elements that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as critical, required, or essential features or elements of the disclosure. The scope of the disclosure is accordingly limited by nothing other than the appended claims, in which reference to an element in the singular is not intended to mean “one and only one” unless explicitly so stated, but rather “one or more.” Moreover, where a phrase similar to ‘at least one of A, B, and C’ or ‘at least one of A, B, or C’ is used in the claims or specification, it is intended that the phrase be interpreted to mean that A alone may be present in an embodiment, B alone may be present in an embodiment, C alone may be present in an embodiment, or that any combination of the elements A, B and C may be present in a single embodiment; for example, A and B, A and C, B and C, or A and B and C. Although the disclosure includes a method, it is contemplated that it may be embodied as computer program instructions on a tangible computer-readable carrier, such as a magnetic or optical memory or a magnetic or optical disk. All structural, chemical, and functional equivalents to the elements of the above-described various embodiments are expressly incorporated herein by reference and are intended to be encompassed by the present claims. Moreover, it is not necessary for a device or method to address each and every problem sought to be solved by the disclosure for it to be encompassed by the present claims. Furthermore, no element, component, or method step in the disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims. No claim element is intended to invoke 35 U.S.C. § 112(f) unless the element is expressly recited using the phrase “means for” or “step for”. As used herein, the terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
The process flows and screenshots depicted are merely embodiments and are not intended to limit the scope of the disclosure. For example, the steps recited in any of the method or process descriptions may be executed in any order and are not limited to the order presented. It will be appreciated that the following description makes appropriate references not only to the steps and user interface elements, but also to the various system components as described herein. It should be understood that, although exemplary embodiments are illustrated in the figures and described herein, the principles of the disclosure may be implemented using any number of techniques, whether currently known or not. The disclosure should in no way be limited to the exemplary implementations and techniques illustrated in the drawings and described below. Unless otherwise specifically noted, articles depicted in the drawings are not necessarily drawn to scale.
Computer programs (also referred to as computer control logic) are stored in main memory and/or secondary memory. Computer programs may also be received via communications interface. Such computer programs, when executed, enable the computer system to perform the features as discussed herein. In particular, the computer programs, when executed, enable the processor to perform the features of various embodiments. Accordingly, such computer programs represent controllers of the computer system.
These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions that execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
In various embodiments, software may be stored in a computer program product and loaded into a computer system using a removable storage drive, hard disk drive, or communications interface. The control logic (software), when executed by the processor, causes the processor to perform the functions of various embodiments as described herein. In various embodiments, hardware components may take the form of application specific integrated circuits (ASICs). Implementation of the hardware so as to perform the functions described herein will be apparent to persons skilled in the relevant art(s).
As will be appreciated by one of ordinary skill in the art, the system may be embodied as a customization of an existing system, an add-on product, a processing apparatus executing upgraded software, a stand-alone system, a distributed system, a method, a data processing system, a device for data processing, and/or a computer program product. Accordingly, any portion of the system or a module may take the form of a processing apparatus executing code, an internet based embodiment, an entirely hardware embodiment, or an embodiment combining aspects of the internet, software, and hardware. Furthermore, the system may take the form of a computer program product on a computer-readable storage medium having computer-readable program code means embodied in the storage medium. Any suitable computer-readable storage medium may be utilized, including hard disks, CD-ROM, BLU-RAY DISC®, optical storage devices, magnetic storage devices, and/or the like.
In various embodiments, components, modules, and/or engines of system 100 may be implemented as micro-applications or micro-apps. Micro-apps are typically deployed in the context of a mobile operating system, including for example, a WINDOWS® mobile operating system, an ANDROID® operating system, an APPLE® iOS operating system, a BLACKBERRY® company's operating system, and the like. The micro-app may be configured to leverage the resources of the larger operating system and associated hardware via a set of predetermined rules which govern the operations of various operating systems and hardware resources. For example, where a micro-app desires to communicate with a device or network other than the mobile device or mobile operating system, the micro-app may leverage the communication protocol of the operating system and associated device hardware under the predetermined rules of the mobile operating system. Moreover, where the micro-app desires an input from a user, the micro-app may be configured to request a response from the operating system which monitors various hardware components and then communicates a detected input from the hardware to the micro-app.
The system and method may be described herein in terms of functional block components, screen shots, optional selections, and various processing steps. It should be appreciated that such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions. For example, the system may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. Similarly, the software elements of the system may be implemented with any programming or scripting language such as C, C++, C #, JAVA®, JAVASCRIPT®, JAVASCRIPT® Object Notation (JSON), VBScript, Macromedia COLD FUSION, COBOL, MICROSOFT® company's Active Server Pages, assembly, PERL®, PHP, awk, PYTHON®, Visual Basic, SQL Stored Procedures, PL/SQL, any UNIX® shell script, and extensible markup language (XML) with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements. Further, it should be noted that the system may employ any number of techniques for data transmission, signaling, data processing, network control, and the like. Still further, the system could be used to detect or prevent security issues with a client-side scripting language, such as JAVASCRIPT®, VBScript, or the like.
The system and method are described herein with reference to screen shots, block diagrams and flowchart illustrations of methods, apparatus, and computer program products according to various embodiments. It will be understood that each functional block of the block diagrams and the flowchart illustrations, and combinations of functional blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by computer program instructions.
Accordingly, functional blocks of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, and program instruction means for performing the specified functions. It will also be understood that each functional block of the block diagrams and flowchart illustrations, and combinations of functional blocks in the block diagrams and flowchart illustrations, can be implemented by either special purpose hardware-based computer systems which perform the specified functions or steps, or suitable combinations of special purpose hardware and computer instructions. Further, illustrations of the process flows and the descriptions thereof may make reference to user WINDOWS® applications, webpages, websites, web forms, prompts, etc. Practitioners will appreciate that the illustrated steps described herein may comprise, in any number of configurations, including the use of WINDOWS® applications, webpages, web forms, popup WINDOWS® applications, prompts, and the like. It should be further appreciated that the multiple steps as illustrated and described may be combined into single webpages and/or WINDOWS® applications but have been expanded for the sake of simplicity. In other cases, steps illustrated and described as single process steps may be separated into multiple webpages and/or WINDOWS® applications but have been combined for simplicity.
In various embodiments, the software elements of the system may also be implemented using a JAVASCRIPT® run-time environment configured to execute JAVASCRIPT® code outside of a web browser. For example, the software elements of the system may also be implemented using NODE.JS® components. NODE.JS® programs may implement several modules to handle various core functionalities. For example, a package management module, such as NPM®, may be implemented as an open source library to aid in organizing the installation and management of third-party NODE.JS® programs. NODE.JS® programs may also implement a process manager, such as, for example, Parallel Multithreaded Machine (“PM2”); a resource and performance monitoring tool, such as, for example, Node Application Metrics (“appmetrics”); a library module for building user interfaces, and/or any other suitable and/or desired module.
Middleware may include any hardware and/or software suitably configured to facilitate communications between disparate computing systems. Middleware components may be contemplated. Middleware may be implemented through commercially available hardware and/or software, through custom hardware and/or software components, or through a combination thereof. Middleware may reside in a variety of configurations and may exist as a standalone system or may be a software component residing on the internet server. Middleware may be configured to communicate between the various components of an application server and any number of internal or external systems for any of the purposes disclosed herein. WEBSPHERE® MQ™ (formerly MQSeries) by IBM®, Inc. (Armonk, N.Y.) is an example of a commercially available middleware product. An Enterprise Service Bus (“ESB”) application is another example of middleware.
The computers discussed herein may provide a suitable website or other internet-based graphical user interface which is accessible by users. In one embodiment, MICROSOFT® company's Internet Information Services (IIS), Transaction Server (MTS) service, and an SQL SERVER® database, are used in conjunction with MICROSOFT® operating systems, WINDOWS NT® web server software, SQL SERVER® database, and MICROSOFT® Commerce Server. Additionally, components such as ACCESS® software, SQL SERVER® database, ORACLE® software, SYBASE® software, INFORMIX® software, MYSQL® software, INTERBASE® software, etc., may be used to provide an Active Data Object (ADO) compliant database management system. In one embodiment, the APACHE® web server is used in conjunction with a LINUX® operating system, a MYSQL® database, and PERL®, PHP, Ruby, and/or PYTHON® programming languages.
For the sake of brevity, data networking, application development, and other functional aspects of the systems (and components of the individual operating components of the systems) may not be described in detail herein. Furthermore, the connecting lines shown in the various figures contained herein are intended to represent exemplary functional relationships and/or physical couplings between the various elements. It should be noted that many alternative or additional functional relationships or physical connections may be present in a practical system.
In various embodiments, the methods described herein are implemented using the various particular machines described herein. The methods described herein may be implemented using the below particular machines, and those hereinafter developed, in any suitable combination, as would be appreciated immediately by one skilled in the art. Further, as is unambiguous from this disclosure, the methods described herein may result in various transformations of certain articles.
In various embodiments, the system and various components may integrate with one or more smart digital assistant technologies. For example, exemplary smart digital assistant technologies may include the ALEXA® system developed by the AMAZON® company, the GOOGLE HOME® system developed by Alphabet, Inc., the HOMEPOD® system of the APPLE® company, and/or similar digital assistant technologies. The ALEXA® system, GOOGLE HOME® system, and HOMEPOD® system, may each provide cloud-based voice activation services that can assist with tasks, entertainment, general information, and more. All the ALEXA® devices, such as the AMAZON ECHO®, AMAZON ECHO DOT®, AMAZON TAP®, and AMAZON FIRE® TV, have access to the ALEXA® system. The ALEXA® system, GOOGLE HOME® system, and HOMEPOD® system may receive voice commands via its voice activation technology, activate other functions, control smart devices, and/or gather information. For example, the smart digital assistant technologies may be used to interact with music, emails, texts, phone calls, question answering, home improvement information, smart home communication/activation, games, shopping, making to-do lists, setting alarms, streaming podcasts, playing audiobooks, and providing weather, traffic, and other real time information, such as news. The ALEXA®, GOOGLE HOME®, and HOMEPOD® systems may also allow the user to access information about eligible transaction accounts linked to an online account across all digital assistant-enabled devices.
The various system components discussed herein may include one or more of the following: a host server or other computing systems including a processor for processing digital data; a memory coupled to the processor for storing digital data; an input digitizer coupled to the processor for inputting digital data; an application program stored in the memory and accessible by the processor for directing processing of digital data by the processor; a display device coupled to the processor and memory for displaying information derived from digital data processed by the processor; and a plurality of databases. Various databases used herein may include: client data; merchant data; financial institution data; and/or like data useful in the operation of the system. As those skilled in the art will appreciate, user computer may include an operating system (e.g., WINDOWS®, UNIX®, LINUX®, SOLARIS®, MACOS®, etc.) as well as various support software and drivers typically associated with computers.
The present system or any part(s) or function(s) thereof may be implemented using hardware, software, or a combination thereof and may be implemented in one or more computer systems or other processing systems. However, the manipulations performed by embodiments may be referred to in terms, such as matching or selecting, which are commonly associated with mental operations performed by a human operator. No such capability of a human operator is necessary, or desirable, in most cases, in any of the operations described herein. Rather, the operations may be machine operations or any of the operations may be conducted or enhanced by artificial intelligence (AI) or machine learning. AI may refer generally to the study of agents (e.g., machines, computer-based systems, etc.) that perceive the world around them, form plans, and make decisions to achieve their goals. Foundations of AI include mathematics, logic, philosophy, probability, linguistics, neuroscience, and decision theory. Many fields fall under the umbrella of AI, such as computer vision, robotics, machine learning, and natural language processing. Useful machines for performing the various embodiments include general purpose digital computers or similar devices. The AI or ML may store data in a decision tree in a novel way.
In various embodiments, the embodiments are directed toward one or more computer systems capable of carrying out the functionalities described herein. The computer system includes one or more processors. The processor is connected to a communication infrastructure (e.g., a communications bus, cross-over bar, network, etc.). Various software embodiments are described in terms of this exemplary computer system. After reading this description, it will become apparent to a person skilled in the relevant art(s) how to implement various embodiments using other computer systems and/or architectures. The computer system can include a display interface that forwards graphics, text, and other data from the communication infrastructure (or from a frame buffer not shown) for display on a display unit.
The computer system also includes a main memory, such as random access memory (RAM), and may also include a secondary memory. The secondary memory may include, for example, a hard disk drive, a solid-state drive, and/or a removable storage drive. The removable storage drive reads from and/or writes to a removable storage unit. As will be appreciated, the removable storage unit includes a computer usable storage medium having stored therein computer software and/or data.
In various embodiments, secondary memory may include other similar devices for allowing computer programs or other instructions to be loaded into a computer system. Such devices may include, for example, a removable storage unit and an interface. Examples of such may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an erasable programmable read only memory (EPROM), programmable read only memory (PROM)) and associated socket, or other removable storage units and interfaces, which allow software and data to be transferred from the removable storage unit to a computer system.
The terms “computer program medium,” “computer usable medium,” and “computer readable medium” are used to generally refer to media such as removable storage drive and a hard disk installed in hard disk drive. These computer program products provide software to a computer system.
The computer system may also include a communications interface. A communications interface allows software and data to be transferred between the computer system and external devices. Examples of such a communications interface may include a modem, a network interface (such as an Ethernet card), a communications port, etc. Software and data transferred via the communications interface are in the form of signals which may be electronic, electromagnetic, optical, or other signals capable of being received by communications interface. These signals are provided to communications interface via a communications path (e.g., channel). This channel carries signals and may be implemented using wire, cable, fiber optics, a telephone line, a cellular link, a radio frequency (RF) link, wireless and other communications channels.
As used herein an “identifier” may be any suitable identifier that uniquely identifies an item. For example, the identifier may be a globally unique identifier (“GUID”). The GUID may be an identifier created and/or implemented under the universally unique identifier standard. Moreover, the GUID may be stored as 128-bit value that can be displayed as 32 hexadecimal digits. The identifier may also include a major number, and a minor number. The major number and minor number may each be 16-bit integers.
In various embodiments, the server may include application servers (e.g., WEBSPHERE®, WEBLOGIC®, JBOSS®, POSTGRES PLUS ADVANCED SERVER®, etc.). In various embodiments, the server may include web servers (e.g., Apache, IIS, GOOGLE® Web Server, SUN JAVA® System Web Server, JAVA® Virtual Machine running on LINUX® or WINDOWS® operating systems).
A web client includes any device or software which communicates via any network, such as, for example any device or software discussed herein. The web client may include internet browsing software installed within a computing unit or system to conduct online communications. These computing units or systems may take the form of a computer or set of computers, although other types of computing units or systems may be used, including personal computers, laptops, notebooks, tablets, smart phones, cellular phones, personal digital assistants, servers, pooled servers, mainframe computers, distributed computing clusters, kiosks, terminals, point of sale (POS) devices or terminals, televisions, or any other device capable of receiving data over a network. The web client may include an operating system (e.g., WINDOWS®, WINDOWS MOBILE® operating systems, UNIX® operating system, LINUX® operating systems, APPLE® OS® operating systems, etc.) as well as various support software and drivers typically associated with computers. The web-client may also run MICROSOFT® INTERNET EXPLORER® software, MOZILLA® FIREFOX® software, GOOGLE CHROME™ software, APPLE® SAFARI® software, or any other of the myriad software packages available for browsing the internet.
As those skilled in the art will appreciate, the web client may or may not be in direct contact with the server (e.g., application server, web server, etc., as discussed herein). For example, the web client may access the services of the server through another server and/or hardware component, which may have a direct or indirect connection to an internet server. For example, the web client may communicate with the server via a load balancer. In various embodiments, web client access is through a network or the internet through a commercially-available web-browser software package. In that regard, the web client may be in a home or business environment with access to the network or the internet. The web client may implement security protocols such as Secure Sockets Layer (SSL) and Transport Layer Security (TLS). A web client may implement several application layer protocols including HTTP, HTTPS, FTP, and SFTP.
The various system components may be independently, separately, or collectively suitably coupled to the network via data links which includes, for example, a connection to an Internet Service Provider (ISP) over the local loop as is typically used in connection with standard modem communication, cable modem, DISH NETWORK®, ISDN, Digital Subscriber Line (DSL), or various wireless communication methods. It is noted that the network may be implemented as other types of networks, such as an interactive television (ITV) network. Moreover, the system contemplates the use, sale, or distribution of any goods, services, or information over any network having similar functionality described herein.
The system contemplates uses in association with web services, utility computing, pervasive and individualized computing, security and identity solutions, autonomic computing, cloud computing, commodity computing, mobility and wireless solutions, open source, biometrics, grid computing, and/or mesh computing.
Any of the communications, inputs, storage, databases or displays discussed herein may be facilitated through a website having web pages. The term “web page” as it is used herein is not meant to limit the type of documents and applications that might be used to interact with the user. For example, a typical website might include, in addition to standard HTML documents, various forms, JAVA® applets, JAVASCRIPT® programs, active server pages (ASP), common gateway interface scripts (CGI), extensible markup language (XML), dynamic HTML, cascading style sheets (CSS), AJAX (Asynchronous JAVASCRIPT And XML) programs, helper applications, plug-ins, and the like. A server may include a web service that receives a request from a web server, the request including a URL and an IP address (192.168.1.1). The web server retrieves the appropriate web pages and sends the data or applications for the web pages to the IP address. Web services are applications that are capable of interacting with other applications over a communications means, such as the internet. Web services are typically based on standards or protocols such as XML, SOAP, AJAX, WSDL and UDDI. For example, representational state transfer (REST), or RESTful, web services may provide one way of enabling interoperability between applications.
The computing unit of the web client may be further equipped with an internet browser connected to the internet or an intranet using standard dial-up, cable, DSL, or any other internet protocol. Communications originating at a web client may pass through a firewall in order to prevent unauthorized access from users of other networks. Further, additional firewalls may be deployed between the varying components of CMS to further enhance security.
Encryption may be performed by way of any of the techniques now available in the art or which may become available—e.g., Twofish, RSA, El Gamal, Schorr signature, DSA, PGP, PKI, GPG (GnuPG), HPE Format-Preserving Encryption (FPE), Voltage, Triple DES, Blowfish, AES, MD5, HMAC, IDEA, RC6, and symmetric and asymmetric cryptosystems. The systems and methods may also incorporate SHA series cryptographic methods, elliptic curve cryptography (e.g., ECC, ECDH, ECDSA, etc.), and/or other post-quantum cryptography algorithms under development.
The firewall may include any hardware and/or software suitably configured to protect CMS components and/or enterprise computing resources from users of other networks. Further, a firewall may be configured to limit or restrict access to various systems and components behind the firewall for web clients connecting through a web server. Firewall may reside in varying configurations including Stateful Inspection, Proxy based, access control lists, and Packet Filtering among others. Firewall may be integrated within a web server or any other CMS components or may further reside as a separate entity. A firewall may implement network address translation (“NAT”) and/or network address port translation (“NAPT”). A firewall may accommodate various tunneling protocols to facilitate secure communications, such as those used in virtual private networking. A firewall may implement a demilitarized zone (“DMZ”) to facilitate communications with a public network such as the internet. A firewall may be integrated as software within an internet server or any other application server components, reside within another computing device, or take the form of a standalone hardware component.
Any databases discussed herein may include relational, hierarchical, graphical, blockchain, object-oriented structure, and/or any other database configurations. Any database may also include a flat file structure wherein data may be stored in a single file in the form of rows and columns, with no structure for indexing and no structural relationships between records. For example, a flat file structure may include a delimited text file, a CSV (comma-separated values) file, and/or any other suitable flat file structure. Common database products that may be used to implement the databases include DB2® by IBM® (Armonk, N.Y.), various database products available from ORACLE® Corporation (Redwood Shores, Calif.), MICROSOFT ACCESS® or MICROSOFT SQL SERVER® by MICROSOFT® Corporation (Redmond, Wash.), MYSQL® by MySQL AB (Uppsala, Sweden), MONGODB®, Redis, APACHE CASSANDRA®, HBASE® by APACHE®, MapR-DB by the MAPR® corporation, or any other suitable database product. Moreover, any database may be organized in any suitable manner, for example, as data tables or lookup tables. Each record may be a single file, a series of files, a linked series of data fields, or any other data structure.
As used herein, big data may refer to partially or fully structured, semi-structured, or unstructured data sets including millions of rows and hundreds of thousands of columns. A big data set may be compiled, for example, from a history of purchase transactions over time, from web registrations, from social media, from records of charge (ROC), from summaries of charges (SOC), from internal data, or from other suitable sources. Big data sets may be compiled without descriptive metadata such as column types, counts, percentiles, or other interpretive-aid data points.
Association of certain data may be accomplished through various data association techniques. For example, the association may be accomplished either manually or automatically. Automatic association techniques may include, for example, a database search, a database merge, GREP, AGREP, SQL, using a key field in the tables to speed searches, sequential searches through all the tables and files, sorting records in the file according to a known order to simplify lookup, and/or the like. The association step may be accomplished by a database merge function, for example, using a “key field” in pre-selected databases or data sectors. Various database tuning steps are contemplated to optimize database performance. For example, frequently used files such as indexes may be placed on separate file systems to reduce In/Out (“I/O”) bottlenecks.
More particularly, a “key field” partitions the database according to the high-level class of objects defined by the key field. For example, certain types of data may be designated as a key field in a plurality of related data tables and the data tables may then be linked on the basis of the type of data in the key field. The data corresponding to the key field in each of the linked data tables is preferably the same or of the same type. However, data tables having similar, though not identical, data in the key fields may also be linked by using AGREP, for example. In accordance with various embodiments, any suitable data storage technique may be utilized to store data without a standard format. Data sets may be stored using any suitable technique, including, for example, storing individual files using an ISO/IEC 7816-4 file structure; implementing a domain whereby a dedicated file is selected that exposes one or more elementary files containing one or more data sets; using data sets stored in individual files using a hierarchical filing system; data sets stored as records in a single file (including compression, SQL accessible, hashed via one or more keys, numeric, alphabetical by first tuple, etc.); data stored as Binary Large Object (BLOB); data stored as ungrouped data elements encoded using ISO/IEC 7816-6 data elements; data stored as ungrouped data elements encoded using ISO/IEC Abstract Syntax Notation (ASN.1) as in ISO/IEC 8824 and 8825; other proprietary techniques that may include fractal compression methods, image compression methods, etc.
In various embodiments, the ability to store a wide variety of information in different formats is facilitated by storing the information as a BLOB. Thus, any binary information can be stored in a storage space associated with a data set. As discussed above, the binary information may be stored in association with the system or external to but affiliated with the system. The BLOB method may store data sets as ungrouped data elements formatted as a block of binary via a fixed memory offset using either fixed storage allocation, circular queue techniques, or best practices with respect to memory management (e.g., paged memory, least recently used, etc.). By using BLOB methods, the ability to store various data sets that have different formats facilitates the storage of data, in the database or associated with the system, by multiple and unrelated owners of the data sets. For example, a first data set which may be stored may be provided by a first party, a second data set which may be stored may be provided by an unrelated second party, and yet a third data set which may be stored may be provided by a third party unrelated to the first and second party. Each of these three exemplary data sets may contain different information that is stored using different data storage formats and/or techniques. Further, each data set may contain subsets of data that also may be distinct from other subsets.
As stated above, in various embodiments, the data can be stored without regard to a common format. However, the data set (e.g., BLOB) may be annotated in a standard manner when provided for manipulating the data in the database or system. The annotation may comprise a short header, trailer, or other appropriate indicator related to each data set that is configured to convey information useful in managing the various data sets. For example, the annotation may be called a “condition header,” “header,” “trailer,” or “status,” herein, and may comprise an indication of the status of the data set or may include an identifier correlated to a specific issuer or owner of the data. In one example, the first three bytes of each data set BLOB may be configured or configurable to indicate the status of that particular data set; e.g., LOADED, INITIALIZED, READY, BLOCKED, REMOVABLE, or DELETED. Subsequent bytes of data may be used to indicate for example, the identity of the issuer, user, transaction/membership account identifier or the like. Each of these condition annotations are further discussed herein.
The data set annotation may also be used for other types of status information as well as various other purposes. For example, the data set annotation may include security information establishing access levels. The access levels may, for example, be configured to permit only certain individuals, levels of employees, companies, or other entities to access data sets, or to permit access to specific data sets based on the user or other data. Furthermore, the security information may restrict/permit only certain actions, such as accessing, modifying, and/or deleting data sets. In one example, the data set annotation indicates that only the data set owner or the user are permitted to delete a data set, various identified users may be permitted to access the data set for reading, and others are altogether excluded from accessing the data set. However, other access restriction parameters may also be used allowing various entities to access a data set with various permission levels as appropriate.
The data, including the header or trailer, may be received by a standalone interaction device configured to add, delete, modify, or augment the data in accordance with the header or trailer. As such, in one embodiment, the header or trailer is not stored on the device along with the associated data, but instead the appropriate action may be taken by providing to the user, at the standalone device, the appropriate option for the action to be taken. The system may contemplate a data storage arrangement wherein the header or trailer, or header or trailer history, of the data is stored on the system or device in relation to the appropriate data.
One skilled in the art will also appreciate that, for security reasons, any databases, systems, devices, servers, or other components of the system may comprise any combination thereof at a single location or at multiple locations, wherein each database or system includes any of various suitable security features, such as firewalls, access codes, encryption, decryption, compression, decompression, and/or the like.
Practitioners will also appreciate that there are a number of methods for displaying data within a browser-based document. Data may be represented as standard text or within a fixed list, scrollable list, drop-down list, editable text field, fixed text field, pop-up window, and the like. Likewise, there are a number of methods available for modifying data in a web page such as, for example, free text entry using a keyboard, selection of menu items, check boxes, option boxes, and the like.
The data may be big data that is processed by a distributed computing cluster. The distributed computing cluster may be, for example, a HADOOP® software cluster configured to process and store big data sets with some of nodes comprising a distributed storage system and some of nodes comprising a distributed processing system. In that regard, distributed computing cluster may be configured to support a HADOOP® software distributed file system (HDFS) as specified by the Apache Software Foundation at www.hadoop.apache.org/docs.
As used herein, the term “network” includes any cloud, cloud computing system, or electronic communications system or method which incorporates hardware and/or software components. Communication among the parties may be accomplished through any suitable communication channels, such as, for example, a telephone network, an extranet, an intranet, internet, point of interaction device (point of sale device, personal digital assistant (e.g., an IPHONE® device, a BLACKBERRY® device), cellular phone, kiosk, etc.), online communications, satellite communications, off-line communications, wireless communications, transponder communications, local area network (LAN), wide area network (WAN), virtual private network (VPN), networked or linked devices, keyboard, mouse, and/or any suitable communication or data input modality. Moreover, although the system is frequently described herein as being implemented with TCP/IP communications protocols, the system may also be implemented using IPX, APPLETALK® program, IP-6, NetBIOS, OSI, any tunneling protocol (e.g. IPsec, SSH, etc.), or any number of existing or future protocols. If the network is in the nature of a public network, such as the internet, it may be advantageous to presume the network to be insecure and open to eavesdroppers. Specific information related to the protocols, standards, and application software utilized in connection with the internet may be contemplated.
“Cloud” or “Cloud computing” includes a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. Cloud computing may include location-independent computing, whereby shared servers provide resources, software, and data to computers and other devices on demand.
As used herein, “transmit” may include sending electronic data from one system component to another over a network connection. Additionally, as used herein, “data” may include encompassing information such as commands, queries, files, data for storage, and the like in digital or any other form.
Any database discussed herein may comprise a distributed ledger maintained by a plurality of computing devices (e.g., nodes) over a peer-to-peer network. Each computing device maintains a copy and/or partial copy of the distributed ledger and communicates with one or more other computing devices in the network to validate and write data to the distributed ledger. The distributed ledger may use features and functionality of blockchain technology, including, for example, consensus-based validation, immutability, and cryptographically chained blocks of data. The blockchain may comprise a ledger of interconnected blocks containing data. The blockchain may provide enhanced security because each block may hold individual transactions and the results of any blockchain executables. Each block may link to the previous block and may include a timestamp. Blocks may be linked because each block may include the hash of the prior block in the blockchain. The linked blocks form a chain, with only one successor block allowed to link to one other predecessor block for a single chain. Forks may be possible where divergent chains are established from a previously uniform blockchain, though typically only one of the divergent chains will be maintained as the consensus chain. In various embodiments, the blockchain may implement smart contracts that enforce data workflows in a decentralized manner. The system may also include applications deployed on user devices such as, for example, computers, tablets, smartphones, Internet of Things devices (“IoT” devices), etc. The applications may communicate with the blockchain (e.g., directly or via a blockchain node) to transmit and retrieve data. In various embodiments, a governing organization or consortium may control access to data stored on the blockchain. Registration with the managing organization(s) may enable participation in the blockchain network.
Data transfers performed through the blockchain-based system may propagate to the connected peers within the blockchain network within a duration that may be determined by the block creation time of the specific blockchain technology implemented. For example, on an ETHEREUM®-based network, a new data entry may become available within about 13-20 seconds as of the writing. On a HYPERLEDGER® Fabric 1.0 based platform, the duration is driven by the specific consensus algorithm that is chosen, and may be performed within seconds. In that respect, propagation times in the system may be improved compared to existing systems, and implementation costs and time to market may also be drastically reduced. The system also offers increased security at least partially due to the immutable nature of data that is stored in the blockchain, reducing the probability of tampering with various data inputs and outputs. Moreover, the system may also offer increased security of data by performing cryptographic processes on the data prior to storing the data on the blockchain. Therefore, by transmitting, storing, and accessing data using the system described herein, the security of the data is improved, which decreases the risk of the computer or network from being compromised.
In various embodiments, the system may also reduce database synchronization errors by providing a common data structure, thus at least partially improving the integrity of stored data. The system also offers increased reliability and fault tolerance over traditional databases (e.g., relational databases, distributed databases, etc.) as each node operates with a full copy of the stored data, thus at least partially reducing downtime due to localized network outages and hardware failures. The system may also increase the reliability of data transfers in a network environment having reliable and unreliable peers, as each node broadcasts messages to all connected peers, and, as each block comprises a link to a previous block, a node may quickly detect a missing block and propagate a request for the missing block to the other nodes in the blockchain network.
The particular blockchain implementation described herein provides improvements over technology by using a decentralized database and improved processing environments. In particular, the blockchain implementation improves computer performance by, for example, leveraging decentralized resources (e.g., lower latency). The distributed computational resources improves computer performance by, for example, reducing processing times. Furthermore, the distributed computational resources improves computer performance by improving security using, for example, cryptographic protocols.
This application claims priority to, and the benefit of, U.S. Provisional Ser. No. 63/297,075 filed on Jan. 6, 2022 and entitled “Method and Apparatus for Creating 3D Objects with Lenticular Surfaces,” the entire contents of which is hereby incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
63297075 | Jan 2022 | US |