One most critical anthropometric measurement for product development of bras is breast size which associates with bra cup size. The boundary of the breast(s) is important in determining the breast size. This boundary information, which requires identifying where the fat tissue of the breast ceases, is not available on a 3D scan. Without this information, there is no way to determine the breast volume correctly or predict a breast size. No wonder bra fit has been troubling women for decades and as many as 85% of women wear a wrong-size bra on a daily basis. In addition, an ill-fitted bra can bring health concerns to the wearer. It may cause her back pain, shoulder pain and neck pain.
Currently, it is common practice to find the boundary through physical manipulation of the breasts, such as by pushing the entire breast upward to reveal the folding line. However, for obvious reasons, this method may be unpleasant experience for individuals.
Accordingly, disclosed is a non-contact method for determining a boundary of breasts. The method may comprise receiving, by a processor, a plurality of three-dimensional (3D) images. The three-dimensional images may be successive 3D images. The 3D images may include the breasts of the same individual. The 3D images may be acquired while the individual is moving. The method may further comprise receiving, by the processor, a three-dimensional (3D) image acquired while the individual is stationary, defining a number of datapoints on the surface of the breasts in the 3D image acquired while the individual is stationary and a number of datapoints on the surface of an alignment region and selecting a subset of the 3D images acquired while the individual is moving. For each selected 3D image in the subset, the method may comprise pre-processing the selected 3D image to at least remove image data outside a predetermined region and rotate the selected 3D image to have each selected 3D image in the same orientation as the 3D image acquired while the individual is stationary, aligning the selected 3D image with respect to the alignment region of the 3D image acquired when the individual is stationary, defining a number of datapoints on the surface of the breasts in the selected 3D image and comparing the selected 3D image with the 3D image acquired when the individual is stationary by determining for each defined datapoint a vertical displacement. The method may further comprise determining, for each defined datapoint, a displacement parameter based on the determined vertical displacement for each 3D image in the subset of 3D images with respect to the 3D image acquired when the individual is stationary for the same defined datapoint, generating a mapping based on the displacement parameter for each defined datapoint and determining the boundary of the breasts using a threshold based on the mapping.
In an aspect of the disclosure, the subset of 3D images may comprise 3D images showing at least one complete gait cycle. The subset of 3D images may also comprise at least a predetermined number of 3D images. In an aspect of the disclosure, the subset of 3D images may comprise 3D images acquired after a preset number of 3D images and before a preset number of 3D images.
In an aspect of the disclosure, the predetermined region may include the torso. In an aspect of the disclosure, the preprocessing may further comprise identifying an underbust level and bust point for each breast and removing image data below the identified underbust level.
In an aspect of the disclosure, the alignment region may be at an upper back area of the individual. In an aspect of the disclosure, the alignment may comprise minimizing a shape discrepancy between each selected 3D image and the 3D image acquired while the individual is stationary by iteratively moving a selected 3D image and calculating the shape discrepancy.
In an aspect of the disclosure, the pre-processing may further comprise determining whether another body part is covering a surface of the breast and torso region and in response to determining that another body part is covering a surface of the breast or torso region, removing image data associated with the another body part and filling in a space corresponding to the removed image data with surface image points predicted for the space to maintain a curvature with a surrounding surface of the breast or maintain the curvature of the torso region.
In an aspect of the disclosure, the pre-processing may further comprise determining a first average value of image points in the predetermined region in a first direction, determining a second average value of image points in the predetermined region in a second direction orthogonal to the first direction and orthogonal to the longitudinal axis of the body; and defining the central axis of the predetermined region as intersecting by the first average value and the second average value and parallel to the longitudinal axis of the body. The first direction may be orthogonal to a longitudinal axis of the individual. The pre-processing may further comprise shifting the selected 3D image such that the central axis intersects an origin.
In an aspect of the disclosure, the vertical displacement may be determined using
d
j
=z
ij
−z
i0
where d a is an array containing the vertical displacements of all the defined datapoints for the jth 3D image, where j is 1≤j≤N, where N is the number of 3D images in the subset, zij is the z-coordinate of the i-th defined datapoint of that jth 3D image (1≤i≤M), where M is the number of defined datapoints, while zi0 is the z-coordinate of the i-th point of the 3D image acquired while the same individual is stationary. In an aspect of the disclosure, the displacement parameter may be a standard deviation.
In an aspect of the disclosure, the threshold may be determined based on a range of the displacement parameters for the datapoints and a preset percentage. For example, the threshold may be determined from an average of the displacement parameters in a first region and subtracting an average of the displacement parameters in a second region and multiplying the preset percentage.
In an aspect of the disclosure, the vertical slices may be parallel to the frontal or coronal plane of the individual. In this aspect, the datapoints on the surface of the breasts may be defined by partitioning, by the processor, each breast into vertical slices and partitioning, by the processor, each vertical slice into a plurality of portions on the surface of the respective breast based on a fixed angular interval. Each portion may correspond to an angle value, and each portion may include a set of points. For each portion on each slice, the processor may determine an average distance among distances of the set of points with respect to one of the associated reference points for a corresponding vertical slice; and set a point associated with the average distance as a datapoint represented by the angle value corresponding to the portion. The datapoint may be one of the number of datapoints identified. When there is an absence of image points for a particular point in any of the vertical slices, a set of undefined values may be assigned to the datapoints for the particular portions. In an aspect of the disclosure, determining of the boundary may further comprise identifying, for each angle having a datapoint between a first angle and a second angle, a vertical slice having the displacement parameter closest to the threshold and identifying a median vertical slice among the identified vertical slices. Datapoints in the posterior direction of the median vertical slice may be removed.
In other aspects of the disclosure, the vertical slices may be parallel to the sagittal plane of the individual. In this aspect, the datapoints on the surface of the breasts may be defined by partitioning, by the processor, each breast into vertical slices, the vertical slices being parallel to the sagittal plane and partitioning, by the processor, each vertical slice into a plurality of portions on the surface of the respective breast based on a fixed interval with respect to a first direction. Each portion may correspond to a specific value in the first direction, and each portion may include a set of points. The first direction may be orthogonal to the longitudinal axis and parallel to the sagittal plane. For each portion on each slice, the processor may determine an average coordinate among coordinates of the set of points for a corresponding vertical slice, the coordinate being in a direction parallel to the longitudinal axis and set a point associated with the average coordinate as a datapoint represented by the specific value corresponding to the portion. The datapoint may be one of the number of datapoints identified. In an aspect of the disclosure, determining of the boundary may further comprise identifying, for each vertical slice, the specific value having the displacement parameter closest to the threshold and identifying a median specific value among the identified specific values. Datapoints in the posterior direction of the median specific value may be removed.
In an aspect of the disclosure, the mapping may be displayed. In an aspect of the disclosure, the method may further comprise removing image data from the 3D image acquired when the individual is stationary based on the threshold and displaying a 3D image of the breasts.
Also disclosed is a non-contact method for predicting a cup size of breasts of an individual. The method may comprise receiving, by a processor, a plurality of three-dimensional (3D) images. The three-dimensional images may be successive 3D images. The 3D images may include the breasts of the same individual. The 3D images may be acquired while the individual is moving. The method may further comprise receiving, by the processor, a three-dimensional (3D) image acquired while the individual is stationary, defining a number of datapoints on the surface of the breasts in the 3D image acquired while the individual is stationary and a number of datapoints on the surface of an alignment region and selecting a subset of the 3D images acquired while the individual is moving. For each selected 3D image in the subset, the method may comprise pre-processing the selected 3D image to at least remove image data outside a predetermined region and rotate the selected 3D image to have each selected 3D image in the same orientation as the 3D image acquired while the individual is stationary, aligning the selected 3D image with respect to the alignment region of the 3D image acquired when the individual is stationary, defining a number of datapoints on the surface of the breasts in the selected 3D image and comparing the selected 3D image with the 3D image acquired when the individual is stationary by determining for each defined datapoint a vertical displacement. The method may further comprise determining, for each defined datapoint, a displacement parameter based on the determined vertical displacement for each 3D image in the subset of 3D images with respect to the 3D image acquired when the individual is stationary for the same defined datapoint, generating a mapping based on the displacement parameter for each defined datapoint and determining the boundary of the breasts using a threshold based on the mapping. The method may further comprise separating the breasts from other parts of the 3D image acquired while the individual is stationary based on the threshold value, defining a number of datapoints on the surface of the breasts in the 3D image acquired while the individual is stationary using horizontal slicing, calculating a shape discrepancy between the breasts in the 3D image acquired while the individual is stationary using the defined datapoints and datapoints in 3D images of breasts associated with known cup sizes, respectively and determining the cup size based on the calculated shape discrepancy for each known cup size. Each 3D image for the known cup sizes may be acquired while a model is stationary.
In an aspect of the disclosure, the method may further comprise at least one of displaying the determined cup size or transmitting the determined cup size to a preset device.
Also disclosed is a non-contact method for evaluating a performance of a garment. The method may comprise receiving, by a processor, a plurality of three-dimensional (3D) images. The 3D images may be successive 3D images. The 3D images may include the breasts of the same individual. The 3D images may be acquired while the individual is moving. The method may further comprise receiving, by the processor, a three-dimensional (3D) image acquired while the individual is stationary, defining a number of datapoints on the surface of the breasts in the 3D image acquired while the individual is stationary and a number of datapoints on the surface of an alignment region and selecting a subset of the 3D images acquired while the individual is moving. For each selected 3D image in the subset, the method may comprise pre-processing the selected 3D image to at least remove image data outside a predetermined region and rotate the selected 3D image to have each selected 3D image in the same orientation as the 3D image acquired while the individual is stationary, aligning the selected 3D image with respect to the alignment region of the 3D image acquired when the individual is stationary, defining a number of datapoints on the surface of the breasts in the selected 3D image and comparing the selected 3D image with the 3D image acquired when the individual is stationary by determining for each defined datapoint a displacement. The method may further comprise determining, for each defined datapoint, a displacement parameter based on the determined displacement for each 3D image in the subset of 3D images with respect to the 3D image acquired when the individual is stationary for the same defined datapoint, generating a mapping based on the displacement parameter for each defined datapoint, identifying areas in the mapping with a displacement parameter greater than a threshold; and generating a report based on the identified areas.
In an aspect of the disclosure, the pre-processing may further comprise determining whether another body part is covering a surface of the breast and torso region and in response to determining that another body part is covering a surface of the breast or torso region, removing image data associated with the another body part and filling in a space corresponding to the removed image data with surface image points predicted for the space to maintain a curvature with a surrounding surface of the breast or maintain the curvature of the torso region.
In an aspect of the disclosure, the datapoints on the surface of the breasts may be defined using horizontal slicing. In an aspect of the disclosure, the datapoints may be define by partitioning, by the processor, the breasts into horizontal slices and partitioning, by the processor, each horizontal slice into a plurality of portions on the surface of the breasts based on a fixed angular interval. Each portion may correspond to an angle value, and each portion may include a set of points. For each portion on each slice, the processor may determine an average distance among distances of the set of points with respect to one of the associated reference points and set a point associated with the average distance as a datapoint represented by the angle value corresponding to the portion. The datapoint may be one of the number of datapoints identified. When there is an absence of image points for a particular point in any of the horizontal slices, a set of undefined values may be assigned to the datapoints for the particular portions.
In an aspect of the disclosure, the displacement may be a horizontal displacement. The horizontal displacement may be determined using
d′
j=√{square root over (xij2+yij2)}−√{square root over (xi02+yi02)}
where d′j is an array containing distances from the associated reference point for the horizontal slice in j-th 3D image, where j is 1≤j≤N, where N is the number of 3D images in the subset xij and yij is the x-coordinate and y-coordinate, respectively, of the i-th datapoint in the j-th 3D image, where 1≤i≤P, where P is the number of defined datapoints, while xi0 and yi0 is the x-coordinate and y-coordinate, respectively, of the i-th datapoint of the 3D image acquired when the individual is stationary. The displacement parameter may be a standard deviation of the calculated horizontal displacement for a given datapoint.
In an aspect of the disclosure, the base image for comparison may be a three-dimensional image acquired while the individual is stationary or one of the three-dimensional images acquired while the individual is moving for determining the boundary of the breasts. In an aspect of the disclosure, a non-contact method for determining a boundary of the breasts may comprise receiving a plurality of three-dimensional (3D) images. In some aspects, the 3D images may be successive 3D images acquired while the individual is moving. The plurality of 3D images may include the breasts of the same individual. In other aspects, one 3D image may be acquired while the individual is stationary. The method may further comprise selecting one 3D image as a base image. The method may further comprise selecting a subset of the 3D images acquired while the individual is moving. For the base image, the method may comprise pre-processing the base image to at least remove image data outside a predetermined region and rotate to a target orientation and defining a number of datapoints on the surface of the breasts and a number of datapoints on the surface of an alignment region. For the remaining selected 3D images in the subset (after the base 3D image may be removed) or the selected 3D images in the subset, the method may comprise pre-processing the 3D image to at least remove image data outside a predetermined region and rotate the 3D image to have each 3D image selected in the same orientation as the base image, aligning the 3D image with respect to the alignment region of the base image, defining a number of datapoints on the surface of the breasts in the 3D image; and comparing the 3D image with the base image by determining for each defined datapoint a vertical displacement. The method may further comprise determining, for each defined datapoint, a displacement parameter based on the determined vertical displacement for each 3D image selected with respect to the base image for the same defined datapoint, generating a mapping based on the displacement parameter for each defined datapoint, and determining the boundary of the breasts using a threshold based on the mapping.
In another aspect of the disclosure, a non-contact method for predicting a cup size of breasts of an individual may comprise receiving a plurality of three-dimensional (3D) images. In some aspects, the 3D images may be successive 3D images acquired while the individual is moving. The plurality of 3D images may include the breasts of the same individual. In other aspects, one 3D image may be acquired while the individual is stationary. The method may further comprise selecting one 3D image as a base image. The method may further comprise selecting a subset of the 3D images acquired while the individual is moving. For the base image, the method may comprise pre-processing the base image to at least remove image data outside a predetermined region and rotate to a target orientation and defining a number of datapoints on the surface of the breasts and a number of datapoints on the surface of an alignment region. For the remaining selected 3D images in the subset (after the base image may be removed) or the selected 3D images in the subset, the method may comprise pre-processing the 3D image to at least remove image data outside a predetermined region and rotate the 3D image to have each 3D image selected in the same orientation as the base image, aligning the 3D image with respect to the alignment region of the base image, defining a number of datapoints on the surface of the breasts in the 3D image; and comparing the 3D image with the base image by determining for each defined datapoint a vertical displacement. The method may further comprise determining, for each defined datapoint, a displacement parameter based on the determined vertical displacement for each 3D image selected with respect to the base image for the same defined datapoint, generating a mapping based on the displacement parameter for each defined datapoint, and determining the boundary of the breasts using a threshold based on the mapping. The method may further comprise separating the breasts from other parts of the 3D image acquired while the individual is stationary based on the threshold value, defining a number of datapoints on the surface of the breasts in the 3D image acquired while the individual is stationary using horizontal slicing, calculating a shape discrepancy between the breasts in the 3D image acquired while the individual is stationary using the defined datapoints and datapoints in 3D images of breasts associated with known cup sizes, respectively and determining the cup size based on the calculated shape discrepancy for each known cup size. Each 3D image for the known cup sizes may be acquired while a model is stationary.
In another aspect of the disclosure, a non-contact method for evaluating a performance of a garment may comprise receiving a plurality of three-dimensional (3D) images. In some aspects, the 3D images may be successive 3D images acquired while the individual is moving. The plurality of 3D images may include the breasts of the same individual. In other aspects, one 3D image may be acquired while the individual is stationary. The method may further comprise selecting one 3D image as a base image. The method may further comprise selecting a subset of the 3D images acquired while the individual is moving. For the base image, the method may comprise pre-processing the base image to at least remove image data outside a predetermined region and rotate to a target orientation and defining a number of datapoints on the surface of the breasts and a number of datapoints on the surface of an alignment region. For the remaining selected 3D images in the subset (after the base image may be removed) or the selected 3D images in the subset, the method may comprise pre-processing the 3D image to at least remove image data outside a predetermined region and rotate the 3D image to have each 3D image selected in the same orientation as the base image, aligning the 3D image with respect to the alignment region of the base image, defining a number of datapoints on the surface of the breasts in the 3D image; and comparing the 3D image with the base image by determining for each defined datapoint a displacement. The method may further comprise determining, for each defined datapoint, a displacement parameter based on the determined displacement for each 3D image selected with respect to the base image for the same defined datapoint, generating a mapping based on the displacement parameter for each defined datapoint, and determining the boundary of the breasts using a threshold based on the mapping. The method may further comprise identifying areas in the mapping with a displacement parameter greater than a threshold; and generating a report based on the identified areas.
Also disclosed is an apparatus or system which may comprise a three-dimensional (3D) image scanner, a memory, a processor and a display. The 3D image scanner may be configured to obtain images of an individual and generate a plurality of 3D images of the individual. The memory may be configured to store image data for each 3D image. The processor may be configured to select a subset of the 3D images. The subset of 3D images may be 3D images acquired while the individual is moving. The processor may also be configured to select a base image. The base image may be a 3D image acquired while the individual is stationary or one of the selected 3D images in the subset. For the base image, the processor may be configured to pre-process the base image to at least remove image data outside a predetermined region and rotate to a target orientation; and define a number of datapoints on the surface of the breasts and a number of datapoints on the surface of an alignment region. For the selected 3D images in the subset or the remaining 3D images in the subset (after the based 3D image may be removed), the processor may be configured to pre-process the 3D image to at least remove image data outside a predetermined region and rotate the 3D image to have each 3D image selected in the same orientation as the base image, align the 3D image with respect to the alignment region of the base image, define a number of datapoints on the surface of the breasts in the 3D image and compare the 3D image with the base image by determining for each defined datapoint a vertical displacement. The processor may also be configured to determine, for each defined datapoint, a displacement parameter based on the determined vertical displacement for each 3D image selected with respect to the base image for the same defined datapoint, generate a mapping based on the displacement parameter for each defined datapoint and determine the boundary of the breasts using a threshold based on the mapping. The display may be configured to display at least the mapping.
In an aspect of the disclosure, the datapoints on the surface of the breasts may be defined using vertical slicing.
In an aspect of the disclosure, the 3D images may be successive 3D images acquired while the individual is moving. In other aspects, the 3D image scanner may be further configured to obtain images while the individual is stationary and generate a three-dimensional image (3D) of the individual.
In an aspect of the disclosure, the 3D image scanner may comprise a plurality of cameras positioned at different locations to cover a 360° view.
In an aspect of the disclosure, the apparatus or system may comprise one or more communication interfaces. In an aspect of the disclosure, the 3D image scanner may be configured to transmit the 3D images to the processor via a communication interface. The communication interface may be wireless. In an aspect of the disclosure, the processor may be configured to transmit the mapping to the display via a communication interface.
In an aspect of the disclosure, the processor may be configured to predict a cup size of the breasts of an individual. In this aspect, the processor may be configured to separate the breasts from other parts of the 3D image acquired while the individual is stationary based on the threshold value, define a number of datapoints on the surface of the breasts in the 3D image acquired while the individual is stationary using horizontal slicing, calculate a shape discrepancy between the breasts in the 3D image acquired while the individual is stationary using the defined datapoints and datapoints in images of breasts associated with known cup sizes, respectively, and predict the cup size based on the calculated shape discrepancy for each known cup size. Each 3D image for the known cup sizes may be acquired with a model is stationary and processed in the same manner as the 3D image acquired while the individual is stationary.
In an aspect of the disclosure, the predicted cup size may be displayed and/or transmitted to a user terminal.
In an aspect of the disclosure, the apparatus or system may further comprise a point of sales terminal and the display may be in the point of sales terminal.
Also disclosed is an apparatus or system which may comprise a three-dimensional (3D) image scanner, a memory, and a processor. The 3D image scanner may be configured to obtain images of an individual and generate a plurality of 3D images of the individual. The memory may be configured to store image data for each 3D image. The processor may be configured to select a subset of the 3D images. The subset of 3D images may be 3D images acquired while the individual is moving. The processor may also be configured to select a base image. The base image may be a 3D image acquired while the individual is stationary or one of the selected 3D images in the subset (or another 3D image). For the base image, the processor may be configured to pre-process the base image to at least remove image data outside a predetermined region and rotate to a target orientation; and define a number of datapoints on the surface of the breasts and a number of datapoints on the surface of an alignment region. For the selected 3D images in the subset or the remaining 3D images in the subset (after the base 3D image may be removed), the processor may be configured to pre-process the 3D image to at least remove image data outside a predetermined region and rotate the 3D image to have each 3D image selected in the same orientation as the base image, align the 3D image with respect to the alignment region of the base image, define a number of datapoints on the surface of the breasts in the 3D image and compare the 3D image with the base image by determining for each defined datapoint a displacement. The processor may also be configured to determine, for each defined datapoint, a displacement parameter based on the determined displacement for each 3D image selected with respect to the base image for the same defined datapoint, generate a mapping based on the displacement parameter for each defined datapoint, identify areas in the mapping with a displacement parameter greater than a threshold and generate a report based on the identified areas.
In an aspect of the disclosure, the apparatus or system may further comprise a display and the report may be displayed on the display. In other aspects, the processor may transmit the report to a user terminal.
In an aspect of the disclosure, the 3D images may be successive 3D images acquired while the individual is moving. In other aspects, the 3D image scanner may be further configured to obtain images while the individual is stationary and generate a three-dimensional image (3D) of the individual.
In an aspect of the disclosure, the 3D images may be acquired while the individual is wearing a garment. In an aspect of the disclosure, the garment may be a sports bra. In an aspect of the disclosure, the 3D images may be also acquired while the individual is nude and the processor may be configured to compare determined displacement when the individual is wearing the garment and when the individual is nude. In an aspect of the disclosure, the report may comprise a percent difference in the displacement when the individual is wearing the garment and when the individual is nude.
In an aspect of the disclosure, the displacement may be a horizontal displacement.
Also disclosed is an apparatus which may comprise a processor and a display. The processor may be configured to receive a plurality of three-dimensional (3D) images and store the 3D images in memory, select a subset of the 3D images and select a base image. The subset of 3D images may be 3D images acquired while the individual is moving. The base image may be one of the selected 3D images in the subset. The base image may be another 3D image acquired while the individual is moving, or 3D image acquired while the individual is stationary. For the base image, the processor may be configured to pre-process the base image to at least remove image data outside a predetermined region and rotate to a target orientation and define a number of datapoints on the surface of the breasts and a number of datapoints on the surface of an alignment region. For the selected 3D images in the subset or the remaining 3D images in the subset (after the base 3D image may be removed), the processor may be configured to pre-process the 3D image to at least remove image data outside a predetermined region and rotate the 3D image to have each 3D image selected in the same orientation as the base image, align the 3D image with respect to the alignment region of the base image, define a number of datapoints on the surface of the breasts in the 3D image and compare the 3D image with the base image by determining for each defined datapoint a vertical displacement. The processor may be further configured to determine, for each defined datapoint, a displacement parameter based on the determined vertical displacement for each 3D image selected with respect to the base image for the same defined datapoint, generate a mapping based on the displacement parameter for each defined datapoint and determine the boundary of the breasts using a threshold based on the mapping. The display may be configured to display at least the mapping.
In an aspect of the disclosure, the processor may be further configured to predict a cup size of breasts of an individual.
Also disclosed is an apparatus which may comprise a processor. The processor may be configured to receive a plurality of three-dimensional (3D) images, store the 3D images in memory, select a subset of the 3D images and select a base image. The subset of 3D images may be 3D images acquired while the individual is moving. The base image may be one of the selected 3D images in the subset. The base image may be another 3D image acquired while the individual is moving, or 3D image acquired while the individual is stationary. For the base image, the processor may be configured to pre-process the base image to at least remove image data outside a predetermined region and rotate to a target orientation and define a number of datapoints on the surface of the breasts and a number of datapoints on the surface of an alignment region. For the selected 3D images in the subset or the remaining 3D images in the subset (after the base 3D image may be removed), the processor may be configured to pre-process the 3D image to at least remove image data outside a predetermined region and rotate the 3D image to have each 3D image selected in the same orientation as the base image, align the 3D image with respect to the alignment region of the base image, define a number of datapoints on the surface of the breasts in the 3D image and compare the 3D image with the base image by determining for each defined datapoint a displacement. The processor may be further configured to determine, for each defined datapoint, a displacement parameter based on the determined displacement for each 3D image selected with respect to the base image for the same defined datapoint, generate a mapping based on the displacement parameter for each defined datapoint, identify areas in the mapping with a displacement parameter greater than a threshold and generate a report based on the identified areas.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
In accordance with aspects of the disclosure, a device 10 or system 30 captures a series of three-dimensional images of an individual, while the individual is moving to determine displacement of the breasts. In some aspects of the disclosure, the displacement may then be used to determine boundary information between the breasts and chest wall. The boundary information may also be used to predict a cup size for a bra. The 4D scanning described herein (where the fourth dimension is time), also makes it possible to track the whole breast under motion, therefore, the scans can also provide information regarding the shape change of breasts during physical activity. In some aspect of the disclosure, this information may be used to evaluate the performance of a garment such as a sports bra. In some aspects, the device 10 or system 30 may capture an 3D image of the individual while the individual is stationary. Either the 3D image captured while the individual is stationary (static image) or one of the 3D images captured while the individual is moving (dynamic image) is used as a base image for comparison.
During physical activities, the breasts usually have a time delay in displacement with the chest wall, and this relative displacement in vertical direction causes the bouncing of breasts. Understanding the vertical displacement of breasts may be critical as many studies have shown that the vertical displacement during physical activities is closely related to breast discomfort.
The processor 14 may be, for example, a central processing unit (CPU) or graphic processing unit (GPU) of the device 10, a microprocessor, a system on chip, and/or other types of hardware processing unit. The memory 16 may be configured to store a set of instructions, where the instructions may include code such as source code and/or executable code. The processor 14 may be configured to execute the set of instructions stored in the memory 16 to implement the methods and functions described herein. In some examples, the set of instructions may include code relating to various image processing techniques, encryption and decryption algorithms, slicing (vertical and/or horizontal), displacement calculation, heat map determination and/or other types of techniques and algorithms that can be applied to implement the methods and functions described herein.
The display 18 may be a touchscreen such as on a mobile phone or tablet. The display 18 may also be a screen of a point of sales terminal at a store. In other aspects, the display 18 may be a computer screen or television screen. In accordance with aspects of the disclosure, the display 18 may be configured to display the heat map(s) determined from a displacement parameter, the threshold(s) for the boundary, the separated breasts defined by the boundaries, the predicted cup size and performance evaluation reports as described herein.
In some aspects of the disclosure, the device 10 may perform all the functionality described herein. However, in other aspects, a system 30 comprising multiple devices may collectively perform the functionality. As shown in
In some aspects, the server 40 may also have a display. The display may display the heat map(s) determined from a displacement parameter, the threshold(s) for the boundary, the separated breasts defined by the boundaries, the predicted cup size and performance evaluation reports as described herein.
In other aspects, the instead of the client 50 transmitting the image data to the server 40, the client 50 may have a memory card and the images may be stored on the memory card and transferred to the server 40 via removal of the memory card from the client 50 and inserted into the server 40. In some aspects, the client 50 may include multiple DSLRs.
In some aspects, the client 50 may be installed in a fitting room of a store. For example, the client 50 may comprise the multiple cameras which may be installed on a railing system on a wall or door of an individual fitting room. The cameras may be mounted to different portions of the fitting room to have complete coverage, e.g., enable 360 degrees acquisition. The person may be able to raise or lower the client 50 such that the imaging device 12 is aligned with the height of the person (breast height) (via the railing system). The server 40 may also be located within the same store such as a point of sales terminal. In this manner, the individual may be able to be imaged (3D image constructed) in privacy but at the same time, the cup size prediction may be shown to an employee of the store.
In other aspects, the client 50 may be used at home such that the individual may be imaged in the privacy of the home and the server 40 may located at a manufacturer of the garment. The individual may position the multiple cameras around a room such that the 3D image may be constructed from the images from each camera.
In other aspects, the device 10/system 30 may be located at the manufacturer and be used to design and test a garment.
In accordance with aspects of the disclosure, the imaging device 12 acquires a plurality of 3D images while the individual is running. The 3D images include the chest/breast region. The individual may be running in place in front of the imaging device 12. Alternatively, the individual may be running on a treadmill (or another piece of exercise equipment). In some aspects, the imaging device 12 may also acquire a 3D image of the individual when the individual is stationary. For the stationary 3D image, it is preferred that the arms do not cover the breasts. The image data is transmitted (transfers) to the processor 14 (and memory 16). The image data for the 3D image(s) is with respect to a coordinate system of the imaging device 12 (e.g., its origin). In some aspects, the individual may be wearing a bra or another garment while the 3D images are captured. In other aspects, the individual may be nude.
At S5, a subset of the 3D images of the individual acquired while the individual was moving may be selected for the displacement determination. In an aspect of the disclosure, the processor 14 may automatically select the subset. In some aspects, the number of 3D images selected may be greater than a preset number of 3D images. In some aspects, the preset number is 9. Additionally, since when a person starts to run, movement is not consistent, in an aspect of the disclosure, the selected 3D images are after a second preset number of 3D images, such that the pattern of displacement may be uniform through the gait cycle. In other aspects of the disclosure, the selected 3D images should include representative images from at least one complete gait cycle. In some aspects, the selected 3D images may have a uniform timing between them.
In other aspects of the disclosure, the processor 14, using machine learning, may identify key image positions in the gait cycle and select 3D images based on the identification. In other aspects, a person may manually select the subset of 3D images. In some aspects, one of the selected 3D images may be used as the base image.
Each of the selected 3D images (subset) may be subsequently processed to determine displacement (other than the one 3D image in the subset selected as the base image, if any). S9-S19 is repeated for each of the selected 3D images in the subset, where the iterations may be determined using a counter. For example, at S7 the processor 14 may set a counter to 1 (I=1) and obtain the selected 3D image in the subset.
At S9, the processor 14 performs pre-processing of the 3D image, e.g., series of image pre-processing steps to identify a predetermined section (remove data outside the reason), fill any holes in the predetermined section caused by the removal, match orientation, identify an underbust level (and bust) and define a central axis and shift the image data as needed.
At S54, the processor 14 may rotate the 3D image to match an orientation of the base image. Where the base image is the stationary image, the rotation is to match the orientation with the stationary image. When one of the 3D images acquired while the person is moving is used as the base image, the rotation would be to match the orientation for the selected 3D image (base image). The rotation may be to match in different views. For example, the 3D images may be rotated such that the images are upright and face frontward.
The curved arrow in
For the base image, the 3D image may be rotated to a preset orientation. In some aspects, the orientation is facing front or back such that the coronal or frontal plane is orthogonal to the y-axis.
At S56, the processor 14 may define the central axis in the selected 3D image and shift the image if needed.
At S74, the processor 14 may shift the predetermined section (as processed above) such that the central axis 900 can intersect a reference point, such as the origin. This effectively causes the average values (in the first direction and the second direction) to move to the reference point. For example, the processor 14 may shift the predetermined section (as processed above) horizontally such as shown in
In some aspects, the processor 14 may shift the predetermined section (as processed above) such that the underbust line 700 align with the z-component of the reference point (e.g., Z=0). However, since the 3D images are going to be vertically shifted to align with other 3D images, this vertical shifting may be omitted.
After the central axis is defined, the bust points may be determined as a local apex in the 3D image.
At S60, the processor 14 may identify the underbust level 700. To determine the underbust level 700, in some aspects, the processor 14 may use an immediate crease of a protruded region (defined with respect to the bust point 760/bust slice 705) to identify the underbust level 700 such as shown in
Once selected 3D image is preprocessed, the processor 14 may define datapoints in an alignment region. In an aspect of the disclosure, the alignment region may be the upper back area. The anterior body and the lower posterior body (including the hip) have more fat tissue which may undergo more displacement and shape change during movement, and thus may not be suitable to serve as the reference portion for the alignment. In addition, shoulders may have relative movement to the ribcage due to arm swing, therefore, the shoulder area may not be ideal for alignment either.
The datapoints are points on the surface of the alignment region of the 3D image. The datapoints may be a fixed number of datapoints P. In some aspect of the disclosure, P=1480, which is 37 datapoints per horizontal slice 1100 in the alignment portion and having 40 slices 1100.
For example, to identify 37 datapoints, the processor 14 may identify the points on a horizontal slice from 0° to 180°, at angle increments of 5°, as shown in a
If a certain image point is missing in the alignment region, its coordinates can be defined or replaced by undefined values, such as not-a-number (NaN) values, to hold the space for the datapoint, and to maintain the sequence and indexing of other datapoints among i=1 to i=P. The missing surface points can be a result of the removal of limbs (e.g., arms) during pre-processing (S59). At S102, the processor 14 may initialize a value of s to 1 to begin a sequence to identify the datapoints from the bottommost horizontal slice (s=1). The processor 14 may include a counter to count the processed sliced. In other aspects, the processor may use a pointer or flag to identify the slice.
At S104, the processor 112 may partition or divide the horizontal slice s into a plurality of portions represented by an angle value a (the angle is described above). To improve an accuracy of the x-, y-, z-coordinates of the location of the datapoints, the instructions may define a threshold t corresponding to an angle tolerance of each datapoint within the horizontal slice 1100. In some aspects, the tolerance may be based on the number of datapoints. For example, for an angle value a=40° (and datapoints every 5°), the threshold may be t=2.5.
The processor 14 may partition each horizontal slice s into a plurality of portions based on a fixed angular interval defined by the angle value a and the threshold t. For example, each portion may range from an angle a−t to a+t. For example, the portion represented by the angle a=40° may range from 37.5° to 42.5°, and the portion may include multiple image points. In other aspects, there may be a threshold to average points between horizontal slices 1100 to include in the value for the datapoints. For example, if there is a horizontal slice 1100 every z=5, then the z-tolerance may be +−2.5.
At S106, the processor 14 may initialize the angle value a to a=0°. At S108, the processor 14 may determine distances between images points along the horizontal slice s (at the surface) and a reference point 1110 for the horizontal slice (all image points at the angle and within the tolerance(s)). In an aspect of the disclosure, the reference point 1110 may be (x=0 and y=0 and the z value may change based on the slice). The reference point 1110, per slice, may be the projection of the central axis 900 on the slice. However, since the horizontal slice is two dimensional, the z value does not matter. At S110, the processor 14 may determine an average of all the distances determined at S108. For each portion, the processor 14 may determine the distances of the multiple image points from the reference point 1110 for the slice and determine an average among these determined distances. At S112, the processor 14 may associate the datapoint to have the average distance determined at S110 for the angle value a. The value and associated angle (and slice) may be stored in the memory 16 as the datapoint. For example, for the datapoint associated with a=the processor 14 obtains the distances from image points between 37.5° to 42.5° (and values off-slice within the z-tolerance) and averages the same.
At S114, the processor 14 may determine whether the angle value a is 180° or not. In other aspects, instead of started at zero and moving up to 180° (counterclockwise), the process may start at 180 degrees may decrement to zero. If the angle value a is not 180° (NO at S114), the processor 14 may increment the value of a by M, where M is the angular difference between the datapoints (e.g., 0°+5)°=5° at S113 and the processor 14 may perform subsequently perform S108, S110, S112 and S114 for a next portion in the same horizontal slice. If the angle value a is 180° (YES at S114), the processor 14 may determine whether the horizontal slice s is the fixed number S (e.g., at S116. If the horizontal slice s is not equal to S (NO at S116), then the processor 14 may increment s by one to at S118 and returns to S104 where the processor 14 may subsequently perform S106, S108, S110, S112, and S114 for a next horizontal slice (after S104). If the value of s is S (e.g., 40) at S116, that means all horizontal slices are processed and process may end at S120. In other aspects of the disclosure, the processing may begin with the highest number horizontal slice and work downward instead of beginning with horizontal slice s=1 and worked upward.
Once the datapoints are defined for the selected 3D image, the processor 14 may shift the selected 3D image to align the selected 3D image with the base image. For example, when the base image is the stationary image, the processor 14 may shift the selected 3D image with respect to the stationary image. For the stationary image, the processor 14 may execute S9 and S11 prior to getting to S13. When one of the selected 3D images in the subset is the base image, and the processing is for the first 3D image (e.g., the base image), S13 may be omitted in the first iteration as it is the base image. To determine the shift, the processor 14 first determines a shape discrepancy between the alignment region in the selected 3D image with the same portion in the base image using the defined datapoint. The shape discrepancy (also referred to herein as fit-loss) may be determined by a point wise comparison (same datapoint) in the respective 3D images.
Since the two 3D images for comparison have been pre-processed as described above (base image and the current selected 3D image), each point's distance from the origin (0,0,0) can be calculated by the following equation (Eq.1):
d
i
=x
i
2
+y
i
2
°z
i
2 (1)
where, (xi, yi, zi) is the coordinates of the i-th datapoint among the scan-surface points (i ranges from 1 to P), and di is the distance of the i-th point from the origin (0, 0, 0) or the reference point. If the coordinates of a point includes undefined (e.g., NaN) values, the distance of that point from the reference point will be recorded as NaN.
Based on the calculated distances from Eq. 1, a shape discrepancy between the pair of 3D images is given by the following equation (Eq.2):
where d1, d2 represent two different 3D images, dli refers to the i-th point on the first image (e.g., base image), while d2i refers to the same i-th point on the second image, e.g., current selected 3D image. The variable n is the total number of points, which in the examples described here, is P. Any value subtracting or being subtracted by a NaN value will result in an NaN value, but all the NaN values can be removed by the processor 14 before the addition. The variable m is the total number of pairs of points where both points do not include undefined values.
After the shape discrepancy is calculated, the current selected 3D image is shifted. For example, the current selected 3D image may be shifted vertically. The shifting changes the distance from the surface datapoints and the reference point in the current selected 3D image. The shape discrepancy is then calculated again as described above. The two shape discrepancies are compared. If the latter determined shape discrepancy is smaller than the former, the current selected 3D image is further shifted (as the 3D images are becoming more aligned) and a shape discrepancy is calculated. The process is repeated to minimize the calculated shape discrepancy.
However, if the latter determined shape discrepancy is larger than the former, the shifting stops and may be returned to an earlier position or shifted in the opposite direction and the shape discrepancy is calculated again.
Once the current selected 3D image is aligned with the base image, the processor 14 may define the datapoints on the breasts at S15. In an aspect of the disclosure, the datapoint on the breasts (surface) may be defined using vertical slices. In some aspects of the disclosure, each breast may be separately processed to determine the datapoints. In some aspects of the disclosure, the vertical slices may be parallel to the coronal plane (frontal plane), e.g., have the same y value in its coordinate as shown in
At S152, the processor 14 may initialize a value of t to 1 to begin a sequence to identify the datapoints 1305A from the innermost vertical slice (t=1) (vertical slice closest to y=0). The processor 14 may include a counter to count the processed vertical sliced. In other aspects, the processor may use a pointer or flag to identify the vertical slice.
At S154, the processor 112 may partition or divide the vertical slice t 1300 into a plurality of portions represented by an angle value a. To improve an accuracy of the x-, y-, z-coordinates of the location of the datapoints 1305A, the instructions may define a threshold tt corresponding to an angle tolerance of each datapoint within the vertical slice. In some aspects, the tolerance may be based on the number of datapoints. For an angle value a=40° and threshold tt=1°, the processor 14 may partition each vertical slice t into a plurality of portions based on a fixed angular interval defined by the angle value a and the threshold tt, e.g., each portion can range from an angle a−tt to a+tt. For example, the portion represented by the angle a=40° can range from 39° to 41°, and the portion can include multiple image points. In other aspects, there may be a threshold to average points between vertical slices 1300 to include in the value for the datapoints 1305A. For example, if there is a vertical slice 1300 every y=5, then the y-tolerance may be +−2.5.
At S156, the processor 14 may initialize the angle value a to a=0°. At S158, the processor 14 may determine distances between all images points along the vertical slice t (at the surface) and a reference point 1310A for the vertical slice (all image points may include image points at the angle and within the angle tolerance, as well as image point off-slice within the y-tolerance).
At S160, the processor 14 may determine an average of all the distances determined at S158. For each portion, the processor 14 may determine the distances of the multiple image points from the reference point 1310A for the vertical slice 1300 and determine an average among these determined distances. At S162, the processor 14 may associate the datapoint 1305A to have the average distance determined at S160 for the angle value a. The value and associated angle (and vertical slice) may be stored in the memory 16 as the datapoint 1305A. For example, the 1st datapoint can be a point i=1 located at the innermost slice t=1, at the angle of 0°. The 90th datapoint is the point i=90 located on the innermost slice t=1, at the angle of 178°. The x-, y-, z-coordinates of the datapoints i can be determined by the processor 14 and recorded in sequence ranging from i=1 to i=Q (Q being the total number of datapoints), and the recorded locations or coordinates can be stored in the memory 16.
If a certain image point is missing on the slice, its coordinates can be defined or replaced by undefined values, such as not-a-number (NaN) values, to hold the space for the datapoint 1305A, and to maintain the sequence and indexing of other datapoints among i=1 to i=Q. The missing points can be a result of the removal of limbs (e.g., arms) during pre-processing (S59) or the optional removal of the protrusion of the upper abdomen between the breasts (e.g., the lower and central area between the two breasts). For example,
At S164, the processor 14 may determine whether the angle value a is 0° again. In other aspects, instead of started at zero and moving around the quasi-circle back to zero, (counterclockwise), the process may start at 360 degrees move clockwise. If the angle value a is not 0° again (NO at S164), the processor 14 may increment the value of a by N at S165, where N is the angular difference between the datapoints (e.g., 0°+2)°=2° and the processor 112 can perform the S158, S160, S162 and S164 for a next portion in the same vertical slice 1300. If the angle value a is 0° again (YES at S164), the processor 14 may determine whether the vertical slice t is the fixed number T (e.g., 40) at S166. If the vertical slice t is not equal to T (NO at S166), then the processor 14 may increment t by one to at S168 and return to S154, where the processor 14 may subsequently perform S156, S158, S160, S162, and S164 for a next vertical slice (after S154). If the value oft is T (e.g., 40) at S166, that means all vertical slices 1300 are processed and process may end at S170. In other aspects of the disclosure, the processing may begin with the outermost vertical slice and work inward instead of beginning with the innermost and working outward.
In
This process may be repeated for the other breast region.
In some aspects of the disclosure, the vertical slices 1300A may be parallel to the sagittal plane, e.g., have the same x value in its coordinate as shown in
At S152, the processor 14 may initialize a value of t to 1 to begin a sequence to identify the datapoints 1305B from the vertical slice (t=1) (vertical slice closest to the left or right). The processor 14 may include a counter to count the processed vertical slice. In other aspects, the processor 14 may use a pointer or flag to identify the vertical slice.
At S154A, the processor 112 may partition or divide the vertical slice t 1300A into a plurality of portions represented by y values. Each portion may be associated with a range of y-values. To improve an accuracy of the x-, y-, z-coordinates of the location of the datapoints (e.g., 1305C for vertical slice A), the instructions may define a threshold t′″ corresponding to an y-value tolerance of each datapoint within the vertical slice 1300A. In some aspects, the tolerance may be based on the number of datapoints for the vertical slice 1300A. The more datapoints 1305B per vertical slice 1300A, the smaller the threshold t′″ may be. For example, if there are datapoints every y=2, the y-value threshold t′″ may be +−1 (or +−0.5). Additionally, the portion may include image data off-slice (image data between the vertical slices 1300A). For example, if there is a vertical slice 1300A every x=5, then the x-tolerance may be+−2.5.
At S156A, the processor 14 may initialize the value to a minimum value (absolute value). For example, the minimum value may be y=0. At S158, the processor 14 may determine the z-coordinate for all images points associated with the portion identified in S154A (at the surface) and for the y value (datapoint that is currently being processed) (all image points may include image points at the specific y value and within the y tolerance, as well as image point off-slice within the x-tolerance) for the specific vertical slice.
At S160, the processor 14 may determine an average of all the z-coordinates determined at S158. For each portion, the processor 14 may determine the z-coordinates of the multiple image points for the vertical slice 1300A and determine an average among these determined z-coordinates. At S162A, the processor 14 may associate the datapoint 1305A to have the average z-coordinate determined at S160 for the y value for the datapoint (and the x value, which is known based on the vertical slice). The average z-coordinate and associated y- value (and vertical slice, e.g., x-value) may be stored in the memory 16 as the datapoint 1305B (e.g., datapoint 1305C for Vertical Slice A). The x-, y-, z-coordinates of the datapoints may be determined by the processor 14 and recorded in sequence ranging from i=1 to i=R, and the recorded locations or coordinates may be stored in the memory 16 (where R is the maximum number of datapoints).
If a certain image point is missing on the vertical slice 1300A, its coordinates can be defined or replaced by undefined values, such as not-a-number (NaN) values, to hold the space for the datapoint 1305B, and to maintain the sequence and indexing of other points among i=1 to i=R. The missing points can be a result of the removal of limbs (e.g., arms) during pre-processing (S59) or the optional removal of the protrusion of the upper abdomen between the breasts (e.g., the lower and central area between the two breasts).
At S164A, the processor 14 may determine whether the y value being processed is the maximum y value (absolute value). This indicates that all portions of the divided vertical slice have been processed. In other aspects, instead of started at y=minimum and moving counterclockwise, the process may start at y=maximum move clockwise (min and max absolute value). If the y value is not the maximum absolute y value (NO at S164A), the processor 14 may increment the value of y by 0 at S165A, where 0 is the y increment between the datapoints and the processor 14 can perform the S158, S160, S162A and S164A for a next portion in the same vertical slice 1300A. If the y value a is the maximum absolute y value (YES at S164A), the processor 14 may determine whether the vertical slice t is the fixed number T′ (e.g., 60) at S166A. If the vertical slice t is not equal to T′ (NO at S166A), then the processor 14 may increment t by one to at S168 and return to S154A. where the processor 14 may subsequently perform S156A, S158, S160, S162A, and S164A for a next vertical slice (e.g., vertical slice B). If the value oft is T′ (e.g., 60) at S166A, that means all vertical slices 1300A are processed and process may end at S170.
In other aspects, the vertical slices any be angled with respect to both the coronal plane and the sagittal plane.
S15 is repeated for the base image (whether the base image is the 3D image acquired while the individual is stationary or one of the 3D images acquired while the individual is moving). In an aspect of the disclosure, the same vertical slicing technique (whether parallel to the coronal plane or sagittal plane or angled) may be used for the base image. This will allow for consistency in the datapoints. This way the base image and the current processed 3D image have the same number of defined datapoints.
At S17, the processor 14 may determine the vertical displacement for the 3D image being processed (3D image acquired while the individual is moving) with respect to the base image. The vertical displacement may be determined for each defined datapoint in S15 (point-wise). The vertical displacement compares the z-coordinates in the defined datapoints (same points in the base image and the current processed 3D image), e.g., relative vertical displacement. The displacement may be determined using the following equation.
d
j
=z
ij
−z
i0 (3)
where dj is an array containing the vertical displacements of all the defined datapoints for the jth 3D image (the current processed 3D image), where j is 1≤j≤N, where N is the number of three-dimensional images in the subset, z 11 is the z-coordinate of the i-th defined datapoint of that jth 3D image (1≤i≤M), where M is the number of defined datapoints, while zi0 is the z-coordinate of the i-th point of the three-dimensional image for the base image.
The displacement array may be stored in the memory 16 (associated with the current processed 3D image).
At S19, the processor 14 may determine if all the 3D images in the subset have been processed. For example, since a counter may be used to track the processed 3D images, the processor may determine whether the counter value equals the number of 3D images \in the subset. When there are unprocessed 3D images in the subset (NO at S19), the processor 14 may increment the counter at S20 and the processor 14 executes S9-S19 for the next 3D image acquired when the individual is moving.
When all the 3D images in the subset are processed (YES at S19), the processor 14 may calculate a displacement parameter at S21. In some aspects of the disclosure, the displacement parameter may be a standard deviation.
Once again, the displacement parameter may be calculated for each defined datapoint in S15.
The displacement parameter may be calculated using the following equation:
where SD represents the standard deviation of dj, davg is the mean value of dj, and n is the number of 3D images in the subset (or if one of the 3D images acquired while the individual is moving is selected as the base image, n is the remaining number of 3D images in the subset).
In an aspect of the disclosure, the processor 14 may generate a mapping, such as a heat map at S23 using the displacement parameter calculated at S21. For example, the heat map may be superposed over the base image (such as the 3D image acquired while the individual was stationary or the one of the 3D images acquired while the individual was moving). Since, the x, y, and z coordinates for each of the datapoints in the breast region were defined in S15 (for the base image as well), the processor 14 knows the position of the corresponding datapoints associated with the determined displacement parameter. In some aspects of the disclosure, the heat map may be displayed on the display 18. In some aspects, the server 40 may transmit the determined heat map to the client 50 to display the heatmap on the display 18 (of the client 50).
In some aspects, the heat map may be presented by gradient colors. The dark blue color may correspond to minimal variability in vertical displacement, whereas the dark red color may correspond to maximal variability in vertical displacement (or vice versa). In other aspects, other colors may be used to differentiate the variability.
In some aspects, the heat map may use a gray scaling to represent that variability in the vertical displacement. For example, dark grey scale image may represent maximal variability whereas light grey scale may represent minimal variability.
In some aspects of the disclosure, the heat map may be separately displayed for the different breast (right and left separately). This may be particularly used where the breasts are separately processed via the vertical slicing (where vertical slices 1300 parallel to the coronal plane are used).
At S25, the processor 14 may determine a threshold for separation of the breasts from the chest (chest wall). In an aspect of the disclosure, the memory 16 may store a preset percentage. The preset percentage may be a fixed percentage of variability. The preset percentage may be multiplied with a difference in the average standard deviation of the displacement in two different areas to determine the threshold. The processor 14 may define the first area 1800 and the second area 1805 at S25. The first area 1800 may be the bust area. This first area 1800 may be associated with the highest variability in the displacement. The bust slice 705 has already been determined. In an aspect of the disclosure, the height of the first area 1800 may be preset, e.g., +−Z value of the bust slice 705. In other aspects, the height of the first area 1800 may be adjusted based on the heat map to keep the highest x percentage displacement within the first area. The width of the first area (e.g., in the x direction may be determined by using a set percentage of the points. For example, that percentage may be the 10% to 90% of the x points. For example, if there are 100 points, then the width may include the middle 80 points (not including the first and last 10 points). The second area 1805 may be a low displacement area. The second area 1805 may exclude the armhole area (which may have high variability of displacement). The width of the second area may be defined by the x-coordinate of the left bust point and right bust point as determined above. Thus, the second area 1805 may be narrower than the first area 1800, an example of which are shown in
Once the areas are defined, the processor 14 may calculate the average of the standard deviation for all datapoints in the first area to obtain “A” and calculate the average of the standard deviation for all datapoints in the second area to obtain “B”. The processor 14 may calculate the difference D (D=A−B). The processor 14 may then determine the threshold by multiplying the predetermined percentage times the difference D. The threshold may identify the upper boundary of the breasts.
The connecting area between the bra cup and the shoulder strip (as circled in
At S27, the processor 14 may separate the breasts from the chest using the base image (e.g., the 3D image acquired while the individual was stationary or one of the 3D images acquired while the individual was moving).
In the base image, the image data below the underbust line 700 is already removed, image data in the posterior region (e.g., y>0) (or y<0 depending on the definition or direction of the y-axis) may already be removed. Thus, at S29, the processor may separate the remaining image data using the determined threshold and the standard deviation values for the datapoints (and heat map). For example, in some aspects of the disclosure, datapoints having the standard deviation below the threshold (determined at S25) may be removed and the remaining datapoints identified as breast datapoints. The remaining datapoints may be displayed on the display 18. In some aspects, the server 40 may transmit the 3D image of the breasts to the client 50 to display on the display 18 (of the client 50). In some aspects, the heat map only containing values associated with the breast datapoints may be superposed on the 3D image of the breasts.
In other aspects of the disclosure, the breasts may be separated based on the vertical slices 1300 or y-value separation. The method of determining the separation vertical slice or y value may be different depending on the direction of the vertical slicing.
In accordance with aspects of the disclosure, for angles within a range, the slice (for the angle) having the standard deviation value closest to the threshold determined at S25 (separation) is identified. The identified slices for all the angles within the range are sorted and the median is determined, and the median slice is selected as the separation vertical slice. In some aspects of the disclosure, the angle may be 0° to 90°. However, the maximum angle may change depending on when the individual is wearing a bra with straps or not.
At S200, the processor 14 may set the angle for processing to the first angle. For example, the processor 14 may set the angle to zero.
At S202, the processor 14, for the processing angle (e.g., 0°), identifies all the SD values for each of the vertical slices (e.g., 40) associated with the datapoint. Each of the SD values is compared with the threshold determined in S25. The vertical slice having the SD value closest to the threshold is recorded (for the datapoint for the processing angle) and stored in the memory 16. At S204, the processor 14 may determine if the current processed angle equals the second angle, e.g., 90°. Since this is the first iteration, the determination is NO. The processor 14 may increment the processing angle to the angle associated with the next datapoint at S206. As described above, there may be a datapoint every 2° and thus the processing angle would now be 2°. S202 and S204 may be repeated for each datapoint between the first angle and the second angle. Using the above example, there may be 46 datapoints between the first angle and the second angle.
For example, at 0°, the processor 14 may determine that vertical slice 32 is the closest and at 2°, the processor 14 may determine that vertical slice 36 is the closest to the threshold . . . and at 88°, the processor 14 may determine the vertical slice 40 is the closest to the threshold.
Once all the angles between the first angle and the second angle are processed, the processor 14 may determine that the angle equals the second angle (YES at S204) and the processor 14 sorts all the determined vertical slices per angle in lowest to highest order or vice versa to determine the median vertical slice of all identified slices. The median vertical slice is then identified as the separation slice 2100 at S208. An example of the separation slice 2100 is shown in
At S27, the processor 14 may separate the breasts from the chest using the base image using the separation slice 2100.
In other aspects of the disclosure, the breasts may be separated from the chest using a determined y-value.
At S220, the processor 14 may set the vertical slice 1300A for processing to a first slice, e.g., t=1.
At S222, the processor 14, for the processing vertical slice (e.g., t=1), identifies all the SD values for datapoints, respectively associated with a y-value. Each of the SD values is compared with the threshold determined in S25. The y-value associated with the SD value closest to the threshold is recorded and stored in the memory 16 associated with the processing vertical slice. At S224, the processor 14 may determine if the current processed vertical slice equals the maximum number of slices T′, e.g., 60. Since this is the first iteration, the determination is NO. The processor 14 may increment the processing vertical slice to the next vertical slice at S226 (t=t+1). S222 and S224 may be repeated for each vertical slice 1300A. Using the above example, there may be 60 slices and thus, there may be 60 determined y values as being closest to the threshold (one per vertical slice).
For example, at slice X=25, the processor 14 may determine that y value 54 is the closest to the threshold, the processor 14 may determine that vertical slice X=20, the y value 59 is the closest to the threshold . . . and at vertical slice X=0, the processor 14 may determine that y value 39 is the closest to the threshold.
Once all the vertical slices 1300A are processed, the processor 14 may determine that t=T′ (YES at S204) and the processor 14 may sort all the determined y values in lowest to highest order or vice versa to determine the median y value of all identified y values at S228. The median y value is then identified as the separation y value at S230. In an aspect of the disclosure, when there are an even number of defined y-values, the separation y value may be defined by taking an average of the two y-values closest to a middle number of identified y-values. For example, if the middle two identified y-values are 11 and 12, a separation y-value may be a y-value of 11.5.
At S27, the processor 14 may separate the breasts from the chest using the base image using the separation y value.
The separated breasts may be subsequently displayed on the display 18 as described above.
In other aspects of the disclosure, the time series analysis of the vertical displacement and subsequent threshold determination may be used to generate a cup size recommendation for the individual.
At S254, the processor 14 may define the datapoints in the breasts (after separation). In an aspect of the disclosure, the datapoints may be defined using horizontal slicing in a similar manner as described above for defining the datapoints in the alignment region (see
In an aspect of the disclosure, the y coordinate for the reference point for each slice may be the y coordinate of the separation slice or the y separation value. In some aspects, the value may be recentered such that it intersects the origin (y=0). In an aspect of the disclosure, the x coordinate for the reference may intersect the central axis 900. The z- coordinate is the z-value of the horizontal slice.
In other aspects of the disclosure, the datapoints defined in S15 (vertical slicing) may be used.
At S256, the processor 14 may determine a shape discrepancy with a model associated with each cup size. In an aspect of the disclose, each cup size has its own model (prototype). The model is a representative size for the specified cup size. The 3D image of each model acquired while the model is stationary may be processed as described above to define the datapoints for the breasts. Additionally, the boundary of the breasts in each model maybe defined as described above.
For each model (cup size), the processor 14 may calculate the shape discrepancy using equations 1 and 2. In equation 2, dli refers to the i-th point on the 3D image for the model for a specific cup size (acquired while the model is stationary), respectively, while d2i refers to the same i-th point on the 3D image acquired while the individual is stationary. Therefore, S256, the processor may determine CS different shape discrepancies, where CS is the number of different cup size models.
After all the shape discrepancies are calculated, the processor 14 may compare the values. Based on the comparison, the processor 14 may identify the lowest shape discrepancy. The lowest shape discrepancy may indicate that the individual breast size is most similar to the model 3D image that resulted in the lowest shape discrepancy.
At S258, the processor 14 may generate a recommendation for the cup size based on the lowest shape discrepancy. In some aspects, the processor 14 may identified the cup size associated with the lowest shape discrepancy and issue a recommendation. In some aspects, the recommendation may be displayed on display 18. In some aspects, where there is a server 40/client 50 relationship, the server 40 may transmit the recommendation to the client 50 via the communication interfaces 45/55, respectively for display on the display 18 at the client 50. For example, when an individual would like to have a size recommendation, the individual may use their own personal devices to acquire the 3D images and transmit the same to the manufacturer for the above processing and recommendation. The manufacturer may perform the above-identified functions and transmit the recommendation back to one of the individual's personal devices. In some aspects, the recommendation may be emailed or texted to the individual. In some aspects, the recommendation may be in the form on a report containing one or more of the scanned 3D images.
In other aspects of the disclosure, displacement between acquired 3D images may be used to evaluate the performance of a garment.
At S5, a subset of the 3D images acquired while the individual is moving is selected for further processing. When one of the 3D images acquire while the individual is moving is the base image, the selection also includes setting the base image. A description of selecting the subset of 3D images is described above. Features S9-19 are performed for each of the selected 3D images. A counter may be used to track the current 3D image being processed. The counter may be initialized to 1 at S9. When a 3D image acquired while the individual is stationary is the based image S9 and S11, S254A are performed for the 3D image. Since the base image is used for comparison, S13 may be omitted because the other 3D images are aligned with the base image at S13. When one of the 3D images while the individual is moving is set as the base image, S13 and S17A may be omitted for the first iteration (e.g., for the selected 3D image as the base image).
At S254A, the datapoints may be defined for the breasts (breast region). This is performed on each of the 3D images in the subset (and if the 3D image acquired while the individual is stationary is used as the base image, datapoints may be defined on the 3D image acquired while the individual is stationary). Once the 3D images are aligned, the processor 14 may separate the anterior region from the posterior region (at y=0). The breasts are located at the anterior region. Also, in S9 the underbust is identified, image points below the underbust had already been removed. Therefore, S254A the processed 3D image is bound by the underbust in the z direction and y=0 in the y direction. Since the body is divided into posterior and anterior, the anterior portion has 180 degrees. The x-axis may be identified as 0° and 180°, however other angles may be used.
The breasts may be divided using horizontal slicing in a similar manner as described in
d′
j=√{square root over (xij2+yij2)}−√{square root over (xi02+yi02)} (5)
where d is an array containing the distances from the reference point for each of the datapoints for the j-th 3D image (1≤j≤N), where N is the number of 3D images in the subset (non-base images), xij and yij is the x-coordinate and y-coordinate, respectively, of the i-th point of that 3D image (1≤i≤P), where P is the total number of datapoints while xi0 and yi0 is the x-coordinate and y-coordinate, respectively, of the i-th point of the base image (of the same individual) (1≤i≤P). The defined z-coordinate may be ignored when determining the displacement.
Once all the 3D images in the subset are processed (and the 3D image acquired while the individual is stationary, if used) (YES at S21), the processor 14 may calculate the displacement parameter for each datapoint at S21. Similar to above, the displacement parameter may be a standard deviation calculated using equation 4.
In an aspect of the disclosure, the processor 14 may generate a mapping, such as a heat map using the displacement parameter calculated at S21. For example, the heat map may be superposed over the base image (such as the 3D image acquired while the individual was stationary or the one of the 3D images acquired while the individual was moving).
In some aspects of the disclosure, the heat map may be displayed on the display 18. In some aspects, the server 40 may transmit the determined heat map to the client 50 to display the heat map on the display 18 (of the client 50).
In some aspects, the heat map may be presented by gradient colors. The dark blue color may correspond to minimal variability in displacement (low shape change), whereas the dark red color may correspond to maximal variability in displacement (high shape change) (or vice versa). In other aspects other colors may be used to differentiate the variability.
In some aspects, the heat map may use a gray scaling to represent that variability in the displacement (shape change). For example, dark grey scale image may represent maximal variability whereas light grey scale may represent minimal variability.
In other aspects of the disclosure, the determination may be relative to the individual. For example, the processor 14 may calculate the average of the SD for all datapoints and determine another SD of the average and identify datapoints based on another SD. Looking at the two example heat maps in
At S302, the processor 14 may generate an evaluation report based on the analysis of the variability in the heat map. For example, the evaluation report may include recommendations for designing the garment. Using the example in
Using the example in
In some aspects, the analysis may also indicate that the garment does not properly fit. For example, in a case where there is a high level of variability throughout the breasts, this may indicate that the bra does not properly fit.
In some aspects, the method described in
The heat maps may also be used to observe patterns in the variability of the displacement among different sizes and body shapes. For example, it may be expected that upper breasts have the most variability in shape deformation. However, the pattern may be used to develop variability thresholds. The heat maps may be used to determine the effect of certain materials such as restraints and supports including wires and straps in different areas.
In some aspects of the disclosure, the method described in
In some aspects, the heat maps may be used for recommendations of postures or changing running styles. For example, the heat map may show an asymmetry of the variability in the displacement in the left and right breasts, which may be caused by asymmetry of the breasts, the running postures, and/or the interaction between the bra and the breasts. This may be impacted by the rotation and sway of the torso during running, and/or the interaction between the bra and the breasts.
In some aspects of the disclosure, aspects of the disclosure may be used to track and evaluate the performance of breast implants or breast reconstruction for surgeons. For example, when one breast is reconstructed, a goal is for it to have similar displacement with the other breast (e.g., bounce or shape movement). In accordance with aspects of the disclosure, the displacement of the reconstructed breast may be compared with the other breast (both vertical displacement and horizontal displacement). The displacement pattern(s) may be displayed as a heat map for the reconstructed and other breast, e.g., side-by-side.
Similarly, when both breasts are reconstructed or have implants, a goal is for them to have similar displacement (e.g., bounce or shape movement). In accordance with aspects of the disclosure, the displacement of the reconstructed breasts may be compared (both vertical displacement and horizontal displacement). The displacement pattern(s) may be displayed as a heat map, e.g., side-by-side. The heat maps may be used to confirm that the movement is substantially the same. Additionally, the heat maps may be used to confirm that the movement is similar to the movement of real breasts.
The terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “processor” may include a single core processor, a multi-core processor, multiple processors located in a single device, or multiple processors in wired or wireless communication with each other and distributed over a network of devices, the Internet, or the cloud. Accordingly, as used herein, functions, features or instructions performed or configured to be performed by a “processor”, may include the performance of the functions, features or instructions by a single core processor, may include performance of the functions, features or instructions collectively or collaboratively by multiple cores of a multi-core processor, or may include performance of the functions, features or instructions collectively or collaboratively by multiple processors, where each processor or core is not required to perform every function, feature or instruction individually.
Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements, if any, in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the disclosure. Aspects were chosen and described in order to best explain the principles and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
This application claims the benefit of and priority to U.S. Provisional Application Ser. No. 63/094,985 filed on Oct. 22, 2020, the entirety of which is incorporated by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/055179 | 10/15/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63094985 | Oct 2020 | US |