This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2018-126737 filed Jul. 3, 2018.
The present invention relates to a height calculation system, an information processing apparatus, and a non-transitory computer readable medium storing a program.
JP2013-037406A discloses estimation of a foot position of a target person for each assumed height from each of images captured in a time series manner by a capturing unit using a head part detected by a head part detection unit.
For example, there is a technology such that the height of a human displayed in an image is calculated from the distance between the head and a foot of the human body displayed in the captured image. However, depending on cases, both of the head and the foot of the human body may not be displayed in the captured image. In this case, the height of the person displayed in the image may not be easily calculated.
Aspects of non-limiting embodiments of the present disclosure relate to a height calculation system, an information processing apparatus, and a non-transitory computer readable medium storing a program which provide calculation of the height of a human displayed in an image based on the image in which a part of the human body is displayed.
Aspects of certain non-limiting embodiments of the present disclosure overcome the above disadvantages and/or other disadvantages not described above. However, aspects of the non-limiting embodiments are not required to overcome the disadvantages described above, and aspects of the non-limiting embodiments of the present disclosure may not overcome any of the disadvantages described above.
According to an aspect of the present disclosure, there is provided a height calculation system including a capturing section that captures an image, a detection section that detects a specific body part of a human in the image captured by the capturing section, and a calculation section that calculates a height of the human body based on a size of the part when the one part detected by the detection section overlaps with a specific area present in the image.
Exemplary embodiment(s) of the present invention will be described in detail based on the following figures, wherein:
Hereinafter, an exemplary embodiment of the present invention will be described in detail with reference to the accompanying drawings.
Configuration of Height Calculation System 10
The network 20 is an information communication network for communication between the video camera 100 and the information processing apparatus 200, and between the information processing apparatus 200 and the terminal apparatus 300. The type of network 20 is not particularly limited, provided that data may be transmitted and received through the network 20. The network 20 may be, for example, the Internet, a local area network (LAN), or a wide area network (WAN). A communication line that is used for data communication may be wired or wireless. The network 20 connecting the video camera 100 and the information processing apparatus 200 to each other may be the same as or different from the network 20 connecting the information processing apparatus 200 and the terminal apparatus 300 to each other. While illustration is not particularly provided, a relay device such as a gateway or a hub for connecting to a network or a communication line may be disposed in the network 20.
The height calculation system 10 of the exemplary embodiment analyzes a motion picture in which a person is displayed, and calculates the size of the face of the person displayed in the motion picture. The height of the person having the face is calculated based on the calculated size of the face. A specific method of calculating the height will be described in detail below.
The video camera 100 captures a walking person. The video camera 100 of the exemplary embodiment is disposed in a facility such as the inside of a store or on a floor of an airport. The video camera 100 may be disposed on the outside of a facility such as a sidewalk.
The video camera 100 of the exemplary embodiment has a function of transmitting the captured motion picture as digital data to the information processing apparatus 200 through the network 20.
The information processing apparatus 200 is a server that analyzes the motion picture captured by the video camera 100 and calculates the height of the person based on the size of the face of the person displayed in the motion picture.
The information processing apparatus 200 may be configured with a single computer or may be configured with plural computers connected to the network 20. In the latter case, the function of the information processing apparatus 200 of the exemplary embodiment described below is implemented by distributed processing by the plural computers.
The terminal apparatus 300 is an information terminal that outputs information related to the height of the person calculated by the information processing apparatus 200. The terminal apparatus 300 is implemented by, for example, a computer, a tablet information terminal, a smartphone, or other information processing apparatuses.
Hardware Configuration of Information Processing Apparatus 200
As illustrated in
The CPU 201 performs various controls and operation processes by executing a program stored in the ROM 203.
The RAM 202 is used as a work memory in the control or the operation process of the CPU 201.
The ROM 203 stores various kinds of data used in the program or the control executed by the CPU 201.
The external storage device 204 is implemented by, for example, a magnetic disk device or anon-volatile semiconductor memory on which data may be read and written. The external storage device 204 stores the program that is loaded into the RAM 202 and executed by the CPU 201, and the result of the operation process of the CPU 201.
The network interface 205 connects to the network 20 and transmits and receives data with the video camera 100 or the terminal apparatus 300.
The configuration example illustrated in
Functional Configuration of Information Processing Apparatus 200
Next, a functional configuration of the information processing apparatus 200 will be described.
As illustrated in
The motion picture data acquiring unit 210 acquires motion picture data from the video camera 100 through the network 20. The acquired motion picture data is stored in, for example, the RAM 202 or the external storage device 204 illustrated in
The area identification unit 220 analyzes the motion picture acquired by the motion picture data acquiring unit 210 and identifies an area in which the face of the person is displayed. Specifically, the area identification unit 220 identifies the area in which the face of the person is displayed by detecting the face of the person displayed in the motion picture based on brightness, saturation, hue, and the like in the motion picture. Hereinafter, the area in which the face of the person is displayed will be referred to as a face area. The area identification unit 220 is regarded as a detection section that detects the face of a human body displayed in an image. The face of the human body is regarded as one specific part of the human body.
The attribute estimation unit 230 as one example of an estimation section estimates the attribute of the person having the face based on the image of the face detected by the area identification unit 220. Specifically, the attribute estimation unit 230 extracts features such as parts, contours, and wrinkles on the face from the face area identified by the area identification unit 220 and estimates the attribute of the person having the face based on the extracted features. The attribute of the person is exemplified by, for example, the sex or the age group of the person.
The sensing unit 240 senses that the face area identified by the area identification unit 220 overlaps with a specific area in the motion picture. The specific area in the motion picture will be described in detail below.
The size calculation unit 250 calculates the size of the face area identified by the area identification unit 220. Specifically, the size calculation unit 250 calculates the size of the face area by calculating the number of pixels (pixel count) constituting the image of the face area when the sensing unit 240 senses that the face area overlaps with the specific area in the motion picture. Hereinafter, the pixel count constituting the image of the face area when the sensing unit 240 senses that the face area overlaps with the specific area in the motion picture will be referred to as a face pixel count. The face pixel count is regarded as part information related to the size of the face when the face detected by the area identification unit 220 overlaps with the specific area.
The storage unit 260 stores the face pixel count calculated by the size calculation unit 250 in association with the attributes estimated by the attribute estimation unit 230 for the person having the face. Accordingly, information related to the face pixel count is accumulated for each person.
The scale creating unit 270 creates a scale as a measure used in calculation of the height. Specifically, the scale creating unit 270 as one example of an acquiring section acquires information related to the face pixel count stored in the storage unit 260 and determines a pixel count to be associated with a predetermined height based on the acquired information. The scale creating unit 270 associates the pixel count with the predetermined height for each attribute estimated by the attribute estimation unit 230. That is, the scale creating unit 270 of the exemplary embodiment determines each pixel count to be associated with the height determined for each attribute. Accordingly, a height scale in which the pixel count associated with the predetermined height is illustrated for each attribute is created. A method of creating the scale will be described in detail below.
The height calculation unit 280 as one example of a calculation section calculates the height of the person displayed in the motion picture. Specifically, the height calculation unit 280 calculates the height of the person having the face detected by the area identification unit 220 using the height scale created by the scale creating unit 270. A method of calculating the height will be described in detail below.
The output unit 290 transmits information related to the height calculated by the height calculation unit 280 to the terminal apparatus 300 through the network 20.
The motion picture data acquiring unit 210, the area identification unit 220, the attribute estimation unit 230, the sensing unit 240, the size calculation unit 250, the storage unit 260, the scale creating unit 270, the height calculation unit 280, and the output unit 290 are implemented by cooperation of software and hardware resources.
Specifically, in the exemplary embodiment, an operating system and application software and the like executed in cooperation with the operating system are stored in the ROM 203 (refer to
The storage unit 260 is implemented by the ROM 203, the external storage device 204, or the like.
Hardware Configuration of Terminal Apparatus 300
Next, a hardware configuration of the terminal apparatus 300 will be described.
As illustrated in
The CPU 301 performs various controls and operation processes by executing a program stored in the ROM 303.
The RAM 302 is used as a work memory in the control or the operation process of the CPU 301.
The ROM 303 stores various kinds of data used in the program or the control executed by the CPU 301.
The display device 304 is configured with, for example, a liquid crystal display and displays an image under control of the CPU 301.
The input device 305 is implemented using an input device such as a keyboard, a mouse, or a touch sensor and receives an input operation performed by an operator. In a case where the terminal apparatus 300 is a tablet terminal, a smartphone, or the like, a touch panel in which a liquid crystal display and a touch sensor are combined functions as the display device 304 and the input device 305.
The network interface 306 connects to the network 20 and transmits and receives data with the information processing apparatus 200.
The configuration example illustrated in
Functional Configuration of Terminal Apparatus 300
Next, a functional configuration of the terminal apparatus 300 will be described.
As illustrated in
The height information acquiring unit 310 acquires information related to the height of the person calculated by the information processing apparatus 200 through the network 20. The received information is stored in, for example, the RAM 302 in
The display image generation unit 320 generates an output image indicating the height of the person based on the information acquired by the height information acquiring unit 310.
The display control unit 330 causes, for example, the display device 304 in the computer illustrated in
The operation reception unit 340 receives an input operation that is performed by the operator using the input device 305. The display control unit 330 controls display of the output image or the like on the display device 304 in accordance with the operation received by the operation reception unit 340.
The height information acquiring unit 310, the display image generation unit 320, the display control unit 330, and the operation reception unit 340 are implemented by cooperation of software and hardware resources.
Specifically, in the exemplary embodiment, an operating system and application software and the like executed in cooperation with the operating system are stored in the ROM 303 (refer to
Method of Calculating Face Pixel Count
Next, a method of calculating the face pixel count will be described.
As illustrated in
As illustrated in
The area identification unit 220 identifies a face area W related to the person from the motion picture.
The sensing unit 240 sets a reference line L in a specific area in the motion picture. In the example illustrated in
The sensing unit 240 senses that the face area W overlaps with the specific area in the motion picture. Specifically, the sensing unit 240 senses that the face area W overlaps with the specific area by sensing that the face area W overlaps with the reference line L in the motion picture. That is, overlapping of the face of the person with the specific area means overlapping of the face area W with the reference line L.
The size calculation unit 250 calculates the pixel count (face pixel count) of the face area W when the sensing unit 240 senses that the face area W overlaps with the reference line L.
The size of the face area W when the face area W related to a person X overlaps with the reference line L, and the size of the face area W when the face area W related to a person Y having a smaller height than the person X overlaps with the reference line L will be described with reference to
As illustrated in
Thus, as illustrated in
Therefore, in the exemplary embodiment, using the fact that the pixel count of the face area W is increased as the size of the face area W is increased, the height of the person is calculated based on the pixel count (face pixel count) of the face area W when the face area W overlaps with the reference line L. That is, in the exemplary embodiment, as the face pixel count is increased, the calculated height is increased.
Description of Face Pixel Count Management Table
Next, information stored in the storage unit 260 will be described.
In the face pixel count management table illustrated in
In the face pixel count management table of the exemplary embodiment, the “age group” and the “sex” estimated for one person by the attribute estimation unit 230 are associated with the “face pixel count” related to the person.
Description of Method of Creating Height Scale
Next, a method of creating the height scale will be described.
First, the scale creating unit 270 creates a histogram of the face pixel count for each attribute. Specifically, in a case where the number of persons for which the face pixel count associated with one attribute (for example, male in 30s) is stored in the storage unit 260 becomes greater than or equal to a predetermined person count (for example, 30 persons), the scale creating unit 270 creates a histogram of the face pixel count associated with the one attribute using information related to the face pixel counts of the persons.
The histogram illustrated in
Next, the scale creating unit 270 determines the pixel count to be associated with the predetermined height. Specifically, the scale creating unit 270 creates a normal distribution F by fitting the created histogram and associates the predetermined height with the face pixel count at a peak V of the created normal distribution F. The face pixel count at the peak V of the normal distribution is the average value of the face pixel counts in the normal distribution F. The predetermined height is exemplified by, for example, the average value of heights for each attribute. For example, the average value of known heights may be used as the average value of heights. In addition, for example, the average value of heights that is acquired as a result of measuring the heights of two or more persons may be used as the average value of heights.
The scale creating unit 270 is regarded as a determination section that determines a size to be associated with a predetermined height based on a face pixel count. In addition, the scale creating unit 270 is regarded as a creating section that creates a normal distribution of the face pixel count for each attribute.
The scale creating unit 270 creates the histogram and determines the pixel count to be associated with the predetermined height for the remaining attributes. By causing the scale creating unit 270 to create the histogram and determine the pixel count to be associated with the predetermined height for each attribute, a height scale in which the predetermined height is associated with the pixel count for each attribute is created as illustrated in
Description of Method of Calculating Height
Next, a method of calculating the height will be described.
Hereinafter, a method of calculating the height from the face pixel count calculated by the size calculation unit 250 will be described with reference to
In calculation of the height from a target face pixel count Px, first, the height calculation unit 280 determines whether or not the pixel count associated with any attribute is used as a reference among the pixel counts shown in the height scale. Specifically, the height calculation unit 280 determines that the next larger pixel count than the target face pixel count Px and the next smaller face pixel count than the target face pixel count Px are used as references among the pixel counts shown in the scale for each attribute. In a case where a pixel count that is larger than the target face pixel count Px is not shown in the scale, the next smaller pixel count and the second next smaller pixel count than the target face pixel count Px are used as references among the pixel counts shown in the scale for each attribute. In a case where a pixel count that is smaller than the target face pixel count Px is not shown in the scale, the next larger pixel count and the second next larger pixel count than the target face pixel count Px are used as references among the pixel counts shown in the scale for each attribute.
Next, the height calculation unit 280 calculates a height H using Expression (1).
H=a×H
a+(1−a)×Hb Expression (1)
In Expression (1), Ha denotes a height associated with the larger pixel count of two pixel counts used as references in calculation of the height, and Hb denotes a height associated with the smaller pixel count. Expression (2) is used for obtaining a.
a=P
x
−P
b
/P
a
−P
b Expression (2)
In Expression (2), Pa denotes the larger pixel count of the two pixel counts used as references in calculation of the height, and Pb denotes the smaller pixel count.
The height calculation unit 280 calculates a by substituting the target face pixel count Px in Expression (2). In addition, the height H is calculated by substituting calculated a in Expression (1).
Flow of Process of Calculating Height
Next, a flow of process of calculating the height (height calculation process) will be described.
First, the area identification unit 220 detects the face of the person displayed in the motion picture acquired by the motion picture data acquiring unit 210 (S101) and identifies the face area related to the person.
The attribute estimation unit 230 estimates the attribute of the person from the face of the person detected by the area identification unit 220 (S102).
The size calculation unit 250 calculates the face pixel count related to the person of which the face is detected by the area identification unit 220 (S103).
Next, the height calculation unit 280 determines whether or not the height scale (refer to
In a case where it is determined that the height scale is already created (YES in S104), the height calculation unit 280 calculates the height of the person from the face pixel count calculated by the size calculation unit 250 (S105). Specifically, the height calculation unit 280 calculates the height based on the face pixel count calculated by the size calculation unit 250 and the height and the pixel count shown in the height scale.
The output unit 290 outputs information related to the calculated height to the terminal apparatus 300 (S106).
After step S106 or in a case where a negative result is acquired in step S104, the storage unit 260 stores the face pixel count calculated by the size calculation unit 250 in association with the attribute estimated by the attribute estimation unit 230 (S107). The stored information is managed as the face pixel count management table (refer to
Next, with reference to the face pixel count management table, the scale creating unit 270 determines whether or not the person count for which the face pixel count associated with the attribute estimated in step S103 is stored in the storage unit 260 is greater than or equal to a predetermined person count (S108). In a case where a negative result is acquired, the height calculation process is finished.
In a case where a positive result is acquired in step S108, the scale creating unit 270 creates a histogram of the face pixel count for the attribute estimated in step S103 (S109).
Next, the scale creating unit 270 creates a normal distribution for the histogram and creates a height scale in which the average value of face pixel counts in the created normal distribution is associated with the predetermined height (S110).
In a case where a positive result is acquired in step S108, the process from step S109 may be performed even in a case where the height is associated with the pixel count in the height scale for the attribute estimated in step S103. That is, the pixel count associated with the predetermined height in the height scale may be updated. In this case, a new histogram that includes a new face pixel count calculated by the size calculation unit 250 is created, and the pixel count to be associated with the predetermined height in the height scale is determined based on the histogram.
In a case where the faces of plural persons are displayed in the motion picture, the height calculation process is performed for each of the plural persons.
In the exemplary embodiment, the height of the person displayed in the image is calculated based on the size of the face when the face of the person displayed in the image captured by the video camera 100 overlaps with the specific area present in the image.
For example, the height of the person having a human body displayed in the image may be calculated from the distance between a head and a foot of the human body displayed in the captured image. However, depending on cases, both of the head and the foot of the human body may not be displayed in the captured image. In this case, the height of the person displayed in the image may not be easily calculated.
Meanwhile, as in the exemplary embodiment, in a case where the height of the person is calculated based on the size of the face of the person displayed in the image, the height of a person of which the face is displayed but the foot is not displayed in the image is calculated.
In the exemplary embodiment, while the height is calculated from the size of the face of the person, the present invention is not limited thereto.
For example, the height may be calculated from the size of a hand of the person displayed in the image. In this case, the area identification unit 220 detects the hand of the person displayed in the image and identifies an area in which the hand is displayed. The attribute estimation unit 230 extracts features such as parts, contours, and wrinkles on the identified hand of the person and estimates the attribute of the person having the hand based on the extracted features. The size calculation unit 250 calculates the pixel count constituting the image in the area of the hand when the area of the hand of the person overlaps with the reference line L (refer to
The part of the human body that is used in calculation of the height of the person may be a part of the human body other than the face or the hand.
That is, one specific part of the human body displayed in the image may be detected by the area identification unit 220, and the height of the person having the human body displayed in the image may be calculated based on the size of the one part when the detected one part overlaps with the specific area present in the image.
In the exemplary embodiment, the scale creating unit 270 determines the pixel count to be associated with the predetermined height based on the face pixel count calculated by the size calculation unit 250. The height calculation unit 280 calculates the height of the person based on the relationship between the face pixel count calculated by the size calculation unit 250 and the pixel count associated with the predetermined height.
Thus, depending on the type of video camera 100 or the position at which the video camera 100 is disposed, the face pixel count calculated by the size calculation unit 250 changes, and accordingly, the size associated with the predetermined height also changes.
Content Output to Display Device 304
Next, a content that is output to the display device 304 of the terminal apparatus 300 will be described.
As illustrated in
The image of the person of which the height is calculated is displayed in the person image 351. Specifically, the image that is identified as the face area by the area identification unit 220 for the person of which the height is calculated is displayed in the person image 351.
The height calculated by the height calculation unit 280 is displayed in the height information 352.
The attribute estimated by the attribute estimation unit 230 for the person of which the height is calculated is displayed in the attribute information 353. Specifically, the age group and the sex of the person estimated by the attribute estimation unit 230 are displayed in the attribute information 353.
The time when the person of which the height is calculated is captured by the video camera 100 is displayed in the time information 354. Specifically, the time, date, month, and year at which a record of an image for which the sensing unit 240 senses that the face area related to the person of which the height is calculated overlaps with the reference line L (refer to
In a case where the operator selects the next button 355, the calculation result image 350 related to a person who is sensed by the sensing unit 240 at the next later time than the time displayed in the time information 354 is displayed. In a case where the operator selects the previous button 356, the calculation result image 350 related to a person who is sensed by the sensing unit 240 at the next earlier time than the time displayed in the time information 354 is displayed.
In the exemplary embodiment, while the video camera 100 is disposed above the walking person (refer to
The video camera 100 may be disposed below the face of the walking person, for example, at the height of the foot of the walking person. That is, the video camera 100 may be disposed such that a difference in level is present with respect to the one specific part used in calculation of the height of the walking person.
In the exemplary embodiment, one reference line L (refer to
For example, two or more reference lines may be set in the motion picture, and the pixel count to be associated with the predetermined height may be determined based on the average value of pixel counts of the face area when the face area overlaps with each reference line.
In the exemplary embodiment, while the pixel count to be associated with the predetermined height is determined based on the pixel count constituting the face area identified by the area identification unit 220, the present invention is not limited thereto.
For example, background subtraction may be performed on the image of the face area identified by the area identification unit 220 using an image that is captured in a state where no person is displayed in the capturing area of the video camera 100. The pixel count to be associated with the predetermined height may be determined based on the pixel count constituting the image acquired as a difference.
In the exemplary embodiment, while the pixel count to be associated with the predetermined height is determined based on the pixel count of the face area in the image when the face area overlaps with the reference line L, the present invention is not limited thereto.
For example, the pixel count to be associated with the predetermined height may be determined based on the average value of pixel counts of the face area in each image constituting the motion picture while the face area overlaps with the reference line L.
In addition, for example, the pixel count to be associated with the predetermined height may be determined based on the average value of pixel counts of the face area in each image constituting the motion picture in one cycle of the walking of the person during which the face area overlaps with the reference line L.
Specifically, the area identification unit 220 detects the regularity of oscillation of the face area in the up-down direction of the motion picture and detects the cycle of the walking of the person from the regularity of oscillation. Each image constituting the motion picture in the period of one cycle of the walking in a period during which the face area overlaps with the reference line L in the motion picture may be extracted. The pixel count to be associated with the predetermined height may be determined based on the average value of pixel counts of the face area in the extracted images.
In the exemplary embodiment, while the pixel count to be associated with the predetermined height in the height scale is the face pixel count at the peak V of the normal distribution F (refer to
For example, the average value of face pixel counts for each attribute stored in the storage unit 260 may be associated with the predetermined height in the height scale.
In addition, for example, the pixel count associated with the predetermined height may be corrected based on the face pixel count related to a person having a known height. Specifically, the face pixel count related to the person is calculated from a motion picture that is acquired by capturing the person having a specific attribute with a known height using the video camera 100. The height and the pixel count that are associated with each other in the height scale for the specific attribute may be replaced with the height of the person having a known height and the face pixel count calculated for the person.
In addition, for example, each pixel count associated with the height may be corrected for each attribute based on the pixel count of the face area when the face area of the person having a known height overlaps with the reference line L.
Hereinafter, a method of correcting each pixel count associated with the height for each attribute will be described with reference to
First, a pixel count Ps0 (refer to
In Expression (3), Hw denotes the height of the person having a known height, and Pw denotes the face pixel count calculated for the person. In addition, Hs denotes the height shown in the height scale for the attribute of the person having a known height.
Next, the pixel counts shown in the height scale for the remaining attributes are corrected. Specifically, the value of change in pixel count for the initially corrected attribute is subtracted from each pixel count shown in the height scale for the remaining attributes. More specifically, each pixel count associated in the height scale for the remaining attributes is corrected by subtracting, from each pixel count shown in the height scale for the remaining attributes, a value (Ps0−Ps) acquired by subtracting the pixel count Ps after correction from the pixel count Ps0 before correction for the initially corrected attribute.
In the exemplary embodiment, while the height and the pixel count are associated with each other for each age group and each sex in the height scale, the attribute that is used as a category for associating the height with the pixel count in the height scale is not limited to age group and sex.
For example, the height and the pixel count may be shown in the height scale for each nationality. In this case, the attribute estimation unit 230 estimates the age group, the sex, and the nationality of the person from the features of the face identified by the area identification unit 220. The scale creating unit 270 may create a height scale in which the height determined for each age group, each sex, and each nationality is associated with the pixel count, by determining the pixel count of the person to be associated with the height determined for each age group, each sex, and each nationality.
In the exemplary embodiment, in a case where the person count for which the face pixel count associated with one attribute is stored in the storage unit 260 becomes greater than or equal to the predetermined person count, a histogram of the face pixel count associated with the one attribute is created. However, the present invention is not limited thereto.
For example, a determination as to whether or not to create a histogram may be made based on periods. Specifically, a timing unit (not illustrated) that measures time may be disposed in the information processing apparatus 200. In a case where a predetermined period (for example, 30 days) elapses from when the motion picture data acquiring unit 210 acquires the motion picture data for the first time, a histogram may be created using information related to the face pixel count stored in the storage unit 260.
In the exemplary embodiment, a motion picture is captured using the video camera 100, and the height of the person displayed in the captured motion picture data is calculated. However, the present invention is not limited thereto.
For example, a photograph may be captured using a capturing means that captures an image, and the height of the person may be calculated based on the size of the face area of the person in the captured photograph.
A program that implements the exemplary embodiment of the present invention may be provided in a state where the program is stored in a computer readable recording medium such as a magnetic recording medium (a magnetic tape, a magnetic disk, or the like), an optical storage medium (optical disc or the like), a magneto-optical recording medium, or a semiconductor memory. In addition, the program may be provided using a communication means such as the Internet.
The present disclosure is not limited to the exemplary embodiment and may be embodied in various forms without departing from the nature of the present disclosure.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2018-126737 | Jul 2018 | JP | national |