This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2022-025131, filed on Feb. 21, 2022, the entire contents of which are incorporated herein by reference.
The embodiment discussed herein is related to a distribution program and the like.
Marketing is performed to stimulate purchase intention of users and promote sales of products by distributing coupons (vouchers). For example, in the conventional technology, a technique of identifying a product related to a certain product that has been purchased by a user in the past and providing a coupon for the identified product when the user performs payment at a cash register is disclosed.
In addition, in online shops, a coupon is individually issued for each of users and advertising is also provided on the basis of an access history or a search history.
According to an aspect of an embodiment, a non-transitory computer-readable recording medium has stored therein a distribution program that causes a computer to execute a process, the process including extracting a person and a product from a video of an inside of a store; tracking the extracted person; identifying a behavior that is performed by the tracked person with respect to a product in the store; identifying a first behavior type that is led by the behavior that is performed by the tracked person with respect to the product among a plurality of behavior types that define transition of a process of the behavior since entrance into the store until purchase of a product in the store; and distributing information on a product indicating the first behavior type to the tracked person when the identified first behavior type is at a predetermined level or higher and the tracked person has not yet purchased the product.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
However, in the conventional technology as described above, it is difficult to provide information on a product that attracts interest of a user and that is not yet purchased by the user.
For example, if it is possible to issue information related to a product for which the user is wondering whether to purchase, it becomes possible to more effectively stimulate purchase intention of the user.
In contrast, in the conventional technology, it is only possible to issue a coupon related to a product that has already been purchased by the user in the past, and it is difficult to issue a coupon for a product that is not yet purchased by the user.
Accordingly, it is an object in one aspect of an embodiment of the present invention to provide a distribution program, a distribution method, and an information processing apparatus capable of providing information related to a product that attracts interest of a user and that is not yet purchased by the user.
Preferred embodiments of the present invention will be explained with reference to accompanying drawings. The present invention is not limited by the embodiments below.
The camera 10 is a camera that captures a video of a sales floor 3. A shelf 4 for storing products is installed in the sales floor 3. While the shelf 4 is illustrated in
For example, when a user C1 takes a product from the shelf 4, puts the product in a shopping cart or the like, and performs payment for the product, the user C1 moves to a cashier area 5.
The camera 20 is a camera that captures a video of the cashier area 5. A cash register 6 at which a user C1′ performs payment is installed in the cashier area 5. While the cash register 6 is illustrated in
The cash register 6 is connected to the network 15, and transmits purchase history information on the user C1′ to the information processing apparatus 100. The purchase history information includes information on a product that is purchased by the user C1′. Meanwhile, the information processing apparatus 100 may analyze the second video information and identify the information on the product that is purchased by the user C1′.
The information processing apparatus 100 includes a behavior rule database (DB) 50 in which transition of a process of a behavior and a behavior type are defined. Examples of the behavior include “carrying a shopping cart”, “looking at a product for a predetermined time or longer”, and “a foot is located in a place in front of a shelf of a product”. Examples of the behavior type include “pick up a product and put the product into a cart”, “not purchased after consideration of whether to purchase a product”, “stay in front of a product”, “extend a hand to a plurality of products”, and “pass by in front of a product”.
The information processing apparatus 100 tracks the user C1 on the basis of the first video information, and identifies a behavior that is performed by the user C1 with respect to a product in the sales floor 3. The information processing apparatus 100 identifies a first behavior type that is led by each of behaviors of the user among a plurality of behavior types, on the basis of each of the behaviors performed by the user C1 identified from the first video information and the behavior rule DB 50. Furthermore, the information processing apparatus 100 identifies a product as a target for the first behavior type on the basis of the first video information.
The information processing apparatus 100 extracts a personal feature of the user C1 on the basis of the first video information. The information processing apparatus 100 identifies the cash register 6 that is used by the user C1′ who has the same personal feature as the user C1 on the basis of the second video information, and receives the purchase history information on the user C1′ from the cash register 6. Meanwhile, the information processing apparatus 100 may analyze the video information received from the camera 10, the camera 20, and a different camera, tracks the user C1 who moves from the sales floor 3 to the cashier area 5, and identify the cash register 6 that is used by the user C1 (the user C1′).
If the first behavior type is a behavior type with a predetermined level or higher and the product as the target for the first behavior type is not included in the purchase history information, the information processing apparatus 100 distributes a coupon or information related to advertising, such as an advertisement, to the user C1′. For example, the information processing apparatus 100 may cause the cash register 6 to output a coupon, or display an advertisement on a display screen of the cash register 6. Furthermore, it may be possible to distribute information on a coupon or an advertisement in cooperation with an application that is set in a terminal device carried by the user C1′. Hereinafter, the application may be referred to as an “app”. It may be possible to transmit an e-mail related to a product advertisement to the terminal device of the user C1′ on the basis of registration information on the user C1′. Meanwhile, the coupon may be a discount ticket or a complimentary ticket for a product, and may be usable when the user visits the store next time or later. The advertisement is information that lets a large number of persons in the world know information on a product, generates interest in the product, and urges the persons to take action, such as purchase. In this case, it is possible to change contents of the advertisement depending on the first behavior type that is led by each of the behaviors of the user.
Here, it is assumed that the behavior type with the predetermined level or higher is a behavior type corresponding to an interest, a request, and a comparison with respect to a product. For example, examples of the behavior type with the predetermined level or higher include “not purchased after consideration of whether to purchase a product”, “stay in front of a product”, and “extend a hand to a plurality of products”.
As described above, the information processing apparatus 100 according to the present embodiment identifies the first behavior type of the user C1 on the basis of the first video information, and distributes a coupon if the first behavior type is the behavior type with the predetermined level or higher and a product corresponding to the first behavior type is not purchased. With this configuration, it is possible to provide a coupon for a product that may attract interest of the user and that is not yet purchased by the user.
A configuration example of the information processing apparatus 100 according to the present embodiment will be described below.
The communication unit 110 performs data communication with the camera 10 (a plurality of cameras installed in the sales floor 3), the camera 20 (a plurality of cameras installed in the cashier area 5), the cash register 6 (a plurality of cash registers), other apparatuses, and the like via the network 15. Further, the communication unit 110 may receive pieces of video information from a plurality of cameras that are installed at an entrance and an exit of the store. The communication unit 110 is implemented by a network interface card (NIC) or the like.
The storage unit 140 includes the behavior rule DB 50, a first video buffer 141, a second video buffer 142, a camera installation DB 143, a product DB 144 and a person DB 145. The storage unit 140 is implemented by, for example, a semiconductor memory device, such as a flash memory, or a storage device, such as a hard disk or an optical disk.
The behavior rule DB 50 stores therein information that defines a relationship between a recognition rule and a behavior type.
The condition x1 is satisfied if the behavior of the user meets a behavior y1-1 and a behavior y1-2. The behavior y1-1 is “carrying a shopping cart”. The behavior y1-2 is “a foot is located in a place in front of a shelf of a product”.
The condition x2 is satisfied if one of a condition x2-1 and a condition x2-2 is satisfied.
The condition x2-1 is satisfied if the behavior of the user meets a behavior y2-1 and a behavior y2-2. The behavior y2-1 is “looking at a shelf of a product for 5 consecutive seconds or more”. The behavior y2-2 is “not put a hand on a shelf of a product”.
The condition x2-2 is satisfied if the behavior of the user meets a behavior y3-1 and a behavior y3-2. The behavior y3-1 is “take something by putting a hand on a shelf of a product”. The behavior y3-2 is “return a product by putting hand on the same product shelf”.
Referring back to explanation of
The second video buffer 142 is a buffer for storing the second video information that is received from the camera 20 (a plurality of cameras) in the cashier area 5.
The camera installation DB 143 stores therein information for identifying a place in which the camera 10 is installed in the sales floor 3. The information stored herein is set in advance by an administrator or the like.
The product DB 144 stores therein information on a product that is provided in each of sales floors. The information stored herein is set in advance by an administrator or the like.
The person DB 145 stores therein various kinds of information on a user who stays in the store.
Meanwhile, the above-described information stored in the storage unit 140 is one example, and it is possible to store various kinds of information other than the above-described information in the storage unit 140.
Referring back to explanation of
The receiving unit 151 receives the first video information from the camera 10 (a plurality of cameras) installed in the sales floor 3. The receiving unit 151 stores the received first video information in the first video buffer 141. The receiving unit 151 stores the first video information for each of cameras in each of the sales floors. The first video information includes chronological frames (still image information).
The receiving unit 151 receives the second video information from the camera 20 (a plurality of cameras) installed in the cashier area 5. The receiving unit 151 stores the received second video information in the second video buffer 142. The receiving unit 151 stores the second video information for each of cameras in the cashier area 5. The second video information includes chronological frames.
The tracking unit 152 extracts frames in which users appear from among a plurality of frames that are stored in the first video buffer 141 and the second video buffer 142, and identifies the same person among the frames.
For example, the tracking unit 152 tracks a certain user since entrance into the store until leave from the store, and acquires each of frames of the certain user who is captured in the store.
Furthermore, as illustrated in an upper part in
The tracking unit 152 assigns a user ID to the same user, and registers the user ID in association with the personal feature information on the user in the person DB 145. The personal feature information is 512-order vector information that is obtained by Person Re-Identification or the like. The tracking unit 152 may extract, as the personal feature information, a color of clothes, a height, the way of carrying a bag, the way of walking, or the like of a user included in the BBOX.
The skeleton detection unit 153 acquires skeleton information on a user who appears in a frame. Specifically, the skeleton detection unit 153 performs skeleton detection on a user with respect to frames in which each of the users extracted by the tracking unit 152 appear. The skeleton detection unit 153 adds the ID of the user who appears in the frames to the skeleton information.
For example, the skeleton detection unit 153 acquires the skeleton information by inputting image data of the extracted user, that is, the BBOX image representing the extracted user, to a trained machine learning model that is constructed by using an existing algorithm, such as DeepPose or OpenPose.
Furthermore, the skeleton detection unit 153 may determine a posture of the whole body, such as standing, walking, squatting, sitting, or sleeping, by using a machine learning model that is trained for skeleton patterns in advance. For example, the skeleton detection unit 153 may be able to determine the closest posture of the whole body by using a machine learning model that is trained by using Multilayer Perceptron for an angle between some joints in the skeleton information as illustrated in
Furthermore, the skeleton detection unit 153 is able to detect a motion of each of parts by determining a posture of the part on the basis of a three-dimensional (3D) joint posture of the body. Specifically, the skeleton detection unit 153 is able to convert a two-dimensional (2D) joint coordinate to a 3D joint coordinate by using an existing algorithm, such as a 3D-baseline method.
With respect to a part “arm”, the skeleton detection unit 153 is able to detect whether left and right arms are oriented in any direction among forward, backward, leftward, rightward, upward, and downward directions (six types) by determining whether an angle between forearm orientation and each directional vector is equal to or smaller than a threshold. Meanwhile, the skeleton detection unit 153 is able to detect the arm orientation by a vector that is defined such that “a start point is an elbow and an end point is a wrist”.
With respect to a part “leg”, the skeleton detection unit 153 is able to detect whether left and right legs are oriented in any direction among forward, backward, leftward, rightward, upward, and downward directions (six types) by determining whether an angle between a lower leg orientation and each directional vector is equal to or smaller than a threshold. Meanwhile, the skeleton detection unit 153 is able to detect the lower leg orientation by a vector that is defined such that “a start point is a knee and an end point is an ankle”.
With respect to a part “elbow”, the skeleton detection unit 153 is able to detect that the elbow is extended if an angle of the elbow is equal to or larger than a threshold and the elbow is flexed if the angle is smaller than the threshold (two types). Meanwhile, the skeleton detection unit 153 is able to detect the angle of the elbow by an angle between a vector A that is defined such that “a start point is an elbow and an end point is a shoulder” and a vector B that is defined such that “a start point is an elbow and an end point is a wrist”.
With respect to a part “knee”, the skeleton detection unit 153 is able to detect that the knee is extended if an angle of the knee is equal to or larger than a threshold and the knee is flexed if the angle is smaller than the threshold (two types). Meanwhile, the skeleton detection unit 153 is able to detect the angle of the knee by an angle between a vector A that is defined such that “a start point is a knee and an end point is an ankle” and a vector B that is defined such that “a start point is a knee and an end point is a hip”.
With respect to a part “hip”, the skeleton detection unit 153 is able to detect left twist and right twist (two types) by determining whether an angle between the hip and the shoulder is equal to or smaller than a threshold, and is able to detect that the hip is oriented forward if the angle is smaller than the threshold. Meanwhile, the skeleton detection unit 153 is able to detect the angle between the hip and the shoulder from a rotation angle about an axial vector C that is defined such that “a start point is a midpoint of both hips and an end point is a midpoint of both shoulders”, with respect to each of a vector A that is defined such that “a start point is a left shoulder and an end point is a right shoulder” and a vector B that is defined such that “a start point is a left hip (hip (L)) and an end point is a right hip (hip (R))”.
Referring back to explanation of
For example, if a skeleton representing a face looking at the front is continuously detected by determination of each of the parts and a skeleton representing standing is continuously detected by determination on the posture of the whole body among several frames, the motion recognition unit 154 recognizes a motion of “looking at the front for a certain time”. Further, if a skeleton in which a change of the posture of the whole body is smaller than a predetermined value is continuously detected among several frames, the motion recognition unit 154 recognizes a motion of “not moved”.
Furthermore, if a skeleton in which the angle of the elbow is changed by a predetermined threshold or more is detected among several frames, the motion recognition unit 154 recognizes a motion of “moving one hand “, and, if a skeleton in which the angle of the elbow is changed by the threshold or more and thereafter the angle reaches less than the threshold is detected among several frames, the motion recognition unit 154 recognizes a motion of “flexing one hand”. Moreover, if a skeleton in which the angle of the elbow is changed by the threshold or more and thereafter the angle reaches less than the threshold is detected, and thereafter, the angle is continued among several frames, the motion recognition unit 154 recognizes a motion of “looking at one hand”.
Furthermore, if a skeleton in which an angle of a wrist is continuously changed is detected among several frames, the motion recognition unit 24 recognizes a motion of “frequently moving the coordinate of the wrist during a certain time period”. If a skeleton in which the angle of the wrist is continuously changed and the angle of the elbow is continuously changed is detected among several frames, the motion recognition unit 154 recognizes a motion of “frequently changing the coordinate of the elbow and the coordinate of the wrist during a certain time period”. If a skeleton in which the angle of the wrist, the angle of the elbow, and the orientation of the whole body are continuously changed is detected among several frames, the motion recognition unit 154 recognizes a motion of “frequently moving the body orientation and whole body motion during a certain time period”.
Moreover, the motion recognition unit 154 identifies a product and a sales floor in image data in which a user, a product, and a sales floor of the product appear, from an imaging area of each of the cameras 10 and the coordinates of each of products in the imaging area and a sales floor of each of the products.
Furthermore, the motion recognition unit 154 identifies a first behavior type that is led by behaviors of the tracked user among a plurality of behavior types that define transition of a process of a behavior from entrance into the store until purchase of a product in the store. The motion recognition unit 154 outputs a motion recognition result, in which the identified first behavior type, the product ID of the product corresponding to the first behavior type, and the user ID of the tracked user are associated with one another, to the distribution processing unit 155.
The motion recognition unit 154 identifies that feet are located in a place in front of the shelf of the product with the product ID of “item1-1” by comparing positions indicated by the pieces of skeleton information 30A and 30B included in the frame F10 and the area A1-2 in front of the shelf. Further, the motion recognition unit 154 recognizes that users corresponding to the pieces of skeleton information 30A and 30B are carrying shopping carts 31a and 31b on the basis of an object recognition technology. The motion recognition unit 154 may use, as the object recognition technology, a technology disclosed in “S. Wang, K.-H. Yap, J. Yuan and Y. -P. Tan, “Discovering Human Interactions With Novel Objects via Zero-Shot Learning,” 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, pp. 11649-11658, doi: 10.1109/CVPR42600.2020.01167.” or the like.
Furthermore, it is assumed that the motion recognition unit 154 analyzes transition of the pieces of skeleton information 30A and 30B through the posture determination as described above, and recognizes that “the user looks at the area A1-1 of the shelf for 5 consecutive seconds or more” and “not put hand in the area A1-1 of the shelf”. In this case, the motion recognition unit 154 identifies the first behavior type of “not purchased after consideration of whether to purchase a product” among a plurality of behavior types that define transition of a process of a behavior of the user until purchase of a product in the behavior rule DB 50. The motion recognition unit 154 identifies the product ID of “item1-1” that corresponds to the first behavior type of “not purchased after consideration of whether to purchase a product”.
Meanwhile, the process performed by the motion recognition unit 154 to identify the first behavior type that is led by the behaviors of the tracked user is not limited to the process as described above. For example, the motion recognition unit 154 may identify the first behavior type that is led by the behaviors of the tracked user on the basis of a purchase psychological process or the like (to be described later).
In the example illustrated in
Specifically, the purchase behavior process transitions to a first behavior process indicating an attention (Attention), a second behavior process indicating an interest (Interest), a third behavior process indicating a desire (Desire), a fourth behavior process indicating comparison (Compare), and a fifth behavior process indicating an action (Action) in this order.
If the user A is in the first behavior process among the plurality of behavior types that define transition of a process of a behavior, the motion recognition unit 154 determines whether the user A performs a behavior (for example, extension of hand to a product) corresponding to the second behavior process to which the first behavior process transitions. If it is determined that the behavior corresponding to the second behavior process is performed, the motion recognition unit 154 determines that the user A has transitioned to the second behavior process. For example, when the user A stays in the floor, the user A is in the state of “Attention”. At this time, if “extension of hand to a product” associated with “Interest” that is a transition destination of “Attention” is detected as the behavior of the user A, the state transitions to the state of “Interest”.
For example, the motion recognition unit 154 performs the process as described above until the users A and B reach the cashier area 5, and identifies the first behavior type corresponding to each of the users.
Referring back to explanation of
Here, it is assumed that the behavior type with the predetermined level or higher is a behavior type corresponding to an interest, a desire, or a comparison. For example, examples of the behavior type with the predetermined level or higher include “not purchased after consideration of whether to purchase a product”, “stay in front of a product”, and “extend a hand to a plurality of products”.
For example, the distribution processing unit 155 performs one of a first purchase history acquisition process and a second purchase history acquisition process, and acquires the purchase history information on the user.
The first purchase history acquisition process will be described below. The distribution processing unit 155 acquires the personal feature information on the target user from the person DB 145 on the basis of the user ID that is set in the motion recognition result. The distribution processing unit 155 identifies a user with personal feature information that is similar to the acquired personal feature information from among users who appear in the second video information in the second video buffer 142. The distribution processing unit 155 identifies a cash register that is located near the identified user in the second video information, and acquires the purchase history information on the user from the identified cash register. Meanwhile, it is assumed that the area of the cash register included in the second video information and identification information on the cash register are set in advance in the storage unit 140.
The second purchase history information acquisition process will be described below. The distribution processing unit 155 acquires the personal feature information on the target user from the person DB 145 on the basis of the user ID that is set in the motion recognition result. The distribution processing unit 155 identifies a user with personal feature information that is similar to the acquired personal feature information from among user who appear in the second video information in the second video buffer 142. After identifying the similar user from among the users who appear in the second video information, the distribution processing unit 155 analyzes a video of a shopping cart that is located near the identified user. The distribution processing unit 155 inputs the video information on the shopping cart of the identified user into a trained learning model that recognizes a product, and identifies identification information on a purchase target product (for example, one or more products). The distribution processing unit 155 generates the identified identification information on the product as the purchase history information.
Here, when distributing a coupon, the distribution processing unit 155 transmits coupon information on the product corresponding to the first behavior type to the cash register that is used by the user, and issues a coupon to the user. For example, the storage unit 140 stores therein coupon information for each product ID, and the distribution processing unit 155 acquires the coupon information on the product corresponding to the first behavior type from the storage unit 140. For example, the distribution processing unit 155 may acquire the purchase history information as described above from the cash register at the time an order confirmation button or the like in the cash register is pressed, and issue a coupon from the cash register at the time a receipt is issued to the user.
Furthermore, the distribution processing unit 155 may distribute information on the coupon in cooperation with an application that is set in a terminal device of the target user. For example, the user inputs application registration information at the time of payment. The application registration information includes an address of a server as a service provider of the application used by the user, the identification information on the user, and the like. The distribution processing unit 155 transmits the identification information on the user and the coupon information on the product to the address that is set in the application registration information, and requests issuance of the coupon.
One example of the flow of the process performed by the information processing apparatus according to the present embodiment will be described below.
The tracking unit 152 of the information processing apparatus 100 extracts a user from each of frames of the video information, and tracks the user (Step S102). The skeleton detection unit 153 of the information processing apparatus 100 detects the skeleton information on the user (Step S103).
The motion recognition unit 154 of the information processing apparatus 100 identifies each of behaviors of the user on the basis of the skeleton information in each of the frames (Step S104). The motion recognition unit 154 identifies the first behavior type on the basis of each of the behaviors of the user and the behavior rule DB 50 (Step S105).
The distribution processing unit 155 of the information processing apparatus 100 performs the distribution process (Step S106).
One example of the distribution process described at Step S106 in
If the detected user has not started payment at the cash register (Step S202, No), the distribution processing unit 155 goes to Step S201. In contrast, if the detected user has started payment at the cash register (Step S202, Yes), the distribution processing unit 155 goes to Step S203.
The distribution processing unit 155 acquires the purchase history information from the cash register that is used by the user for the payment (Step S203). The distribution processing unit 155 determines whether the first behavior type is a behavior type with a predetermined level or higher (Step S204).
If the first behavior type is not the behavior type with the predetermined level or higher (Step S205, No), the distribution processing unit 155 terminates the distribution process. If the first behavior type is the behavior type with the predetermined level or higher (Step S205, Yes), the distribution processing unit 155 goes to Step S206.
The distribution processing unit 155 determines whether the product corresponding to the first behavior type is included in the purchase history information (Step S206). If the product corresponding to the first behavior type is included in the purchase history information (Step S207, Yes), the distribution processing unit 155 terminates the distribution process.
If the product corresponding to the first behavior type is not included in the purchase history information (Step S207, No), the distribution processing unit 155 distributes a coupon for the product corresponding to the first behavior type (Step S208). Meanwhile, the information to be distributed is not limited to the coupon for the product corresponding to the first behavior type, but may be an advertisement of the product.
Effects achieved by the information processing apparatus 100 according to the present embodiment will be described below. The information processing apparatus 100 identifies the first behavior type of the user on the basis of the first video information, and distributes a coupon if the first behavior type is the behavior type with the predetermined level or higher and a product corresponding to the first behavior type is not purchased. With this configuration, it is possible to provide a coupon for a product that attracts interest of the user and that is not yet purchased by the user.
The information processing apparatus 100 analyzes the second video information received from the camera 20 installed in the cashier area 5, identifies a product that is purchased by the user, and generates the purchase history information. With this configuration, the information processing apparatus 100 is able to determine whether the product corresponding to the first behavior type has been purchased without receiving the purchase history information from the cash register. Meanwhile, as described above, the information processing apparatus 100 may directly receive the purchase history information from the cash register and perform processes.
The information processing apparatus 100 identifies the personal feature information on the user and the product indicating the first behavior type of the user from the first video information. The information processing apparatus 100 identifies the user with similar personal feature information from the second video information by using the personal feature information on the user identified from the first video information, and identifies a cash register that is used by the identified user for payment. With this configuration, it is possible to issue a coupon for the product indicating the first behavior type from the cash register to the user.
The information processing apparatus 100 assigns the first behavior type to the user on the basis of the behavior rule DB in which a combination of behaviors and a behavior type are associated, so that it is possible to reduce a processing load on the information processing apparatus 100.
Meanwhile, the distribution processing unit 155 as described above may preferentially distribute a coupon for the most expensive product among a plurality of products if a plurality of products correspond to the first behavior type and the plurality of products are not purchased. With this configuration, it is possible to distribute a coupon for a product that efficiently increases sales.
One example of a hardware configuration of a computer that implements the same functions as those of the information processing apparatus 100 of the above-described embodiment will be described below.
As illustrated in
The hard disk device 307 includes a reception program 307a, a tracking program 307b, a skeleton detection program 307c, a motion recognition program 307d, and a distribution processing program 307e. Further, the CPU 301 reads each of the programs 307a to 307e and loads the programs 307a to 307e onto the RAM 306.
The reception program 307a functions as a reception process 306a. The tracking program 307b functions as a tracking process 306b. The skeleton detection program 307c functions as a skeleton detection process 306c. The motion recognition program 307d functions as a motion recognition process 306d. The distribution processing program 307e functions as a distribution processing process 306e.
A process of the reception process 306a corresponds to the process performed by the receiving unit 151. A process of the tracking process 306b corresponds to the process performed by the tracking unit 152. A process of the skeleton detection process 306c corresponds to the process performed by the skeleton detection unit 153. A process of the motion recognition process 306d corresponds to the process performed by the motion recognition unit 154. A process of the distribution processing process 306e corresponds to the process performed by the distribution processing unit 155.
Meanwhile, each of the programs 307a to 307e need not always be stored in the hard disk device 307 from the beginning. For example, each of the programs may be stored in a “portable physical medium”, such as a flexible disk (FD), a compact disk-ROM (CD-ROM), a digital versatile disk (DVD), a magneto optical disk, or an integrated circuit (IC) card, which is inserted into the computer 300. Further, the computer 300 may read and execute each of the programs the programs 307a to 307e.
It is possible to provide information on a product that attracts interest of a user and that is not yet purchased.
All examples and conditional language recited herein are intended for pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment of the present invention has been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2022-025131 | Feb 2022 | JP | national |