The present invention relates to an exhibition apparatus that displays a real object such as a product and displays an image related to the real object, a display control apparatus that controls a display image thereof, and an exhibition system that includes the exhibition apparatus and an information processing device.
This application claims priority right based on JP 2015-162640 filed in Japan on Aug. 20, 2015, the content of which is incorporated herein by reference.
In recent years, instead of a sales form in which a real object such as a product is displayed on a shelf to be presented to a purchaser, a sales form is adopted in a marketing site in which an image of a real object such as a product is displayed on a display device to be exhibited to a purchaser. Such sales form is called a virtual store. A seller who sells a product in a virtual store, for example, displays a product image and a QR code® associated with the product on a display provided in a vending machine. A user (purchaser) reads the QR code® of the favorite product with a smartphone or the like and purchases the product. Using the virtual store, the seller can exhibit a large number of products in a limited space. On the other hand, users have the advantage of being able to easily purchase products.
Various technologies have been developed for exhibition, display and advertisement of products. PTL 1 discloses a product selection support device that changes a display form of an image of a product based on the result of the gaze of user (purchaser) on the image of the product displayed on the display of the vending machine. For example, the product selection support device detects the gaze of the user on the image of the product, and calculates its attention degree. The product selection support device provides a display image suitable for the user by emphasizing and displaying the image of the product for which the user has a high attention degree.
PTL 2 discloses a video display device which detects a video in which a plurality of video viewers are interested on a display screen having a main area and a sub-area and switches the video from the sub-area to the main area. PTL 3 discloses an advertisement providing device that provides the customer with useful information about a product exhibited in shelf. PTL 4 discloses a product exhibition shelf which can be remodeled in various types and can enhance the production effect of the exhibition of the product.
PTL 1 technology cannot effectively utilize the display area which displays images such as products. For example, in a general virtual store, a state in which the image of the products are fixedly exhibited on the display is displayed, and the exhibition volume of the products, which are desired to be shown according to the attribute information about the user (purchaser) and the individual user, cannot be changed. In PTL 2, it is necessary to calculate the display desire degree representing that the viewer wants the display of a specific video and replace the video between the main area and the sub-area in such a way that the video with the highest display desire degree is displayed in the main area. However, it is impossible to estimate the product in which the user (purchaser) is interested and change its display mode.
The present invention has been made in view of the above problems, and it is an object of the present invention to provide an exhibition apparatus, a display control apparatus, and an exhibition system that can dynamically change display mode of a real object such as a product.
A first aspect of the present invention is an exhibition apparatus that includes: an exhibition area exhibiting a real object; a display area corresponding to the real object; and a control unit changing a display mode of the display area based on at least one of real object information about the real object and moving object information about a moving object.
A second aspect of the present invention is a display control apparatus to be applied to an exhibition apparatus including an exhibition area exhibiting a real object, and a display area corresponding to the real object. The display control apparatus includes a control unit changing a display mode of the display area based on at least one of real object information about the real object and moving object information about a moving object.
A third aspect of the present invention is an exhibition system that includes: an exhibition apparatus including an exhibition area exhibiting a real object, and a display area corresponding to the real object; and a display control apparatus changing a display mode of the display area based on at least one of real object information about the real object and moving object information about a moving object.
A fourth aspect of the present invention is an exhibition system. The exhibition system includes an exhibition apparatus and an information processing device. The exhibition apparatus includes an exhibition area exhibiting a real object, a display area corresponding to the real object, and a control unit changing a display mode of the display area based on at least one of real object information about the real object and moving object information about a moving object. The information processing device includes an interest estimation unit estimating whether the moving object is interested in the real object exhibited in the exhibition area, and the control unit changes a display mode of the display area corresponding to an image of the real object which the interest estimation unit estimates that the moving object is interested in.
A fifth aspect of the present invention is a display control method to be applied to an exhibition apparatus including an exhibition area and a display area. The method includes displaying, in the display area, an image corresponding to a real object exhibited in the exhibition area; and changing a display mode of the display area based on at least one of real object information about the real object and moving object information about a moving object.
A sixth aspect of the present invention is a program executed by a computer of an exhibition apparatus including an exhibition area and a display area. The computer executes the program to cause displaying, in the display area, an image corresponding to a real object exhibited in the exhibition area; and changing a display mode of the display area based on at least one of real object information about the real object and moving object information about a moving object.
According to the present invention, when exhibiting a product in a store and the like and displaying a product image or product information on a display, it is possible to draw attention of the user and attract the user's interest. For example, you can change the display mode of the display area corresponding to the product image when the user approaches the exhibition apparatus, or when the user picks up the product exhibited in the exhibition apparatus.
An exhibition apparatus, a display control apparatus, and an exhibition system according to the present invention will be explained with embodiments with reference to the attached drawings.
An exhibition apparatus 30 according to the first embodiment of the present invention will be described with reference to
First, the control unit 33 causes a display corresponding to the real object (step S1). The real object is an article to be exhibited in the exhibition area 31 such as, for example, product or exhibit. The real objects other than the article may be, for example, posters and signs. Causing “display corresponding to real object” is to display an image on a display in which one or more products exhibited in, for example, the exhibition area 31 are arranged. Alternatively, it may be possible to display information about the product (product description, product introduction, commercial, and the like). The control unit 33 obtains the image data corresponding to the real object and outputs the image data to the display or a similar device of the display.
Next, the control unit 33 changes the display mode of the display area 320 based on at least one of the real object information and the moving object information (step S2). For example, the real object information is an attribute such as type, size, shape, smell of a product exhibited in the exhibition area 31. The moving object information is, for example, an attribute such as the age and sex of a person viewing the display area 320, a facial image, movement of the person, the distance between the person and the exhibition apparatus 30, and the like. The moving object is a person related to a real object, a person detected by a sensor in relation to a real object, and the like. The moving object is not limited to a person. In addition to a person, the moving object may be a robot, an animal, an unmanned aerial vehicle, or the like. The control unit 33 changes the display mode of the display area 320 based on at least one of the real object information and the moving object information. Here, the “change the display mode” means, for example, enlarging the display area 320. Alternatively, it is also possible to change the color to be displayed on the display area 320, to change the brightness, or to enlarge or reduce an image such as the display product. Still alternatively, changes in the color, the brightness, and the size of the image may be combined appropriately.
For example, in case of changing the display mode of the display area 320 based on the real object information, the control unit 33 enlarges and displays the product image in the display area 320 corresponding to the product image on the display based on the exhibition of the small product in the exhibition area 31. When changing the display mode of the display area 320 based on the moving object information, the control unit 33 enlarges the display area 320 corresponding to the product image when it is estimated that the person viewing the display area 320 is interested in the product exhibited in the exhibition area 31. The control unit 33 includes a function for performing control to change the display mode of the display area 320 based on the real object information or the moving object information, and a function for outputting the image generated based on the real object information or moving object information. As described later in the fourth embodiment, for example, the control unit 33 may have a function to determine whether to change the display area 320 such as enlarging and displaying the product image in the display area 320 corresponding to the product image based on the exhibition of the small product in the exhibition area 31.
In the above description, the display mode change processing has been described based on the assumption that the exhibition apparatus 30 has the control unit 33 in accordance with the minimum configuration shown in
The edge terminal device 20 is an information processing device installed in the store utilizing the exhibition apparatus 30. The edge terminal device 20 generates a product exhibition image to be displayed on the exhibition apparatus 30 based on the image detected by the store video sensor 10 and the information analyzed by the server terminal device 40. Here, the product exhibition image includes the entire area of the image displayed by the output unit 32. The edge terminal device 20 includes a video input unit 21, a meta-data conversion unit 22, a meta-data transmission unit 23, a market data reception unit 24, an interest estimation unit 25, an output instruction unit 26, an input information reception unit 27, a data output unit 28, and a storage unit 29.
The edge terminal device 20 is a personal computer (PC) having a small box-shaped casing, and can be equipped with additional modules (for example, an image processing module, an analysis module, a target specification module, an estimation module, and the like) having various functions. The functions of the meta-data conversion unit 22 and the interest estimation unit 25 are realized by an added module. The edge terminal device 20 can communicate with other devices using various communication means. Various communication means include, for example, wired communication via a LAN (Local Area Network) cable or optical fiber, wireless communication based on communication method such as Wi-Fi (Wireless Fidelity), communication using a carrier network with SIM (Subscriber Identity Module) card equipped therein, and the like. When transmitting images obtained by photographing a situation of a store and the like and the information detected by sensors and the like to a server terminal device installed in the data center and the like, there is a possibility that, e.g., the network is overloaded or information is leaked. Therefore, usually, the edge terminal device 20 is installed on the store side provided with cameras and sensors, so that image processing and analysis are performed on the images, whereby the images are converted into meta-data, and the meta-data is transmitted to the server terminal device.
The video input unit 21 inputs the image photographed by the store video sensor 10. The store video sensor 10 is provided with two dimensional camera 11 and three dimensional camera 12. The meta-data conversion unit 22 converts the image input by the video input unit 21 into meta-data. For example, the meta-data conversion unit 22 analyzes the image captured by the two dimensional camera 11 and sends the attribute data of the person included in the image to the meta-data transmission unit 23. The attribute data is, for example, the age and sex of a person. In addition, the meta-data conversion unit 22 analyzes the image captured by the two dimensional camera 11 and specifies the person included in the image. In the storage unit 29, the facial image and the like of the user frequently visiting the store are registered in advance, and the image input by the video input unit 21 is collated with the facial image registered in advance, so that the user appearing in the input image is specified. The meta-data conversion unit 22 sends the individual data (for example, user ID, and the like) of the specified user to the meta-data transmission unit 23. The meta-data conversion unit 22 converts the image photographed by the three dimensional camera 12 into purchase behavior data. The three dimensional camera 12 is attached to a position where it is possible to photograph an action of the user in front of the shelf from the ceiling side of the store (hereinafter referred to as shelf front action). From the three dimensional image captured by the three dimensional camera 12, the distance between the three dimensional camera 12 and the subject can be obtained. For example, if the three dimensional image contains an image in which the user obtains a product on a shelf, it is possible to measure the distance between three dimensional camera 12 and the position where the user reaches out in order to obtain the product, and it is possible to determine from which stage of the product shelf the user obtained the product from. The meta-data conversion unit 22 specifies the product on the shelf that the user reaches out from the three dimensional image and the number of the products, and sends specified data to the meta-data transmission unit 23 as purchasing behavior data.
The meta-data transmission unit 23 transmits the meta-data sent from the meta-data conversion unit 22 to the server terminal device 40. The market data reception unit 24 receives market data from the server terminal device 40. The market data is information indicating the tendency of purchase behavior corresponding to attribute information about the user. The market data is the purchase behavior history of the product about the individual user, and the like. Based on the market data received by the market data reception unit 24, the interest estimation unit 25 estimates the product that the user class indicated by the attribute information is interested in. The interest estimation unit 25 estimates the product that the individual user specified by the meta-data conversion unit 23 is interested in.
The output instruction unit 26 transmits instruction information to the exhibition apparatus 30 so as to perform volume-display of the product estimated by the interest estimation unit 25. The volume-display is to enlarge and display the display area corresponding to the product exhibited on the shelf. For example, when the output unit 32 of the exhibition apparatus 30 is constituted by a plurality of displays, the display area corresponding to one product corresponds to one display in a normal display mode. When the display mode is volume-display, the display area corresponding to the volume-displayed product may be enlarged to a plurality of displays. The input information reception unit 27 receives information including the selection operation of the user for the product exhibited in the exhibition apparatus 30, and accepts the selection operation of the user. The data output unit 28 transmits the information of the product selected by the user accepted by the input information reception unit 27 to the store terminal device 50. The storage unit 29 stores various kinds of information such as the user's facial image, the image of product, and the like.
The exhibition apparatus 30 includes an exhibition area 31 for exhibiting the product and an output unit 32 for displaying the image of the product. Multiple exhibition areas 31 may be provided for one exhibition apparatus 30. The output unit 32 has a display area 320 corresponding to the image of the product exhibiting in the exhibition area 31. For example, the output unit 32 is a display capable of three dimensional display. Alternatively, it may be a projector that projects an image on the wall surface on which the exhibition apparatus 30 is installed. In the case where multiple exhibition areas 31 are provided in the exhibition apparatus 30, the output unit 32 displays an image including display areas corresponding to the each product exhibited in different exhibition areas 31. For example, when four kinds of products are exhibited in the exhibition area 31, four display areas are displayed corresponding to the each product. The output unit 32 is not limited to an image display device. The output unit 32 may include a device for emitting an odor related to a product exhibited in the exhibition area 31, or an ultrasonic haptics that provide the user with haptic information such as hardness or softness of the product and a haptic experience such as operation feeling that the user feels when the user operates the product.
In the exhibition apparatus 30, the control unit 33 displays the product exhibition image received from the output instruction unit 26 to the output unit 32. When it is assumed that, for example, products A, B, C, and D are exhibited in the exhibition area 31, and when the product exhibition image in which the product A is volume-displayed is received from the output instruction unit 26, the control unit 33 controls to display the product exhibition image on the output unit 32. In this case, for example, the product exhibition image is a display obtained by enlarging the display area corresponding to the product A and displaying a larger number of products A than the products A displayed in the display area before the enlarged display area is enlarged. This makes the products A volume-displayed on the output unit 32 easier to be seen from the user, so that the user's attention is attracted to the product A. As will be described later, the volume-display is realized in various modes.
The exhibition apparatus 30 further includes a communication unit 34 and an input accepting unit 35. The communication unit 34 communicates with other devices such as the edge terminal device 20. The input accepting unit 35 receives the selection operation of the product from the user. For example, the output unit 32 is a display in which a touch panel and a liquid crystal display unit are combined in an integrated manner, and a product selection button is displayed on an output unit 32. In this case, the input accepting unit 35 accepts the operation of the button selected by the user. Alternatively, the input operation unit 35 may acquire information about the product selected by the user operating the smartphone via the network such as a carrier network, and accept the selection information about the product. The input accepting unit 35 transmits the selection information about the product to the edge terminal device 20.
The server terminal device 40 is installed in, for example, a data center. The server terminal device 40 accumulates information such as purchase history about products by many consumers. The server terminal device 40 also has a big data analysis unit 41. The big data analysis unit 41 performs marketing analysis such as accumulating information such as images received from the edge terminal device 20, products purchased by the user, products searched for by the user, and the like, and identifying the product selling well for each age group and sex.
The store terminal device 50 runs, for example, an inventory management system, a product ordering system, a POS system, and the like. The store terminal device 50 is a smart device 52 such as, for example, a personal computer (PC) 51 or a tablet terminal. When the store terminal device 50 receives information about the product selected by the user from the data output unit 28 of the edge terminal device 40, the store terminal device 50 generating information for instructing the store clerk to move a predetermined number of products from the inventory of the products to the shelf and displays the products on the display provided in the store terminal device 50. It should be noted that a plurality of exhibition apparatuses 30 may be connected to one edge terminal device 20.
On the other hand, meanwhile, regardless of the use of product and the like, for example, new products or well-selling products, or products for which the general consumer's awareness is low but the sales can be expected to increase in the future, and the like are exhibited in the exhibition apparatuses 30A to 30D. On the display 32A, a product exhibition image showing a state in which those products are exhibited is displayed.
Next, a product exhibition image displayed on the display 32A of the exhibition apparatus 30A will be described. The exhibition apparatus 30A can exhibit one or more types of products. The product exhibition image displayed on the display 32A is provided for each of the products exhibited in the display area displaying the image of the product. In each display area, an image showing how the product is being exhibited side by side is displayed. In other words, instead of exhibiting the actual product, how the product is exhibited can be expressed by displaying the exhibition image of the product. In the method of displaying images in which products are arranged instead of exhibiting actual products, it is possible to display a large number of types of products with images, so that it is possible to save the product exhibition space and save labors of the actual product exhibition work. In addition, the exhibition apparatus 30 according to the present embodiment can increase the attention and interest in the products by the user by changing the display mode of the display area of product. In the present embodiment, the exhibition apparatus 30 is installed in the vicinity of the entrances 140A, 140B of the floor 100 through which the user passes before he or she searches the shelf 130 for the desired product, and the display mode of the display area of the product is controlled, so that it is possible to improve awareness and purchase motivation for products other than the product which the user is looking for.
The exhibition apparatus 30 controls the display mode of the display area of the product based on the past purchase behavior history in each time zone, which records the age and gender of the user who comes to the store and what kind of product the user purchased. For example, the exhibition apparatus 30 enlarges and displays (volume-display) the display area of the products which is most likely to be purchased by the user class who is likely to visit the store during the specified time zone. The exhibition apparatus 30 displays numerous images of that product in the enlarged display area than the product displayed in the display area before enlargement. By doing so, it can be expected to increase purchase motivation, interest, awareness of the product by the user class who is most likely to visit the store in the specification time zone. Furthermore, the exhibition apparatus 30 may enlarge and display the display area of the product that is most likely to be purchased by the user who comes to the store under the specific environment according to the season, the day of week, the weather, or the like.
According to the past purchase behavior history of the user included in the image photographed by the store video sensor 10, the exhibition apparatus 30 enlarges and displays (volume-display) the display area of the product, specified by the meta-data conversion unit 22, that the user is most likely purchase. Furthermore, the exhibition apparatus 30 displays a larger number of images of products in the enlarged display area than the product displayed in the display area before enlargement. As a result, it is expected to increase purchase motivation for products frequently purchased by the users (such as regular customers).
Next, an outline of the processing of the exhibition system 1 required to control the display area of the exhibition apparatus 30 will be described. First, the two dimensional camera 11A and the three dimensional camera 12A transmit the captured video to the edge terminal device 20. In the edge terminal device 20, the video input unit 21 receives the video and sends the video to the meta-data conversion unit 22. The meta-data conversion unit 22 extracts from the video taken by the two dimensional camera 11A the attribute data (age, gender, and the like) of the user appearing in the video, individual data (identification information of the regular customer specified by collating the facial image, and the like). In addition, the meta-data conversion unit 22 sends the purchase behavior data of the user appearing in the video (user's product obtained from the shelf, and the like) obtained from the three dimensional camera 12A. The meta-data transmission unit 23 transmits the information to the server terminal device 40. In the server terminal device 40, the big data analysis unit 41 accumulates the information received from the meta-data transmission unit 23. In addition, the big data analysis unit 41 transmits the purchase behavior history, product search history, and the like corresponding to attribute data and individual data about the user received from the meta-data transmission unit 23 to the edge terminal device 20. The purchase behavior history is, for example, information about the product purchased by the user class corresponding to the age group and the gender indicated by the attribute data when receiving the attribute data of the user from the meta-data transmission unit 23. The product search history is information about the products that the user class is searching through the Internet or the like. Even when the individual information about the user is received from the meta-data transmission unit 23, the big data analysis unit 41 also transmits the purchase behavior history and the product search history to the edge terminal device 20.
In the edge terminal device 20, the market data reception unit 24 receives purchase behavior history and product search history of the user and sends them to the interest estimation unit 25. Based on the purchase behavior history and the product search history of the user, The interest estimation unit 25 estimates which of the products exhibited in the exhibition apparatus 30A the user passing in front of the exhibition apparatus 30A is interested in. For example, when the product included in the purchase behavior history and the product search history is exhibited in the exhibition area 31 of the exhibition apparatus 30A, the interest estimation unit 25 estimates that the user has interest in the product. The interest estimation unit 25 sends to the output instruction unit 26 information relating to the product in which the user is considered to be interested. The output instruction unit 26 generates a product exhibition image in which the product estimated by the interest estimation unit 25 is volume-displayed and transmits the product exhibition image to the exhibition apparatus 30. In the exhibition apparatus 30, the control unit 33 receives the product exhibition image and displays the product exhibition image on the output unit 32.
Next, an example of a display mode of the display area which changes according to the interest estimation of the user estimated by the interest estimation unit 25 will be explained with reference to
The upper side diagram in
Next, it is assumed that the control unit 33 receives a new product exhibition image of the product B volume-displayed from the output control unit 26. Then, the control unit 33 displays the product exhibition image on the output unit 32. The lower side diagram in
Next, a specific example of processing for changing the volume-display in the display area 320 of the exhibition apparatus 30 will be described.
When it is determined that it is time to estimate the product to be subjected to volume-display, the interest estimation unit 25 acquires the current time and date information (step S11). Next, the interest estimation unit 25 refers to the storage unit 29 and reads the attribute information about the user class in which a largest number of people visit the store in the day of week and the time zone indicated by the time and date information (step S12). For example, the interest estimation unit 25 reads from the storage unit 29 the information that there are tendencies for many men in their thirties to visit on a day of week and the time zone indicated by time and date information. The interest estimation unit 25 sends the attribute information thereof to the market data reception unit 24. The market data reception unit 24 requests the server terminal device 40 for market data corresponding to the attribute information. The big data analysis unit 41 transmits the purchase behavior history and the product search history of the user having the attribute information to the market data reception unit 24. The market data reception unit 24 sends the market data to the interest estimation unit 25. The interest estimation unit 25 estimates the interest corresponding to the attribute information about the majority of users visiting on the current day and time zone (step S13). For example, the interest estimation unit 25 extracts the product purchased by a man in his thirties from a purchase behavior history of a male in his thirties received from the big data analysis unit 41, and compares the extracted product with the product in the exhibition area 31 of the exhibition apparatus 30. If the product in the exhibition apparatus 30 contains the same product as or product related to the product recorded in the purchase behavior history, the interest estimation unit 25 indicates that the product is a product which men in their thirties are likely to be interested in.
Next, the interest estimation unit 25 sends to the output instruction unit 26 the information about the product for which interest of the user class has been estimated. The output instruction unit 26 generates a product exhibition image that volume-displays the product in which the user class in which many users are expected to visit the store in the time zone is interested, and transmits the product exhibition image to the exhibition apparatus 30. In the exhibition apparatus 30, the control unit 33 displays the product exhibition image in which the volume-display has been changed (step S14). More specifically, the control unit 33 acquires the product exhibition image via the communication unit 34 and sends the product exhibition image to the output unit 32. The output unit 32 displays the product exhibition image.
First, the two dimensional camera 11 continues to capture the video of the user visiting the store and sends the image to the video input unit 21. The video input unit 21 inputs the video captured by the two dimensional camera 11 (step S21). Next, the video input unit 21 sends video to the meta-data conversion unit 22. The meta-data conversion unit 22 extracts the facial image of the user appearing in the video and compares it with the facial image of the customer stored in the storage unit 29. When the verification is successful, the meta-data conversion unit 22 specifies that the visited user is the customer who is successfully verified (step S22). The meta-data conversion unit 22 transmits the individual data of the specified customer to the server terminal device 40 via the meta-data transmission unit 23. In the server terminal device 40, the big data analysis unit 41 analyzes the product purchased by the customer indicated by the individual data in the past, and transmits the purchase behavior history including the product information to the market data reception unit 24. The interest estimation unit 25 acquires the past purchase behavior history of the specified customer from the market data reception unit 24 (step S23).
The interest estimation unit 25 estimates the interest of the specified customer (step S24). For example, the interest estimation unit 25 extracts the product purchased by the specified customer from the purchase behavior history and compares the product purchased with the product exhibited in the exhibition apparatus 30. When the product exhibited on exhibition apparatus 30 includes the same product as and product related to the product recorded in purchase behavior history, the interest estimation unit 25 determines that the product is a product in which the customer is interested.
Next, the interest estimation unit 25 sends to the output instruction unit 26 the information about the product in which the customer is estimated to be interested. The output instruction unit 26 generates a product exhibition image which volume-displays the product in which the customer is estimated to be interested, and transmits the product exhibition image to the exhibition apparatus 30. When there are multiple exhibition apparatus exhibition apparatuses 30, the output instruction unit 26 obtains identification information of the two dimensional camera 11 and the like from the video input unit 21, and outputs the product exhibition image to the exhibition apparatus 30 corresponding to the two dimensional camera 11. In the exhibition apparatus 30, the communication unit 34 receives the product exhibition image, and the control unit 33 sends the product exhibition image to the output unit 32. The output unit 32 displays the product exhibition image in which the volume-display has changed (step S25). This can be expected to have an effect of raising interest in the product purchased by the customer in the past and the product related to that customer in relation thereto for the regular customer who passed in front of the exhibition apparatus 30.
When the customer cannot be specified in step S22 of
First, the output instruction unit 26 of the edge terminal device 20 acquires the information about the product which is volume-displayed from the interest estimation unit 25 (step S31). Next, the output instruction unit 26 reads the attribute of the product which is volume-displayed from the storage unit 29. The output instruction unit 26 compares the size information included in the attribute of the product read from the storage unit 29 with a predetermined threshold value to determine whether the product is small (step S32). When the product is not small (determination result “NO” in step S 32), this processing flow is terminated. In this case, the output instruction unit 26 enlarges the display area of the product which is volume-displayed and generates a product exhibition image in which a larger number of products than those in the display area before enlargement are arranged in the enlarged display area. On the other hand, when the product is small (determination result “YES” in step S32), the output instruction unit 26 enlarges the display area of the product which is volume-displayed and also generates a product exhibition image in which the image of the products arranged in the enlarged display area is displayed in an enlarged manner (step S33). The output instruction unit 26 may alternately generate a product exhibition image in which the display area of the product which is volume-displayed is enlarged and a product exhibition image in which the display area of the product is not enlarged with a predetermined time interval.
If a product is a relatively large, even if the product is displayed in full size, the user can see the product. However, in the case of a relatively small product, even if the display area is simply enlarged to generate an image in which a large number of products are arranged, there is a possibility that the user cannot see the product. However, in the processing of
It should be noted that the product exhibition image is not limited to the example shown in
The upper side diagram in
In the volume-display in the above product exhibition image, the display area is enlarged, and in the enlarged display area, a larger number of products are displayed than the products displayed in the display area before enlargement, but the embodiment is limited thereto. For example, it is possible to change the background color of the enlarged display area or the product color, or enlarge and display the image of product in the enlarged display area.
Alternatively, the exhibition apparatus 30 may be provided with a tank filled with the odor particles of the product to be exhibited and means to release odor particles. In this way, in addition to the volume-display of the product, the odor of the product may be released. Alternatively, a tactile sense such as hardness, softness, and operation property of the product to be volume-displayed may be presented using ultrasonic haptics technology.
According to the exhibition apparatus 30 according to the present embodiment, by changing the display mode of the display area 320 according to the user's interest, it is possible to raise user's awareness in the product and to stimulate purchase motivation. Since actual product is exhibited in the exhibition area 31 of the exhibition apparatus 30, the volume-display allows the user who is interested in the product to pick up the product to review the product.
In the present embodiment, the attribute and individual user are specified according to the image detected by the image sensor, but the embodiment is not limited thereto. The means for detecting the attribute of the user and the like is not limited to the image sensor. For example, instead of the image sensor, a reading device of an IC card possessed by the user may be placed near the exhibition apparatus 30. In that case, when the user holds the IC card over the reading device, the reading device reads the individual information about the user recorded on the IC card, and the interest estimation unit 25 estimates the customer's interest based on the individual information.
Next, an exhibition system 2 according to a second embodiment of the present invention will be explained with reference to
When the distance between the user and the exhibition apparatus 30E is greater than or equal to a predetermined threshold value based on the distance estimated based on the image of the user included in the video captured by the two dimensional camera 11E, the exhibition apparatus 30E is performs volume-display for the product E. This makes it possible to make the user the user who is away from the exhibition apparatus 30E to be impressed with the product E.
The upper side diagram of
First, the two dimensional camera 11E continues to take the video of the user in the store, and sends the video to the video input unit 21. The video input unit 21 sends the video taken by the two dimensional camera 11E to the distance estimation unit 251. The distance estimation unit 251 analyzes the video, and estimates the distance between the exhibition apparatus 30E provided with the two dimensional camera 11E taking the video and the user appearing in the video. For example, the distance estimation unit 251 estimates the distance between the user and the exhibition apparatus 30E based on the positional relationship between the user and the shelf 131 around the user in the video. The distance estimation unit 251 sends the estimated distance to the output instruction unit 261. The output instruction unit 261 determines whether or not the user exists within a predetermined distance from the exhibition apparatus 30E (step S41).
When it is determined that the user exists within the predetermined distance from the exhibition apparatus 30E (determination result “YES” in step S41), the output instruction unit 261 generates a product exhibition image including the image of the product E and the product explanation of the product E. The output instruction unit 261 sends the product exhibition image to the exhibition apparatus 30E. In the exhibition apparatus 30E, the control unit 33 causes the output unit 32 to display the product exhibition image including the image of the product E and the product explanation of the product E (step S42).
On the other hand, when it is determined that the user does not exist within the predetermined distance from the exhibition apparatus 30E (determination result “NO” in step S41), the output instruction unit 261 generates a product exhibition image in which the product E is volume-displayed. The output instruction unit 261 transmits the product exhibition image to the exhibition apparatus 30. In the exhibition apparatus 30E, the control unit 33 displays the product exhibition image in which the product E is volume-displayed on the output unit 32 (step S43).
For the second embodiment of the present invention, various modifications may be considered. For example, a transmitter of a beacon signal is distributed to the user at entrances 141A, 141B of the floor 101. In the vicinity of the exhibition apparatus 30E, a receiver of the beacon signal is set up, and when the user approaches, the receiver receives the beacon signal and detect that the user is approaching. At this time, the receiver transmits a signal indicating that user is approaching to the edge terminal device 201. In the edge terminal device 201, the distance estimation unit 251 receives that signal. When the distance estimation unit 251 receives the signal, the distance estimation unit 251 sends a distance in which the beacon signal can be detected to the output instruction unit 261. The output instruction unit 261 generates a product exhibition image in which the product is volume-displayed according to the distance. The exhibition apparatus 30E displays the product exhibition image.
In the present embodiment, as another method of estimating the distance between the user and the exhibition apparatus 30E, a pressure sensor may be provided on the floor of the passage from the entrances 141A, 141B to the exhibition apparatus 30E in the floor 101. When the pressure sensor detects the weight of a person, the distance between the user and the exhibition apparatus 30E may be estimated based on the installation position of the pressure sensor and the installation position of the exhibition apparatus 30E. This is not limited to the pressure sensor that is installed in the passage inside the floor 101. For example, a person detection sensor may be provided in the passage leading to the exhibition apparatus 30E within the floor 101. In this case, the distance between the user and the exhibition apparatus 30E may be estimated based on the installation position of the person detection sensor that detects a passing person and the installation position of the exhibition apparatus 30E.
Next, an exhibition system 3 according to the third embodiment of the present invention will be explained with reference to
In conjunction with the processing described above, the following processing may be executed. For example, in the store, for luxuries and the like, empty box may be exhibited instead of the actual products. For this reason, the exhibition apparatus 30 exhibits empty boxes of luxuries. When the user picks up the empty box, the selection product detection unit 252 analyzes the video taken by the two dimensional camera 11 and the three dimensional camera 12 and specifies the product of the empty box which the user has picked up. The selection product detection unit 252 sends the information of the specified product to the data output unit 28. The data output unit 28 transmits the information about the luxury selected by the user to the PC 51 installed in the cash register. The employee in charge of the cash register acquires the information about the luxury notified to the PC 51, and prepares the luxury in the cash register in advance. This eliminates the need for the user to search for luxuries after presenting an empty box with the cash register, thereby improving work efficiency and reducing user's waiting time.
The method of specifying the product selected by the user is not limited to the above method, and other methods can be adopted. For example, the user may activate an application program (hereinafter referred to as a dedicated application) cooperating with the exhibition system 3 on the portable terminal owned by the user. In this case, the user searches for the product exhibited on the exhibition apparatus 30 within the predetermined range from the exhibition apparatus 30 using the dedicated application. Then, the dedicated application transmits the information about the product searched by the user and the position information about the portable terminal possessed by the user to the edge terminal device 202. In the edge terminal device 202, the selection product detection unit 252 receives the information. The selection product detection unit 252 specifies the exhibition apparatus 30 installed at the position where the user exists from the position information of the portable terminal. The selection product detection unit 252 determines whether or not the product searched by the user is exhibited in the specified exhibition apparatus 30. In the case where the product searched by the user is exhibited in the exhibition apparatus 30 specified by the user, the selection product detection unit 252 sends the specification information of the specified exhibition apparatus 30 and the information of the product searched by the user to the output instruction unit 262. The output instruction unit 262 creates a product exhibition image in which the product searched by the user is volume-displayed. The output instruction unit 262 transmits the product exhibition image to the exhibition apparatus 30 indicated by the identification information.
A display in which the output unit 32 is integrated with the touch panel may be adopted. In this case, in the display area 320 corresponding to the product exhibited in the exhibition area 31, a selection button as well as image is displayed in the product. When the user touches the selection button, the input accepting unit 35 transmits the information about the product selected by the user to the edge terminal device 202. In the edge terminal device 202, the input information reception unit 27 receives the product information and sends the product information to the selected product detection unit 252. The output instruction unit 262 generates a product exhibition image in which the product selected by the user is volume-displayed. The exhibition apparatus 30 displays its product exhibition image.
The following processing may be added in conjunction with the processing described above. For example, in stores that sell drugs, there is a product that users cannot purchase unless the explanation is received from the pharmacist. The exhibition apparatus 30 shows an empty box of such a specification product. The product purchase button is displayed in the display area 320 of the exhibition apparatus 30 corresponding to the specification product. When the user operates the product purchase button, the input accepting unit 35 transmits the information of the product corresponding to the product purchase button operated by the user to the edge terminal device 202. In the edge terminal device 202, the input information reception unit 27 receives the product information and sends the product information to the data output unit 28. The data output unit 28 transmits the information about the product purchased by the user to the PC 51 installed in the cash register. The pharmacist prepares the product notified by the PC 51 in the cash register. When the user arrives at the cash register, the pharmacist explains about the product. As a result, it is possible to improve the ease of shopping by assisting the purchase behavior of the user.
In the present embodiment, an acceleration sensor may be attached to the product in order to detect operation of the user, or a weight sensor may be provided in the product exhibition surface (or product exhibition shelf) of the exhibition area 31. The acceleration sensor detects the acceleration caused when the user picks up the product, and the weight sensor detects the weight change that occurs when the user picks up the product. In this way, based on the acceleration sensor and the detection result of the weight sensor, volume-display may be performed on the display area 320 corresponding to the product the user has picked up.
Next, an exhibition system 4 according to the fourth embodiment of the present invention will be explained with reference to
The output instruction unit 260 of the edge terminal device 203 transmits instruction information including the identification information of the product to be volume-displayed to the exhibition apparatus 300. The storage unit 36 of the exhibition apparatus 300 stores the image to be displayed in the display area 320 of the output unit 32. For example, the storage unit 36 is a hard disk included in the exhibition apparatus 300, a USB memory connected to the exhibition apparatus 300, and the like. The control unit 331 has a function of reading the image from the storage unit 36 and generating a product exhibition image. It should be noted that the other configurations and functions of the exhibition system 4 are similar to those of the exhibition system 1, but it is also possible to combine the fourth embodiment with the second embodiment and the third embodiment.
On the premise, in the exhibition apparatus 300, the control unit 331 reads the image corresponding to the product exhibited in the exhibition area 31 from the storage unit 36 to generate the product exhibition image, and the output unit 32 displays the product exhibition image.
First, the interest estimation unit 25 determines that it is time to estimate the product to be subjected to volume-display and acquires the current time and date information (step S11). Next, the interest estimation unit 25 reads from the storage unit 29 the attribute information about the user class visiting the store most frequently on the day of week indicated by the time and date information and the time zone (step S12). In addition, the interest estimation unit 25 estimates the user's interest indicated by the attribute information about the user class in which a largest number of users visit the store in the time zone on the current day of week and time zone (step S13). Next, the interest estimation unit 25 sends the information about the product in which the user is estimated to be interested to the output instruction unit 260. The output instruction unit 260 transmits the identification information of the product to the exhibition apparatus 300 (step S135). In the exhibition apparatus 300, the control unit 33 acquires the identification information of the product via the communication unit 34. The control unit 33 generates a product exhibition image in which the product corresponding to the identification information is volume-displayed, and sends the product exhibition image to the output unit 32. The output unit 32 displays the product exhibition image (step S14).
In the present embodiment, when a USB memory, the image of the product displayed in the product exhibition image can be easily switched. If user classes coming to visit the store are different depending on, for example, time zone, it may be desired to display different images depending on age of specification even for the same product. According to the present embodiment, for example, multiple USB memories that stores images of products may be prepared for different target ages, and the product exhibition image can be easily switched by replacing the USB memory according to the ages in which many users visit the store in each time zone. Even when the product to be exhibited in the exhibition area 31 is switched according to the time zone, it is possible to easily change the product exhibition image according to product switching by using the USB memories as shown in the present embodiment.
The control unit 331 may have a function of determining whether or not to change the display area 320 based on at least one of real object information and moving object information. For example, the storage unit 29 associates and stores information indicating the size of the product and the image of the product, and the control unit 331 determines to change the display mode of the product display area 320 of that product in a case where the size of the product exhibited in the exhibition area 31 (the real object information) is smaller than the predetermined threshold value. Then, the control unit 331 generates a product exhibition image in which the product is volume-displayed with a predetermined time interval, and sends the product exhibition image to the output unit 32. Alternatively, an acceleration sensor is attached to the product, and the control unit 331 is configured to acquire the acceleration detected by the acceleration sensor. Then, in the case where the acceleration (moving object information) acquired from the acceleration sensor is greater than or equal to a predetermined threshold value, the control unit 331 determines that the user has picked up the product and hence determines to volume-display the product. Thereafter, the control unit 331 generates a product exhibition image in which the product is volume-displayed.
Next, the exhibition system 5 according to the fifth embodiment of the present invention will be explained with reference to
When it is determined that it is time to estimate the product to be subjected to volume-display, the interest estimation unit 25 acquires the current time and date information (step S11). Next, the interest estimation unit 25 reads from the storage unit 29 the attribute information about the user class in which a largest number of people visit the store in the day of week and the time zone indicated by the time and date information (step S12). Thereafter, the interest estimation unit 25 estimates the user's interest indicated by the attribute information about the user class in which a largest number of users visit the store in the time zone on the current day of week and time zone (step S13). Next, the interest estimation unit 25 sends the information about the product in which the user is estimated to be interested to the output instruction unit 260. The output instruction unit 26 generates a product exhibition image that volume-displays the product in which the user class in which many users are expected to visit the store in the time zone is interested, and transmits the product exhibition image to the output control unit 263. The output control unit 263 transmits the product exhibition image to the exhibition apparatus 301 and displays the product exhibition image on the output unit 32 of the exhibition apparatus 301 (step S141). In the exhibition apparatus 301, the output unit 32 displays the product exhibition image. According to the present embodiment, since the function of the control unit 33 is ported to the edge terminal device 204, the weight of the exhibition apparatus 301 can be reduced and the transportability can be improved.
Next, the network configuration applied to exhibition systems 1 to 5 according to the first embodiment to the fifth embodiment of the present invention will be explained.
It is also possible to add a module having the function of a part or all of the server terminal device 40 to the edge terminal device 20. For example, the edge terminal device 20 may be configured to be equipped with the function of the big data analysis unit 41 which can only perform analysis of user class with respect to the age class in which users are likely to come to the store for each store, and when a user outside of the target age class comes to the store, the server terminal device 40 is inquired. Alternatively, it is also possible to add a module having all functions of the server terminal device 40 to the edge terminal device 20 so as to omit the server terminal device 40. Conversely, a part of the function of the edge terminal device 20 may also be implemented in the server terminal device 40. For example, the function of the interest estimation unit 25 of the edge terminal device 20 may be implemented in the server terminal device 40.
As explained with reference to
The control unit 250a has a function of determining whether or not to change the display area 320 based on at least one of real object information and moving object information. In addition, the control unit 250a may have a function of changing the display mode of the display area 320 based on at least one of the real object information and the moving object information. It should be noted that the above edge terminal devices 20, 201, 202, and 203 exemplify the control device 20a, and the output instruction units 26, 260, 261, and 262 exemplify the control unit 250a.
In the first embodiment to the fifth embodiment of the present invention, the situation where the product is volume-displayed with the exhibition apparatus installed in the store has been explained, but the exhibition system according to the present invention can be used in other situations as follows.
At the store, a security guard poster is exhibited on the exhibition area of the exhibition apparatus 30. For example, in the case of the first embodiment, a face photograph of a person who is likely to have shoplifted in the past is registered in advance, and when that person visits the store, the display area corresponding to the security guard poster is volume-displayed. Alternatively, in the case of the third embodiment, when an operation is detected in which the customer picks up the product and puts the product into the bag directly in front of the shelf, the display area corresponding to the security guard poster is volume-displayed. As a result, shoplifting prevention effect can be expected.
In a situation where a store or AI (Artificial Intelligence) robot automatically collects products specified by customers, operators, and the like, the display area corresponding to the product that is to be collected and that is exhibited in the exhibition area of the exhibition apparatus 30 is volume-displayed. For example, in the case of the second embodiment, the product may be volume-displayed according to the distance between the AI robot and the exhibition apparatus 30. Alternatively, a product that is a collection target may be volume-displayed. Thus, it can be expected that the recognition accuracy of the collection target product by the AI robot is improved.
In an exhibition site, the display area corresponding to the product (exhibit) exhibited in the exhibition area of the exhibition apparatus 30 may be volume-displayed, as explained in the first embodiment to the fifth embodiment. This makes it possible to appeal the exhibits according to the interest of the people who are present at the exhibition site.
The exhibition apparatus 30 may be installed on the farm to present a strawman in the exhibition area. Then, wild animals such as wild boars are detected with an image sensor and the display area corresponding to strawman is volume-displayed according to the distance between the wild animals and the exhibition apparatus 30 as in the second embodiment. For example, when the boar comes closer to the exhibition apparatus 30, the image of strawman is enlarged and displayed, or a lot of strawman are displayed. This can be expected to prevent wild animals such as wild boars from damaging the farm.
The exhibition apparatus 30 is placed in the aisle inside and outside of the building and a sign is exhibited to guide the exit, destination, and the like to the exhibition area. Then, when an approach of a person is detected with a person detection sensor or the like, the display area corresponding to the sign is volume-displayed to guide the person. Accordingly, it can be expected that this can prevent a person who passes complicated underground road with few marks from getting lost.
The exhibition apparatus 30 is placed near the road where traffic accidents occur frequently, and traffic signs and an exhibition posters and the like calling for attention is exhibited in the exhibition area. When the vehicle approaches within a predetermined distance with respect to the exhibition apparatus 30, the display area corresponding to traffic signs or the like is volume-displayed. This can be expected to prevent the occurrence of traffic accidents.
AI robots for transporting chemicals and specimens have been introduced in hospitals. In this case, the exhibition apparatus 30 is placed in the hospital and the mark sign is exhibited in the exhibition area. Then, when it is detected that the AI robot exists within a predetermined distance from the exhibition apparatus 30, the exhibition apparatus 30 displays the mark sign in volume-display. As a result, the recognition accuracy of the AI robot improves, and chemicals and the like can be reliably delivered to the destination. In the application example described above, the moving object may be any of a person (user, store clerk, and the like), animals, objects (robots, unmanned aerial vehicles, and the like).
In the above embodiment, the edge terminal device 20 is described as a personal computer (PC) or the like, but all the functions or some functions of the edge terminal device 20 and all functions or some functions of the store video sensor 10 and the edge terminal device 20 may be provided in the robot. In other words, in the exhibition system according to the present invention, a robot can be provided instead of the edge terminal device 20. Alternatively, both of the edge terminal device 20 and the robot may be included in the exhibition system according to the present invention.
The above exhibition apparatus 30 has a computer system provided therein. The above-described processing of the exhibition apparatus 30 is stored in a computer-readable medium in a program format, and the above-described processing is performed by causing the computer to read and execute the program. Here, the computer-readable recording medium is a magnetic disk, a magneto-optical disk, a CD-ROM, a DVD-ROM, a semiconductor memory, or the like. A computer program implementing the function of the present invention may be delivered to the computer via a communication line so that the computer executes the computer program. The program described above may be for realizing a part of the function of the present invention. Further, the above-described program may be a so-called difference program (difference file) that can realize the function of the present invention in combination with a program already recorded in the computer system.
Finally, the present invention is not limited to the above-described embodiments and modifications but the present invention also includes design changes and modifications within the scope of the invention as defined in the appended claims. For example, the edge terminal devices 20, 201, 202 and the server terminal device 70 exemplify an information processing device which cooperates with the exhibition apparatus in the exhibition system.
The present invention is applied to an exhibition apparatus, a display control apparatus, and an exhibition system which are installed in a store and the like to exhibit a product and display an image and product explanation about the product, but the present invention is not limited thereto. The present invention can be widely applied to facilities such as warehouses and hospitals and social life infrastructures such as roads and public facilities.
Number | Date | Country | Kind |
---|---|---|---|
2015-162640 | Aug 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/074172 | 8/19/2016 | WO | 00 |