This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2022-052530 filed Mar. 28, 2022.
The present disclosure relates to an information processing device, an information processing method, and a non-transitory computer readable medium.
For example, Japanese Unexamined Patent Application Publication No. 2014-211769 describes an information processing device capable of editing a candidate image to be placed on the surface of an object, such as printed material to be affixed to the object. The information processing device includes a detecting means for detecting an object existing in a prescribed space, an acquiring means for acquiring shape information about the object detected by the detecting means, a projection controlling means for projecting a candidate image to be placed on the surface on the object onto the object detected by the detecting means, and a setting means for setting print settings information for causing a prescribed printer to print the candidate image.
Also, Japanese Patent No. 6911870 describes a display control device capable of executing an operation corresponding to a gesture with consideration for the usage situation and intention of a user. The display control device includes a determination unit that determines the content of a gesture from an operation performed on a display surface and an effect specifying unit that designates an effect setting the arrangement direction of display objects displayed on the display surface, the effect corresponding to the gesture determined by the determination unit, and also designates the display object(s) to be targeted by the effect. The effect specifying unit causes the display object(s) to be displayed on the basis of the placement position of a physical object placed on the display surface in the case where the operation has a prescribed positional relationship with the physical object, and causes the display object(s) to be displayed on top of the physical object in the case where the operation is an operation of drawing a line along the physical object.
Incidentally, in cases where it is desirable to print an image to be inserted into or applied onto a desired real object, it may be desirable to print the image at a size matching the size of the real object.
There is a technology for projecting a single piece of image data onto a single real object and adjusting properties such as the size of the image data. However, in the case where there are multiple real objects onto which image data is to be projected, no consideration is given regarding how a desired real object is to be selected.
Aspects of non-limiting embodiments of the present disclosure relate to providing an information processing device, method, and non-transitory computer-readable medium that allow a desired real object to be selected in the case where image data is projected onto a desired real object and printed after confirming properties such as the size of the image data, even if there are multiple real objects onto which image data is to be projected.
Aspects of certain non-limiting embodiments of the present disclosure address the above advantages and/or other advantages not described above. However, aspects of the non-limiting embodiments are not required to address the advantages described above, and aspects of the non-limiting embodiments of the present disclosure may not address advantages described above.
According to an aspect of the present disclosure, there is provided an information processing device including a processor configured to: acquire image data; detect a position on a predetermined stage of each of a plurality of real objects placed on the stage, the plurality of real objects being candidates onto which the image data is to be projected; receive a selection of a real object as a projection target from among the plurality of real objects according to content of an operation performed by a user on the plurality of real objects and the position of each of the plurality of real objects on the stage; and cause the image data to be printed after projecting the image data onto the selected real object according to the position of the selected real object on the stage.
Exemplary embodiments of the present disclosure will be described in detail based on the following figures, wherein:
Hereinafter, exemplary embodiments for carrying out the present disclosure will be described in detail and with reference to the drawings. Note that components and processes whose operations, actions, and functions serve the same role are denoted with the same signs throughout all drawings, and a duplicate description thereof may be reduced or omitted, as appropriate. Each drawing is merely a schematic illustration used as an aid to sufficiently understand the technology according to the present disclosure. Thus, the technology according to the present disclosure is not limited to the illustrated examples alone. Moreover, in the exemplary embodiments, a description may be reduced or omitted for well-known features and features not directly related to the present disclosure.
As illustrated in
The image capture unit 18 is a camera, preferably a document camera, for example, and continuously captures images of the real object 40 placed on the stage 30. In addition, the image capture unit 18 continuously captures images of a user's hand raised above the real object 40 on the stage 30.
The projection unit 19 is a projector, preferably a short-throw projector, for example, and projects user-desired image data onto the real object 40 placed on the stage 30. In addition, the projection unit 19 projects a user interface (UI) screen for receiving operating input from the user onto the stage 30.
The printing unit 20 is an inkjet printer or an electrophotographic system, for example, and prints an image based on image data projected onto the real object 40 and edited by the user onto a recording medium such as paper.
The information processing device 10 according to the present exemplary embodiment is provided with a function for performing a process (hereinafter referred to as the “image projection and printing process”) in which, in the case of printing a user-desired image to be inserted into or applied onto the real object 40, the information processing device 10 adjusts the size and the like of the image data while projecting the image data onto the real object 40, and then prints an image based on the adjusted image data. Note that the information processing device 10 may also include functions such as a scan function, a print function, and a copy function. The copy function, print function, and scan function are examples of image processing in the information processing device 10.
As illustrated in
As illustrated in
As illustrated in
As illustrated in
The CPU 11, ROM 12, RAM 13, and I/O 14 are interconnected via a bus. Functional units including the storage unit 15, display unit 16, external I/F 17, image capture unit 18, projection unit 19, printing unit 20, and short-range communication unit 21 are connected to the I/O 14. Each of these functional units is capable of bidirectional communication with the CPU 12 via the I/O 14.
The CPU 11, ROM 12, RAM 13, and I/O 14 form a control device. The control device may be configured as a sub-controller that controls a subset of operations of the information processing device 10, or may be configured as a main controller that controls all operations of the information processing device 10. An integrated circuit such as a large-scale integration (LSI) chip or an integrated circuit (IC) chipset is used for some or all of the blocks of the control device, for example. A discrete circuit may be used for each of the above blocks, or a circuit integrating some or all of the above blocks may be used. The above blocks may be provided together as a single body, or some blocks may be provided separately. Also, a part of each of the above blocks may be provided separately. The integration of the control device is not limited to LSI, and a dedicated circuit or a general-purpose processor may also be used.
For the storage unit 15, a hard disk drive (HDD), a solid-state drive (SSD), flash memory, or the like is used, for example. An information processing program 15A for executing the image projection and printing process according to the present exemplary embodiment is stored in the storage unit 15. Note that the information processing program 15A may be stored in the ROM 12.
The information processing program 15A may be preinstalled in the information processing device 10, for example. The information processing program 15A may also be achieved by being stored on a non-volatile storage medium or distributed over a network, and appropriately installed in the information processing device 10. Note that anticipated examples of the non-volatile storage medium include a Compact Disc-Read-Only Memory (CD-ROM), a magneto-optical disc, an HDD, a Digital Versatile Disc-Read-Only Memory (DVD-ROM), flash memory, a memory card, and the like.
In the present exemplary embodiment, a UI screen enabling the user to perform operations is projected onto the stage 30 by the projection unit 19, but it is also possible to provide a separate display unit 16 and display a partial UI screen on the display unit 16. For the display unit 16, a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, or the like is used, for example. The display 16 includes an integrated touch panel. The display unit 16 is provided in a portion of the stage 30, for example. The display unit 16 acts as an operating panel that receives various instructions related to the image projection and printing process from the user of the information processing device 10. Additionally, the display 16 displays various information such as the results of processes executed according to instructions received from the user, notifications about processes, and the like.
The external I/F 17 connects to a portable storage medium such as Universal Serial Bus (USB) memory or a memory card carried by the user, and inputs image data stored in advance on the portable storage medium into the information processing device 10.
As described above, the image capture unit 18 is a camera, preferably a document camera, for example, and continuously captures images of the real object 40 placed on the stage 30. In addition, the image capture unit 18 continuously captures images of a user's hand raised above the real object 40 on the stage 30.
As described above, the projection unit 19 is a projector, preferably a short-throw projector, for example, and projects user-desired image data onto the real object 40 placed on the stage 30. In addition, the projection unit 19 projects a UI screen for receiving operating input from the user onto the stage 30.
As described above, the printing unit 20 is an inkjet printer or an electrophotographic system, for example, and prints an image based on image data projected onto the real object 40 and edited by the user onto a recording medium such as paper.
In the case where the system for forming images is an electrophotographic system, the printing unit 20 includes a photoreceptor drum, a charging device, an exposure device, a developing device, a transfer device, and a fusing device. The charging device applies a voltage to the photoreceptor drum to charge the surface of the photoreceptor drum. The exposure device forms an electrostatic latent image on the photoreceptor drum by exposing the photoreceptor drum charged by the charging device with light corresponding to image data. The developing device forms a toner image on the photoreceptor drum by developing with toner the electrostatic latent image formed on the photoreceptor drum. The transfer device transfers the toner image formed on the photoreceptor drum onto paper. The fusing device fuses the transferred toner image to the paper with heat and pressure.
The short-range communication unit 21 communicates with a mobile terminal (such as a smartphone, for example) carried by the user via short-range wireless communication, and inputs image data stored in advance in the mobile terminal into the information processing device 10. A communication standard such as near field communication (NFC) or Bluetooth® is implemented in the short-range communication unit 21, for example. NFC refers to a short-range wireless communication technology using a frequency of 13.56 MHz over a distance of about 10 cm.
Incidentally, as described above, in the case where there are multiple real objects 40 onto which image data is to be projected, no consideration is given regarding how a desired real object is to be selected.
Accordingly, the information processing device 10 according to the present exemplary embodiment detects the position on the stage 30 of each of the multiple real objects 40 treated as a candidate for the projection of image data, receives the selection of a real object 40 as a projection target from among the multiple real objects 40 according to the content of an operation performed by the user on the multiple real objects 40 and the position on the stage 30 of each of the multiple real objects 40, projects image data onto the selected real object 40 according to the position on the stage 30 of the selected real object 40, and then prints the image data.
Specifically, the CPU 11 of the information processing device 10 according to the present exemplary embodiment functions as each of the units illustrated in
As illustrated in
An image database (DB) 151 and an image association table 152 are stored in the storage unit 15. The image DB 151 stores image data acquired from a portable storage medium or mobile terminal carried by the user and also stores edited image data that has been edited in the past. Also, the image association table 152 is a data table for registering correspondence relationships between image data and real objects onto which image data is projected. For example, identification information expressing an “image of a dog” and identification information expressing a “fan” are registered in association with each other in the image association table 152.
The acquisition unit 11A acquires image data to be projected from among the image data obtained from the portable storage medium or mobile terminal carried by the user.
The object detection unit 11B detects multiple real objects 40 placed on the stage 30. At this time, the object detection unit 11B detects the position on the stage 30 of each of the multiple real objects 40. Specifically, the object detection unit 11B detects the multiple real objects 40 by analyzing an image of the multiple real objects 40 captured by the image capture unit 18. The positions of the multiple real objects 40 are expressed as coordinates in relation to a reference point set to a specific position (center, upper-left corner, lower-left corner, upper-right corner, lower-right corner) in an image of the stage 30 captured by the image capture unit 18, for example.
The object detection unit 11B may capture and detect the outline of a real object 40. By printing an outline onto paper together with an image based on image data corresponding to a real object 40, the outline may be used as an image cutting line.
The operation detection unit 11C detects the content of an operation performed by the user on the multiple real objects 40. The operation performed by the user is any of an operation performed as a gesture by the user, an operation performed by the user on a button projected onto the stage 30, an operation performed as speech spoken by the user, or an operation performed according to the user's line of sight, for example. In the case of a gesture operation, a gesture operation performed by the user's hand is detected by analyzing an image of the user's hand raised above the real objects 40 captured by the image capture unit 18.
The selection unit 11D receives the selection of a real object 40 as a projection target from among the multiple real objects 40 according to the content of the operation performed by the user on the multiple real objects 40 and the position of each of the multiple real objects 40 on the stage 30.
The projection control unit 11E controls the projection unit 19 to project image data acquired by the acquisition unit 11A onto the real object 40 whose selection is received by the selection unit 11D, such that the image data is projected according to the position of the selected real object 40 on the stage 30. At this time the projection control unit 11E receives an edit operation by the user with respect to the image data. Known technology may be applied to the editing of the projected image data, and for example, editing such as enlarging, reducing, moving, rotating, and adjusting the position of a projected image is possible in accordance with user operations. Note that the projected image data may be edited directly or projected and edited separately on an edit screen.
The print control unit 11F controls the printing unit 20 to print the edited image data projected by the projection unit 19 in accordance with a designated paper size (such as A4, B5, or postcard, for example) and paper type (such as plain paper, card stock, label paper, or sticker paper, for example).
The payment unit 11G executes payment (electronic payment) using the user's mobile phone or IC card, for example, for the printed material obtained from the printing by the printing unit 20.
Next, the image projection and printing process by the information processing device 10 according to the present exemplary embodiment will be described specifically with reference to
In (S1) of
In (S2), multiple real objects 40A, 40B, and 40C (in the example of
In (S3), the image data 50 selected in (S1) above is projected onto the central real object 40B among the multiple real objects 40A, 40B, and 40C placed on the stage 30 in (S2) above.
In (S4), a gesture operation performed by the user's hand 60 (hereinafter also referred to as the “user's finger 60” or the “user's hands 60”) is detected, and the projection target of the image data 50 is moved from the real object 40B to the real object 40C.
In (S5), a gesture operation performed by the user's hand 60 is detected, edits to the projected image data 50 are received, and the edited image data is added to a print job.
In (S6), if the user selects desired image data to print from a list of images registered as print jobs and presses a “Print” button, the selected image data is printed.
As illustrated in
As illustrated in
Next,
First, if the information processing device 10 is instructed to execute the image projection and printing process, the information processing program 15A is launched by the CPU 11 and the following steps are executed.
In step S101 of
In step S102, the CPU 11 projects an image list, that is, a list of the image data acquired in step S101, onto the stage 30, and receives the selection of desired image data to be printed from the user.
In step S103, the CPU 11 detects, for example, the multiple real objects 40A to 40C placed on the stage 30 by the user as illustrated in
In step S104, the CPU 11 projects the image data 50 selected in step S102 onto the stage 30 at a default position as illustrated in
In step S105, the CPU 11 detects a user operation on the multiple real objects 40A to 40C. For example, as illustrated in
In step S106, the CPU 11 receives the selection of a real object 40 to be treated as the projection target from among the multiple real objects 40A to 40C according to the user operation detected in step S105.
In step S107, the CPU 11 projects the image data 50 onto the real object 40 selected in step S106. Here, as illustrated in
Gesture operations performed by the user's hand 60 will be described specifically with reference to
As illustrated in
As illustrated in
As illustrated in
Next, button operations performed by the user's hand 60 will be described specifically with reference to
As illustrated in
As illustrated in
If the projection target has been confirmed as above, the image data to be projected may also be changed using a button operation, as illustrated in
The message display field 71, the “No” button 72, and the “Yes” button 73 are projected as transparent objects onto the stage 30 illustrated in
In the example in
Returning to
In step S109, the CPU 11 causes the image data 50 edited in step S108 to be printed in accordance with a designated paper size (such as A4, B5, or postcard, for example) and paper type (such as plain paper, card stock, label paper, or sticker paper, for example). Thereafter, the series of processes according to the information processing program 15A ends.
Note that the above describes a configuration in which gesture operations and button operations performed by the user are used when selecting a desired real object 40 as the projection target from among multiple real objects 40, but similar selection operations may also be performed as operations according to speech spoken by the user. In this case, the information processing device 10 may be provided with a speech recognition function. Moreover, similar selection operations may also be performed as operations according to the user's line of sight. In this case, the information processing device 10 may be provided with a gaze detection function.
In this way, according to the present exemplary embodiment, a desired real object is selected in the case where image data is projected onto a desired real object and printed after confirming properties such as the size of the image data, even if there are multiple real objects onto which image data is to be projected.
The second exemplary embodiment describes a configuration in which a projected image is resized automatically to match the size of a real object.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10A”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10A according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
The object detection unit 11B detects the size of each of multiple real objects 40.
If the image data 50 is being projected onto a first real object among the multiple real objects 40 and the selection of a second real object different from the first real object is received according to a user operation, the projection control unit 11E controls the projection onto the second real object such that the size of the image data is adjusted to match the size of the second real object.
As illustrated in
In this way, according to the present exemplary embodiment, projected image data is resized to match the size of a real object. Consequently, the workload for resizing image data is reduced.
The third exemplary embodiment describes a configuration that automatically selects the real object onto which image data is to be projected.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10B”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10B according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
In a case where previously edited image data similar to the image data 50 exists and a real object similar to the real object previously associated with the edited image data exists among the multiple real objects 40, the projection control unit 11E controls the projection of at least one of the image data 50 or the similar edited image data onto the similar real object among the multiple real objects 40. Note that the similar edited image data is image data obtained by editing the image data 50, for example, and is image data based on the same original image data. For example, edited image data may be assigned an identification number to indicate that the original image data is the same. The edited image data is stored in the image DB 151. Also, for the determination of similarity, a method such as pattern matching or machine learning may be used, for example.
Also, when determining a similar real object, the object detection unit 11B detects the shape of each of the multiple real objects 40. On the basis of the results detected by the object detection unit 11B, the projection control unit 11E determines whether a real object having a shape similar to the real object previously associated with the edited image data exists among the multiple real objects 40. For the determination of similarity, a method such as pattern matching or machine learning is used, for example. Also, a real object may be determined to be similar if information specifying the real object is the same. In other words, if the real object 40 is a “fan”, for example, a real object having information indicating “fan” is determined to be similar. Information specifying a real object previously associated with edited image data is obtained from the image association table 152.
Next,
First, if the information processing device 10 is instructed to execute the automatic selection process, the information processing program 15A is launched by the CPU 11 and the following steps are executed.
In step S111 of
In step S112, the CPU 11 determines whether the image DB 151 includes edited image data similar to the image data 50 selected in step S111. If the image DB 151 is determined to include similar image data (if the determination is positive), the flow proceeds to step S113, whereas if the image DB 151 is determined not to include similar image data (if the determination is negative), the flow proceeds to step S116.
In step S113, the CPU 11 references the image association table 152 on the basis of the edited image data similar to the image data 50.
In step S114, the CPU 11 determines, from the image association table 152 referenced in step S113, whether a real object having a shape similar to a real object previously associated with the edited image data exists among the multiple real objects 40A to 40C placed on the stage 30. If a similar real object is determined to exist (if the determination is positive), the flow proceeds to step S115, whereas if a similar real object is determined not to exist (if the determination is negative), the flow proceeds to step S116.
In step S115, the CPU 11 projects the image data 50 selected in step S111 onto the similar real object 40A as illustrated in
On the other hand, in step S116, the CPU 11 projects the image data 50 selected in step S111 onto the default position of the stage 30 as illustrated in
At this point, if there are multiple pieces of similar edited image data, the projection control unit 11E may cause the multiple pieces of edited image data to be projected together with the image data 50 or instead of the image data 50 as illustrated in
In this way, according to the present exemplary embodiment, the real object onto which image data is to be projected is selected automatically. Consequently, the user does not have to select a real object manually, and the workload for selecting a real object is reduced.
The fourth exemplary embodiment describes a configuration in which multiple projection target candidates are handled collectively as a single projection target.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10C”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10C according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
The projection control unit 11E groups multiple real objects 40 together as a single projection target according to a user operation, and causes the image data 50 to be projected onto the single grouped projection target.
As illustrated in
As illustrated in
In this way, according to the present exemplary embodiment, multiple real objects are grouped and handled as a single projection target. Consequently, it is possible to project image data onto the single projection target obtained by grouping multiple real objects.
The fifth exemplary embodiment describes a configuration in which a projected image automatically tracks the projection target when the real object set as the projection target is moved or rotated.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10D”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10D according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
If a first real object among multiple real objects 40 is moved or rotated while the image data 50 is being projected onto the first real object, the projection control unit 11E moves or rotates the image data 50 to track the first real object.
In (S11) of
In (S12), the user moves the real object 40A in the direction of the arrow (to the right in the example of
In (S13), the image data 50 moves to follow the real object 40A.
In (S21) of
In (S22), the user rotates the real object 40A in the direction of the arrow (clockwise in the example of
In (S23), the image data 50 rotates to follow the real object 40A.
The projection control unit 11E may also disable tracking if the user performs a predetermined operation. Note that the predetermined operation may be an operation in which the user raises both hands over the stage 30 for a fixed time, for example. In this case, the projection control unit 11E causes the background color projected onto the stage 30 when tracking is enabled to be different from the background color projected onto the stage 30 when tracking is disabled. The background color herein refers to the color of a background portion other than an image portion in the entire projected data.
In (S31) of
In (S32), the automatic tracking by the image data 50 is disabled and the background color projected onto the stage 30 is changed to clearly indicate that automatic tracking is disabled. In the example of
In (S33), if automatic tracking by the image data 50 is disabled, the image data 50 does not track the real object 40A even if the real object 40A is moved or rotated by the user's hand 60.
In this way, according to the present exemplary embodiment, the projected image automatically tracks the projection target when the real object set as the projection target is moved or rotated. Consequently, it is also possible to accommodate situations in which the user makes an adjustment by moving the real object rather than moving the projected image.
The sixth exemplary embodiment describes a configuration in which the projection of an edited image list screen is controlled.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10E”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10E according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
The projection control unit 11E controls the projection, onto the stage 30, of an edited image list screen, that is, a list of edited image data obtained in the past. In this case, the projection control unit 11E treats edited image data selected from the edited image list screen as the image data to be projected. For example, as illustrated in
Also, if a predetermined user operation is received with respect to edited image data selected from the edited image list screen, the projection control unit 11E may also control the projection of unedited image data corresponding to the edited image data.
As illustrated in
As illustrated in
In this way, according to the present exemplary embodiment, an edited image list screen is projected, thereby enabling the user to select edited image data as the image data to be projected.
The seventh exemplary embodiment describes a configuration in which the projection of a projected image selection screen is controlled.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10F”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10F according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
The projection control unit 11E controls the projection, onto the stage 30, of a projected image selection screen including an image list, that is, a list of image data containing the image data currently being projected. When adding image data to be projected, the projection control unit 11E adds image data selected from the projected image selection screen as the image data to be projected. For example, as illustrated in
Also, if a predetermined user operation is received with respect to projected image data included on the projected image selection screen, the projection control unit 11E may also exclude currently projected image data from projection.
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
In this way, according to the present exemplary embodiment, a projected image selection screen is projected, thereby enabling the user to add or remove image data to be projected.
The eighth exemplary embodiment describes a configuration in which the projection of a projected object list screen is controlled.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10G”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10G according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
The projection control unit 11E projects a projected object list screen including an image list, that is, a list of image data, onto a region of the stage 30 other than the region where multiple real objects 40 are placed, and also controls the projection of the image data included on the image list onto the region of the stage 30 where the multiple real objects 40 are placed. For example, as illustrated in
The projection control unit 11E may also be configured such that when image data is selected from the projected object list screen, the appearance is changed for the image data corresponding to the selected image data from among the image data being projected in the region where the multiple real objects 40 are placed on the stage 30.
The projection control unit 11E may also be configured such that when multiple pieces of image data are selected from the projected object list screen, the multiple pieces of image data are grouped together as a single piece of projected image data.
As illustrated in
As illustrated in
As illustrated in
In the case of grouping multiple pieces of projected image data into one, as illustrated in
As illustrated in
As illustrated in
As illustrated in
In this way, according to the present exemplary embodiment, a projected object list screen is projected, thereby enabling the user to select a real object to set as the projection target while looking at the image data being projected.
The ninth exemplary embodiment describes a configuration in which the projection of a print execution screen and a payment screen is controlled.
The CPU 11 of the information processing device according to the present exemplary embodiment (hereinafter referred to as the “information processing device 10H”) functions as an acquisition unit 11A, an object detection unit 11B, an operation detection unit 11C, a selection unit 11D, a projection control unit 11E, a print control unit 11F, and a payment unit 11G, similarly to the information processing device 10 described in the first exemplary embodiment above. Hereinafter, the points where the information processing device 10H according to the present exemplary embodiment differs from the information processing device 10 according to the first exemplary embodiment above will be described.
The projection control unit 11E controls the projection, onto the stage 30, of a print execution screen including an image list, that is, a list of image data to be printed. For example, as illustrated in
The projection control unit 11E may also control the projection, onto the stage 30, of a payment screen including an image list, that is, a list of image data to be printed.
As illustrated in
As illustrated in
The information processing device 10H according to the present exemplary embodiment automatically selects an optimal paper size when the user prints edited image data. On the other hand, if the user has a desired paper size, the following two cases are conceivable.
(1) If the optimal paper size is equal to or smaller than the user's desired paper size, the image data is printed on a single sheet of paper.
(2) If the optimal paper size is larger than the user's desired paper size, the image data is printed on multiple sheets of paper.
Consequently, for example, a “number of sheets” item may also be displayed, as illustrated on the print execution screen 120 in
As illustrated in
As illustrated in
In this way, according to the present exemplary embodiment, a print execution screen and a payment screen are projected, thereby enabling the user to print and pay for image data.
In the embodiments above, the term “processor” refers to hardware in a broad sense. Examples of the processor include general processors (e.g., CPU: Central Processing Unit) and dedicated processors (e.g., GPU: Graphics Processing Unit, ASIC: Application Specific Integrated Circuit, FPGA: Field Programmable Gate Array, and programmable logic device).
In the embodiments above, the term “processor” is broad enough to encompass one processor or plural processors in collaboration which are located physically apart from each other but may work cooperatively. The order of operations of the processor is not limited to one described in the embodiments above, and may be changed.
The above describes examples of an information processing device according to the exemplary embodiments. The exemplary embodiments may also be configured as a program that causes a computer to execute the functions of each component provided in the information processing device. The exemplary embodiments may also be configured as a computer-readable storage medium storing the program.
Otherwise, the configuration of the information processing device described in the exemplary embodiments above is an example, and may be modified according to circumstances within a range that does not depart from the gist.
Also, the process flow of the program described in the exemplary embodiments above is an example, and unnecessary steps may be removed, new steps may be added, or the processing sequence may be rearranged within a range that does not depart from the gist.
Also, the exemplary embodiments above describe a case in which the process according to the exemplary embodiments is achieved by a software configuration using a computer by executing a program, but the configuration is not limited thereto. The exemplary embodiments may also be achieved by a hardware configuration or by a combination of a hardware configuration and a software configuration, for example.
The foregoing description of the exemplary embodiments of the present disclosure has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the disclosure and its practical applications, thereby enabling others skilled in the art to understand the disclosure for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2022-052530 | Mar 2022 | JP | national |