One of the aspects of the embodiments relates to an information processing apparatus, an information processing system, an information processing method, and a storage medium.
Technologies that utilize virtual space to train users to operate equipment have recently been known. A model (avatar) to replace the user himself and a model (object) of virtual equipment used for the training are placed in the virtual space. Moving the avatar and model in the virtual space in accordance with the user movements can provide easy training in an environment that would be difficult to prepare for in real life due to constraints such as cost and safety. Japanese Patent Laid-Open No. 2022-47989 discloses a technology for reproducing a sterile room as the virtual space in the medical field.
Recently, in order to provide users with a more immersive virtual world, manufacturers of actual products sell officially certified three-dimensional data (actual product information) and display it as an object in the virtual space.
In a case where the object is displayed in the virtual space such as that disclosed in Japanese Patent Laid-open No. 2022-47989, the functions that can be executed with the object depend on the settings of the application that displays the virtual space. Thus, in a case where an actual product is displayed, user operations and function execution that are unavailable with the actual product may be available depending on the settings of the application. In this case, the user's immersion sense in the virtual space deteriorates.
An information processing apparatus according to one aspect of the embodiment for generating virtual space and a virtual object includes a memory storing instructions, and a processor configured to execute the instructions to display the virtual object in the virtual space, and execute a function of the virtual object in the virtual space based on function information about the virtual object acquired from the memory. An information processing apparatus according to another aspect of the embodiment for generating virtual space includes a memory storing instructions; and a processor configured to execute the instructions to display a model of a user and a virtual object in the virtual space, change at least one of the model and the virtual object to be displayed, and acquire user operation information. The processor changes at least one of an operation of the model in the virtual space or a display of the virtual object based on the user operation information and function information about the virtual object acquired from the memory. An information processing system including each of the above information processing apparatus also constitutes another aspect of the embodiment. An information processing method corresponding to each of the information processing apparatus also constitutes another aspect of the embodiment. A storage medium storing a program storing a program that causes a computer to execute the above information processing method also constitutes another aspect of the embodiment.
Further features of the disclosure will become apparent from the following description of embodiments with reference to the attached drawings.
In the following, the term “unit” may refer to a software context, a hardware context, or a combination of software and hardware contexts. In the software context, the term “unit” refers to a functionality, an application, a software module, a function, a routine, a set of instructions, or a program that can be executed by a programmable processor such as a microprocessor, a central processing unit (CPU), or a specially designed programmable device or controller. A memory contains instructions or programs that, when executed by the CPU, cause the CPU to perform operations corresponding to units or functions. In the hardware context, the term “unit” refers to a hardware element, a circuit, an assembly, a physical structure, a system, a module, or a subsystem. Depending on the specific embodiment, the term “unit” may include mechanical, optical, or electrical components, or any combination of them. The term “unit” may include active (e.g., transistors) or passive (e.g., capacitor) components. The term “unit” may include semiconductor devices having a substrate and other layers of materials having various concentrations of conductivity. It may include a CPU or a programmable processor that can execute a program stored in a memory to perform specified functions. The term “unit” may include logic elements (e.g., AND, OR) implemented by transistor circuits or any other switching circuits. In the combination of software and hardware contexts, the term “unit” or “circuit” refers to any combination of the software and hardware contexts as described above. In addition, the term “element,” “assembly,” “component,” or “device” may also refer to “circuit” with or without integration with packaging materials.
Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the disclosure.
Referring now to
In the computer 100, the user information storing unit 102 stores, for each user (user ID, user ID information), product name information that the user has the authority to execute in the virtual space and the actually selected product name information (user information).
In a case where a user selects a product to execute its function in the virtual space, the user authority determining unit 103 determines whether the user has the authority to execute the function of the product. In other words, the user authority determining unit 103 determines whether the user corresponding to the user identification information is authorized (or entitled) to execute the function of the product (virtual object). In a case where it is determined that the user is authorized, the computer 100 transmits the product name information selected by the user to the product information database 200 via the communication interface 101. The product information database 200 receives the product name information via a communication interface 201. A intra-database (DB) search unit 202 searches for actual products based on the received product name information. In a case where actual product information matching the received product name information is found, the intra-DB search unit 202 retrieves the actual product information. That is, the product information database 200 has a search unit for searching for an actual product corresponding to the virtual object, and in a case where the search unit searches for the actual product corresponding to the virtual object, the product information database 200 transmits the function information about the actual product corresponding to the virtual object to the computer 100. In this embodiment, the product information group 203 in the product information database 200 includes, for example, a camera information database 204, a lens information database 205, and an accessory information database 206, but is not limited to this example.
The retrieved product information is sent to the object information acquiring unit 104 in the computer 100 via the communication interfaces 201 and 101. The display unit 109 displays an avatar (a user's model in the virtual space) and an object (a product (virtual object) in the virtual space) based on the product information received from the product information database 200.
On the other hand, the user set value acquiring unit 106 receives a menu value set by the user and user operation information perceived by the user operation reading unit 107. The verifying unit 105 determines whether or not the user operation is available based on the actual product specification, in conjunction with the already acquired product information. That is, the verifying unit 105 uses the user operation information and the function information to determine whether the user operation is executable. In a case where the verifying unit 105 determines that the user operation is available, the verifying unit 105 sends the user operation information to the set value reflecting unit 108. Then, the function executing unit 110 executes the function based on the information from the set value reflecting unit 108. In a case where the product is a camera, the functions include starting to take still images or videos, zooming the displayed image, and connecting the lens and camera. In a case where the verifying unit 105 determines that the function specified by the user operation is unavailable in light of the product specification, the verifying unit 105 grays out the function on the application menu or prevents the switching or ringing from operating. In a case where the verifying unit 105 determines that the combination of the camera and lens cannot be connected due to their mechanical configuration, it prevents them from being connected.
Referring now to
First, in step S101, the verifying unit 105 determines whether or not the object (product) selected for display in the virtual reality application that is used by the user is included in the actual product information (whether or not the object having the actual product information is to be displayed). This determination is made based on information received from the product information database 200. In a case where it is determined that the object selected by the user is not included in the actual product information, the flow proceeds to step S102. In step S102, the display unit 109 displays a default object appearance in each virtual space application (displays an object in each application).
On the other hand, in a case where it is determined in step S101 that the object selected by the user is included in the actual product information, the flow proceeds to step S103. In step S103, the object information acquiring unit 104 acquires external shape data (product external shape information) such as the position, orientation, and size of the object from the product information database 200. Next, in step S104, the display unit 109 displays the actual product (actual or real object) in the virtual reality.
Next, in step S105, the verifying unit 105 determines whether or not the user reproduces the function of the actual object (actual or real product) in the virtual reality (whether or not to ensure functional consistency with the user operation (object displayed in S104)). In a case where it is determined that the function of the actual object is not to be reproduced (the functional consistency between the user operation and the actual object is not ensured), the flow proceeds to step S106. In step S106, the function executing unit 110 executes the function of the object determined within each application.
On the other hand, in a case where it is determined in step S105 that the function of the actual object is to be reproduced (functional consistency between the user operation and the actual object is ensured), the flow proceeds to step S107. In step S107, the object information acquiring unit 104 acquires function information (product function information about the displayed object) from the product information database 200. Next, in step S108, the verifying unit 105 verifies the user operation information separately acquired by the user operation reading unit 107 and the product function information acquired in step S107, and recognizes the content of the user operation information. The user operation information includes, for example, menu value change information, button or ring operations acquired from the user's hand movement, and the like.
Next, in step S109, the verifying unit 105 determines whether the user operation is an operation that is unavailable according to the product specification of the object. In a case where it is determined that the user operation is unavailable according to the product specification of the object, the flow proceeds to step S110. In step S110, the function executing unit 110 does not execute the function of the object based on the user operation. At this time, the function executing unit 110 can execute a predetermined function (a function corresponding to the setting of the application, and all functions executable in the application).
On the other hand, in a case where it is determined in step S109 that the user operation is an operation that is available according to the product specification of the object, the flow proceeds to step S111. In step S111, the function executing unit 110 executes the object function (object operation corresponding to the actual product) based on the user operation. That is, the function executing unit 110 executes a function (function corresponding to function information) based on the function information about the virtual object (product function information) acquired from the product information database 200.
For example, assume that a user operates a zoom ring of a lens in the real world, as illustrated in
For example, in a case where the lens selected by the user has an AF/MF switch, during AF, autofocusing is made on the user's desired position within a range of the lens performance and the currently selected menu value and the image blur increases as the position becomes more distant from the focal length. On the other hand, in a case where the MF button is selected or an MF dedicated lens is attached, the focus position is maintained constant and autofocusing is not made on the user's desired position. At that time, information such as a focal length, an F-number, and a pixel pitch, is also verified.
Assume blur reproduction. In capturing a still image in the virtual world, blurring does not occur as in reality, but in a case where the user selects to maintain functional consistency in step S105, the image is blurred by the user's camera shake or shutter pressing action. In a case where the lens or camera has an image stabilizing function and the user turns on this function, the camera shake is suppressed within the performance range.
In a case where a user performs slow-motion or fast-motion recording, a combination of image resolution and frame rate value installed in the actual camera is acquired and reflected in a recorded image. In a case where the frame rate of the virtual space application being executed is lower than the frame rate of the actual camera, a warning will be issued to that effect and the execution will be restricted.
This embodiment trains users according to the actual specification of the product being used in the virtual space.
A description will now be given of a second embodiment.
Referring now to
In a case where it is determined in step S205 that the function of the actual object is not to be reproduced (functional consistency between the user operation (object displayed in step S204) and the actual object is not ensured), the flow proceeds to step S206. In step S206, the display unit 508 maintains (continues) displaying the currently displayed avatar motion and object appearance.
On the other hand, in a case where it is determined in step S205 that the function of the actual object is to be reproduced (functional consistency between the user operation and the actual object is ensured), the flow proceeds to step S207. In step S207, the avatar motion change determining unit 506 determines whether the user operation is an operation that is to change the avatar display. In a case where it is determined that the user operation is not an operation that is not to change the avatar display, the flow proceeds to step S208. In step S208, the display unit 508 maintains (continues) displaying the currently displayed avatar motion and object appearance. On the other hand, in a case where it is determined in step S207 that the user operation is an operation that is to change the avatar display, the flow proceeds to step S209. In step S209, the avatar motion change determining unit 506 changes the avatar motion
For example, the camera is commonly mounted on a tripod to prevent image blur in performing slow shutter imaging in the real world. In the virtual world, if the avatar in the virtual space is holding a camera even though the user is trying to capture an image with a slow shutter, the avatar motion may be changed.
In this example, in step S209, as illustrated in
Next, in step S210, the verifying unit 504 acquires product function information about the displayed object from the product information database 200. Next, in step S211, the verifying unit 504 verifies the user operation information and the acquired product function information. Next, in step S212, the avatar motion change determining unit 506 and the object change determining unit 507 determine whether or not the avatar motion and object display (product display) are to be changed, respectively, based on the verification result of the verifying unit 504. In a case where it is determined that the avatar motion and object display (product display) are not to be changed, the flow proceeds to step S213. In step S213, the display unit 508 maintains (continues) the avatar motion and object display. On the other hand, in a case where it is determined that the avatar motion and object display (product display) are to be changed, the flow proceeds to step S214. In step S214, the display unit 508 changes the avatar motion and object display.
In a case where there are a plurality of product candidates that can be selected in response to a user operation, it is necessary to select which product to replace with. At this time, the computer 500 may automatically select the product, or the user may select the product. In the case of the user selection, the priority of replacement products may be previously determined by the user instead of newly determining a replaced product for each user operation. In this case, the object change determining unit 507 changes the display of the virtual object (product) based on the priority previously registered by the user. Thereby, a series of user actions and avatar actions can be brought closer to real-world imaging actions, and the user's sense of immersion can be further enhanced.
Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer-executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer-executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read-only memory (ROM), a storage of distributed computing systems, an optical disc (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the disclosure has been described with reference to embodiments, it is to be understood that the disclosure is not limited to the disclosed embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Each embodiment can provide an information processing apparatus that can realize a virtual space environment that enhances the user's sense of immersion.
This application claims the benefit of Japanese Patent Application No. 2022-200868, filed on Dec. 16, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2022-200868 | Dec 2022 | JP | national |