INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM

Information

  • Patent Application
  • 20240201822
  • Publication Number
    20240201822
  • Date Filed
    November 01, 2023
    8 months ago
  • Date Published
    June 20, 2024
    8 days ago
Abstract
An information processing apparatus for generating virtual space and a virtual object includes a memory storing instructions, and a processor configured to execute the instructions to display the virtual object in the virtual space, and execute a function of the virtual object in the virtual space based on function information about the virtual object acquired from the memory.
Description
BACKGROUND
Technical Field

One of the aspects of the embodiments relates to an information processing apparatus, an information processing system, an information processing method, and a storage medium.


Description of Related Art

Technologies that utilize virtual space to train users to operate equipment have recently been known. A model (avatar) to replace the user himself and a model (object) of virtual equipment used for the training are placed in the virtual space. Moving the avatar and model in the virtual space in accordance with the user movements can provide easy training in an environment that would be difficult to prepare for in real life due to constraints such as cost and safety. Japanese Patent Laid-Open No. 2022-47989 discloses a technology for reproducing a sterile room as the virtual space in the medical field.


Recently, in order to provide users with a more immersive virtual world, manufacturers of actual products sell officially certified three-dimensional data (actual product information) and display it as an object in the virtual space.


In a case where the object is displayed in the virtual space such as that disclosed in Japanese Patent Laid-open No. 2022-47989, the functions that can be executed with the object depend on the settings of the application that displays the virtual space. Thus, in a case where an actual product is displayed, user operations and function execution that are unavailable with the actual product may be available depending on the settings of the application. In this case, the user's immersion sense in the virtual space deteriorates.


SUMMARY

An information processing apparatus according to one aspect of the embodiment for generating virtual space and a virtual object includes a memory storing instructions, and a processor configured to execute the instructions to display the virtual object in the virtual space, and execute a function of the virtual object in the virtual space based on function information about the virtual object acquired from the memory. An information processing apparatus according to another aspect of the embodiment for generating virtual space includes a memory storing instructions; and a processor configured to execute the instructions to display a model of a user and a virtual object in the virtual space, change at least one of the model and the virtual object to be displayed, and acquire user operation information. The processor changes at least one of an operation of the model in the virtual space or a display of the virtual object based on the user operation information and function information about the virtual object acquired from the memory. An information processing system including each of the above information processing apparatus also constitutes another aspect of the embodiment. An information processing method corresponding to each of the information processing apparatus also constitutes another aspect of the embodiment. A storage medium storing a program storing a program that causes a computer to execute the above information processing method also constitutes another aspect of the embodiment.


Further features of the disclosure will become apparent from the following description of embodiments with reference to the attached drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an information processing system according to a first embodiment.



FIG. 2 illustrates an example of user information in the first embodiment.



FIG. 3 illustrates an example of an information database in the first embodiment.



FIG. 4 is a flowchart illustrating the processing of an information processing apparatus according to the first embodiment.



FIGS. 5A and 5B illustrate examples of user operations according to the first embodiment.



FIG. 6 explains avatar motion and object display in the first embodiment.



FIG. 7 is a block diagram of an information processing system according to a second embodiment.



FIG. 8 is a flowchart illustrating the processing of an information processing apparatus according to the second embodiment.



FIG. 9 explains avatar motion and object display in the second embodiment.



FIGS. 10A and 10B explain avatar motion and object display in the second embodiment.





DESCRIPTION OF THE EMBODIMENTS

In the following, the term “unit” may refer to a software context, a hardware context, or a combination of software and hardware contexts. In the software context, the term “unit” refers to a functionality, an application, a software module, a function, a routine, a set of instructions, or a program that can be executed by a programmable processor such as a microprocessor, a central processing unit (CPU), or a specially designed programmable device or controller. A memory contains instructions or programs that, when executed by the CPU, cause the CPU to perform operations corresponding to units or functions. In the hardware context, the term “unit” refers to a hardware element, a circuit, an assembly, a physical structure, a system, a module, or a subsystem. Depending on the specific embodiment, the term “unit” may include mechanical, optical, or electrical components, or any combination of them. The term “unit” may include active (e.g., transistors) or passive (e.g., capacitor) components. The term “unit” may include semiconductor devices having a substrate and other layers of materials having various concentrations of conductivity. It may include a CPU or a programmable processor that can execute a program stored in a memory to perform specified functions. The term “unit” may include logic elements (e.g., AND, OR) implemented by transistor circuits or any other switching circuits. In the combination of software and hardware contexts, the term “unit” or “circuit” refers to any combination of the software and hardware contexts as described above. In addition, the term “element,” “assembly,” “component,” or “device” may also refer to “circuit” with or without integration with packaging materials.


Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the disclosure.


First Embodiment

Referring now to FIG. 1, a description will be given of a first embodiment. FIG. 1 is a block diagram of an information processing system 10. The information processing system 10 includes a computer (information processing apparatus) 100 and a product information database (information storing apparatus) 200. The computer 100 is, for example, an apparatus such as a head mount display (HMD) for displaying virtual space to a user. The computer 100 includes a communication interface 101, a user information storing unit (identification (ID) information storing unit (memory) for storing user ID information) 102, a user authority determining unit (second determining unit) 103, an object information acquiring unit 104, a verifying unit (first determining unit) 105, a user set value acquiring unit 106, a user operation reading unit (operation information acquiring unit for acquiring user information) 107, a set value reflecting unit 108, a display unit 109, and a function executing unit 110.


In the computer 100, the user information storing unit 102 stores, for each user (user ID, user ID information), product name information that the user has the authority to execute in the virtual space and the actually selected product name information (user information). FIG. 2 is an example of the user information. As illustrated in FIG. 2, in a case where the product is an image pickup apparatus, the product name information includes information about the camera, lens, and accessory.


In a case where a user selects a product to execute its function in the virtual space, the user authority determining unit 103 determines whether the user has the authority to execute the function of the product. In other words, the user authority determining unit 103 determines whether the user corresponding to the user identification information is authorized (or entitled) to execute the function of the product (virtual object). In a case where it is determined that the user is authorized, the computer 100 transmits the product name information selected by the user to the product information database 200 via the communication interface 101. The product information database 200 receives the product name information via a communication interface 201. A intra-database (DB) search unit 202 searches for actual products based on the received product name information. In a case where actual product information matching the received product name information is found, the intra-DB search unit 202 retrieves the actual product information. That is, the product information database 200 has a search unit for searching for an actual product corresponding to the virtual object, and in a case where the search unit searches for the actual product corresponding to the virtual object, the product information database 200 transmits the function information about the actual product corresponding to the virtual object to the computer 100. In this embodiment, the product information group 203 in the product information database 200 includes, for example, a camera information database 204, a lens information database 205, and an accessory information database 206, but is not limited to this example.



FIG. 3 illustrates an example of the product information group 203 stored in the product information database 200. As illustrated in FIG. 3, the product information group 203 includes information about the size and appearance of each actual product and information about the function of each product. Information about the size and appearance includes information about the length and width of the product, the arrangement of various buttons, the position of a switch, the position of an LCD monitor, the type of a mount, etc. In the camera information database 204, the information about functions includes, for example, a recordable frame rate value, the number and speed of continuous imaging, an object detecting function, an autofocus (AF) frame display function, and the like. In the lens information database 205, information about functions includes, for example, information indicating whether the lens is a fixed focal length focus lens or a lens with a zoom function, information indicating whether the lens has a manual focus (MF) function only or further has an AF function, a focal length, a mount type, an angle of view, and the shortest imaging distance information, etc. In the accessory information database 206, information about functions includes, for example, an adapter function that enables connection between different mounts of the camera and lens, a remote imaging function that allows the avatar to capture an image with a remote camera, and the like. In the accessory information database 206, the information about the function may be information about a camera fixing function (tripod) that suppresses the shake of the camera body when the shutter is released.


The retrieved product information is sent to the object information acquiring unit 104 in the computer 100 via the communication interfaces 201 and 101. The display unit 109 displays an avatar (a user's model in the virtual space) and an object (a product (virtual object) in the virtual space) based on the product information received from the product information database 200.


On the other hand, the user set value acquiring unit 106 receives a menu value set by the user and user operation information perceived by the user operation reading unit 107. The verifying unit 105 determines whether or not the user operation is available based on the actual product specification, in conjunction with the already acquired product information. That is, the verifying unit 105 uses the user operation information and the function information to determine whether the user operation is executable. In a case where the verifying unit 105 determines that the user operation is available, the verifying unit 105 sends the user operation information to the set value reflecting unit 108. Then, the function executing unit 110 executes the function based on the information from the set value reflecting unit 108. In a case where the product is a camera, the functions include starting to take still images or videos, zooming the displayed image, and connecting the lens and camera. In a case where the verifying unit 105 determines that the function specified by the user operation is unavailable in light of the product specification, the verifying unit 105 grays out the function on the application menu or prevents the switching or ringing from operating. In a case where the verifying unit 105 determines that the combination of the camera and lens cannot be connected due to their mechanical configuration, it prevents them from being connected.


Referring now to FIG. 4, a detailed description will be given of the processing of the computer 100. FIG. 4 is a flowchart illustrating the processing of the computer 100. Each step in FIG. 4 is mainly executed by each part of the computer 100 including the verifying unit 105.


First, in step S101, the verifying unit 105 determines whether or not the object (product) selected for display in the virtual reality application that is used by the user is included in the actual product information (whether or not the object having the actual product information is to be displayed). This determination is made based on information received from the product information database 200. In a case where it is determined that the object selected by the user is not included in the actual product information, the flow proceeds to step S102. In step S102, the display unit 109 displays a default object appearance in each virtual space application (displays an object in each application).


On the other hand, in a case where it is determined in step S101 that the object selected by the user is included in the actual product information, the flow proceeds to step S103. In step S103, the object information acquiring unit 104 acquires external shape data (product external shape information) such as the position, orientation, and size of the object from the product information database 200. Next, in step S104, the display unit 109 displays the actual product (actual or real object) in the virtual reality.


Next, in step S105, the verifying unit 105 determines whether or not the user reproduces the function of the actual object (actual or real product) in the virtual reality (whether or not to ensure functional consistency with the user operation (object displayed in S104)). In a case where it is determined that the function of the actual object is not to be reproduced (the functional consistency between the user operation and the actual object is not ensured), the flow proceeds to step S106. In step S106, the function executing unit 110 executes the function of the object determined within each application.


On the other hand, in a case where it is determined in step S105 that the function of the actual object is to be reproduced (functional consistency between the user operation and the actual object is ensured), the flow proceeds to step S107. In step S107, the object information acquiring unit 104 acquires function information (product function information about the displayed object) from the product information database 200. Next, in step S108, the verifying unit 105 verifies the user operation information separately acquired by the user operation reading unit 107 and the product function information acquired in step S107, and recognizes the content of the user operation information. The user operation information includes, for example, menu value change information, button or ring operations acquired from the user's hand movement, and the like.



FIGS. 5A and 5B illustrate an example of user operation. In FIGS. 5A and 5B, reference numeral 300 denotes a user in the real world, reference numeral 301 denotes a head mount display (HMD) in the real world, and reference numeral 302 denotes a motion sensor (marker) in the real world. As illustrated in FIG. 5A, the hand movement of the user 300 can be captured as a see-through image from the HMD 301. The hand movement of the user 300 may be acquired from the movement of the motion sensor 302 worn on the hand as illustrated in FIG. 5B, or from a camera (not illustrated) separately installed in the room.


Next, in step S109, the verifying unit 105 determines whether the user operation is an operation that is unavailable according to the product specification of the object. In a case where it is determined that the user operation is unavailable according to the product specification of the object, the flow proceeds to step S110. In step S110, the function executing unit 110 does not execute the function of the object based on the user operation. At this time, the function executing unit 110 can execute a predetermined function (a function corresponding to the setting of the application, and all functions executable in the application).


On the other hand, in a case where it is determined in step S109 that the user operation is an operation that is available according to the product specification of the object, the flow proceeds to step S111. In step S111, the function executing unit 110 executes the object function (object operation corresponding to the actual product) based on the user operation. That is, the function executing unit 110 executes a function (function corresponding to function information) based on the function information about the virtual object (product function information) acquired from the product information database 200.


For example, assume that a user operates a zoom ring of a lens in the real world, as illustrated in FIG. 5A. FIG. 6 explains avatar motion and object display in a case where the user operates the zoom ring. In FIG. 6, reference numeral 400 denotes an avatar in the virtual space, reference numeral 401 denotes a camera within the virtual space, and reference numeral 402 denotes a lens with a zoom function within the virtual space. As the user operates the zoom ring in the real world, as illustrated in FIG. 6 and the selected lens in the virtual space is a zoom lens, an image being captured is enlarged. However, in a case where the lens configuration does not have the zoom function in light of the product specification, such as a fixed focal length focus lens (corresponding to step S110 in FIG. 4), even if the user's hand movement is reproduced as the avatar motion, an image being recorded or an image that the user is viewing is not enlarged.


For example, in a case where the lens selected by the user has an AF/MF switch, during AF, autofocusing is made on the user's desired position within a range of the lens performance and the currently selected menu value and the image blur increases as the position becomes more distant from the focal length. On the other hand, in a case where the MF button is selected or an MF dedicated lens is attached, the focus position is maintained constant and autofocusing is not made on the user's desired position. At that time, information such as a focal length, an F-number, and a pixel pitch, is also verified.


Assume blur reproduction. In capturing a still image in the virtual world, blurring does not occur as in reality, but in a case where the user selects to maintain functional consistency in step S105, the image is blurred by the user's camera shake or shutter pressing action. In a case where the lens or camera has an image stabilizing function and the user turns on this function, the camera shake is suppressed within the performance range.


In a case where a user performs slow-motion or fast-motion recording, a combination of image resolution and frame rate value installed in the actual camera is acquired and reflected in a recorded image. In a case where the frame rate of the virtual space application being executed is lower than the frame rate of the actual camera, a warning will be issued to that effect and the execution will be restricted.


This embodiment trains users according to the actual specification of the product being used in the virtual space.


Second Embodiment

A description will now be given of a second embodiment. FIG. 7 is a block diagram of the information processing system 10a. The information processing system 10a includes a computer (information processing apparatus) 500 and a product information database (information storing apparatus) 200. The computer 500 is, for example, an apparatus such as a HMD for displaying virtual space to a user. The computer 500 includes a communication interface 501, a user information storing unit (ID information storing unit (memory)) 502, a user authority availability determining unit (third determining unit) 503, and a verifying unit 504. The computer 500 also includes a user operation acquiring unit (operation information acquiring unit) 505, an avatar motion change determining unit 506, an object change determining unit 507, and a display unit 508. The user information storing unit 502 stores user ID information. The user authority determining unit 503 determines whether a user corresponding to the user ID information is authorized (entitled) to display an object (virtual object) on the display unit 508. The user operation acquiring unit 505 acquires user operation information. The avatar motion change determining unit 506 and the object change determining unit 507 are changing units that change at least one of the model or the virtual object displayed on the display unit 508. The changing unit changes at least one of the movement of the avatar (model) and the display of the object (virtual object) in the virtual space based on the user operation information and the function information about the object (virtual object) acquired from the product information database 200. The product information database 200 is similar to that of the first embodiment, and thus a description thereof will be omitted.


Referring now to FIG. 8, a detailed description will be given of the processing of the computer 500. FIG. 8 is a flowchart illustrating the processing of computer 500. Each step in FIG. 8 is mainly executed by each part of the computer 500, such as the verifying unit 504. Steps S201 to S205 in FIG. 8 are similar to steps S101 to S105 in FIG. 4, respectively, and therefore a description thereof will be omitted.


In a case where it is determined in step S205 that the function of the actual object is not to be reproduced (functional consistency between the user operation (object displayed in step S204) and the actual object is not ensured), the flow proceeds to step S206. In step S206, the display unit 508 maintains (continues) displaying the currently displayed avatar motion and object appearance.


On the other hand, in a case where it is determined in step S205 that the function of the actual object is to be reproduced (functional consistency between the user operation and the actual object is ensured), the flow proceeds to step S207. In step S207, the avatar motion change determining unit 506 determines whether the user operation is an operation that is to change the avatar display. In a case where it is determined that the user operation is not an operation that is not to change the avatar display, the flow proceeds to step S208. In step S208, the display unit 508 maintains (continues) displaying the currently displayed avatar motion and object appearance. On the other hand, in a case where it is determined in step S207 that the user operation is an operation that is to change the avatar display, the flow proceeds to step S209. In step S209, the avatar motion change determining unit 506 changes the avatar motion


For example, the camera is commonly mounted on a tripod to prevent image blur in performing slow shutter imaging in the real world. In the virtual world, if the avatar in the virtual space is holding a camera even though the user is trying to capture an image with a slow shutter, the avatar motion may be changed. FIG. 9 explains avatar motion and object display. In FIG. 9, reference numeral 400 denotes an avatar in the virtual space, reference numeral 401 denotes a camera in the virtual space, reference numeral 403 denotes a fixed focal length focus lens in the virtual space, reference numeral 404 denotes a tripod in the virtual space, and reference numeral 406 denotes a remote switch in the virtual space.


In this example, in step S209, as illustrated in FIG. 9, the avatar 400 takes out the tripod 404 and fixes the camera 401 onto it, and presses the remote switch 406 instead of the camera switch. On the other hand, in a case where the user operation corresponds to the actual operation, the operation of the avatar 400 is not changed and the avatar 400 continues the operation in step S208.


Next, in step S210, the verifying unit 504 acquires product function information about the displayed object from the product information database 200. Next, in step S211, the verifying unit 504 verifies the user operation information and the acquired product function information. Next, in step S212, the avatar motion change determining unit 506 and the object change determining unit 507 determine whether or not the avatar motion and object display (product display) are to be changed, respectively, based on the verification result of the verifying unit 504. In a case where it is determined that the avatar motion and object display (product display) are not to be changed, the flow proceeds to step S213. In step S213, the display unit 508 maintains (continues) the avatar motion and object display. On the other hand, in a case where it is determined that the avatar motion and object display (product display) are to be changed, the flow proceeds to step S214. In step S214, the display unit 508 changes the avatar motion and object display.



FIGS. 10A and 10B explain avatar motion and object display. In FIG. 10B, reference numeral 405 denotes a super-telephoto lens in the virtual space. For example, in a case where the user performs a zoom operation even though the lens of the camera 401 currently held by the avatar 400 is a fixed focal length focus lens 403, it is determined that the operation is to change the avatar motion. At this time, as illustrated in FIG. 10A, the avatar 400 adds an action of changing the lens from the fixed focal length focus lens 403 to the lens 402 with the zoom function. Alternatively, in a case where the avatar 400 has operated a super-telephoto point beyond the lens performance of the camera 401 currently held by the avatar 400, the action of replacing the zoom lens 402 with the zoom function with the super-telephoto lens 405, as illustrated in FIG. 10B.


In a case where there are a plurality of product candidates that can be selected in response to a user operation, it is necessary to select which product to replace with. At this time, the computer 500 may automatically select the product, or the user may select the product. In the case of the user selection, the priority of replacement products may be previously determined by the user instead of newly determining a replaced product for each user operation. In this case, the object change determining unit 507 changes the display of the virtual object (product) based on the priority previously registered by the user. Thereby, a series of user actions and avatar actions can be brought closer to real-world imaging actions, and the user's sense of immersion can be further enhanced.


Other Embodiments

Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer-executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer-executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read-only memory (ROM), a storage of distributed computing systems, an optical disc (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.


While the disclosure has been described with reference to embodiments, it is to be understood that the disclosure is not limited to the disclosed embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.


Each embodiment can provide an information processing apparatus that can realize a virtual space environment that enhances the user's sense of immersion.


This application claims the benefit of Japanese Patent Application No. 2022-200868, filed on Dec. 16, 2022, which is hereby incorporated by reference herein in its entirety.

Claims
  • 1. An information processing apparatus for generating virtual space and a virtual object, the information processing apparatus comprising: a memory storing instructions; anda processor configured to execute the instructions to:display the virtual object in the virtual space, andexecute a function of the virtual object in the virtual space based on function information about the virtual object acquired from the memory.
  • 2. The information processing apparatus according to claim 1, wherein the function information about the virtual object is function information about an actual product corresponding to the virtual object.
  • 3. The information processing apparatus according to claim 1, wherein the processor is configured to: acquire user operation information,determine whether the user operation is executable using the user operation information and the function information, andexecute the function in a case where it is determined that the user operation is executable.
  • 4. The information processing apparatus according to claim 1, wherein the memory further stores user identification information, and wherein the processor is configured to:determine whether a user corresponding to the user identification information is authorized to execute the function of the virtual object, andacquire the function information about the virtual object from the memory in a case where it is determined that the user is authorized to execute the function of the virtual object.
  • 5. The information processing apparatus according to claim 1, wherein the processor is configured to: execute the function corresponding to the function information in a case where the function information about the virtual object is acquired from the memory, andexecute a function corresponding to a setting of an application in a case where the function information about the virtual object is not acquired from the memory.
  • 6. An information processing apparatus for generating virtual space, the information processing apparatus comprising: a memory storing instructions; anda processor configured to execute the instructions to:display a model of a user and a virtual object in the virtual space,change at least one of the model and the virtual object to be displayed, andacquire user operation information,wherein the processor changes at least one of an operation of the model in the virtual space or a display of the virtual object based on the user operation information and function information about the virtual object acquired from the memory.
  • 7. The information processing apparatus according to claim 6, wherein the function information about the virtual object is function information about an actual product corresponding to the virtual object.
  • 8. The information processing apparatus according to claim 6, wherein the processor is configured to: execute the function of the virtual object in the virtual space, andexecute the function based on the function information about the virtual object.
  • 9. The information processing apparatus according to claim 6, wherein the memory further stores user identification information, and wherein the processor is configured to:determine whether a user corresponding to the user identification information can display the virtual object, andacquire the function information about the virtual object from the memory in a case where it is determined that the user can display the virtual object.
  • 10. The information processing apparatus according to claim 6, wherein the processor is configured to change a display of the virtual object based on a priority registered previously by the user.
  • 11. An information processing system comprising: an information storing apparatus configured to store function information about a plurality of actual products; andthe information processing apparatus according to claim 1,wherein the information storing apparatus transmits, to the information processing apparatus, function information about one of the plurality of actual products, which corresponds to the virtual object.
  • 12. The information processing system according to claim 11, wherein the information storing apparatus includes a search unit configured to search for the actual product corresponding to the virtual object, and wherein in a case where the search unit searches for the actual product corresponding to the virtual object, the information storing apparatus transmits the function information about the actual product corresponding to the virtual object, to the information processing apparatus.
  • 13. An information processing method for generating virtual space and a virtual object, the information processing method comprising the steps of: displaying the virtual object in the virtual space; andexecuting a function of the virtual object in the virtual space based on function information about the virtual object acquired from a memory.
  • 14. An information processing method for generating virtual space, the information processing method comprising the steps of: displaying a model of a user and a virtual object in the virtual space;changing at least one of the model and the virtual object to be displayed; andacquiring user operation information,wherein the changing step changes at least one of an operation of the model in the virtual space or a display of the virtual object based on the user operation information and function information about the virtual object acquired from a memory.
  • 15. A non-transitory computer-readable storage medium storing a program that causes a computer to execute the information processing method according to claim 13.
Priority Claims (1)
Number Date Country Kind
2022-200868 Dec 2022 JP national