This disclosure relates to controlling data processing.
Some data processing activities may be controlled by a detection of a trackable device, for example.
An example arrangement involves a games machine, in which movements of a device such as a head mountable display (HMD) and/or a hand-holdable controller such as a Sony® Move® Controller, can be tracked by one or more cameras.
In a so-called mixed reality system (for example, being capable of combining features of real and virtual environments in a display presented to the user), it may be that multiple cameras are in use, for example a camera to obtain images of the user, for example in order to track the user's position and/or activity, and one or more cameras to capture images of the user's real environment. There is a need to be able to spatially align the frames of reference of the captured images obtained from the various cameras.
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
Various aspects and features of the present disclosure are defined in the appended claims and within the text of the accompanying description and include at least a head mountable apparatus such as a display and a method of operating a head-mountable apparatus as well as a computer program.
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Referring to the drawings,
In
A camera 305 is associated with the console 300 to capture images of the user 10 and/or the controller 330.
The Move controller comprises a handle portion 100 and an illuminated end portion 110. The handle portion 100 can carry one or more control buttons and houses a so-called inertial measurement unit (IMU) which will be described in more detail below. The illuminated end portion 110 comprises one or more light emitting diodes (LEDs) inside a translucent spherical shell and which are capable of being illuminated, for example under the control of an apparatus such as the games console 300.
The Move controller 330 provides an example of a control device comprising an elongate handle portion 100 which houses the inertial detector 332, 334 and an illuminated end portion 110 at an end of the handle portion.
In use, the IMU transmits inertial detections to the games console 300 and the games console 300 also tracks, using images captured by the camera 305, the location of the illuminated end portion 110.
A pair of video displays in the HMD 20 are arranged to display images provided via the games console 300, and a pair of earpieces 60 in the HMD 20 are arranged to reproduce audio signals generated by the games console 300. The games console may be in communication with a video server. The USB connection from the games console 300 also provides power to the HMD 20, according to the USB standard.
As mentioned above, the Move controller 330 houses an Inertial Measurement Unit (IMU) 332. Inertial measurements are considered, for the purposes of this description, to encompass one or more of: accelerometer detections; and gyroscopic detection. The games console 300 is connected to the camera 305 which, for the purposes of this explanation, is assumed to be directed towards the Move controller 330.
In operation, the camera 305 captures images including images of the illuminated end portion 110 of the Move controller 330. These are processed by an image processor 312 to identify a location, with respects to the frame of reference of the captured images, of the illuminated end portion 110.
An IMU data receiver 314 receives wireless telemetry 316 from the IMU 332, for example via a Bluetooth® wireless link provided by a wireless interface 334 forming part of the IMU. The telemetry signals 316 indicate inertial measurements performed by the IMU 332, for example accelerometer and gyroscopic measurements indicating an orientation of the Move controller 330 or at least changes in its orientation.
Data provided by the image processor 312 indicative of the location of the illuminated end portion 110 in the captured images and data provided by the IMU data receiver 314 are processed by a location and orientation detector 318 which provides an output signal 332 indicative of any current location and orientation of the Move controller 320, or at least changes in the location and orientation. Note that these changes can be used in the arrangement of
This information provided by the signal 322 can be used in various different ways as a generic control input to a data processing operation. Purely by way of example, in the arrangement shown schematically in
In executing such a mixed reality operation, more than one video camera is generally required because it is necessary to capture the real environment around the user from multiple different angles in order to generate the mixed reality environment for display to the user. In
A Move controller 420 is provided. While it is possible for both cameras to track the illuminated end portion 422 of the Move controller 420, system restriction prevent the IMU 424 of the Move controller 420 (via its wireless link 426) communicating with both the games console 405 and the games console 415.
Having both of the games consoles 405, 415 track the Move controller 420 can be particularly useful not only during potential gameplay situations in a mixed reality environment for example, but also during a calibration fade in which the relative locations of the cameras 400, 410 are determined by using both cameras 400, 410 to track a common object (the Move controller 420).
However, for the reasons discussed above, this is not possible using a single Move controller.
One possibility to address this would be to register a second Move controller (with that one of the games consoles 405, 415 not tracking the Move controller 420) and keep it out of sight so that it is not tracked by the respective video camera. However, then the motion capture results between the two games consoles 405, 415 would not correlate. This is because the tracking system used to detect the location of the Move controller 420 is driven, and discussed above, using a combination of IMU data and captured images of the illuminated end portion 422.
So, a solution is not necessarily provided by registering a second Move controller but keeping it out of sight.
two or more user control devices 600, 610, each user control device having an inertial detector 424 having a wireless communications module 426 to communicate inertial detection results, and an illuminated portion 605, 615; and
a housing 500 to house the two or more control devices so that the two or more control devices are constrained to move with one another but having an obscured portion so as to obscure all (605) but one (615) of the illuminated portions of the two or more control devices 600, 610.
Note that it is not necessarily a requirement that the whole of the housing 500 is obscured; a portion 505 surrounding the illuminated portion 605 could be obscured, and the remainder not obscured (for example, being transparent or even an open frame structure), for example.
The tubular housing 500 of
In terms of the IMU data, one of the Move controllers such as the Move controller 610 provides IMU data to the games console 405 and the other Move controller 600 provides IMU data to the games console 415.
Using this arrangement, at least in a calibration phase during which the relative location of plural video cameras is determined, the housing 500 carrying the multiple Move controllers 600, 610 but allowing the illuminated end portion 615 to be visible can be tracked by both of the cameras and games consoles of
In principle a housing similar to the housing 500 could be provided which encompasses more than two Move controllers, still with just one illuminated end portion protruding or otherwise visible, so that more than two cameras may be mapped together in this way. However, another option where more than two cameras are in use is to generate mappings between them on a pair-wise basis, in other words two at a time.
a control apparatus 700;
two or more data processing devices 405, 415, a respective data processing device being associated with each of the two or more user control devices 600, 610 so as to receive inertial detection results from that user control device;
two or more video cameras 400, 410, a respective video camera being associated with each data processing device and each being configured to capture images of the non-obscured illuminated portion;
in which each data processing device is configured to detect a location of the control apparatus from images captured by the respective camera of the non-obscured illuminated portion and from inertial detection results received from the respective control device.
By comparing the location data at particular physical positions of the control apparatus, the system of
Referring to
To address this,
Note that although different apparatuses have been provided in the examples given above for the data processing apparatus 1020 and the earlier control apparatuses, these functions could be shared by a common apparatus executing at various different tasks, for example.
moving (at a step 1100) a control apparatus according to claim 1 to a plurality of locations within view of two or more video cameras;
deriving (at a step 1110) two or more sets of location data by capturing images of the control apparatus using the two or more video cameras, and in respect of each video camera, detecting inertial detection results received from a respective one of the control devices; and
detecting (at a step 1120) a mapping between respective frames of reference of the two or more video cameras by correlating the two or more sets of location data.
It will be appreciated that in example embodiments the techniques discussed above, including the method of
It will also be apparent that numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practised otherwise than as specifically described herein.
Number | Date | Country | Kind |
---|---|---|---|
1803061 | Feb 2018 | GB | national |
Number | Name | Date | Kind |
---|---|---|---|
8287373 | Marks | Oct 2012 | B2 |
8323106 | Zalewski | Dec 2012 | B2 |
20050212752 | Marvit | Sep 2005 | A1 |
20100149341 | Marks | Jun 2010 | A1 |
20100150404 | Marks | Jun 2010 | A1 |
20120293630 | Persaud | Nov 2012 | A1 |
20140228096 | Detlefsen | Aug 2014 | A1 |
20170087455 | Glennblack | Mar 2017 | A1 |
20170361222 | Tsuchiya | Dec 2017 | A1 |
Number | Date | Country |
---|---|---|
2524492 | Sep 2015 | GB |
3098782 | Aug 2018 | GB |
Entry |
---|
Combined Search and Examination Report for corresponding Application GB1803061.9, 4 pages, dated Aug. 28, 2018. |
Examination Report for corresponding Application GB1803061.9, 2 pages, dated Aug. 25, 2020. |
Number | Date | Country | |
---|---|---|---|
20190262701 A1 | Aug 2019 | US |