The present invention relates to an apparatus, system, and method for data management, and a recording medium.
When images are captured by using the same image capturing device while the focal length (angle of view) is changed, an image of a wide image capture range and an image of a narrow image capture range are obtained. In a case where the two images have the same number of pixels, the image of a narrow image capture range is a high-resolution image compared to the image of a wide image capture range. In some cases, a high-resolution planar image captured separately from a wide-angle planar image is superimposed on (embedded in) a partial area of the wide-angle planar image to provide a clear image of the partial area together with the overall image.
In such a technique, in order to retain, as metadata, etc., location information of the partial image in the overall image, it is desirable that the overall image, the partial image, and the metadata be associated with each other (see, for example, PTL 1). PTL 1 discloses an image data recording method in which a file storing only metadata and a file in which the metadata is registered for image data are prepared and, in a case where the file storing only the metadata is updated, it is determined, in accordance with data updated in the file, whether the file in which the metadata is registered for the image data is to be updated.
PTL 1: JP-2016-96487-A
However, an existing system has a drawback in that, to use a plurality of pieces of image data in superimposed display, metadata needs to be referenced without exception. Specifically, a user may want to select a plurality of pieces of data (images, moving images, sounds, etc.) that are in their possession or are stored on a server as desired and to use the selected pieces of data in superimposed display. However, since information about the plurality of pieces of data (images, moving images, sounds, etc.) that are used in superimposed display is included in the metadata, it is not possible to identify the plurality of pieces of data (images, moving images, sounds, etc.) that are used in superimposed display without reference to the metadata.
That is, in order to know, for example, the scene in which the pieces of image data are obtained to superimpose and display the plurality of images, it is also necessary to reference the metadata.
Further, it is not possible to reference, from data (an image, a moving image, a sound, etc.) to be used in superimposed display, the metadata and the other data. Accordingly, it is not possible to identify, from any data (an image, a moving image, a sound, etc.) used in superimposed display, the other data (images, moving images, sounds, etc.). For example, when an image of a wide image capture range is present, it is not easy for a user to identify an image of a narrow image capture range that is superimposed on the image of a wide image capture range.
Example embodiments of the present invention include an information processing system including one or more processors to: obtain a plurality of pieces of data; generate metadata used to combine first data of the plurality of pieces of data, with second data being one or more of the plurality of pieces of data other than the first data; assign a common identifier to the first data, the second data, and the metadata; and store, in a memory, the first data, the second data, and the metadata in association with the common identifier.
Example embodiments of the present invention include a data management system including the information processing system, and a terminal device. The terminal device includes a processor to control a display to display the plurality of pieces of data obtained at the information processing system, and accept selection of the first data and the second data to be combined from among the plurality of pieces of data; and a communication device to transmit information on the first data and the second data that are selected to the information processing system through the network.
Example embodiments of the present invention include a data management method, and a recording medium storing a program code for causing a computer system to perform the data management method.
According to at least one embodiment of the present invention, a plurality of pieces of data to be combined for display can be managed.
The accompanying drawings are intended to depict example embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
Hereinafter, as example modes of the present invention, a data management system and a data management method performed by the data management system will be described with reference to the drawings while using example embodiments.
First, superimposed display metadata is described with reference to
The equirectangular projection image information and the planar image information each include an image identifier. When the superimposed display metadata is provided, an equirectangular projection image and a planar image that are superimposed and displayed are identified. The superimposed display information includes a location parameter that indicates the position of the planar image in the equirectangular projection image.
However, the pieces of superimposed display use data do not include information about superimposed display, and therefore, it is not possible to reference the superimposed display metadata from any piece of superimposed display use data, as illustrated in
In the present embodiments, a management table is generated, and pieces of superimposed display use data are managed in the management table by using a management ID. In the present embodiments, the table is used; however, the table need not be used in the management.
As described above, in a data management system 100 according to the present embodiments illustrated in
A plurality of pieces of data that are combined and reproduced are pieces of data having a relation in which one of the pieces of data (first data) and the other piece of data (second data) are reproduced together or one of the pieces of data is reproduced and the other piece of data is subsequently reproduced. For example, in a case where a plurality of pieces of image data are present, these pieces of image data include a piece of data of an image of a wide image capture range and a piece of data of an image of a narrow image capture range to be superimposed on the image of a wide image capture range, the pieces of data being combined and reproduced. Further, the types of images may include a moving image and a still image. The images may be distinguished from each other on the basis of, for example, a difference in the resolution or a difference in the image capture date and time. A plurality of pieces of data include a piece of data of a moving image of a wide image capture range and a piece of data of a still image of a narrow image capture range to be superimposed on the moving image, the pieces of data being combined and reproduced, or include a piece of data of a still image of a wide image capture range and a piece of data of a moving image of a narrow image capture range to be superimposed on the still image, the pieces of data being combined and reproduced.
Metadata is data that includes location information about an image that is superimposed on another wide-angle image. In the present embodiments, metadata is described as superimposed display metadata.
A common identifier is an identifier for associating a plurality of pieces of image data with each other. In the present embodiments, a common identifier is described as a management ID.
The superimposed display management system 50 is implemented by at least one information processing apparatus such that it may be alternatively referred to as an information processing system. The superimposed display management system 50 assigns a management ID to superimposed display use data. Specifically, in this embodiment, the superimposed display management system 50 includes a database server (hereinafter referred to as a DB server 40) that stores data used in processing and an application server (hereinafter referred to as an AP server 30) that performs processing by using the data retained by the DB server 40. The DB server 40 and the AP server 30 can be implemented as one integrated server; however, when the DB server 40 and the AP server 30 are separated from each other, the processing speed can be increased by distributed processing, the backup frequency of a database 44 (see
The superimposed display management system 50 may be applicable to cloud computing. Cloud computing is a use form in which resources on a network are used without awareness of a specific hardware resource.
The client terminal 10 is an information processing apparatus or a terminal device operated by a user and is a client that requests processing from the superimposed display management system 50 to use a service of the superimposed display management system 50. In
The client terminal 10 includes a communication device for making a connection to the network N, an input device for selecting an item in a service, inputting text information, etc., and a display for displaying predetermined screens, images, and moving images provided via the network N, and has at least a function of accepting a user instruction via the input device and transmitting the user instruction to the superimposed display management system 50, as described below.
The communication device 103 is, for example, a network interface card (NIC) and makes a connection to the network N to control communication with the superimposed display management system 50. The AP server 30 further incudes a bus line, and the constituent elements including the CPU 101, the memory 102, the storage device 104, and the communication device 103 are electrically connected to each other via the bus line.
In a first example embodiment, the data management system 100 is described in which superimposed display metadata is generated for superimposed display use data that has already been obtained and in which a management ID is assigned.
The AP server 30 includes a metadata generating unit 31, a Web server unit 32, a first communication unit 33, a download data generating unit 34, a management ID generating unit 35, and a DB interfacing unit 36. These units included in the AP server 30 are functions or means that are implemented by one or more of the constituent elements illustrated in
The first communication unit 33 transmits/receives various types of data to/from the client terminal 10. In the first example embodiment, the first communication unit 33 receives superimposed display use data and information about a specified file from the client terminal 10 and transmits screen information about various Web pages that are displayed by the client terminal 10.
The Web server unit 32 has a function of a Web server that returns a Hypertext Transport Protocol (HTTP) response including screen information in response to an HTTP request and a function of a Web application. The screen information is described in HyperText Markup Language (HTML), Cascading Style Sheet (CSS), JavaScript (registered trademark), etc. and is provided from the Web server unit 32 as a Web page. The Web application is software that is run by cooperation of a program in a script language (for example, JavaScript (registered trademark)) running on browser software and a program in the Web server and is used on the browser software, or is the mechanism of the software.
The metadata generating unit 31 generates superimposed display metadata illustrated in
The management ID generating unit 35 chooses and assigns a common (same) management ID to pieces of superimposed display use data that are used in superimposed display and superimposed display metadata generated on the basis of these pieces of superimposed display use data.
The DB interfacing unit 36 controls communication with the DB server 40 to interface with the DB server 40. For example, the DB interfacing unit 36 calls an authenticating unit 42 of the DB server 40 at the time of user authentication. The DB interfacing unit 36 transmits superimposed display use data to the DB server 40 at the time of uploading the superimposed display use data, and obtains superimposed display use data specified by a user and superimposed display metadata from the DB server 40.
The download data generating unit 34 generates download data that includes at least a management ID. The download data generating unit 34 may generate download data that includes superimposed display use data and superimposed display metadata in addition to the management ID. For the download data, patterns 1 to 5 are available as the data format, which will be described in detail below. With the download data, superimposed display use data and superimposed display metadata can be distributed, and any client terminal can perform superimposed display using superimposed display use data.
The DB server 40 includes a data management unit 41, the authenticating unit 42, and a second communication unit 43. These units included in the DB server 40 are functions or means that are implemented by one or more of the constituent elements illustrated in
The data management unit 41 registers superimposed display use data and superimposed display metadata in the database 44 and obtains superimposed display use data and superimposed display metadata from the database 44.
The authenticating unit 42 performs processing for user authentication and returns the result indicating whether authentication is successful or fails to the AP server 30. The authentication method may be a method using a combination of a user ID and a password, an integrated circuit (IC) card, or biometric authentication information.
The second communication unit 43 communicates with the AP server 30. That is, the second communication unit 43 transmits and receives superimposed display use data and superimposed display metadata. The second communication unit 43 may communicate with the client terminal 10.
The client terminal 10 includes a third communication unit 11, an operation accepting unit 12, and a display control unit 13. These units included in the client terminal 10 are functions or means that are implemented by one or more of the constituent elements illustrated in
The third communication unit 11 transmits/receives various types of data to/from the AP server 30. In the first example embodiment, the third communication unit 11 transmits superimposed display use data and information about a specified file to the AP server 30 and receives screen information about a Web page, etc. from the AP server 30.
The operation accepting unit 12 accepts various user operations performed at the client terminal 10. For example, the operation accepting unit 12 accepts selection of superimposed display use data, etc. The display control unit 13 analyzes screen information received by the third communication unit 11 and displays Web pages (various screens) on the display 107 (display device). The client terminal 10 may run a dedicated application instead of browser software to communicate with the superimposed display management system 50.
A method for registering a management ID associated with generation of superimposed display metadata is described with reference to
First, a user operates the client terminal 10 to log in to the superimposed display management system 50. The superimposed display management system 50 performs user authentication (step S401).
When the user authentication is successfully completed (the user successfully logs in to the superimposed display management system 50), the user performs at the client terminal 10 an operation of uploading superimposed display use data to the superimposed display management system 50. Accordingly, the client terminal 10 uploads the superimposed display use data to the superimposed display management system 50 (step S402).
The superimposed display management system 50 generates superimposed display metadata (step S403). After the superimposed display metadata has been generated, a management ID can be generated. The management ID may be generated before the superimposed display metadata is generated.
In a case where superimposed display use data has already been uploaded to the superimposed display management system 50, the data upload step (step 5402) can be skipped. The sequence in each step is described in detail below.
The superimposed display management system 50 and the client terminal 10 are connected to the network, and a user can access the superimposed display management system 50 via a Web browser by inputting the uniform resource locator (URL) of the superimposed display management system 50 to the Web browser.
SS01: When the client terminal 10 accesses the superimposed display management system 50, the client terminal 10 transmits a login screen request to the AP server 30.
S502: The first communication unit 33 of the AP server 30 receives the login screen request, and the Web server unit 32 transmits screen information about a login screen to the client terminal 10.
S503: The third communication unit 11 of the client terminal 10 (Web browser) receives the screen information about the login screen, and the display control unit 13 displays the login screen on the basis of the received screen information.
S504: The user inputs their user ID and login password on the login screen. The operation accepting unit 12 accepts the input user ID and login password.
S505: When the user presses a “login” button provided on the login screen, the operation accepting unit 12 accepts the operation, and the third communication unit 11 transmits a login authentication request to the AP server 30.
S506: The first communication unit 33 of the AP server 30 receives the login authentication request. The login authentication request includes information about the user ID and login password. When the login authentication request is transmitted, the DB interfacing unit 36 of the AP server 30 requests the DB server 40 to authenticate the login including the user ID and login password.
S507: The second communication unit 43 of the DB server 40 receives the request for login authentication, and the authenticating unit 42 performs login authentication by checking the user ID and login password included in the login authentication request against each record of a user table in the database 44 and transmits the result of login authentication via the second communication unit 43.
S508: The DB interfacing unit 36 of the AP server 30 receives the result of login authentication, and the Web server unit 32 generates and transmits via the first communication unit 33 a post-login-authentication screen. The Web server unit 32 transmits the post-login-authentication screen if the authentication is successful, or transmits a login error screen if the authentication fails.
S601: When the user operates the Web browser installed in the client terminal 10, the third communication unit 11 transmits a menu selection screen request to the AP server 30.
S602: The first communication unit 33 of the AP server 30 receives the menu selection screen request, and the Web server unit 32 transmits screen information about a menu selection screen 301 illustrated in
S603: The third communication unit 11 of the client terminal 10 receives the screen information about the menu selection screen 301, and the display control unit 13 displays the menu selection screen 301.
S604: The first communication unit 33 of the AP server 30 causes the DB interfacing unit 36 to obtain data on the basis of the menu selection information. The DB interfacing unit 36 obtains superimposed display use data from the DB server 40 as needed in accordance with the menu selection information, and the Web server unit 32 generates screen information corresponding to the menu selected by the user. Therefore, the specific screen varies. Display examples of a screen displayed for each menu in screen generation (step S604) will be described below.
S605: The first communication unit 33 of the AP server 30 transmits the generated screen information to the client terminal 10.
S606: The third communication unit 11 of the client terminal 10 receives the screen information, and the display control unit 13 analyzes the screen information and displays a screen.
The user can make requests for various services to the AP server 30 via the client terminal 10 by using the menu selection screen 301. Information about the menu selection screen 301 is predetermined and is updated less frequently, and therefore, the screen information may be retained in advance in the AP server 30.
S801: In screen generation/display in step S801, a predetermined screen is generated in accordance with the sequence diagram of screen generation/display described with reference to
S802: The user selects on the data upload screen 311 superimposed display use data that the user wants to upload to the superimposed display management system 50. The operation accepting unit 12 accepts selection of the superimposed display use data.
S803: After the user has registered the superimposed display use data to be uploaded in a file list field, the user presses an upload button 315. The operation accepting unit 12 accepts the operation, and the third communication unit 11 transmits a data upload request to the AP server 30.
S804: The first communication unit 33 of the AP server 30 receives the data upload request and transmits a data upload permission notification to the client terminal 10. In a case where data upload is not permitted, the first communication unit 33 transmits a data upload prohibition notification to the client terminal 10. The first communication unit 33 may transmit to the client terminal 10 a screen with which the user can recognize that data upload is prohibited. The case where data upload is prohibited is, for example, a case where the format of the superimposed display use data is not appropriate, a case where the file size is excessively large, or a case where the DB server 40 is in an abnormal state.
S805: When the third communication unit 11 of the client terminal 10 receives the data upload permission notification, the third communication unit 11 transmits the superimposed display use data registered in a file list field 312 to the AP server 30.
S806: When the first communication unit 33 of the AP server 30 receives the superimposed display use data, the DB interfacing unit 36 transmits the superimposed display use data to the DB server 40. Accordingly, the data management unit 41 saves the superimposed display use data in the database 44. At this point in time, a management ID is not assigned.
S807: After data saving is completed, the data management unit 41 transmits a data saving completion notification to the AP server 30 via the second communication unit 43.
S808: The DB interfacing unit 36 of the AP server 30 receives the data saving completion notification, and the first communication unit 33 transmits a data upload completion notification to the client terminal 10. Consequently, the data upload processing is completed.
When the user presses the data selection button 313 on the data upload screen 311, the operation accepting unit 12 accepts the operation, and the display control unit 13 displays a list of pieces of superimposed display use data stored on the client terminal 10 currently used.
When the user uses the input device 108 of the client terminal 10 to select, from the data list, pieces of data that the user wants to upload, the data names (file names) are registered in the file list field 312, as illustrated in
The upload button 315 is a button for uploading the pieces of superimposed display use data in the file list field 312 to the superimposed display management system 50, and the exit button 316 is a button for closing the data upload screen 311.
S1001: The client terminal 10 and the AP server 30 generate and display a predetermined screen in accordance with the screen generation/display sequence described with reference to
S1002: The user uses the input device 108 of the client terminal 10 to check a checkbox 324 for a piece of superimposed display use data for which superimposed display metadata is to be generated. The operation accepting unit 12 accepts the selection, and the display control unit 13 displays a check mark in the checkbox 324.
S1003: When data selection is completed, the user presses a metadata generation button 325 on the superimposed display metadata generation screen 321. The operation accepting unit 12 accepts the operation, and the third communication unit 11 transmits a superimposed display metadata generation request to the AP server 30.
S1004: The first communication unit 33 of the AP server 30 receives the superimposed display metadata generation request. The superimposed display metadata generation request includes information for identifying files, such as the file names of the selected pieces of superimposed display use data. The DB interfacing unit 36 of the AP server 30 requests the selected pieces of superimposed display use data from the DB server 40 in accordance with the superimposed display metadata generation request.
S1005: The second communication unit 43 of the DB server 40 receives the request for the pieces of superimposed display use data, and the data management unit 41 obtains the requested pieces of superimposed display use data from the database 44. The second communication unit 43 transmits the pieces of superimposed display use data to the AP server 30.
S1006: When the DB interfacing unit 36 of the AP server 30 receives the pieces of superimposed display use data, the metadata generating unit 31 of the AP server 30 uses the pieces of superimposed display use data to generate superimposed display metadata. The generation of superimposed display metadata will be described in detail below.
S1007: When the superimposed display metadata is generated, the DB interfacing unit 36 of the AP server 30 transmits the superimposed display metadata to the DB server 40. The second communication unit 43 of the DB server 40 receives the superimposed display metadata, and the data management unit 41 saves the superimposed display metadata in the database 44.
S1008: When data saving is completed, the second communication unit 43 of the DB server 40 transmits a data saving completion notification to the AP server 30.
S1009: The DB interfacing unit 36 of the AP server 30 receives the data saving completion notification, and the first communication unit 33 transmits a superimposed display metadata generation completion notification to the client terminal 10. Consequently, the superimposed display metadata generation processing is completed.
The metadata generation button 325 on the superimposed display metadata generation screen 321 is a button for requesting the AP server 30 to generate metadata of pieces of superimposed display use data for each of which the checkbox 324 is checked, and an exit button 326 is a button for closing the superimposed display metadata generation screen 321.
Now, the method for registering the management ID for the superimposed display metadata and the pieces of superimposed display use data is described.
In the management ID registration step (step S1207), the management ID generating unit 35 registers the superimposed display metadata generated in the superimposed display metadata generation step in the “file name” column of the management table and registers a common management ID for pieces of superimposed display use data used in generation of the superimposed display metadata and for the generated superimposed display metadata.
For example, in
In
In the first example embodiment, the example case is described where the same management ID is registered for the superimposed display metadata and the pieces of superimposed display use data at a time. However, the present embodiments are not limited to this case, and a case where part of the management ID is the same and a case where completely different management IDs are used but are determined to belong to the same group by referencing another table or by using a specific algorithm are within the scope of the present invention.
For example, a part of the management IDs may be made the same. For example, assuming that one of the management IDs (first management ID) is “A”, the management ID related to the first management ID (second management ID) is assigned with “A-0”. Since the second management ID includes a part of the first management ID, it can be determined that these management IDs belong to the same group. In another example, even when the management IDs are completely different, as long as they are associated via the same identifier, for example, it can be determined that these management IDs belong to the same group. For example, assuming that the first management ID is “A”, and the second management ID is “0AB%08”, are associated via the same identifier “0001”, it is determined that these management IDs belong to the same group. Specifically, these three different identifiers may be managed using a management table, which stores information indicating association of these three different identifiers. Further, as long as superimposed display metadata and pieces of superimposed display use data used in generation of the superimposed display metadata can be referenced with the management ID in the management table, the form of the management ID is not limited. Further, a case where information is retained for the same purpose without using the names “management table” and “management ID” is also within the scope of the present invention.
In the superimposed display metadata generation step, a case where metadata generation fails may occur. The case includes a case where pieces of data of different scenes are selected in data selection and the superimposition positions are unable to be determined and a case where pieces of data of the same scene are selected but the pieces of data are difficult to superimpose. In such cases, a management ID is not registered for the individual pieces of data for which metadata generation fails, so that inappropriate superimposed display use data can be excluded.
Now, a procedure for displaying pieces of superimposed display use data having the same management ID by using the management ID in the data management system 100 is described.
S1401: In a screen generation/display step S1401, a predetermined screen is generated in accordance with the screen generation/display sequence described with reference to
In order to describe the index display screen 331, a management table is illustrated in
In
In
S1402: Referring back to
S1403: When the piece of data is selected, the third communication unit 11 of the client terminal 10 transmits a selected-data display request to the AP server 30. The selected-data display request includes the file identifier, which is, for example, the file name of the selected piece of superimposed display use data. In the first example embodiment, it is assumed that the file name is transmitted, and a description is further given.
S1404: When the first communication unit 33 of the AP server 30 receives the selected-data display request, the DB interfacing unit 36 of the AP server 30 transmits the file name to the DB server 40 to request pieces of superimposed display use data having a management ID the same as that of the file selected by the user. The data management unit 41 of the DB server 40 obtains from the database 44 the piece of superimposed display use data having the file name selected by the user. Further, the data management unit 41 obtains from the database 44 all pieces of superimposed display use data associated with a management ID the same as that of the piece of superimposed display use data. The data management unit 41 transmits the two or more pieces of superimposed display use data to the AP server 30 via the second communication unit 43.
The Web server unit 32 of the AP server 30 generates a detail display screen 341 using the pieces of superimposed display use data. An example of the detail display screen 341 is illustrated in
S1405: When generation of the detail display screen 341 is completed, the first communication unit 33 of the AP server 30 transmits the detail display screen 341 to the client terminal 10.
S1406: The third communication unit 11 of the client terminal 10 receives the detail display screen 341, and the display control unit 13 displays the detail display screen 341.
As described above, when the user only selects one piece of superimposed display use data, the user can view all pieces of superimposed display use data associated with a management ID the same as that of the piece of superimposed display use data. The user can select a plurality of pieces of superimposed display use data at a time.
In order to clearly distinguish the piece of superimposed display use data selected by the user from the other pieces of superimposed display use data, the display control unit 13 may change the sizes of the thumbnail images (increase the size for the selected piece of superimposed display use data) and make the outline of the thumbnail image of the selected piece of data thicker (make the outline thicker for the selected piece of superimposed display use data) as illustrated in
As described above, when the user only selects one piece of superimposed display use data, the client terminal 10 can display pieces of superimposed display use data having the same management ID, and therefore, the user can easily view the plurality of pieces of superimposed display use data used in generation of the superimposed display metadata.
The example where pieces of superimposed display use data having the same management ID are displayed as thumbnail images has been described with reference to
First, the user logs in to the superimposed display management system 50 (step S1801). The login process is the same as that described with reference to
After user authentication is successfully completed, the client terminal 10 downloads superimposed display use data (step S1802). The sequence in the download step S1802 is described in detail below.
S1901: In a screen generation/display step in step S1901, a predetermined screen is generated in accordance with the screen generation/display sequence described with reference to
S1902: Referring back to
S1903: When the piece of superimposed display use data is selected, the third communication unit 11 of the client terminal 10 transmits a selected-data display request to the AP server 30. The selected-data display request includes the file identifier, which is, for example, the file name of the selected piece of superimposed display use data.
S1904: The first communication unit 33 of the AP server 30 receives the selected-data display request, and the DB interfacing unit 36 of the AP server 30 obtains from the DB server 40 pieces of superimposed display use data identified by using the file identifier, which is, for example, the file name. The method for obtainment is the same as that used in step S1404 in
S1905: The first communication unit 33 of the AP server 30 transmits the download setting screen 361 to the client terminal 10.
S1906: The third communication unit 11 of the client terminal 10 receives the download setting screen 361, and the display control unit 13 displays the download setting screen 361.
The download setting field 365 is a field used to set a data format that is used in downloading the management ID and the pieces of superimposed display use data.
The download setting field 365 is described with reference to
Now, the patterns of the data format of download data are described. The data format of pattern 1 is a pattern in which the selected data, each piece of superimposed display use data, and the superimposed display metadata are stored in respective files, and four files are provided in
The data format of pattern 2 is a pattern in which the selected data, each piece of superimposed display use data, and the superimposed display metadata are combined into one file. The management ID is added to the one file obtained as a result of combining. The file in pattern 2 stores data in which the plurality of pieces of superimposed display use data are combined. The file in pattern 2 is in a format with which only the selected data can be viewed with general data browser software. Regarding the pieces of superimposed display use data other than the selected data in the combined data, only the client terminal 10 that logs in to the superimposed display management system 50 is allowed to view the other pieces of superimposed display use data. In such a case where pieces of superimposed display use data other than the selected data are not allowed to be displayed with general data browser software, for example, only the client terminal 10 that logs in to the superimposed display management system 50 is allowed to view the other pieces of superimposed display use data in the combined data, or dedicated data browser software is distributed to users to allow viewing.
The data format of pattern 3 is a pattern in which only the selected data is stored in a file. To the selected data, the management ID is added. The file in pattern 3 is in a format with which viewing with general data browser software is allowed.
The data format of pattern 4 is a pattern in which only the superimposed display metadata is involved. The management ID is added to the superimposed display metadata. The data in the data format of pattern 4 can be viewed with general data browser software.
The data format of pattern 5 is a pattern in which only the management ID is involved. Only the management ID is involved, and therefore, the pattern need not be in a file format and may include address information about the superimposed display management system 50. For example, a use method in which information in pattern 5 can be shared via an email or a social networking service (SNS) may be used.
S1907: Referring back to
S1908: The third communication unit 11 of the client terminal 10 transmits a download request to the AP server 30. The download request includes, for example, the pattern number of the data format that is set by the user in download setting and the file name of the selected piece of superimposed display use data.
S1909: The first communication unit 33 of the AP server 30 receives the download request, and the DB interfacing unit 36 requests the selected data from the DB server 40. The second communication unit 43 of the DB server 40 receives the request for the selected data including the file name of the selected piece of superimposed display use data, and the data management unit 41 obtains superimposed display use data from the database 44.
In a case of pattern 1, the data management unit 41 obtains the selected data, one or more pieces of superimposed display use data, and the superimposed display metadata.
In a case of pattern 2, the data management unit 41 obtains the selected data, one or more pieces of superimposed display use data, and the superimposed display metadata.
In a case of pattern 3, the data management unit 41 obtains the selected data.
In a case of pattern 4, the data management unit 41 obtains the superimposed display metadata.
In a case of pattern 5, the data management unit 41 does not obtain the pieces of data described above, and therefore, the AP server 30 need not access the DB server 40.
S1910: The second communication unit 43 of the DB server 40 transmits the obtained data to the AP server 30.
S1911: The DB interfacing unit 36 of the AP server 30 receives the data in the data format of one of patterns 1 to 5, and the download data generating unit 34 generates download data. The data format of the download data is in the pattern set in the download setting step S1907.
In the case of pattern 1, the download data generating unit 34 generates a file for each of the selected data, the one or more pieces of superimposed display use data, and the superimposed display metadata, compresses these files, and generates one compressed file.
In the case of pattern 2, the download data generating unit 34 converts the selected data, the one or more pieces of superimposed display use data, and the superimposed display metadata to one file in a format with which only the selected data can be viewed with general data browser software.
In the case of pattern 3, the download data generating unit 34 converts the selected data to a file in a format with which the selected data can be viewed with general data browser software.
In the case of pattern 4, the download data generating unit 34 converts the superimposed display metadata to a file in a format with which the superimposed display metadata can be viewed with general data browser software.
In the case of pattern 5, the download data generating unit 34 need not generate a data file but may generate a file for the management ID.
S1912: After generating the download data, the download data generating unit 34 adds the management ID to the download data in the data format of the pattern selected by the user from among patterns 1 to 5 illustrated in
In the cases of patterns 1 to 4, the management ID is added to the download data generated in step S1911. In the case of pattern 5, a file that stores the management ID as data may be generated, or the management ID may be converted so as to have a data format with which the management ID can be transmitted via an email or an SNS. The file for the management ID may be generated in step S1911.
S1913: The DB interfacing unit 36 of the AP server 30 saves on the DB server 40 the download data to which the management ID is added.
S1914: When data saving is completed, the second communication unit 43 of the DB server 40 transmits a data saving completion notification to the AP server 30.
S1915: When the DB interfacing unit 36 of the AP server 30 receives the data saving completion notification, the first communication unit 33 transmits the download data to the client terminal 10.
The client terminal 10 uses the download data to superimpose the planar images on the equirectangular projection image to display the images.
When the download data to which the management ID is added is obtained as described above, the download data can be distributed to the other users (other persons).
Now, distribution of download data is described. There may be a case where the user distributes download data downloaded from the superimposed display management system 50 to the other persons. The data format of the download data downloaded from the superimposed display management system 50 has the plurality of patterns. For example, in the cases of patterns 1 and 2 described with reference to
The cases of patterns 3 to 5 are advantageous in that the data amount is smaller than that in the cases of patterns 1 and 2; however, all pieces of data that are used in superimposed display are not distributed, and therefore, the other persons need to obtain the omitted data from the superimposed display management system 50. In the case of pattern 1, the plurality of pieces of superimposed display use data are stored in the respective files and downloaded, and therefore, in a case where some of the pieces of data are missing for some reason, the other persons need to obtain the missing data from the superimposed display management system 50.
In a case where the other persons attempt to obtain the omitted or missing data from the superimposed display management system 50, if the other persons are not allowed to obtain the data without user registration in the superimposed display management system 50, the other persons to which the data has been distributed is unable to obtain the omitted or missing data from the superimposed display management system 50. Such inconvenience can be avoided by using the management ID.
S2201: The other person inputs the URL of the superimposed display management system 50 to the client terminal 10 to cause the client terminal 10 to access the superimposed display management system 50. Even in a state where the client terminal 10 accesses the superimposed display management system 50 but the other person does not log in to the superimposed display management system 50, when the other person selects a data selection screen request that is displayed, the third communication unit 11 of the client terminal 10 transmits the data selection screen request to the AP server 30.
S2202: The first communication unit 33 of the AP server 30 receives the data selection screen request, and the Web server unit 32 transmits a data selection screen to the client terminal 10 via the first communication unit 33.
S2203: The third communication unit 11 of the client terminal 10 receives the data selection screen, and the display control unit 13 displays the data selection screen.
S2204: The other person selects, on the data selection screen, download data stored on the client terminal 10. Data to be selected is download data downloaded to the client terminal 10 from the superimposed display management system 50 as described with reference to
S2205: The third communication unit 11 of the client terminal 10 transmits a management ID reference request including the management ID to the AP server 30. The management ID reference request includes the management ID information obtained from the download data.
S2206: The first communication unit 33 of the AP server 30 receives the management ID reference request, and the DB interfacing unit 36 transmits the management ID to the DB server 40 to request superimposed display use data and superimposed display metadata corresponding to the management ID. When the management ID is included in the management ID reference request and transmitted, the authenticating unit 42 searches the database 44 for the management ID. In a case where the management ID included in the management ID reference request and transmitted is included in the database 44, the user authentication is regarded as successful.
S2207: In a case where the user authentication is successful, the second communication unit 43 of the DB server 40 sends the received management ID to the data management unit 41, and the data management unit 41 obtains superimposed display use data and superimposed display metadata associated with the management ID. The second communication unit 43 transmits the superimposed display use data and the superimposed display metadata to the AP server 30. The DB interfacing unit 36 of the AP server 30 receives the superimposed display use data and the superimposed display metadata, and the Web server unit 32 uses the obtained data to generate the download setting screen 361.
S2208: The first communication unit 33 of the AP server 30 transmits the download setting screen 361 to the client terminal 10.
S2209: The third communication unit 11 of the client terminal 10 receives the download setting screen 361, and the display control unit 13 displays the download setting screen 361.
In the download setting field 365, only pattern 1 and pattern 2 of the data format, which are patterns with which a complete set of pieces of data that are used in superimposed display is obtained, are displayed so as to selectable. Accordingly, the other person can select a pattern with which a complete set of pieces of data that are used in superimposed display is obtained.
S2210: Referring back to
S2211: The third communication unit 11 of the client terminal 10 transmits a download request to the AP server 30. The sequence in step S2212 and the subsequent steps is the same as the sequence in step 51909 and the subsequent steps in
As described above, when the client terminal 10 accesses the superimposed display management system 50 while referencing the management ID in the download data, even a user who is not registered in the superimposed display management system 50 can use the superimposed display management system 50 to download the data to be used for superimposed display.
Now, a case where a plurality of management IDs are registered for one piece of superimposed display use data is described. When the AP server 30 registers a management ID in the management ID registration step (step S1207) in the superimposed display metadata generation sequence illustrated in
A management ID is not registered for two pieces of superimposed display use data (sss004.jpg and xxx007.jpg) and a management ID (C) has already been registered for one piece of superimposed display use data (xxx005.jpg). Even in such a case where a management ID has already been registered, the new management ID (D) is additionally registered as illustrated in
In the data download sequence for download from the superimposed display management system 50 illustrated in
As described above, in the data management system 100 according to the first example embodiment, a management ID is registered, so that a piece of superimposed display use data can be used to reference the superimposed display metadata and the other pieces of superimposed display use data without reference to the superimposed display metadata. In an existing system, superimposed display use data needs to be retrieved using superimposed display metadata.
Data downloaded to the client terminal 10 from the superimposed display management system 50 has a management ID added thereto. Therefore, the management ID added to the download data can be used to access the superimposed display management system 50 and obtain superimposed display use data and superimposed display metadata to be used. Further, even another person who is not a registered user of the superimposed display management system 50 can obtain data to be used from the superimposed display management system 50 without a login as long as the other person obtains the download data having the management ID added thereto. In a case where download data is altered and management ID information is unable to be correctly obtained, the superimposed display management system 50 fails to correctly reference the management ID, and it is possible to prevent unauthorized use and it is possible not to guarantee superimposed display to work, which enables a stable operation of the superimposed display system.
In a second example embodiment, the data management system 100 in which a management ID can be registered immediately after superimposed display use data is obtained is described.
It is assumed that the superimposed display management system 50 and the client terminal 10 respectively have configurations similar to those in the first example embodiment. However, as described below, in the second example embodiment, the controller 60 generates a management ID.
The image capturing system 2 includes a general image capturing device such as a digital camera (hereinafter referred to as a general image capturing device 3) and a special image capturing device 1 that can capture spherical images, and the controller 60 controls image capturing using the plurality of image capturing devices. The controller 60 controls image capturing by the plurality of image capturing devices in order to superimpose a planar image captured by the general image capturing device 3 in the image capturing system 2 on a spherical image captured by the special image capturing device 1. The general image capturing device 3 and the special image capturing device 1 consecutively capture images while communicating with each other or in accordance with control signals from the controller 60 substantially simultaneously or without a time difference (hereinafter referred to as consecutive image capturing).
The controller 60 is, for example, an information processing apparatus or an information terminal, such as a smartphone or a PC. A dedicated application runs on the controller 60 to communicate with the image capturing system 2 and receive superimposed display use data.
S1: The controller 60 controls the image capturing devices to perform consecutive image capturing in response to a consecutive-image-capture request accepted from the user. The details will be described with reference to
S2: After image capturing, the controller 60 generates a management ID for a plurality of images (superimposed display use data) captured in the consecutive image capturing. The controller 60 may generate superimposed display metadata before generating the management ID.
S3: The controller 60 transmits the superimposed display use data, superimposed display metadata, and management ID to the superimposed display management system 50. This transmission is performed by using the data upload procedure in the first example embodiment.
S4: The superimposed display management system 50 receives the superimposed display use data, superimposed display metadata, and management ID, and the AP server 30 registers the superimposed display use data, superimposed display metadata, and management ID in the DB server 40.
S5: After registration is completed, the user operates the client terminal 10 to specify the management ID, so that the user can obtain from the superimposed display management system 50 and view the superimposed display use data or can download the superimposed display use data from the superimposed display management system 50.
As illustrated in
The third communication unit 11, the operation accepting unit 12, and the display control unit 13 are similar to those of the client terminal 10. The third communication unit 11 transmits/receives various types of data to/from the AP server 30. In the second example embodiment, the third communication unit 11 transmits superimposed display use data and information about a specified file to the AP server 30 and receives screen information about a Web page, etc. from the AP server 30.
The operation accepting unit 12 accepts various user operations performed at the controller 60. For example, the operation accepting unit 12 accepts selection of superimposed display use data, etc. The display control unit 13 analyzes screen information received by the third communication unit 11 and displays Web pages (various screens) on the display 107 (display device). The controller 60 may run a dedicated application instead of browser software to communicate with the superimposed display management system 50.
The management ID determination unit 14 generates a unique management ID from the date and time, location information, etc. The short-distance communication unit 15 communicates with the special image capturing device 1 and the general image capturing device 3 using short-distance wireless communication technology, such as Wi-Fi or Bluetooth (registered trademark). The image capturing unit 16 controls image capturing by the special image capturing device 1 and the general image capturing device 3.
The metadata generating unit 17 superimposes a planar image on an equirectangular projection image, the images being captured by the special image capturing device 1 and the general image capturing device 3, and generates superimposed display metadata. The functions of the metadata generating unit 17 are similar to those of the metadata generating unit 31 of the AP server 30.
The position detecting unit 18 communicates with, for example, a global positioning system (GPS) satellite to detect the current location information.
The controller 60 further includes a storage unit 19 configured as the memory 102 illustrated in
The controller 60 accepts an instruction for starting linked image capturing from the user (step S3011). In this case, the controller 60 displays a linked image capturing device setting screen illustrated in
The controller 60 transmits to the general image capturing device 3 image capture start checking information by performing polling to ask the general image capturing device 3 whether image capturing is started (step S3012). The general image capturing device 3 receives the image capture start checking (asking) information.
Next, the general image capturing device 3 determines whether an image capture start operation is performed by determining whether the general image capturing device 3 accepts a user operation of pressing the shutter button (step S3013).
Next, the general image capturing device 3 transmits, to the controller 60, response information indicating the details of response based on the result of determination in step S3013 (step S3014). In a case where it is determined in step S3013 that image capturing is started, the response information includes image capture start information indicating the start of image capturing. In this case, the response information also includes an image identifier from the general image capturing device 3. On the other hand, in a case where it is not determined in step S3013 that image capturing is started, the response information includes image capture wait information indicating waiting for image capturing. The controller 60 receives the response information.
Now, the case where it is determined in step S3013 that image capturing is started and where the response information received in step S3014 includes the image capture start information is described.
First, the general image capturing device 3 starts image capturing (step S3015). This process for image capturing is a process that starts with pressing of the shutter button and that includes capturing of an image of an object, a scene, etc., thereby obtaining captured image data (here, planar image data), and storing of the captured image data.
The controller 60 transmits image capture start request information for requesting the start of image capturing to the special image capturing device 1 (step S3016). The special image capturing device 1 receives the image capture start request information.
The special image capturing device 1 starts image capturing (step S3017). Accordingly, an equirectangular projection image is generated.
Next, the controller 60 transmits to the general image capturing device 3 captured image request information for requesting the captured image (step S3018). The captured image request information includes the image identifier received in step S3014. The general image capturing device 3 receives the captured image request information.
Next, the general image capturing device 3 transmits the planar image data obtained in step S3015 to the controller 60 (step S3019). At this time, the image identifier for identifying the planar image data that is transmitted and attribute data are also transmitted. The controller 60 receives the planar image data, the image identifier, and the attribute data.
The special image capturing device 1 transmits to the controller 60 the equirectangular projection image data obtained in step S3017 (step S3020). At this time, an image identifier for identifying the equirectangular projection image data that is transmitted and attribute data are also transmitted. The controller 60 receives the equirectangular projection image data, the image identifier, and the attribute data.
Next, the controller 60 stores the electronic file of the planar image data received in step S3019 and the electronic file of the equirectangular projection image data received in step S3020 in the same electronic folder for storage (step S3021).
Next, the controller 60 generates superimposed display metadata that is used when the planar image, which is a high-definition image, is superimposed and displayed on a partial area of the equirectangular projection image, which is a low-definition image (step S3022).
The management ID determination unit 14 generates a management ID using the current date and time, location information, etc. (step S3022-2). The management ID determination unit 14 stores the management ID, the planar image, the equirectangular projection image, and the superimposed display metadata in the storage unit 19.
The controller 60 performs a process for superimposed display (step S3023). The controller 60 transmits to the superimposed display management system 50 the management ID, the planar image, the equirectangular projection image, and the superimposed display metadata.
As described above, in the second example embodiment, the controller 60 can generate superimposed display metadata, and therefore, the management ID can be registered immediately after image capturing.
The management ID may be generated in response to a user operation and need not be generated subsequent to generation of the superimposed display metadata. The superimposed display metadata need not be generated by the controller 60 and may be generated by the AP server 30.
The sequence in
The setting screen 381 includes an information field 382, a location information obtainment button 383, a location information correction button 384, a management ID generation button 385, and a transmit button 386.
When the user presses the location information obtainment button 383, the position detecting unit 18 detects location information, and the display control unit 13 displays the latitude and the longitude in the information field 382 as illustrated in
The user can press the location information correction button 384 to set not only the location information but also an identifier indicating the place, namely, the name or address indicating, for example, a specific floor of a building, in the information field 382. In
The management ID generation button 385 is a button for the management ID determination unit 14 to generate a management ID. As illustrated in
As an example of the location information, the latitude and the longitude are used in the description. Other examples of information that can be detected with GPS include the altitude, the accuracy of the location information, etc., and these pieces of information may be displayed or used to generate a management ID. In addition to GPS, the location information can be obtained by positioning using Wi-Fi, on the basis of the distance to the mobile phone base station, by determination of signal intensity using Bluetooth Low Energy (registered trademark), etc. The display control unit 13 may display location information obtained by using any of these methods in the information field 382. The management ID determination unit 14 may determine a management ID on the basis of location information, etc. obtained by using any of these methods.
The order in which the pieces of information used in generation of the management
ID are concatenated is not limited to the example illustrated in
In step S3 in
As described with reference to step S4 in
Accordingly, from an image having no location information among the pieces of superimposed display use data already registered in the superimposed display management system 50, the date and time when the specific image was captured and location information about the image can be obtained by using the management ID.
The controller 60 can not only transmit pieces of superimposed display use data immediately after image capturing (in the case of consecutive image capturing) but also transmit a piece of superimposed display use data saved in the storage unit 19 to the superimposed display management system 50.
Only the piece of superimposed display use data selected by the user is transmitted to the superimposed display management system 50, and therefore, a situation where an image captured by the user by mistake is transmitted can be prevented.
As described above, according to the second example embodiment, in addition to the effect attained by the first example embodiment in which superimposed display use data and superimposed display metadata assigned a common management ID can be obtained, an effect is obtained in which a management ID associated with superimposed display use data having no location information can be used to obtain the location information.
Further, the controller 60 automatically transmits pieces of superimposed display use data that are assigned a common management ID, so that the user need not make a selection, resulting in increased convenience. The user can select a piece of superimposed display use data to be transmitted, so that an incorrect piece of superimposed display use data is not transmitted.
The functions of the metadata generating units 17 and 31 are described in detail below.
The metadata generating units 17 and 31 each include a spherical image generator 550, an extractor 551, a corresponding area calculator 552, a point-of-gaze determiner 553, a projection converter 554, an area divider 555, a projection reverse converter 556, a shape converter 558, a correction parameter generator 559, and a superimposed display metadata generator 560. The shape converter 558 and the correction parameter generator 559 need not be included in a case where brightness or color need not be corrected. The reference numerals of images and areas described below are found in
The spherical image generator 550 uses Open Graphics Library for Embedded Systems (OpenGL ES) to place an equirectangular projection image EC so as to cover a spherical surface, thereby generating a spherical image CE.
The extractor 551 extracts a plurality of feature points in the equirectangular projection image EC that is a rectangular image obtained by using an equirectangular projection method and a plurality of feature points in a planar image P that is a rectangular image obtained by using a perspective projection method. Each feature point is represented by a pixel on a boundary at which the luminance value changes by a predetermined value or more. Further, the extractor 551 extracts a plurality of feature points in a peripheral area image PI obtained as a result of conversion by the projection converter 554.
The corresponding area calculator 552 calculates a first corresponding area CA1 that is a rectangular area corresponding to the planar image P in the equirectangular projection image EC on the basis of the similarities between the plurality of feature points in the equirectangular projection image EC and the plurality of feature points in the planar image P to thereby perform first homography transformation. Here, a central point CP1 of a rectangle defined by the four vertices of the planar image P is converted to a point of gaze GP1 in the equirectangular projection image EC by the first homography transformation. The corresponding area calculator 552 calculates a second corresponding area CA2 that is a rectangular area corresponding to the planar image P in the peripheral area image PI on the basis of the similarities between the plurality of feature points in the planar image P and the plurality of feature points in the peripheral area image PI to thereby perform second homography transformation.
At least one of the planar image P and the equirectangular projection image EC may be resized before the first homography transformation to reduce the time taken to calculate the first homography. For example, in a case where the planar image P has 40 million pixels and the equirectangular projection image EC has 30 million pixels, for example, the planar image P may be resized so as to have 30 million pixels, or both the planar image P and the equirectangular projection image EC may be resized so as to have 10 million pixels. Similarly, at least one of the planar image P and the peripheral area image PI may be resized before the second homography calculation.
The homography in the present embodiments is a transformation matrix that represents a projection relation between the equirectangular projection image EC and the planar image P. When the coordinates of a point on the planar image P is multiplied by the homography transformation matrix calculated in the homography calculation process, the coordinates of a corresponding point on the equirectangular projection image EC (spherical image CE) can be calculated.
The point-of-gaze determiner 553 determines a point (referred to as “point of gaze” in the present embodiments), on the equirectangular projection image EC, at which the central point CP1 of the planar image P is located after the first homography transformation.
The coordinates of the point of gaze GP1 are the coordinates of a point on the equirectangular projection image EC, and therefore, it is desirable that the coordinates of the point of gaze GP1 be converted so as to be expressed by the latitude and longitude and standardized. Specifically, the vertical direction of the equirectangular projection image EC is expressed by the latitude coordinate extending from −90° (−0.5 a) to +90° (+0.5π), and the horizontal direction thereof is expressed by the longitude coordinate extending from −180° (−π) to +180° (+π). Accordingly, the coordinates of a pixel position that correspond to the image size of the equirectangular projection image EC can be calculated from the latitude and longitude coordinates.
The projection converter 554 converts a peripheral area PA centered on the point of gaze GP1 in the equirectangular projection image EC to an image in perspective projection, which is the projection method for the planar image P, to generate the peripheral area image PI. Here, the peripheral area PA for which projection conversion is performed is determined so that the peripheral area image PI having a square shape can be eventually generated, the peripheral area image PI being defined by a central point CP2, which is a point obtained as a result of conversion of the point of gaze GP1, and the vertical angle of view (or the horizontal angle of view), which is equal to the diagonal angle of view a of the planar image P. This process is further described in detail below.
First, conversion of the projection method is described. The equirectangular projection image EC is placed so as to cover a sphere CS, thereby generating the spherical image CE. Therefore, data of each pixel of the equirectangular projection image EC can be associated with data of a corresponding pixel of the 3D spherical image CE on the surface of the sphere CS. Accordingly, when the coordinates of a point on the equirectangular projection image EC are expressed by (latitude, longitude) =(e, a) and the coordinates of a point on the 3D sphere CS are expressed by rectangular coordinates (x, y, z), conversion performed by the projection converter 554 is expressed by equation 1 below.
(x, y, z)=(cos(e)×cos(a), cos(e)×sin(a), sin(e)) equation 1
Here, the radius of the sphere CS is assumed to be equal to 1.
Meanwhile, the planar image P, which is a perspective projection image, is a 2D image. When a point on the planar image P is expressed by 2D polar coordinates (radius vector, argument)=(r, a), the radius vector r corresponds to the diagonal angle of view α and can have a value within the range 0≤r≤tan(diagonal angle of view/2). When a point on the planar image P is expressed by 2D rectangular coordinates (u, v), the conversion relation with the polar coordinates (radius vector, argument)=(r, a) is expressed by equations 2 below.
u=r×cos(a), v=r×sin(a) equations 2
Then, equations 2 are applied to 3D coordinates (radius vector, polar angle, azimuth). Here, only the surface of the sphere CS is taken into consideration, and therefore, the radius vector in the 3D polar coordinates is equal to 1. When the above-described 2D polar coordinates (radius vector, argument)=(r, a) are used, projection in which the equirectangular projection image EC that is placed on the surface of the sphere CS is converted to a perspective projection image is expressed by equation 3 and equation 4 below under the assumption that a virtual camera IC is present at the center of the sphere CS.
r=tan(polar angle) equation 3
a=azimuth equation 4
Here, when the polar angle is represented by t, t is expressed by t=arctan(r).
Therefore, the 3D polar coordinates (radius vector, polar angle, azimuth) are expressed by (radius vector, polar angle, azimuth)=(1, arctan(r), a).
Further, conversion from the 3D polar coordinates to the rectangular coordinates (x, y, z) is expressed by equation 5 below.
(x, y, z)=(sin(t)×cos(a), sin(t)×sin(a), cos(t)) equation 5
The above equation 5 is used to enable conversion between the equirectangular projection image EC in equirectangular projection and the planar image P in perspective projection. That is, the radius vector r, which corresponds to the diagonal angle of view α of the planar image P to be generated, can be used to calculate transformation map coordinates indicating each pixel of the planar image P and the co-ordinates of a corresponding point on the equirectangular projection image EC. On the basis of the transformation map coordinates, the peripheral area image PI, which is a perspective projection image, can be generated from the equirectangular projection image EC.
In the projection conversion described above, the position, in the equirectangular projection image EC, expressed by (latitude, longitude)=(90°, 0°) is converted to the central point CP2 of the peripheral area image PI, which is a perspective projection image. Therefore, in a case of performing perspective projection conversion while assuming a certain point of the equirectangular projection image EC to be the point of gaze, the sphere CS on which the equirectangular projection image EC is placed is rotated to perform coordinate rotation so that the point of gaze expressed by the coordinates (latitude, longitude) is located at the position (90°, 0°).
As the transformation formula for this rotation of the sphere CS, a general coordinate rotation formula can be used, and therefore, a description thereof will be omitted.
Next, a method for determining the area of the peripheral area image PI is described with reference to
For the corresponding area calculator 552 to determine the similarities between the plurality of feature points in the planar image P and the plurality of feature points in the peripheral area image PI, the second corresponding area CA2 included in the peripheral area image PI is made large to the extent possible. If the peripheral area image PI is set so as to have a large area, the peripheral area image PI includes the second corresponding area CA2. However, if the peripheral area image PI is set so as to have an excessively large area, the number of pixels for which the similarity is to be calculated increases accordingly, resulting in an increased processing time. Therefore, the peripheral area image PI includes the second corresponding area CA2 and so as to have a smaller area to the extent possible. Accordingly, in the present embodiments, the peripheral area image PI is determined with a method as described below.
In the present embodiments, the peripheral area image PI is determined by using the 35 mm equivalent focal length for the planar image P. The 35 mm equivalent focal length is obtained from Exif data recorded at the time of image capturing. The 35 mm equivalent focal length is a focal length based on the film size of 24 mm×36 mm, and therefore, the length of the diagonal of such a film and the focal length can be used to calculate the corresponding diagonal angle of view by using equation 6 and equation 7 below.
Diagonal of film=sqrt(24*24+36*36) equation 6
Angle of view of image to be combined/2=arctan((Diagonal of film/2)/35 mm equivalent focal length for image to be combined) equation 7
Here, an image that covers such an angle of view has a circle shape; however, the actual imaging element (film) has a rectangular shape. Therefore, the image captured by the imaging element is a rectangular image that is inscribed in the circle. In the present embodiments, the vertical angle of view a of the peripheral area image PI is set to a value equal to the diagonal angle of view a of the planar image P. Accordingly, the peripheral area image PI illustrated in
Diagonal of square=sqrt(Diagonal of film*Diagonal of film+Diagonal of film* Diagonal of film) equation 8
Vertical angle of view α/2=arctan((Diagonal of square/2)/35 mm equivalent focal length for planar image) equation 9
The vertical angle of view α thus calculated is used to perform projection conversion, thereby enabling generation of the peripheral area image PI (perspective projection image) that covers the planar image P centered on the point of gaze as large as possible at the diagonal angle of view α and has the vertical angle of view α that is not excessively large.
Referring back to
The area divider 555 divides a rectangle illustrated in
Now, a specific method for division into the plurality of grid areas LA2 is described.
A calculation equation used to equally divide the second corresponding area CA2 is described. In a case of equally dividing a line segment connecting two points A(X1, Y1) and B(X2, Y2) into n segments, the coordinates of a point Pm, which is the m-th point from the point A, are calculated by using equation 10 below.
Pm=(X1+(X2−X1)×m/n, Y1+(Y2−Y1)×m/n) equation 10
With equation 10 above, the coordinates of each point obtained by equally dividing the line segment can be calculated. Therefore, the coordinates of each point obtained by dividing the upper side and the lower side of the rectangle can be obtained, and thereafter, each line segment indicated by corresponding coordinates obtained as a result of division is further divided. When the upper left, the upper right, the lower right, and the lower left vertices of the rectangle are respectively represented by TL, TR, BR, and BL, the coordinates of each point obtained by equally dividing each of the line segment TL-TR and the line segment BR-BL into 30 segments are calculated. Then, the 0-th to 30-th points indicated by the calculated coordinates are obtained as a result of division. Subsequently, each of the line segments defined by corresponding points at the same positions in the order is equally divided into 20 segments to obtain the coordinates of the resulting points. Accordingly, the coordinates based on which the rectangular area is divided into 30×20 small areas can be calculated.
The projection reverse converter 556 reversely converts the projection method for the second corresponding area CA2 to equirectangular projection, which is the projection method for the equirectangular projection image EC, to thereby calculate a third corresponding area CA3, in the equirectangular projection image EC, corresponding to the second corresponding area CA2. Specifically, the projection reverse converter 556 calculates the third corresponding area CA3, in the equirectangular projection image EC, constituted by grid areas LA3 corresponding to the plurality of grid areas LA2 in the second corresponding area CA2. The third corresponding area CA3 is illustrated in
The location parameter thus generated is used to enable calculation of a locational relation between the equirectangular projection image EC and the planar image P.
In a case where the location parameter is calculated and superimposed display is performed without performing any other processing, the resulting superimposed display may be unnatural if the equirectangular projection image EC and the planar image P are significantly different from each other in brightness or color tone. Therefore, the shape converter 558 and the correction parameter generator 559 described below provide a function of preventing unnatural superimposed display in the case where the brightness or color tone significantly differs.
Prior to a color correction described below, the shape converter 558 maps the four vertices of the second corresponding area CA2 to the four vertices of the planar image P to thereby convert the shape of the second corresponding area CA2 to a shape identical to the shape of the planar image P. Specifically, the shape converter 558 converts the shape of the second corresponding area CA2 to a shape identical to the shape of the planar image P so that the grid areas LA2 of the second corresponding area CA2 illustrated in
The correction parameter generator 559 generates, for the color of the grid areas LA2′ of the second corresponding area CA2′ obtained as a result of conversion to the identical shape, a correction parameter for adjusting the brightness and color of the grid areas LAO of the planar image P, the grid areas LAO having a shape identical to the shape of the grid areas LA2′. Specifically, the correction parameter generator 559 calculates the average a of the brightness and color values (R, G, B) of all pixels constituting four grid areas LA0 that share one common grid point and further calculates the average a′ of the brightness and color values (R′, G′, B′) of all pixels constituting four grid areas LA2′ that share one common grid point. In a case where the one grid point of the grid areas LA0 and the one grid point of the grid areas LA2′ correspond to one of the four corners of the second corresponding area CA2 and one of the four corners of the third corresponding area CA3, the correction parameter generator 559 calculates the average a of the brightness and color from the corresponding one grid area LA0 and the average a′ of the brightness and color from the corresponding one grid area LA2′. In a case where the one grid point of the grid areas LA0 and the one grid point of the grid areas LA2′ correspond to a point on the boundary of the second corresponding area CA2 and a point on the boundary of the third corresponding area CA3, the correction parameter generator 559 calculates the average a of the brightness and color from the two internal grid areas LA0 and the average a′ of the brightness and color from the two internal grid areas LA2′. In the present embodiments, the correction parameter is gain data for correcting the brightness and color of the planar image P, and therefore, the correction parameter, which is represented by Pa, is calculated by dividing the average a′ by the average a as expressed by equation 11 below.
Pa=a′/a equation 11
Accordingly, the gain value indicated by the correction parameter is used to perform multiplication for each grid area LA2′ in superimposed display described below, and the color tone and the luminance value of the planar image P become closer to those indicated by the pixel values of the equirectangular projection image EC (spherical image CE), thereby enabling superimposed display that feels natural. The correction parameter need not be calculated from the averages and may be calculated by using, for example, the medians and/or the modes instead of or in addition to the averages.
The superimposed display metadata generator 560 uses the location parameter, the correction parameter, etc. to generate superimposed display metadata that indicates the position at which the planar image P is superimposed on the spherical image CE and the correction values for brightness and color.
Now, a data structure of the superimposed display metadata is described with reference to
As illustrated in
Among these pieces of information, the equirectangular projection image information is information transmitted from the special image capturing device 1 together with captured image data. The equirectangular projection image information includes an image identifier and attribute data. The image identifier included in the equirectangular projection image information is an image identifier for identifying an equirectangular projection image. In
The attribute data included in the equirectangular projection image information is related information added to the equirectangular projection image information. In
The planar image information is information transmitted from the general image capturing device 3 together with captured image data. The planar image information includes an image identifier and attribute data. The image identifier included in the planar image information is an image identifier for identifying the planar image P. In
The attribute data included in the planar image information is related information added to the planar image information. In
The superimposed display information is information generated by the controller 60 and includes area division number information, the coordinates of the grid points of each grid area (location parameter), and correction values for brightness and color (correction parameter). Among these pieces of information, the area division number information includes the number of divisions in the horizontal (longitude) direction and in the vertical (latitude) direction in a case of dividing the first corresponding area CA1 into a plurality of grid areas.
The location parameter is vertex mapping information that indicates a position, in the equirectangular projection image EC (spherical image CE), at which each grid point obtained by dividing the planar image P into a plurality of grid areas is located. The correction parameter is gain data for correcting the color of the planar image P in the present embodiments. The target to be corrected may be a monochrome image, and therefore, the correction parameter is a parameter for adjusting at least the brightness among the brightness and color.
In a case where the spherical image CE is captured by using perspective projection, which is the projection method for the planar image P, a 360-degree omnidirectional image is not obtained. Therefore, an image of a wide angle of view, such as a spherical image, is often generated by using equirectangular projection, which is one of the existing projection methods. When equirectangular projection, such as the Mercator projection, is used for an image, a length in the horizontal direction increases as the distance from the standard parallel increases, resulting in the image significantly different from an image generated using perspective projection that is employed in general cameras. Even if scaling of the images is changed for superimposition, the images do not match, and the planar image P does not satisfactorily fit in the spherical image CE. Accordingly, in a case where the planar image P captured separately from the spherical image CE is superimposed on a partial area of the spherical image CE, the equirectangular projection image EC (spherical image CE) and the planar image P do not match, and the planar image P does not satisfactorily fit in the spherical image CE because the equirectangular projection image EC and the planar image P are generated using different projection methods. In the present embodiments, the location parameter is generated in the process illustrated in
Now, the location parameter and the correction parameter are described in detail with reference to
As illustrated in
As illustrated in
Referring back to
As described above, the location parameter indicates a locational correspondence between the planar image P and the equirectangular projection image EC (spherical image CE). If the location parameter is used to indicate the position of each pixel of the planar image P and the coordinates of a corresponding point on the equirectangular projection image EC (spherical image CE), the location parameter includes information for about 40 million pixels in a case where the general image capturing device 3 is a digital camera having a large number of pixels. Therefore, the amount of data of the location parameter increases, and the processing load due to, for example, data storage increases. In the present embodiments, the planar image P is divided into 600 (30×20) areas, and the location parameter includes data indicating only the coordinates of each grid point on the planar image P and a corresponding position on the equirectangular projection image EC (spherical image CE). In a case of superimposed display, the controller 60 interpolates an image in each area using the coordinates of the grid points to thereby implement superimposed display.
Now, the processing or operation according to the second example embodiment is described with reference to
Even if the imaging element of the general image capturing device 3 and the imaging element of the special image capturing device 1 are the same, the definition, per unit area, of an image captured by the special image capturing device 1 becomes low. This is because the imaging element of the special image capturing device 1 captures an equirectangular projection image that fully covers a 360-degree scene, from which the spherical image CE is generated.
Hereinafter, the process for generating the superimposed display metadata is described. The superimposed display metadata is used to superimpose the planar image P illustrated in
First, the extractor 551 extracts a plurality of feature points in the equirectangular projection image EC, which is a rectangular image obtained by using equirectangular projection, and a plurality of feature points in the planar image P, which is a rectangular image obtained by using perspective projection (step S101).
Subsequently, the corresponding area calculator 552 performs first homography transformation and calculates the first corresponding area CA1, which is a rectangular area corresponding to the planar image P, in the equirectangular projection image EC as illustrated in
Subsequently, the point-of-gaze determiner 553 determines a point (point of gaze GP1), in the equirectangular projection image EC, at which the central point CP1 of the planar image P is located after the first homography transformation (step S103).
Subsequently, the projection converter 554 converts the projection method for the peripheral area PA centered on the point of gaze GP1 on the equirectangular projection image EC to perspective projection, which is the projection method for the planar image P, to eventually generate the peripheral area image PI in which the vertical angle of view α of the peripheral area image PI is equal to the diagonal angle of view a of the planar image P, as illustrated in
Subsequently, the extractor 551 extracts a plurality of feature points in the peripheral area image PI obtained by the projection converter 554 (step S105).
Subsequently, the corresponding area calculator 552 performs second homography transformation and calculates the second corresponding area CA2, which is a rectangular area corresponding to the planar image P, in the peripheral area image PI on the basis of the similarities between the plurality of feature points in the planar image P and the plurality of feature points in the peripheral area image PI (step S106). The planar image P is a high-definition image having, for example, 40 million pixels, and therefore, is resized in advance to an appropriate size.
Subsequently, the area divider 555 divides the second corresponding area CA2 into the plurality of grid areas LA2, as illustrated in
Subsequently, the projection reverse converter 556 converts (reversely converts) the projection method for the second corresponding area CA2 to equirectangular projection, which is the projection method for the equirectangular projection image EC, as illustrated in
The process for generating the correction parameter is described with reference to
After the process in step 5108, the shape converter 558 maps the four vertices of the second corresponding area CA2, as illustrated in
Subsequently, the area divider 555 divides the planar image P into the plurality of grid areas LA0 having a shape identical to the shape of the grid areas LA2′ in the second corresponding area CA2′obtained as a result of conversion, as illustrated in
Subsequently, the correction parameter generator 559 generates, for the brightness and color of the grid areas LA2′ in the second corresponding area CA2′, a correction parameter for adjusting the brightness and color of the grid areas LA0 in the planar image P, the grid areas LAO corresponding to the grid areas LA2′ (step S111).
Last, the superimposed display metadata generator 560 generates the superimposed display metadata on the basis of the equirectangular projection image information obtained from the special image capturing device 1, the planar image information obtained from the general image capturing device 3, the predetermined area division number information, the location parameter generated by the projection reverse converter 556, the correction parameter generated by the correction parameter generator 559, and the metadata generation information (step S112).
Now, the state of superimposed display is described in detail with reference to
As illustrated in
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
For example, in one or more of the present embodiments, image data is assumed to be superimposed display use data; however, audio data or moving image data may be embedded in an equirectangular projection image. For example, when a place in which audio data is embedded is clicked with a mouse, etc., the audio data is reproduced by the client terminal.
In any one of the present embodiments, the metadata is generated mainly for associating pieces of image data to be superimposed. As another example use of the metadata, the metadata can be used in, for example, display of a plurality of pieces of image data obtained at predetermined intervals in interval photography (time-lapse photography).
Further, in the above-described second example embodiment, since the metadata generating unit 17 and the management ID determination unit 14 are provided at the controller 60, the metadata generating unit 31 and/or the management ID generating unit 25 do not have to be provided at the AP server 30. In such case, a part of the functions to be provided by the superimposed display management system 50 may be performed at the controller 60. Specifically, obtaining the first data and the second data, generating the metadata for combining the first data and the second data, and generating the management ID, are performed at the controller 60.
Furthermore, in the above-described second example embodiment, the controller 60 may operate as the client terminal 10, which causes a display to display a screen for allowing a user to select the first data and the second data, for example. In such case, a part of the functions to be provided by the client terminal 10 may be performed at the controller 60. Accordingly, the client terminal 10 does not have to be provided in the second example embodiment.
In one embodiment, the present invention resides in: an information processing apparatus (superimposed display management system 50) including: obtaining means (for example, second communication unit 43) for obtaining a plurality of pieces of data; generating means (for example, metadata generating unit 31) for generating metadata used to combine first data of the plurality of pieces of data with second data being one or more of plurality of pieces of data other than the first data; identifier assigning means (for example, management ID generating unit 35) for assigning a common identifier to the first data, the second data, and the metadata; and data management means (data management unit 41) for storing, in a storage unit (for example, database 44), the first data, the second data, and the metadata in association with the common identifier.
In one embodiment, in the information processing apparatus, the data management means is configured to obtain, from the storage unit, any one of the first data, second data, and metadata, using the common identifier that is obtained. In one example, the data management means obtains, from the storage unit, the second data and the metadata associated with the common identifier that is associated with the first data. In another example, the data management means obtains, from the storage unit, the first data and the metadata associated with the common identifier that is associated with the second data.
In one embodiment, a data management system (100) includes the information processing apparatus (50) and a terminal device (10) connected through a network. The terminal device includes: display control means (display control unit 13) for displaying a plurality of pieces of data; accepting means (for example, operation accepting unit 12) for accepting selection of the first data and the second data to be combined; and first communication means (for example, third communication unit 11) for transmitting information on the first data and the second data accepted by the accepting means to the information processing apparatus. The information processing apparatus further includes second communication means (for example, the first communication unit 33) for receiving the information on the image data and the second data.
In one embodiment, the plurality of pieces of data are pieces of image data and stored in the storage unit (for example, database 44). The second communication means is configured to transmit, to the terminal device, file names and thumbnail images of the plurality of pieces of image data stored in the storage unit. At the terminal device, the display control means (for example, display control unit 13) displays, on a display, the file names and the thumbnail images of the pieces of the image data. The accepting means is configured to accept selection of the first image data and the second image data from among the pieces of the image data. The first communication means is configured to transmit information on the first image data and the second image data that are selected. The generating means is configured to generate the metadata using the information on the first image data and the second image data transmitted from the terminal device.
In one embodiment, in the data management system, one or more common identifiers are associated with one second image data in the memory.
In one embodiment, the data management system further includes an information terminal connected to a network, which includes: the generating means that obtains the first image data from a first image capturing device (for example, special image capturing device 1) generating the first image data and obtains the second image data from a second image capturing device (for example, general image capturing device 3) generating the second image data to generate the metadata; and second identifier assigning means (for example, management ID determination unit 14) for assigning a common identifier to the first image data and the second image data that are obtained and to the metadata. The information terminal registers the first image data, the second image data, the metadata, and the common identifier in the information processing apparatus.
In one embodiment, in the data management system, the second identifier assigning means assigns the common identifier that includes a date and time when the image is captured and location information about the information terminal.
In one embodiment, in the data management system, the information processing apparatus further includes screen information generating means (for example, Web server unit 32) for generating screen data used to display the first and second image data together with common identifiers, each of the common identifiers being associated with corresponding first and second image data and metadata obtained by the data management means from the storage unit. The second communication means is configured to transmit the screen data to the terminal device, and the display control means of the terminal device is configured to display, on the basis of the screen data, each of the first and second image data together with a corresponding one of the common identifiers.
In one embodiment, in the data management system, the screen information generating means is configured to generate screen data used to display each first or second image data together with the number of first and second image data associated with the common identifier that is associated with the first or second image data, and the display control means is configured to display, on the basis of the screen data, each of the first and second image data together with the number of first and second image data associated with the common identifier.
In one embodiment, in the data management system, in a case where the accepting means accepts selection of the first image data or the second image data, the first communication means is configured to transmit, to the information processing apparatus, information about the selected first image data or the selected second image data. The data management means is configured to obtain, from the storage unit, all of the first and second image data having the common identifier that is associated with the selected first image data or the selected second image data. The screen information generating means is configured to generate screen data used to display file names and thumbnail images of the first and second image data obtained by the data management means. The second communication means is configured to transmit, to the terminal device, the screen data that includes the file names and the thumbnail images of the first and second image data obtained by the data management means. The display control means is configured to display, on the display, the file names and the thumbnail images of the first and second image data on the basis of the screen data.
In one embodiment, in the data management system, when a request for downloading first image data, second image data, and metadata is received from the terminal device, the screen information generating means is configured to generate screen data including a data format identifier for specifying a data format that is used when the first image data, the second image data, and the metadata are downloaded. The second communication means is configured to transmit the screen data to the terminal device. The display control means is configured to display the data format identifier together with file names and thumbnail images of the first image data and the second image data on the basis of the screen data. The accepting means is configured to accept selection of the data format identifier.
In one embodiment, the data format identifier is used to specify one of: a data format in which the first image data, the second image data, and the metadata are stored in separate files and the common identifier is added to each of the files; a data format in which the first image data, the second image data, and the metadata are stored in one file and the common identifier is added to the file; a data format in which the first image data or the second image data for which selection is accepted at the terminal device is stored in one file and the common identifier is added to the file; a data format in which the metadata associated with a common identifier the same as the common identifier of the first image data or the second image data for which selection is accepted at the terminal device is stored in one file and the common identifier is added to the file; and a data format in which the first image data, the second image data, and the metadata are not included but the common identifier is stored in a file.
In one embodiment, in the data management system, the information processing apparatus further includes download data generating means (for example, download data generating unit 34) for generating download data in a data format that is specified with the data format identifier transmitted from the terminal device. The download data includes at least one of the first image data, the second image data, and the metadata associated with a common identifier the same as the common identifier of the first image data or the second image data for which selection is accepted at the terminal device. The second communication means is configured to transmit the download data to the terminal device.
In one embodiment, in the data management system, in a case where another terminal device that retains the download data specifies the common identifier of the download data to make a request for the first image data or the second image data, the information processing apparatus is configured to perform user authentication on the basis of whether the common identifier is stored in the storage unit. In a case where the authentication is successful, the screen information generation means is configured to generate the screen data including the data format identifier, and the second communication means is configured to transmit the screen data to the other terminal device.
In one embodiment, in the data management system, the screen information generating means is configured to generate the screen data with which selection of the data format identifier for specifying only download data in a data format that includes the first image data, the second image data, and the metadata is allowed.
In this disclosure, a first image is an image superimposed with a second image, and a second image is an image to be superimposed on the first image. For example, the first image is an image covering an area larger than that of the second image. In another example, the second image is an image with image quality higher than that of the first image, for example, in terms of image resolution. For instance, the first image may be a low-definition image, and the second image may be a high-definition image. In another example, the first image and the second image are images expressed in different projections (projective spaces). Examples of the first image in a first projection include an equirectangular projection image, such as a spherical image. Examples of the second image in a second projection include a perspective projection image, such as a planar image. In this disclosure, the second image, such as the planar image captured with the general image capturing device, is treated as one example of the second image in the second projection (that is, in the second projective space). The first image, and even the second image, if desired, can be made up of multiple pieces of image data which have been captured through different lenses, or using different image sensors, or at different times. Further, in this disclosure, the spherical image does not have to be the full-view spherical image. For example, the spherical image may be the wide-angle view image having an angle of about 180 to 360 degrees in the horizontal direction. As described below, it is desirable that the spherical image is image data having at least a part that is not entirely displayed in the predetermined area T. The predetermined area T is an area to be displayed to the user. In this disclosure, superimposing one image on another image is an example of combining one image with another image. Other examples of combining images include, but not limited to, placement of one image on top of other image entirely or partly, laying one image over other image entirely or partly, mapping one image on other image entirely or partly, pasting one image on other image entirely or partly, and integrating one image with other image. That is, as long as the user can perceive a plurality of images (such as the spherical image and the planar image) being displayed on a display as they were one image, processing to be performed on those images for display is not limited to the above-described examples.
The present invention can be implemented in any convenient form, for example using dedicated hardware, or a mixture of dedicated hardware and software. The present invention may be implemented as computer software implemented by one or more networked processing apparatuses. The processing apparatuses can compromise any suitably programmed apparatuses such as a general purpose computer, personal digital assistant, mobile telephone (such as a WAP or 3G-compliant phone) and so on. Since the present invention can be implemented as software, each and every aspect of the present invention thus encompasses computer software implementable on a programmable device. The computer software can be provided to the programmable device using any conventional carrier medium (carrier means). The carrier medium can compromise a transient carrier medium such as an electrical, optical, microwave, acoustic or radio frequency signal carrying the computer code. An example of such a transient medium is a TCP/IP signal carrying computer code over an IP network, such as the Internet. The carrier medium can also comprise a storage medium for storing processor readable code such as a floppy disk, hard disk, CD ROM, magnetic tape device or solid state memory device.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions.
This patent application is based on and claims priority to Japanese Patent Application No. 2018-048369, filed on Mar. 15, 2018, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
Number | Date | Country | Kind |
---|---|---|---|
2018-048369 | Mar 2018 | JP | national |
2018-234814 | Dec 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/010376 | 3/13/2019 | WO | 00 |