The present application claims priority based on Japanese Patent Application No. 2017-247798 filed on Dec. 25, 2017. The contents of which are incorporated herein by reference in their entirety.
The present invention relates to an image processing system, an image processing method, an image processing device, a recording medium and a portable apparatus.
As described in Japanese Patent Application Laid-Open No. 2014-225831, a photographing system which is configured to photograph a moving object by accurately catching appearance of the moving object when automatically photographing the moving object at a predetermined position has been proposed so far.
According to one embodiment of the present invention, there is provided an image processing system including a portable apparatus that an object possesses, an imaging device which is installed in a predetermined area, a transmission device which is installed in the predetermined area and transmits a transmission device ID which identifies the transmission device itself, an image processing device which processes an image which is captured by the imaging device, in which a PROCESSOR of the imaging device transmits image information pertaining to the image which is captured by the imaging device, a PROCESSOR of the transmission device transmits the transmission device ID, a PROCESSOR of the portable apparatus receives the transmitted transmission device ID, generates information which corresponds to the transmission device ID on the basis of a reception status of the transmission device ID and transmits the generated information which corresponds to the transmission device ID or the transmission device ID, and a PROCESSOR of the image processing device receives the transmitted image information and the transmitted information which corresponds to the transmission device ID or the transmission device ID, and specifies images of parts relating to times that the object enters and leaves the area where the transmission device is installed from images in the image information on the basis of the received image information and the received information which corresponds to the transmission device ID or the transmission device ID.
According to one embodiment of the present invention, there is also provided an image processing method by an image processing system which includes a portable apparatus that an object possesses, an imaging device which is installed in a predetermined area, a transmission device which is installed in the predetermined area and transmits a transmission device ID which identifies the transmission device itself, and an image processing device which processes an image which is captured by the imaging device, including the step of transmitting, by the imaging device, image information pertaining to the image which is captured by the imaging device, the step of transmitting the transmission device ID by the transmission device, the step of receiving the transmission device ID which is transmitted from the transmission device, generating information which corresponds to the transmission device ID on the basis of a reception status of the transmission device ID and transmitting the generated information which corresponds to the transmission device ID or the transmission device ID by the portable apparatus, and the step of receiving the image information which is transmitted from the imaging device and the information which is transmitted from the portable apparatus and corresponds to the transmission device ID or the transmission device ID and specifying images of parts relating to times that the object enters and leaves the area where the transmission device is installed from images in the image information on the basis of the image information and the information which corresponds to the transmission device ID or the transmission device ID.
According to one embodiment of the present invention, there is also provided an image processing device which processes an image which is captured by an imaging device, in which a PROCESSOR of the image processing device receives image information of the image which is captured by the imaging device which is installed in a predetermined area, receives information which is generated by a portable apparatus that an object possesses on the basis of a reception status of a transmission device ID of a transmission device which is installed in the predetermined area and corresponds to the transmission device ID or the transmission device ID and specifies images of parts relating to times that the object enters and leaves the area where the transmission device is installed from images in the image information on the basis of the received image information and the information which corresponds to the transmission device ID or the transmission device ID.
According to one embodiment of the present invention, there is also provided a recording medium in which a computer readable program is recorded, making a computer implement a first reception function of receiving image information of an image which is captured by the imaging device which is installed in a predetermined area, a second reception function of receiving information which is generated by a portable apparatus that an object possesses on the basis of a reception status of a transmission device ID of a transmission device which is installed in the predetermined area and corresponds to the transmission device ID or the transmission device ID and an image specification function of specifying images of parts relating to times that the object enters and leaves the area where the transmission device is installed from images in the image information on the basis of the image information which is received by the first reception function and the information which is received by the second reception function and corresponds to the transmission device ID or the transmission device ID.
According to one embodiment of the present invention, there is also provided a portable apparatus that an object possesses, in which a PROCESSOR of the portable apparatus receives a transmission device ID which is transmitted from a transmission device which is installed in a predetermined area and transmits the transmission device ID for identifying the transmission device itself, generates information which corresponds to the transmission device ID on the basis of a reception status of the transmission device ID, transmits the generated information or the transmission device ID to an image processing device and acquires images of parts relating to times that the object enters and leaves the predetermined area which is specified by the image processing device on the basis of an image which is captured by an imaging device which is installed in the predetermined area and the information which corresponds to the transmission device ID or the transmission device ID.
In the following, specific aspects of the present invention will be described by using the drawings. However, the scope of the present invention is not limited to an illustrated example.
First, a schematic configuration of a photographing system (an image processing system) 100 will be described with reference to
As illustrated in
The server 1 is a server which provides a user with a photographing service by the photographing system 100 and stores and manages tracking data (which will be described later) which is transmitted from the portable apparatus 2, image information which is transmitted from the network camera 3 and so forth. In addition, the server 1 performs various kinds of data processing (for example, user registration (group registration), image addition, image publication and so forth) by executing various programs.
The portable apparatus 2 is, for example, a smartphone, a tablet PC, a cell phone, a PDA (Person Digital Assistant) and so forth that the user possesses when moving. The portable apparatus 2 accepts an input operation of the user, transmits information which is based on the input operation concerned to the server 1 and displays information which is transmitted from the server 1 and received by the portable apparatus 2 itself.
The network camera 3 is a network camera adapted to photograph a predetermined area where the photographing service by the photographing system 100 is performed and transmits the image information (including recording time information) of photographed images to the server 1 at any time. The predetermined area where the photographing service is performed may be either one place or a plurality of places.
The beacon 4 is installed in the predetermined area where the photographing service by the photographing system 100 is performed and transmits the beacon ID at any time. That is, the beacon 4 is configured to be installed by the number according to the number of the predetermined areas where the photographing service is performed.
As illustrated in
The PROCESSOR 11 controls the respective units of the server 1. The PROCESSOR 11 reads out a designated program in system programs and application programs which are stored in the storage unit 13, evolves the read-out program in a work area of the RAM 12 and executes various processes in accordance with the program concerned.
The RAM 12 is, for example, a volatile memory and has the work area where the various programs and various kinds of data which are read out by the processor 11 are temporarily stored.
The storage unit 13 is configured by, for example, an HDD (Hard Disk Drive), an SSD (Solid State Drive) and so forth and is a data and program writable/readable storage unit. In addition, the storage unit 13 stores a program 13a, a logfile DB 13b, an image information DB 13c and so forth.
The program 13a includes the above-described various system programs and application programs to be executed by the PROCESSOR 11.
The logfile DB (a storage unit) 13b is a database in which a logfile (which will be described later) which is prepared targeting on a user or a group who completes registration for utilization of the photographing service by the photographing system 100.
The image information DB 13c is a database in which image information which is transmitted from the network camera 3 via the Internet 5 is registered.
The operation unit 14 has, for example, a key input section such as a keyboard and so forth and a pointing device such as a mouse and so forth. In addition, the operation unit 14 accepts a key input and a position input and outputs operation information on the key input and the position input to the PROCESSOR 11.
The display unit 15 is configured by, for example, an LCD (Liquid Crystal Display), an organic EL (Electro Luminescence) display and so forth. In addition, various screens are displayed on the display unit 15 in accordance with an instruction of a display signal which is input from the PROCESSOR 11.
The communication unit 16 is configured by, for example, a network card and so forth. In addition, the communication unit 16 is connected to the Internet 5 in a communicable state and performs communication with equipment (for example, the portable apparatus 2, the network camera 3 and so forth) on the Internet 5.
As illustrated in
The PROCESSOR 21 controls the respective units of the portable apparatus 2. The PROCESSOR 21 reads out a designated program in system programs and application programs which are stored in the storage unit 23, evolves the read-out program in a work area of the RAM 22 and executes various processes in accordance with the program concerned. In that case, the PROCESSOR 21 operates to store results of execution of various processes in the RAM 22 and to display the results of execution of various processes on the display unit 25 as necessary.
The RAM 22 is, for example, a volatile memory and has a work area where various programs and various kinds of data which are read out by the processor 21 are temporarily stored.
The storage unit 23 is configured by, for example, the HDD, the SSD and so forth and is a data and program writable/readable storage unit. In addition, the storage unit 23 stores a program 23a. The program 23a includes the above-described various system programs and application programs to be executed by the PROCESSOR 21.
The operation unit 24 includes various kinds of function keys, accepts an input by depressing each key by the user and outputs operation information on the input by key depression to the PROCESSOR 21. In addition, the operation unit 24 has a touch panel and so forth that transparent electrodes are installed in the form of a grid shape so as to cover a surface of the display unit 25, detects a position which is depressed with a finger, a touch pen and so forth and outputs position information on the depressed position to the PROCESSOR 21 as operation information.
The display unit 25 is configured by, for example, the LCD and so forth. Various screens are displayed on the display unit 25 in accordance with an instruction of a display signal which is input from the PROCESSOR 21.
The communication unit 26 is wirelessly connected to the Internet 5 via a base station or an access point and performs communication with the server 1 which is connected to the Internet 5. In addition, the communication unit 26 receives the beacon ID which is transmitted from the beacon 4 by a wireless communication system such as, for example, Wi-Fi and so forth.
As illustrated in
The PROCESSOR 31 controls the respective units of the network camera 3. The PROCESSOR 31 reads out a designated program in various system programs and application programs which are stored in the storage unit 33, evolves the read-out program in a work area of the RAM 32 and executes various processes in accordance with the program concerned.
The RAM 32 is, for example, the volatile memory and has the work area where the various programs and various kinds of data which are read out by the PROCESSOR 31 are temporarily stored.
The storage unit 33 is configured by, for example, the HDD, the SSD and so forth and is a data and program writable/readable storage unit. In addition, the storage unit 33 stores a program 33a. The program 33a includes the above-described various system programs and application programs to be executed by the PROCESSOR 31.
The photographing unit 34 photographs a user who becomes the object and generates a photographed image.
Although illustration is omitted, the photographing unit 34 includes a camera which includes an optical system and an imaging element and a photographing control section which controls the camera. The imaging element is an image sensor such as, for example, a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor) and so forth. Then, the imaging element converts an optical image which passes through the optical system into a two-dimensional image signal.
The operation unit 35 includes various function keys, accepts an input by depressing each key and outputs operation information on the input by key depression to the PROCESSOR 31.
The communication unit 36 is wirelessly connected to the Internet 5 via the base station or the access point and transmits image information on the image which is captured (photographed) by the photographing unit 34 to the server 1 which is connected to the Internet 5 at any time. Here, it is assumed that when the image information is transmitted to the server 1, the beacon ID of the beacon 4 which is installed in a photographing area of the network camera 3 which captures the image of the image information is transmitted in correspondence with the image information. Thereby, the server 1 is configured to be able to identify that the image in the received image information is an image which is photographed in which area.
Next, details of the photographing service utilizing the photographing system 100 will be described with reference to
As illustrated in
In a case of receiving provision of the photographing service using the photographing system 100, the user gains access to a predetermined SNS (Social Network Service) site 6 from the portable apparatus 2 that the user possesses via the Internet 5. Then, the user starts a chat function in the SNS site 6 and makes registration of a camerabot as a conversation partner (friend registration) and thereby the photographing service (a personal photographing service) targeting on the user is started. Here, the camerabot is a robot which is in charge of acceptance of the photographing service and so forth and is the one which personifies an exclusive photographer. Then, the photographing system 100 is configured in such a manner that when the photographing service is started, a user ID for identifying the user who made registration of the camerabot is transmitted to the server 1 via the SNS site 6 and a logfile which targets on the user is prepared in the server 1.
As the photographing service, there exists a group photographing service in addition to the personal photographing service. In order to start the group photographing service, the camerabot is registered in a group chat which targets on a group which intends to receive this service and thereby the photographing service (the group photographing service) which targets on respective members of the group is started. Then, the photographing system 100 is configured in such a manner that when the group photographing service is started, user IDs for identifying the respective members are transmitted to the server 1 via the SNS site 6 and a logfile which targets on the group is prepared in the server 1.
Then, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Specifically, the server 1 refers to the logfile for the user A illustrated in
Next, when the image edition processing is terminated, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
Specifically, the server 1 refers to the logfile for the group G illustrated in
Next, when the image edition processing is terminated, as illustrated in
Next, as illustrated in
Next, control procedures of the image edition processing of the server 1 in the photographing system 100 will be described.
As illustrated in
Next, the PROCESSOR 11 decides whether registration of the camerabot CB is made by the portable apparatus 2 via the SNS site 6 (step S2).
In step S2, in a case where it is decided that registration of the camerabot CB is made by the portable apparatus 2 (step S2: YES), the PROCESSOR 11 prepares a logfile for a user or a group who makes registration of the camerabot CB, stores the logfile into the logfile DB 13b (step S3) and shifts to a process of step S2.
On the other hand, in step S2, in a case where it is decided that registration of the camerabot CB is not made by the portable apparatus 2 (step S2: NO), the PROCESSOR 11 skips over step S3 and shifts to a process of step S4.
Next, the PROCESSOR 11 decides whether tracking data is received from the portable apparatus 2 via the communication unit 16 (step S4).
In step S4, in a case where it is decided that the tracking data is received from the portable apparatus 2 (step S4: YES), the PROCESSOR 11 stores the received tracking data into the logfile for the target user or group (step S5) and shifts to a process of step S6.
On the other hand, in step S4, in a case where it is decided that the tracking data is not received from the portable apparatus 2 (step S4: NO), the PROCESSOR 11 skips over step S5 and shifts to a process of step S6.
Next, the PROCESSOR 11 decides whether termination gf photographing is instructed from the portable apparatus 2 via the SNS site 6 (step S6).
In step S6, in a case where it is decided that the termination of photographing is not instructed from the portable apparatus 2 (step S6: NO), the PROCESSOR 11 returns the process back to step S1 and executes the subsequent processes repetitively.
On the other hand, in step S6, in a case where it is decided that the termination of photographing is instructed from the portable apparatus 2 (step S6: YES), the PROCESSOR 11 refers to the logfile for the target user or group and edits the images in the respective pieces of image information which are stored in the image information DB 13c (step S7). Specifically, the PROCESSOR 11 specifies images of parts relating to the target user or group from the images in the respective pieces of image information which are stored in the image information DB 13c. Then, the PROCESSOR 11 cuts out the specified images, synthesizes the cut-out images in time series and generates one moving image.
Next, the processor 11 decides whether publication of the moving image is instructed from the portable apparatus 2 via the SNS site 6 (step S8).
In step S8, in a case where it is decided that the publication of the moving image is not instructed from the portable apparatus 2 (step S8: NO), the PROCESSOR 11 returns the process back to the step S1 and executes the subsequent processes repetitively.
On the other hand, in step S8, in a case where it is decided that the publication of the moving image is instructed from the portable apparatus 2 (step S8: YES), the PROCESSOR 11 publishes the moving image which is edited in step S7 on a video sharing site (step S9), returns the process back to step S1 and executes the subsequent processes repetitively. Incidentally, the PROCESSOR 11 executes the processes of step S1 to step S9 repetitively while the power source of the server 1 is in an ON state.
In addition, although in the above-described embodiment, the tracking data (the enter and leave time information) relating to the enter and leave times of the object is generated, tracking data (enter and leave date and time information) relating to enter and leave dates and times of the object may be generated. According to the photographing system 100 of the present embodiment, the network camera 3 transmits the image information pertaining to the image which is captured by the network camera 3 concerned, the beacon 4 transmits the beacon ID and the portable apparatus 2 receives the beacon ID, generates the tracking data (the enter and leave time information) relating to the times that the object enters and leaves the area where the beacon 4 which corresponds to the beacon ID concerned is installed on the basis of the reception status of the beacon ID and transmits the generated tracking data. The server 1 receives the image information which is transmitted from the network camera 3 and the tracking data which is transmitted from the portable apparatus 2 and consequently specifies the images of the parts relating to the enter and leave times of the object from the images in the image information concerned on the basis of the image information concerned and the tracking data concerned.
Therefore, since the server 1 is able to specify the images of the parts relating to the enter and leave times of the object concerned from the images in the image information which is transmitted from the network camera 3 simply by transmitting the tracking data from the portable apparatus 2 that the object possesses to the server 1, it becomes possible to accurately catch the object concerned by efficient processing using the portable apparatus 2 concerned.
In addition, according to the photographing system 100 of the present embodiment, a plurality of the network cameras 3 and a plurality of the beacons 4 are provided respectively, each network camera 3 and each beacon 4 are installed in each of the plurality of areas (the first area R1, the second area R2 and the third area R3) and each network camera 3 transmits the image information pertaining to the image which is captured by the network camera 3 concerned and the beacon ID of the beacon 4 which is installed in the same area as that of the network camera 3 concerned in correspondence with each other. The server 1 receives the image information and the beacon ID which is made in correspondence with the image information which are transmitted from each network camera 3 and the tracking data which is transmitted from the portable apparatus 2 and consequently specifies the images of the parts relating to the enter and leave times of the project from the images in the image information concerned for every piece of the image information on the basis of the image information concerned and the beacon ID which is made in correspondence with the image information concerned, and the tracking data concerned.
Accordingly, since it is possible to specify the images of the parts relating to the enter and leave times of the object from the images which are photographed in the respective areas, it becomes possible to utilize the photographing system 100 diversely.
In addition, according to the photographing system 100 of the present embodiment, a plurality of the portable apparatus 2 is provided, each of the plurality of objects possesses each of the portable apparatus 2, each network camera 3 transmits the image information pertaining to the image which is captured by the network camera 3 concerned and the beacon ID of the beacon 4 which is installed in the same area as that of the network camera 3 in correspondence with each other and each portable apparatus 2 transmits the tracking data and the user ID in correspondence with each other. The server 1 receives the image information and the beacon ID which is made in correspondence with the image information which are transmitted from each network camera 3, and the tracking data and the user ID which are transmitted from each portable apparatus 2 and consequently specifies images of parts relating to enter and leave times of an object who corresponds to the user ID concerned from the images in the image information concerned for every piece of the image information on the basis of the image information concerned and the beacon ID which is made in correspondence with the image information, and the tracking data concerned and the user ID.
Accordingly, it is possible to specify the images of the parts relating to the respective enter and leave times of the plurality of objects, it becomes possible to utilize the photographing system 100 more diversely.
In addition, according to the photographing system 100 of the present embodiment, the server 1 generates the logfile (the related information) indicating the relation between the image information and the beacon ID which is made in correspondence with the image information concerned, and the tracking data and the user ID which are transmitted from the portable apparatus 2 concerned, stores the logfile concerned into the logfile DB 13b and consequently specifies images of parts relating to enter and leave times of a desirable object who corresponds to the user ID from the images in the image information concerned for every piece of the image information on the basis of the logfile which is stored in the logfile DB 13b.
Accordingly, since it is possible to specify the images of the parts relating to the enter and leave times of the object of the desirable user ID on the basis of the logfile which is stored in advance in the logfile DB 13b, it becomes possible to perform specification of the images concerned smoothly.
In addition, according to the photographing system 100 of the present embodiment, the server 1 cuts out and synthesizes the plurality of the specified images consequently. Therefore, since it is possible to see the plurality of specified images altogether at one time, it becomes possible to facilitate confirmation of the image.
In addition, according to the photographing system 100 of the present embodiment, the portable apparatus 2 further transmits the input information which is input on the basis of the user operation of the portable apparatus 2 concerned. The server 1 receives the input information which is transmitted from the portable apparatus 2 and consequently further synthesizes the input information concerned when synthesizing the plurality of cut-out images. Therefore, since it becomes possible to add a message and so forth to the images to be synthesized, it is possible to edit the images into a synthetic image which is peculiar to each user.
In addition, according to the photographing system 100 of the present embodiment, when synthesizing the plurality of cut-out images, the server 1 synthesizes the plurality of cut-out images concerned in order of the times that the tracking data indicates. Therefore, it becomes possible to confirm the state of the object which changes with time.
In addition, according to the photographing system 100 of the present embodiment, when synthesis of the plurality of cut-out images is terminated, the server 1 transmits termination notice information which indicates that the synthesis concerned is terminated to the portable apparatus 2. Therefore, it becomes possible to promptly confirm the synthetic image so synthesized.
In addition, according to the photographing system 100 of the present embodiment, the server 1 is able to upload the synthetic image so synthesized onto the video sharing site. Therefore, it is possible to share the synthetic image with other users.
Incidentally, the present invention is not limited to the above-described embodiment and various improvements and design alterations may be made within a range not deviating from the gist of the present invention.
For example, although in the above-described embodiment, description was made by illustrating a case where the photographing system 100 is used in the skiing ground, a place where the photographing system 100 is used may be a region where movement of the object is assumed and, for example, may be a theme park, a marathon course and so forth. In addition, in regard to the network camera 3 and the beacon 4 used in the photographing system 100, for example, the network camera 3 and the beacon 4 may be combined with each other and installed on a moving body (for example, an attraction) in the theme park, not limited to a case where the network camera 3 and the beacon 4 are installed in the predetermined area.
In addition, although in the above-described embodiment, edition of the image and uploading of the edited image are performed in the server 1, for example, re-edition such as insertion of a title, a telop and so forth into the edited image, change of the title, the telop and so forth in the edited image, cutting of an unnecessary scene and so forth may be performed on the basis of the user operation.
In addition, although in the above-described embodiment, the user A gives the instructions to terminate photographing and to publish the moving image in the respective users (the user A and the user B) who receive provision of the group photographing service, for example, only a user (for example, an administrator) for whom setting is made in advance may give the instructions to terminate photographing and to publish the moving image in the plurality of users who compose the group.
In addition, although in the above-described embodiment, the server 1 cuts the images of the parts relating to the target user or group out of the respective pieces of the image information stored in the image information DB 13c, synthesizes the cut-out images in time series and generates one moving image, for example, the moving image may be generated by synthesizing each cut-out image to each area where photographing is performed.
In addition, although in the above-described embodiment, the network camera 3 and the beacon 4 are installed in each of the respective areas (the first area R1, the second area R2 and the third area R3), for example, the network camera 3 and the beacon 4 may be integrated into one device and the device so integrated may be installed in each area.
Although the embodiment of the present invention was described as above, the scope of the present invention is not limited to the above-described embodiment and embraces the scope of the invention described in the appended claims and a range of equivalency of the claims.
Number | Date | Country | Kind |
---|---|---|---|
2017-247798 | Dec 2017 | JP | national |