The present invention relates to an image shooting system, and more specifically, to a plurality of image shooting modules and a system thereof that can collaboratively shoot images via collaborative groups.
Currently, the shooting angle is dependant on personal decision and control. Such a personal and individual shooting style shows isolated and cannot be successfully integrated into the surrounding environment. Also, many personal shooting angles are highly overlapped image-taking method to cause the almost same look. Individual image-taker works alone to cause double-work condition and wastes sources.
Also, due to the personal or individual shooting style, there are two ways to take a partially surrounding image or a panoramic image: the first one is that individual stand with a camera, and horizontally rotates the camera a circle to take the panoramic image, then to process with software. The second way is using a multi-lens camera to shoot synchronizedly.
However, in the first case involves personal shooting angles are highly overlapped image-taking method to cause the almost same look, and the second case is costly and heavy, which is apt to cause some problems, such as consume more power.
To solve the above problems, a primary object of the present invention is to provide an image shooting system that shoots images with collaborative groups not individual shooting.
Another object of the present invention is to provide an image shooting system that includes a plurality of individual lens devices which are able to actively adjust their own shooting angle in real time through social network according to their own position data.
A further object of the present invention is to provide an image shooting system that is capable of adjusting each shooting angle of individual lens devices thereof according to position data along with relative data generated by individual lens devices.
A still further object of the present invention is to provide an image shooting system that shoots and completes a panoramic image with a plurality of image shooting modules exchanging their own position data through social network.
To achieve the above and other objects, the image shooting system provided according to the present invention includes a plurality of image shooting modules, each of which has a lens device. The lens device has a lens unit, a positioning unit, a processing unit, a rotation unit, and a wireless communication unit. The lens unit is used for shooting an image, which is processed into image data by the processing unit. The positioning unit generates position data, and the rotation unit rotates relative to the lens unit. A mobile device has an application unit, which is connected to the lens device via the wireless communication unit to acquire position data of the lens device, according to the position data, defines a predetermined area, and has a collaborative group. The application unit of each image shooting module acquires position data of the lens devices of other image shooting modules via a communication platform, and includes other lens devices in the predetermined area into the collaborative group to collaboratively shoot. The application unit further interprets each lens device's relative location in the collaborative group according to their individual position data to respectively control the rotation unit to rotate relative to and adjust the lens device of the lens unit, such that each lens device respectively shoots at different shooting angles.
In an embodiment, the positioning unit has a locator, and the position data is geographic data.
In an embodiment, the positioning unit has a locator and a relative position sensor, whereas the position data is geographic data.
In an embodiment, the relative position sensor includes an ultrasonic sensor, a light sensor, a Bluetooth sensor, or any combination thereof.
In an embodiment, the rotation unit is a servo motor that can rotate by 360 degrees.
In an embodiment, the communication platform is a cloud server.
In an embodiment, the lens device further includes a mike unit for receiving an environmental sound, which is processed into sound data by the processing unit.
In an embodiment, the positioning unit further includes a weight sensor for sensing the lens unit slant angle; the lens unit has a slant angle adjustment unit, which adjusts the slant angle of the lens unit according to the data sensed by the weight sensor.
With these arrangements, the image shooting system has a plurality of image shooting modules, each of which has a lens device connected to a mobile device. The mobile device has an application unit, which interprets the location of the lens device connected thereto according to the position data generated by the lens device. Each image shooting module acquires position data of the lens device of other image shooting modules via a communication platform to include other lens devices in the same area into a collaborative group. The mobile device adjusts individual lens device to shoot at different shooting angles according to a relative position data generated by each lens device of the collaborative group.
The structure and the technical means adopted by the present invention to achieve the above and other objects can be best understood by referring to the following detailed description of the preferred embodiments and the accompanying drawings, wherein
The present invention will now be described with some preferred embodiments thereof and by referring to the accompanying drawings. For the purpose of easy to understand, elements that are the same in the preferred embodiments are denoted by the same reference numerals.
Please refer to
Please refer to
The lens unit 111 is used for shooting an image, which is processed into image data by the processing unit 113. In an embodiment, the lens unit 111 has a plurality of lens for generating an imaging, and a visual angle, which can be ranged, for example but not limited to, 90 to 160 degrees, and preferably 120 degrees in the illustrated preferred embodiment. The processing unit 113 is used for receiving and processing the image shot by the lens unit 111. The positioning unit 112 is used for generating position data.
In an embodiment, the positioning unit 112 has a locator 1121 and a relative position sensor 1122. The locator 1121, such as a Global Positioning System (GPS), generates a geographic data according to satellite signal. The relative position sensor 1122, for example but not limited to, generates a sensing signal to transmit to other nearby lens devices 111, and either senses signals transmitted by other nearby lens devices 111 to generate relative position data, or transmits a sensing signal to other nearby lens devices 111 and then receives feedback signals transmitted by other nearby lens devices 111 to generate relative position data. The relative position sensor 1122 includes an ultrasonic sensor, a light sensor, a Bluetooth sensor, or any combination thereof. In another embodiment, the positioning unit 112 includes only a locator 1121.
The rotation unit 114 rotates relative to the lens unit 111. In an embodiment, the rotation unit 114 can be, for example but not limited to, a servo motor which can rotate by 360 degrees, and rotates relative to the lens device 111 in the same direction, such as in horizontal direction by 360 degrees. The wireless communication unit 15 can be, for example but not limited to, a WiFi communication unit or a Bluetooth communication unit to wirelessly connect other devices. The lens device 11 further includes a mike unit 116 for receiving an environmental sound, such as an environmental sound or a user's sound, which is processed into sound data by the processing unit 113. Normally speaking, after shooting, the lens device 11 generates an image combined with the sound data.
In another embodiment, the lens unit 111 further has a slant angle adjustment unit 117 connected to the positioning unit 112. The positioning unit 112 further includes a weight sensor 1123 for sensing the slant angle of lens unit 111. The slant angle adjustment unit 117 adjusts the slant angle of the lens unit 111 to upwardly or downwardly slant according to the data sensed and transmitted by the weight sensor 1123. The weight sensor 1123 can be, for example, a G-sensor, and the slant angle adjustment unit 117 can be, for example, a bidirectional motor device, and is able to adjust the lens unit 111 to, for example but not limited to, move ranged from upward 90 degrees to downward 30 degrees. Furthermore, when the lens unit 111 slant upwardly or downwardly, the weight sensor 1123 senses the slant angle of the lens unit 111 and then transmits the sensed data to the slant angle adjustment unit 117, such that according to the data sensed by the weight sensor 1123, the slant angle adjustment unit 117 is therefore able to adjust and keep the slant angle of the lens unit 111 at a horizontal position.
The mobile device 12 of the image shooting module 10 has an application unit 121, which is wirelessly connected to the lens device 11 via the wireless communication unit 115 to acquire position data of the lens device 11, according to the position data, defines a predetermined area. The predetermined area specifically is defined as a circle area, in which the connected lens device 11 serves as a center point and take a distance as a radius to draw a virtual circle. The radius can be determined according to users' number or users' location, and, for example but not limited to, 10-40 meters. Also, the application unit 121 of each mobile device 12 has a collaborative group 1211, and connected to the communication platform 21, such as a cloud server, via a wireless network 22 to acquire position data of the lens devices 11 of other image shooting modules 10 via the communication platform 21, and includes other lens devices 11 in the predetermined area into the collaborative group 1211. Each lens device 11 in the collaborative group 1121 senses relative position data among one another via the relative position sensor 1122 of the positioning unit 112. The application unit 121 of the mobile device 12 further interprets each lens device's relative position data in the collaborative group 1211 according to their individual position data to respectively control the rotation unit 114 of each lens device 11 to rotate relative to and adjust the lens device 11 of the lens unit 111, such that each lens device 11 respectively shoots at different shooting angles.
It should be noted here that the image data shot by each lens device 11 in the collaborative group 1121 collaboratively contain synchronized geographic, time, shooting angle information, and changed records of each lens device 11 during shooting.
Also, the image data which carry the information are uploaded from the application unit 121 of the connected mobile device 12 to a database of the communication platform 21, and then edited and reproduced by a render engine according to the geographic, time, shooting angle information of the image data, and changed records of each lens device 11 during shooting to further produce the whole image data after processed by the collaborative group 1211.
The application of the present invention will be described as follows, and for the purpose of easy to understand, elements that are included in the image shooting modules in the preferred embodiments are denoted by the reference numerals.
The application unit 121 of the mobile device 12a of the first image shooting module 10a acquires the position data of the lens devices 11a-11g of other image shooting modules 10b-10g, and according to there data, interprets the second image shooting module 10b located within the predetermined area C of the first image shooting module 10a. The application unit 121 of the mobile device 12a of the first image shooting module 10a includes the lens devices 11a, 11b in the collaborative group 1211, which respectively generate individual relative position data after sensing each other's relative position data.
As shown in
As shown in
Please refer to
when the relative positions of the lens devices 11a, 11b, and 11c in the collaborative group 1211 are changed, the application unit 121 interprets that the lens device 11b is in the forefront, the lens device 11a is in the rearmost and at the left side of the lens device 11c, and the lens device 11c is at the right side of the lens device 11a, that is, the triangle formed by the relative positions of the three lens devices 11a, 11b, and 11c is changed. The shooting angle w2 of the lens device 11b orients forwardly, i.e. the same direction as the shooting main axis Y1, the shooting angle w1 of the lens device 11a orients in left-rear direction, i.e. 60 degrees from the left side of the shooting main axis Y1, and the shooting angle w3 of the lens device 11c orients in right-rear direction, i.e. 60 degrees from the right side of the shooting main axis Y1. Since the shooting angle w1, w2, and w3 of three lens devices 11a, 11b, and 11c are respectively preferably 120 degrees, the union of the three lens devices 11a, 11b, and 11c creates a at least 360 degree panoramic image.
Please refer to
The present invention has been described with some preferred embodiments thereof and it is understood that many changes and modifications in the described embodiments can be carried out without departing from the scope and the spirit of the invention that is intended to be limited only by the appended claims.