The present invention relates generally to a method for performing a cooperative function and a device using the same, and more particularly, to a method for allowing a plurality of devices to perform a single cooperative function together, and a device using the same.
With the advent of various digital devices, the number of digital devices that a single user possesses has increased significantly.
These various digital devices have provided increased convenience, and continue to become more sophisticated by incorporating multi-functions.
The user, however, still pursues digital devices having more advanced and sophisticated functions.
However, a digital device is limited in the number of functions it can perform on its own. Therefore, a method for converging and combining each device owned by the user is required to create a new function which can be performed by a plurality of digital devices.
The present invention has been made to address at least the above problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention provides a method for performing a cooperative function if it is determined that there are other devices in the surrounding area, and a device using the same.
According to an aspect of the present invention, a first electronic device is provided that includes a display, a camera circuitry, a memory configured to store instructions, and one or more processors configured to execute the instructions stored in the memory. A second electronic device disposed in an area related to the first electronic device is detected while obtaining video via the camera circuitry. Based on detecting the second electronic device disposed in the area, the video is automatically transmitted, without receiving a user input for transmitting the video being obtained via the camera circuitry, to the second electronic device such that the transmitted video is provided on a display of the second electronic device. A third electronic device disposed in the area related to the first electronic device is detected while providing a still image on the display of the first electronic device. The still image is previously stored by the first electronic device. Based on detecting the third electronic device disposed in the area, the still image is automatically transmitted, without receiving a user input for transmitting the still image being provided on the display of the first electronic device, to the third electronic device such that the transmitted still image is provided on a display of the third electronic device.
The above and other aspects, features and advantages of the present invention will be more apparent from the following detailed description when taken in conjunction with the accompanying drawings, in which:
Embodiments of the present invention are described in detail with reference to the accompanying drawings.
In the following description, the same or similar reference numerals may be used for the same or similar elements when they are illustrated in different drawings. Detailed descriptions of constructions or processes known in the art may be omitted to avoid obscuring the subject matter of the present invention.
If the digital camera 110 is adjacent to the electronic frame 120 as illustrated in
FIG. IB illustrates that the digital camera 110 transmits stored photos to the electronic frame 120, and the electronic frame 120 reproduces the photos received from the digital camera 110 as a slideshow.
In order to perform the above cooperative function, the digital camera 110 is preset to transmit stored photos to the electronic frame 120 if the digital camera 110 is adjacent to the electronic frame 120.
In addition, in order to perform the above cooperative function, the electronic frame 120 is preset to reproduce the photos received from the digital camera 110 as a slideshow if the electronic frame 120 is adjacent to the digital camera 110.
Once the cooperative function starts between the digital camera 110 and the electronic frame 120, the cooperative function continues even if the distance between the two becomes wide.
If the digital camera 220 is adjacent to the PC 210 as illustrated in
In order to perform the above cooperative function, the digital camera 220 is preset to transmit additionally stored photos to the PC 210 if the digital camera 220 is adjacent to the PC 210.
In addition, in order to perform the above cooperative function, the PC 210 is preset to back-up the photos received from the digital camera 220 in a designated folder of the HDD if the PC 210 is adjacent to the digital camera 220. Once the cooperative function starts between the PC 210 and the digital camera 220, the cooperative function continues even if the distance between the two becomes wide.
If the digital camera 310 is adjacent to the printer 320 as illustrated in
In order to perform the above cooperative function, the digital camera 310 is preset to transmit photos that are currently being reproduced to the printer 320 if the digital camera 310 is adjacent to the printer 320.
In addition, in order to perform the above cooperative function, the printer 320 is preset to print the photos received from the digital camera 310 if the printer 320 is adjacent to the digital camera 310.
If an MP3 player, which is reproducing music, is adjacent to the printer 320, the MP3 player transmits information regarding the current music to the printer 320 and the printer 320 may download the lyrics or music book of the current music through the Internet and print them.
If the mobile phone-A 410 is adjacent to the mobile phone-B 420 as illustrated in
In order to perform the above cooperative function, the mobile phone-A 410 is preset to transmit photos, which are currently being reproduced to the mobile phone-B 420, if the mobile phone-A 410 is adjacent to the mobile phone-B 420. In addition, in order to perform the above cooperative function, the mobile phone 420 is preset to display the photos received from the mobile phone-A 410 if the mobile phone 420-B is adjacent to the mobile phone-A 410.
Once the cooperative function starts between the mobile phone-A 410 and the mobile phone-B 420, the cooperative function continues even if the distance between the two becomes wide.
In order to perform the above cooperative function, the mobile phone 510 is preset to transmit photos, which are currently being reproduced, to the TV 520 if the mobile phone 510 is adjacent to the TV 520.
In addition, in order to perform the above cooperative function, the TV 520 is preset to reproduce the photos received from the mobile phone 510 if the TV 520 is adjacent to the mobile phone 510.
Once the cooperative function starts between the mobile phone 510 and the TV 520, the cooperative function continues even if the distance between the two becomes wide.
If the digital camcorder 610 is adjacent to the TV 620 as illustrated in
In addition, in order to perform the above cooperative function, the TV 620 is preset to reproduce the images received from digital camcorder 610 if the TV 620 is adjacent to the digital camcorder 610.
Once the cooperative function starts between the digital camcorder 610 and the TV 620, the cooperative function continues even if the distance between the two becomes wide.
If the mobile phone 720 is adjacent to the PC 710 as illustrated in
In order to perform the above cooperative function, the mobile phone 720 is preset to transmit stored schedule information to the PC 710 if the mobile phone 720 is adjacent to the PC 710.
In addition, in order to perform the above cooperative function, the PC 710 is preset to back-up the schedule information received from the mobile phone 720 in a designated folder if the PC 710 is adjacent to the mobile phone 720. Once the cooperative function starts between the mobile phone 720 and the PC 710, the cooperative function continues even if the distance between the two becomes wide.
As illustrated in
If it is determined that there is another device nearby in step S820, the device determines whether an automatic cooperative function is set between the device and the another device in step S830.
If it is determined that an automatic cooperative function is set in step S830, the device sets a communicable connection with the another device in step S840.
Subsequently, the device performs the cooperative function with the another device automatically in step S850.
If the digital camera 920 is adjacent to the back of the electronic frame 910 as illustrated in
In order to perform the above cooperative function, the digital camera 920 is preset to transmit stored photos to the electronic frame 910 if the digital camera 920 is adjacent to the back of the electronic frame 910.
In addition, in order to perform the above cooperative function, the electronic frame 910 is preset to back-up the photos received from the digital camera 920 if the back of electronic frame 910 is adjacent to the digital camera 920. Once the cooperative function starts between the digital camera 920 and the electronic frame 910, the cooperative function continues even if the distance between the two becomes wide.
If the digital camera 920 is adjacent to the front of the electronic frame 910 as illustrated in
In order to perform the above cooperative function, the digital camera 920 is preset to transmit stored photos to the electronic frame 910 if the digital camera 920 is adjacent to the front of the electronic frame 910.
In addition, in order to perform the above cooperative function, the electronic frame 910 is preset to reproduce the photos received from the digital camera 920 as a slideshow if the front of electronic frame 910 is adjacent to the digital camera 920.
Once the cooperative function starts between the digital camera 920 and the electronic frame 910, the cooperative function continues even if the distance between the two becomes wide.
As illustrated in
If it is determined that there is another device nearby in step SI 120, the device identifies the location of the another device in step SI 130.
Subsequently, the device identifies an automatic cooperative function that should be performed together with the another device based on the location of the another device in step S1140.
The device sets a communicable connection with the another device in step SI 150. Subsequently, the device performs the cooperative function with the another device automatically in step SI 160.
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
In addition, as illustrated in
In the above embodiments, a cooperative function is automatically performed when two devices become close to each other, but this is only an example. A cooperative function may also be automatically performed when two devices are in contact with each other.
In this case, the type of cooperative function to be performed may be determined depending on which part of a device is contacted by another device. For example, if the device is in contact with the “front” of another device, “a first” cooperative function may be performed, and if the device is in contact with the “back” of another device, “a second” cooperative function may be performed. To sense which part of the device in contact with another device, sensors should be formed on the surface of the another device.
In addition, the type of cooperative function to be performed may be determined depending on which part of a device contacts which part of another device. For example, if the “front” of the device contacts the “front” of another device, “the first” cooperative function may be performed, and if the “back” of the device contacts the “back” of another device, “the second” cooperative function may be performed. The type of cooperative function performed by devices may be determined by a user. In addition, the type of cooperative function that is already set may be changed by a user.
A cooperative function may be automatically set according to the properties of a device. For example, since the properties of a camera include taking pictures and the properties of a printer includes printing, a cooperative function may be automatically set as the camera taking pictures and the printer printing the photographed pictures.
The devices mentioned in the above embodiments are only examples.
The technical feature of the present invention may be applied to other devices.
The function block 1510 performs an original function of the device. If the device is a mobile phone, the function block performs telephone communication and SMS, and if the device is a TV, the function block 1510 performs broadcast reception and reproduction.
The display 1520 displays the performance result of the function block 1510 and the GUI.
The storage unit 1540 is a storage medium to store programs necessary to perform the function of the function block 1510 and to provide the GUI, contents, and other data.
The communication unit 1550 senses whether another device approaches a device in surrounding areas, and sets a communicable connection between the device and a sensed device.
In addition, the communication unit 1550 senses the location of another device in surrounding areas. For example, the communication unit 1550 senses from which sides among front, back, left and right another device approaches. To do so, the communication unit 1550 may use a plurality of directional antennas and a plurality of directional sensors.
Meanwhile, the communication unit 1550 may have a bi-directional wireless communication module to sense the location of other devices in surrounding areas. In this case, there is no limitation to the method of wireless communication of the bi-directional wireless communication module. Therefore, the wireless communication may be realized as infrared communication, sound wave communication, an RF communication, or wireless network communication. The controller 1530 controls the device to perform a cooperative function with another device through the process illustrated in
Number | Date | Country | Kind |
---|---|---|---|
10-2009-0078349 | Aug 2009 | KR | national |
This application is a continuation of U.S. patent application Ser. No. 16/425,221, filed on May 29, 2019, which is a continuation of U.S. patent application Ser. No. 16/012,041, filed on Jun. 19, 2018, which is a continuation of U.S. patent application Ser. No. 15/583,421, filed on May 1, 2017, which issued as U.S. Pat. No. 10,027,790 on Jul. 17, 2018, which is a continuation of U.S. patent application Ser. No. 15/130,338, filed on Apr. 15, 2016, which issued as U.S. Pat. No. 9,706,039 on Jul. 11, 2017, which is a continuation of U.S. patent application Ser. No. 14/638,757, filed on Mar. 4, 2015, which issued as U.S. Pat. No. 9,326,095 on Apr. 26, 2016, which is a continuation of U.S. application Ser. No. 12/862,301, filed on Aug. 24, 2010, which issued as U.S. Pat. No. 8,995,913 on Mar. 31, 2015, and claims priority under 35 U.S.C. § 119(a) to Korean Patent Application No. 10-2009-0078349, filed Aug. 24, 2009, in the Korean Intellectual Property Office, the contents of each of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5895459 | Enomoto | Apr 1999 | A |
6297805 | Adler et al. | Oct 2001 | B1 |
6665717 | Aizono | Dec 2003 | B1 |
6985589 | Morley et al. | Jan 2006 | B2 |
7228137 | Chinomi et al. | Jun 2007 | B2 |
8086214 | Naruse | Dec 2011 | B2 |
8154472 | Yamaguchi et al. | Apr 2012 | B2 |
8244179 | Dua | Aug 2012 | B2 |
20020019900 | Kim | Feb 2002 | A1 |
20020196125 | Yu et al. | Dec 2002 | A1 |
20030206635 | Morley et al. | Nov 2003 | A1 |
20050090294 | Narasimhan | Apr 2005 | A1 |
20050135619 | Morley et al. | Jun 2005 | A1 |
20050157329 | Park et al. | Jul 2005 | A1 |
20050276575 | Murayama et al. | Dec 2005 | A1 |
20060187475 | Fujioka | Aug 2006 | A1 |
20070211762 | Song et al. | Sep 2007 | A1 |
20070273609 | Yamaguchi et al. | Nov 2007 | A1 |
20080005272 | Kim et al. | Jan 2008 | A1 |
20080005767 | Seo | Jan 2008 | A1 |
20080084577 | Mihira | Apr 2008 | A1 |
20080089298 | Anschutz | Apr 2008 | A1 |
20080209011 | Stremel et al. | Aug 2008 | A1 |
20080320094 | Tu et al. | Dec 2008 | A1 |
20090137256 | Karaoguz | May 2009 | A1 |
20090147146 | Lee et al. | Jun 2009 | A1 |
20090193474 | Stein | Jul 2009 | A1 |
20090253372 | Naruse | Oct 2009 | A1 |
20090254602 | Yoshida | Oct 2009 | A1 |
20090254980 | Kanaparti | Oct 2009 | A1 |
20100100628 | Oka | Apr 2010 | A1 |
20100250794 | Hanks et al. | Sep 2010 | A1 |
20120127168 | Yamaguchi et al. | May 2012 | A1 |
20130184002 | Moshfeghi | Jul 2013 | A1 |
20190057600 | Watanabe | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
1335563 | Feb 2002 | CN |
0 798 651 | Oct 1997 | EP |
0 899 650 | Mar 1999 | EP |
0 984 363 | Mar 2000 | EP |
1 109 367 | Jun 2001 | EP |
11143606 | May 1999 | JP |
2001167019 | Jun 2001 | JP |
2005354312 | Dec 2005 | JP |
2006092332 | Apr 2006 | JP |
2006-217545 | Aug 2006 | JP |
2007259288 | Oct 2007 | JP |
2008003574 | Jan 2008 | JP |
2009-094797 | Apr 2009 | JP |
2009253476 | Oct 2009 | JP |
2010011054 | Jan 2010 | JP |
1020050075559 | Jul 2005 | KR |
1020080005840 | Jan 2008 | KR |
1020090011298 | Feb 2009 | KR |
10-0935382 | Jan 2010 | KR |
10-1195614 | Oct 2012 | KR |
2257015 | Jul 2005 | RU |
2359319 | Jun 2009 | RU |
WO 2005004415 | Jan 2005 | WO |
WO 2008085844 | Jul 2008 | WO |
WO 2010151284 | Dec 2010 | WO |
Entry |
---|
Korean Office Action dated Apr. 16, 2018 issued in counterpart application No. 10-2017-0094931, 6 pages. |
Chinese Office Action dated Dec. 18, 2017 issued in counterpart application No. 201610108455.4, 26 pages. |
Australian Examination Report dated Feb. 12, 2015 issued in counterpart application No. 2010287253. |
European Search Report dated Oct. 21, 2014 issued in counterpart application No. 10173898.7-1954. |
Australian Examination Report dated Oct. 24, 2014 issued in counterpart application No. 2010287253. |
Chinese Office Action dated Dec. 1, 2014 issued in counterpart application No. 201010260479.4. |
Russian Office Action dated Dec. 4, 2014 issued in counterpart application No. 2012111311/08. |
Japanese Office Action dated Jun. 12, 2014 issued in counterpart application No. 2010-187070. |
Chinese Office Action dated Jul. 3, 2014 issued in counterpart application No. 201010260479.4. |
Russian Office Action dated Apr. 10, 2015 issued in counterpart application No. 2012111311/08. |
Chinese Office Action dated May 6, 2015 issued in counterpart application No. 201010260479.4. |
Notice of Acceptance Office Action dated Jul. 10, 2015 issued in counterpart application No. 2010287253, 3 pages. |
Korean Office Action dated Sep. 16, 2015 issued in counterpart application No. 10-2009-0078349, 7 pages. |
Russian Office Action dated May 21, 2019 issued in counterpart application No. 2015132722/08, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20190387093 A1 | Dec 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16425221 | May 2019 | US |
Child | 16555016 | US | |
Parent | 16012041 | Jun 2018 | US |
Child | 16425221 | US | |
Parent | 15583421 | May 2017 | US |
Child | 16012041 | US | |
Parent | 15130338 | Apr 2016 | US |
Child | 15583421 | US | |
Parent | 14638757 | Mar 2015 | US |
Child | 15130338 | US | |
Parent | 12862301 | Aug 2010 | US |
Child | 14638757 | US |