The present invention relates to a method for selecting an image and a method for facilitating generation or capturing of a desired image.
A great number of the graphical or photographical images of today are generated digitally. Generally this results in that more images are created and many times a greater amount of undesirable images are created. One of the problems of today is that even undesired images are stored and thereby occupy storage capacity. One simple solution to this problem is to delete all undesired images. However, the likelihood of undesired images still occupying storage capacity becomes greater as the time pass from the time an image was created.
It is an object of the invention to improve operations on images and to improve the experience for the user of continuous operations on images.
This object is achieved by means of a method for selecting an image according to claim 1. Further embodiments of the invention are disclosed in the dependent claims.
In particular, according to a first aspect of the invention, a method for selecting an image comprises forming a group of digital image representations, displaying a first digital image representation of the group of digital image representations on a touch sensitive display, generating a position signal in response to a detection of a pointing device on the touch sensitive display, said position signal indicating a touch position, identifying a selected position within the displayed first digital image representation based on the position signal, generating a zoom-in signal in response to a detection of the pointing device sliding away from the touch position on the touch display, said zoom signal indicating a sliding distance from the touch position, displaying an enlarged representation of the first digital image representation in response to the zoom-in signal, generating a shift signal in response to a detection of a second sliding motion of the pointing device on the touch sensitive display, and displaying a second digital image representation and an enlarged representation of the second digital image representation in response to the shift signal, the enlargement of the second digital image representation being based on the zoom signal generated during displaying of the first digital image representation.
The advantage of forming a group of images is that the chance of having at least one good image is increased. By implementing the selection method the selection of the best image is facilitated and thereby it becomes easier for a user to discard less desirable images. Moreover, the zoom-in facilitates viewing of details in the high resolution image on a smaller preview screen.
According to one embodiment the sliding direction of the second sliding motion is along a trajectory that is substantially circular. This is an advantage in that the sliding motion determines the switching between images and if the number of images in the group of image representations are large, the touch sensitive display may not be big enough. However, by making the sliding motion circular there is no longer any limitations as it becomes possible to slide the pointing device many turns.
According to yet another embodiment the touch sensitive display is a multi-touch sensitive display, wherein said generating of a position signal further includes a detection of a second pointing device on the multi-touch sensitive display, said position signal indicating a touch position which is based on each position of the two pointing devices respectively, wherein said zoom-in signal is generated in response to a detection of the two pointing devices sliding on the multi-touch display away from each other, and wherein said shift signal is generated in response to a second sliding motion of the two pointing devices at substantially constant distance between the pointing devices.
In another embodiment the touch position is a position between the two detected pointing devices.
In yet another embodiment the act of displaying an enlarged representation of the first image includes displaying an enlarged subarea of the first digital image representation in response to the zoom-in signal, the position of the subarea within the first digital image representation being based on the selected position and the enlargement of the subarea being based on the zoom-in signal.
In one embodiment the size of the subarea to be enlarged is based on the distance between the two pointing devices at the generation of the position signal.
In another embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different points in time.
According to another embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view at different exposure settings.
According to yet another embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view having different focus distances.
According to a further embodiment said forming of a group of digital image representations includes capturing a plurality of different digital image representations of essentially the same view being exposed for different transforms.
According to yet another embodiment said forming of a group of digital image representations includes generating a plurality of digital image representations from one single original image by manipulating the original image differently for each digital, the manipulation includes applying a transform or a parameter to the original image.
According to another embodiment a pointing device is a fingertip.
The invention will now be described in further detail by way of example under reference to the accompanying drawings, on which:
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments are shown. Like numbers refer to like elements throughout.
In
Moreover, now referring to
The processor 18, the volatile memory 20 and the non volatile memory 22 may be arranged and connected in a way known to the skilled person for operation of the image presentation device and execution of applications stored in the non-volatile memory 22.
The design and implementation of the touch screen circuitry 28 depends on the type of touch sensitive display that is to be used. The implementation of the touch screen driver 26 depends on the type of touch sensitive display and the operating system of the image presentation device 10.
In the present application the term touch sensitive display or touch screen is used for a display that is arranged to detect the presence, location, and/or movement of a “touch” within the display area. The touch screen may be designed to detect presence, location, and/or movement on the display by a finger, a hand, a stylus, a pen, etc.
Depending on the usage of the image presentation device one of a plurality of types of touch screens may be selected. For example may the touch screen be a resistive touch screen, a touch screen based on surface acoustic wave technology, a capacitive touch screen, a touch screen using surface capacitance, a touch screen based on projected capacitive touch technology, a system based on infrared LEDs and photo sensors, a system based on a strain gauge configuration, a touch screen based on dispersive signal technology, a touch screen based on acoustic pulse recognition technology, etc.
According to one embodiment a method for selecting images is part of a greater scheme of achieving a desired image having specific characteristics. The embodiment relates to an image selecting method operating on a group of digital image representations in order to achieve this result. The images forming the group of image representations may be images retrieved from a storage device, e.g. a hard drive, the non volatile memory 22, an image server accessed via a network, etc. The images may alternatively be acquired by means of a camera arranged in the device 10 or by means of transforming one original image retrieved from a storage device or one original image acquired by said camera. The image sequence may also be calculated from one or more source images, and the image itself may be virtual representation based on one or more mathematical schemes applied on one or more original images.
One example of how to generate the group of image representations are to bracket, i.e. to take photographs at more than one exposure in order to ensure that the desired exposure is obtained in at least one exposure. Other examples are to take a plurality of photographs at different points in time, different depth of field, at different focus distances, or by varying any other setting of the camera. The camera used in these examples may well be a camera implemented in the same device or system as the image presentation device 10. Moreover, the group of image representation may be generated from applying different transforms to the images.
The number of images in a group of image representations may be as few as two and as many as hundreds, it much depends on the application in which the method is planned to be used. The group of images may be separate images/photos or a sequence of frames in a video. In
Then the touch sensitive display 14 detects a sliding motion 74, performed by means of the pointing device 70, along the display 14 away from the touch position 72. This detection results in the generation of a zoom signal that is sent to the processor 18. The zoom signal includes an indication of the distance of the sliding movement 74, referred to as zoom-value. Based on the zoom signal and the zoom value the displayed image representation 54 is enlarged to a degree that is based on the zoom value, see
Now referring to
The number of images shifted may be proportional to the length of the second sliding motion 76. Hence, in the example above, in which the group of image representations 50 only includes three photographs 52, 54, 56, the shift would continue to present the image representation of the first photograph if the second sliding motion 76 is continued.
According to another embodiment the enlargement is not applied to the entire image representation as depicted in
According to yet another embodiment two pointing devices are used, e.g. a finger and the thumb of a users hand. In this embodiment a position signal is generated when the two pointing devices are detected on the touch sensitive display, the touch position being indicated as a position between the detection points of the two pointing devices.
Then, upon detection of the two pointing devices sliding away from each other, a zoom signal is generated and in response to the zoom signal an enlarged image representation of the image representation presently displayed is presented on the display. The degree of enlargement is based on the distance the two pointing devices have been sliding away from each other. Moreover, according to one embodiment, not the entire image representation is zoomed but only a subarea. The size of this sub area may correspond to an area defined by the initial positions of the pointing devices, i.e. when the touch position is indicated.
Then, in response to detection of a second sliding motion by the two pointing devices, wherein the two pointing devices are sliding at substantially constant distance from each other, a shift signal is generated and in response to the shift signal the image representation displayed is shifted to another image representation from the group of image representations. In one embodiment they are rotated substantially around a position in-between the two pointing devices and at a substantially constant distance from each other, e.g. following a substantially circular trajectory. The length of the sliding motion determines which image representation from the group of image representations to display. The enlargement applied to the initial image is displayed in the shifted images as well.
According to one specific embodiment only portions of the initially displayed image representation is shifted. The portion to be shifted may for instance be indicated manually by tracing the contours of the area and then the shifting results in that the corresponding area of another image from the group of image representations is displayed. The contours of the area can also be automatically computed by tracing where the two images', aligned to substantially same positions within the particular area, pixels are substantially similar around the traced contour. By means of this embodiment combined with a group of image representations being a bracketed image sequence it is possible to generate HDR images, High Dynamic Range images.
According to one embodiment the method may advantageously be used for browsing images. In such an application the zoom-in step may be skipped and a rotational/circular motion using one or two pointing devices may be used to switch images. In this embodiment the group of image representations probably includes the images of a folder in a file system or a database or of a particular category in a database.
According to one particular embodiment the image presentation device 10 is a mobile telephone equipped with a camera.
According to another aspect of the present invention yet another embodiment is shown in
Then the displayed image representation 54 is enlarged to a degree that is based on a predetermined zoom value, see
By moving the pointing device 70 on the touch sensitive display 14 from the first position 82 to a second position 84, see
The size of the predetermined area 81 may be reduced or enlarged upon detection of two pointing devices sliding towards each other or away from each other, respectively. This may e.g. be done by pointing the two pointing devices on to opposite sections on the border of the predetermined area 81, respectively, and then sliding the two pointing devices towards each other or away from each other.
As a next step the image representation shown in the predetermined area 81 may be shifted. Now referring to
Furthermore according to the above embodiment when a specific image representation has been selected among the group of image representations by performing the sliding motion 86 the not selected digital image representations of the group of digital image representations may be discarded.
It is recognized that the embodiment of
Number | Date | Country | Kind |
---|---|---|---|
0901263-4 | Oct 2009 | SE | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/SE2010/051019 | 9/22/2010 | WO | 00 | 6/6/2012 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2011/040864 | 4/7/2011 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5138460 | Egawa | Aug 1992 | A |
5657402 | Bender | Aug 1997 | A |
5689611 | Ohta et al. | Nov 1997 | A |
6075905 | Herman et al. | Jun 2000 | A |
6105045 | Kurabayashi et al. | Aug 2000 | A |
6249616 | Hashimoto | Jun 2001 | B1 |
6304284 | Dunton et al. | Oct 2001 | B1 |
6529627 | Callari et al. | Mar 2003 | B1 |
6542645 | Silverbrook et al. | Apr 2003 | B1 |
6621524 | Iijima et al. | Sep 2003 | B1 |
6724386 | Clavadetscher | Apr 2004 | B2 |
6750903 | Miyatake et al. | Jun 2004 | B1 |
6927874 | Enokida et al. | Aug 2005 | B1 |
6930703 | Hubel et al. | Aug 2005 | B1 |
6975352 | Seeger et al. | Dec 2005 | B2 |
6985172 | Rigney et al. | Jan 2006 | B1 |
7099510 | Jones | Aug 2006 | B2 |
7133069 | Wallach et al. | Nov 2006 | B2 |
7133169 | Terashima et al. | Nov 2006 | B2 |
7199829 | Matsui et al. | Apr 2007 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7508438 | Okamoto et al. | Mar 2009 | B2 |
7623733 | Hirosawa | Nov 2009 | B2 |
7639897 | Gennetten et al. | Dec 2009 | B2 |
7659923 | Johnson | Feb 2010 | B1 |
7787664 | Luo et al. | Aug 2010 | B2 |
8077213 | Gulliksson | Dec 2011 | B2 |
8127232 | Pavley et al. | Feb 2012 | B2 |
8160152 | Murayama et al. | Apr 2012 | B2 |
8249332 | Stevens et al. | Aug 2012 | B2 |
8463020 | Schuckmann et al. | Jun 2013 | B1 |
8494306 | Sorek et al. | Jul 2013 | B2 |
8497920 | Levoy et al. | Jul 2013 | B2 |
8594460 | Lindskog et al. | Nov 2013 | B2 |
9196069 | Lindskob | Nov 2015 | B2 |
9344642 | Niemi | May 2016 | B2 |
9396569 | Lindskog | Jul 2016 | B2 |
9432583 | Niemi | Aug 2016 | B2 |
20010010546 | Chen | Aug 2001 | A1 |
20010020978 | Matsui et al. | Sep 2001 | A1 |
20010048802 | Nakajima et al. | Dec 2001 | A1 |
20020025796 | Taylor et al. | Feb 2002 | A1 |
20020110286 | Cheatle | Aug 2002 | A1 |
20020159632 | Chui et al. | Oct 2002 | A1 |
20030071908 | Sannoh et al. | Apr 2003 | A1 |
20030147000 | Shiraishi | Aug 2003 | A1 |
20030189647 | Kang | Oct 2003 | A1 |
20030190090 | Beeman et al. | Oct 2003 | A1 |
20040080661 | Afsenius et al. | Apr 2004 | A1 |
20040097206 | Grewing et al. | May 2004 | A1 |
20040165788 | Perez et al. | Aug 2004 | A1 |
20040174434 | Walker et al. | Sep 2004 | A1 |
20040189849 | Hofer | Sep 2004 | A1 |
20040201699 | Parulski et al. | Oct 2004 | A1 |
20040201755 | Norskog | Oct 2004 | A1 |
20040218833 | Ejiri et al. | Nov 2004 | A1 |
20040223649 | Zacks et al. | Nov 2004 | A1 |
20040239767 | Stavely et al. | Dec 2004 | A1 |
20050007483 | Zimmermann | Jan 2005 | A1 |
20050031214 | Zhang et al. | Feb 2005 | A1 |
20050099514 | Cozier et al. | May 2005 | A1 |
20050122412 | Shirakawa et al. | Jun 2005 | A1 |
20050168594 | Larson et al. | Aug 2005 | A1 |
20060001650 | Robbins | Jan 2006 | A1 |
20060022961 | Kaminaga | Feb 2006 | A1 |
20060028579 | Sato | Feb 2006 | A1 |
20060038908 | Yoshino | Feb 2006 | A1 |
20060044444 | Okamoto et al. | Mar 2006 | A1 |
20060050152 | Rai et al. | Mar 2006 | A1 |
20060061845 | Lin | Mar 2006 | A1 |
20060078224 | Hirosawa | Apr 2006 | A1 |
20060171687 | Aiso | Aug 2006 | A1 |
20060181614 | Yen et al. | Aug 2006 | A1 |
20060181619 | Liow et al. | Aug 2006 | A1 |
20060187321 | Sakamoto | Aug 2006 | A1 |
20060235765 | David | Oct 2006 | A1 |
20070024721 | Rogers | Feb 2007 | A1 |
20070025723 | Baudisch et al. | Feb 2007 | A1 |
20070030363 | Cheatle et al. | Feb 2007 | A1 |
20070055651 | Yamanaka et al. | Mar 2007 | A1 |
20070058064 | Hara et al. | Mar 2007 | A1 |
20070081081 | Cheng | Apr 2007 | A1 |
20070097206 | Houvener et al. | May 2007 | A1 |
20070224980 | Wakefield | Sep 2007 | A1 |
20070237421 | Luo et al. | Oct 2007 | A1 |
20070274563 | Jung et al. | Nov 2007 | A1 |
20070274705 | Kashiwa et al. | Nov 2007 | A1 |
20080062141 | Chandhri | Mar 2008 | A1 |
20080143744 | Agarwala | Jun 2008 | A1 |
20080152258 | Tulkki | Jun 2008 | A1 |
20080165141 | Christie | Jul 2008 | A1 |
20080218611 | Parulski et al. | Sep 2008 | A1 |
20080218613 | Janson et al. | Sep 2008 | A1 |
20090019399 | Matsunaga et al. | Jan 2009 | A1 |
20090021576 | Linder | Jan 2009 | A1 |
20090046943 | Ishiga | Feb 2009 | A1 |
20090073285 | Terashima | Mar 2009 | A1 |
20090141046 | Rathnam et al. | Jun 2009 | A1 |
20090190803 | Neghina et al. | Jul 2009 | A1 |
20090204920 | Beverley et al. | Aug 2009 | A1 |
20090244301 | Border et al. | Oct 2009 | A1 |
20090245685 | Makii | Oct 2009 | A1 |
20090251591 | Whitham | Oct 2009 | A1 |
20090290759 | Stevens et al. | Nov 2009 | A1 |
20090295830 | Muraveynyk et al. | Dec 2009 | A1 |
20090303338 | Chaurasia et al. | Dec 2009 | A1 |
20090309990 | Levoy et al. | Dec 2009 | A1 |
20090322926 | Ikeda et al. | Dec 2009 | A1 |
20100025123 | Lee et al. | Feb 2010 | A1 |
20100045608 | Lessing | Feb 2010 | A1 |
20100053353 | Hunter et al. | Mar 2010 | A1 |
20100079498 | Zaman et al. | Apr 2010 | A1 |
20100149367 | Yim et al. | Jun 2010 | A1 |
20100268729 | Nara et al. | Oct 2010 | A1 |
20110141227 | Bigioi et al. | Jun 2011 | A1 |
20110200259 | Lindskog et al. | Aug 2011 | A1 |
20120105601 | Jeon et al. | May 2012 | A1 |
20120262490 | Niemi | Oct 2012 | A1 |
20130300822 | Mills | Nov 2013 | A1 |
20140101590 | Lindskob et al. | Apr 2014 | A1 |
20140177975 | Lindskob et al. | Jun 2014 | A1 |
20140184852 | Niemi | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
1471375 | Jan 2004 | CN |
1750593 | Mar 2006 | CN |
1 309 171 | May 2003 | EP |
1 613 060 | Jan 2006 | EP |
1942401 | Jul 2008 | EP |
2 124 186 | Nov 2009 | EP |
2 175 635 | Apr 2010 | EP |
2 323 102 | May 2011 | EP |
H11-88811 | Mar 1999 | JP |
2010 020581 | Jan 2010 | JP |
401687 | Aug 2000 | TW |
WO 9951027 | Oct 1999 | WO |
WO 0159709 | Aug 2001 | WO |
WO 03105466 | Dec 2003 | WO |
WO 2004068865 | Aug 2004 | WO |
WO 2005036780 | Apr 2005 | WO |
WO 2005050567 | Jun 2005 | WO |
WO 2005112437 | Nov 2005 | WO |
WO 2006002796 | Jan 2006 | WO |
WO-2007006075 | Jan 2007 | WO |
WO 2007038198 | Apr 2007 | WO |
WO-2008038883 | Apr 2008 | WO |
WO 2008064349 | May 2008 | WO |
WO 2009156561 | Dec 2009 | WO |
WO 2010072587 | Jul 2010 | WO |
WO 2011040864 | Apr 2011 | WO |
Entry |
---|
International Search Report. |
Notice of Allowance for U.S. Appl. No. 14/037,563 dated Jul. 20, 2015. |
Office Action for U.S. Appl. No. 14/118,493 dated Aug. 4, 2015. |
Extended European Search Report for corresponding European Application No. 11153998.7 Aug. 12, 2015, 9 pages. |
Xiong, Y. et al., Fast Panorama Stitching n Mobile Devices, Consumer Electronics (ICCE), 2010 Digest of Technical Papers International Conference, (Jan. 2010) 319-320. |
Extended European Search Report for corresponding European Application No. 12814902.8 dated Jun. 1, 2015, 9 pages. |
International Type Search Report and Written Opinion for Application No. ITS/SE10/00290 dated May 2, 2011. |
International Search Report and Written Opinion for Application No. PCT/SE2012/050688 dated Oct. 26, 2012. |
Office Action for Canadian Application No. 2,841,910 dated Apr. 17, 2015. |
Peleg, S. et al., Stereo Panorama With a Single Camera, Proceedings of the 1999 IEEE Computer society conference on Computer Vision and Pattern Recognition, IEEE, vol. 1 (Jun. 1999) pp. 395-401. |
Shum, H-Y. et al., Rendering with Concentric Mosaics, SIGGRAPH 99, ACM (1999) 299-306. |
Farin, et al., Shortest Circular Paths on Planar Graphs, 27thSymposium on Information Theory in the Benelux (2006) pp. 117-124. |
Itti, et al., A Model of Saliency-based Visual Attention for Rapid Scene Analysis, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, No. 11 (1998) pp. 1254-1259. |
Ling., H. et al., Diffusion Distance for Histogram Comparison, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2006) 8 pages. |
Jiebo Luo et al. Highly Automated Image Recomposition: The Picture You Wish You Had Taken, 2006 Western New York Image Processing Workshop, Sep. 29, 2006, Rochester Institute of Technology, Chester F. Carlson Center for Imaging Science Auditorium, Building 76, Rochester, NY 14623. Download from Internet http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.102.8032&rep=rep1&type=pdf#p.=27. |
Matthews, K. E. et al., Sumiltaneous Motion Parameter Estimation and Image Segmentation using the EM Algorithm, Oct. 23-26, 1995, download from internet: http://ieeexplore.ieee.org/xpl/login.jsp?reload=true&tp=&arnumber=531423&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs—all.jsp%3Farnumber%3D531423. |
Ojala, T. et al., Multiresolution Gray-Scale and Rotation Invariant Texture Classification With Local Binary Patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7 (Jul. 2002) 971-987. |
Oliva, A. et al., Modeling the Shape of the Scene: A Holistic Representation of the Spatial Envelope, International Journal of Computer Vision 42(3), (2001) 145-175. |
Aseem Agarwala, et al., Interactive Digital Photomontage, ACM Transactions on Graphics, Aug. 1, 2004, vol. 23, No. 3, pp. 294-302; ISSN: 0730-0301; XP-002359799; figure 1; abstract. |
A.A.Isola, et al., Motion compensated iterative reconstruction of a region of Interest in cardiac cone-beam CT, Computerized Medical Imaging and Graphics, Mar. 1, 2010, vol. 34, No. 2, pp. 149-159, ISSN:0895-6111, abstract. |
Shutao Li et al., Multifocus Image Fusion Using Region Segmentation and Spatial Frequency, ScienceDirect, Image and Vision Computing, vol. 26 (2008) pp. 971-979. |
Wandell, B. et al., Multiple Capture Image Architecture With a CMOS Sensor, Proceedings of the International Symposium on Multispectral Imaging and Color Reproduction for Digital Archives (Society of Multispectral Imaging of Japan), (Oct. 21-22, 1999) 11-17. |
Canon: “Powershot S95 Camera User Guide” [online] [retrieved Jun. 1, 2015]. Retrieved from the Internet: UR: http://gdlp01.c-wss.com/gds/4/0300003994/02/PSS95—CUG—EN—02.pdf>. (dated 2010) 196 pages. |
International Search Report: mailed Jul. 20, 2010; Search Request No. ITS/SE10/00034. |
International Search Report/Written Opinion for Application No. PCT/SE2012/050584 dated Sep. 6, 2012. |
Office Action for European Application No. 12 793 286.1 dated Oct. 2, 2014. |
Supplementary European Search Report for Application No. EP 12 79 3286 dated Feb. 4, 2015. |
Swedish Office Action dated Jul. 20, 2010; Ref: SE-21046020. |
Office Action for European Application No. 11 15 3998 dated Mar. 9, 2015. |
Office Action for U.S. Appl. No. 13/026,500 dated Jul. 31, 2012. |
Office Action for U.S. Appl. No. 13/026,500 dated Jan. 2, 2013. |
Notice of Allowance for U.S. Appl. No. 13/026,500 dated May 21, 2013. |
Notice of Allowance for U.S. Appl. No. 13/026,500 dated Jul. 24, 2013. |
Notice of Allowance for U.S. Appl. No. 14/037,563 dated Aug. 14, 2014. |
Notice of Allowance for U.S. Appl. No. 14/037,563 dated Dec. 12, 2014. |
Notice of Allowance for U.S. Appl. No. 14/037,563 dated Mar. 31, 2015. |
Third-Party Submission for U.S. Appl. No. 14/037,708 dated Jun. 12, 2015. |
Office Action for U.S. Appl. No. 14/118,493 dated Feb. 6, 2015. |
Office Action for U.S. Appl. No. 14/037,708 dated Oct. 19, 2015. |
Notice of Allowance for U.S. Appl. No. 14/118,493 dated Dec. 30, 2015. |
Supplementary European Search Report for Application No. EP 10 82 0903 dated Dec. 18, 2015. |
Office Action for U.S. Appl. No. 13/499,711 dated Oct. 5, 2015. |
Office Action for U.S. Appl. No. 14/233,053 dated Sep. 11, 2015. |
Atallah, M. J. et al., An Optimal Algorithm for Shortest Paths on Weighted Interval and Circular-Arc Graphs, With Applications, Algorithms ESA '93 Sep. 30, 1993 Springer, Berlin, Heidelberg, vol. 726, pp. 13-24. |
Notice of Allowance for U.S. Appl. No. 14/037,708 dated Mar. 22, 2016. |
Notice of Allowance for U.S. Appl. No. 14/118,493 dated Jan. 21, 2016. |
Office Action for European Application No. 12 792 286.1 dated Feb. 15, 2016. |
Baudisch, P. et al., Panoramic Viewfinder: Shooting Panoramic Pictures with the Help of a Real-Time Preview; UIST '05 (Oct. 2005) pp. 1-2. |
Baudisch, P., et al., Panoramic Viewfinder: Providing a Real-Time Preview to Help Users Avoid Flaws in Panoramic Pictures, Proceedings of OZCHI, 2005, 10 pages. |
Communication and extended search report issued by the European Patent Office in corresponding European patent application No. 07748329.5, dated Mar. 25, 2011. |
Communication issued by the European Patent Office in corresponding European patent application No. 07748329.5, dated Mar. 1, 2016. |
Communication issued by the European Patent Office in corresponding European patent application No. 07748329.5, dated Mar. 26, 2012. |
Extended European Search Report from corresponding European Patent Application No. 11827056.0 dated Apr. 11, 2014. |
International Preliminary Report on Patentability from corresponding International Patent Application No. PCT/SE2007/000680 dated Nov. 18, 2008. |
International Search Report for Application No. PCT/SE2007/000680 dated Oct. 30, 2007. |
International Search Report for Application No. PCT/SE2011/051124, dated Mar. 12, 2012. |
Notice of Allowability for U.S. Appl. No. 14/233,053 dated May 10, 2016. |
Notice of Allowance for U.S. Appl. No. 14/233,053 dated Apr. 29, 2016. |
Notice of Allowance for U.S. Appl. No. 11/634,264, dated Dec. 2, 2010. |
Notice of Allowance for U.S. Appl. No. 13/825,214 dated Aug. 24, 2016. |
Notice of Allowance for U.S. Appl. No. 13/825,214 dated Aug. 4, 2016. |
Notice of Allowance from corresponding U.S. Appl. No. 13/825,214 dated Sep. 8, 2016, 13 pages. |
Notification of Reason for Rejection in Japanese Patent Application No. 2009-520706 mailed Aug. 30, 2011 (7 pages). |
Office Action for European Application No. 12 814 902.8 dated Aug. 2, 2016. |
Office Action for U.S. Appl. No. 11/634,264, dated Aug. 20, 2010. |
Office Action for U.S. Appl. No. 11/634,264, dated Jan. 26, 2010. |
Office Action for U.S. Appl. No. 11/634,264, dated Mar. 24, 2010. |
Office Action for U.S. Appl. No. 11/634,264, dated May 22, 2009. |
Office Action for U.S. Appl. No. 11/634,264, dated Sep. 4, 2009. |
Office Action for U.S. Appl. No. 12/805,850, dated Apr. 8, 2011. |
Office Action for U.S. Appl. No. 12/805,850, dated Oct. 12, 2011. |
Office Action for U.S. Appl. No. 13/825,214 dated Mar. 24, 2016. |
Office Action for U.S. Appl. No. 13/825,214 dated Oct. 21, 2015. |
Office Action for U.S. Appl. No. 13/856,843 dated Apr. 22, 2015. |
Office Action for U.S. Appl. No. 13/856,843 dated May 2, 2014. |
Office Action for U.S. Appl. No. 13/856,843 dated Nov. 10, 2014. |
Office Action for U.S. Appl. No. 13/856,843 dated Sep. 6, 2013. |
Office Action from corresponding Chinese Patent Application No. 201180052544.7 dated Jan. 29, 2015. |
Office Action from corresponding Chinese Patent Application No. 201310132668.7 dated Sep. 5, 2016. |
Peleg, S., et al., Stereo Panorama with a Single Camera, Proceedings of the 1999 IEEE Computer Society Conference on Computer Vision and Patter Recognition; Jun. 23-25, 1999; Fort Collins, Colorado, IEEE, vol. 1, pp. 395-401. |
Shum, H. Y., et al.; “Rendering with Concentric Mosaics;” SIGGRAPH 99; pp. 299-206; dated 1999. |
Written Opinion from corresponding International Patent Application No. PCT/SE2007/000680 dated Oct. 30, 2007. |
Written Opinion from corresponding International Patent Application No. PCT/SE2011/051124 dated Mar. 12, 2012. |
Summons to attend oral proceedings pursuant to Rule 115(1) EPC in corresponding European patent application No. 07748329.5, dated Nov. 21, 2016. |
Communication issued by the European Patent Office in corresponding European patent application No. 10820903.2 dated Jan. 24, 2017. |
Extended European Search Report for corresponding European Application No. 16177376.7 dated Nov. 14, 2016, 5 pages. |
Extended European Search Report for corresponding European Application No. 16177377.5 dated Nov. 14, 2016, 7 pages. |
Decision to Grant a European Patent Pursuant to Article 97(1) EPC for European Patent Application No. 07748329.5 dated Aug. 18, 2017, 2 pages. |
Office Action for U.S. Appl. No. 13/856,843 dated Aug. 26, 2017, 109 pages. |
Number | Date | Country | |
---|---|---|---|
20120262490 A1 | Oct 2012 | US |