It is often common for people to talk on the phone and make arrangements to get together. Time and location are generally the two elements that are needed to coincide for a successful meet-up. Sometimes however, people need more flexibility in time and location to meet. This can mean that additional phone calls are needed to coordinate the meet-up, which can cause additional interruptions and wasted time. In addition, people find they could have met up earlier if they had only known where the other meeting attendee is located.
This Background is provided to introduce a brief context for the Summary and Detailed Description that follow. This Background is not intended to be an aid in determining the scope of the claimed subject matter nor be viewed as limiting the claimed subject matter to implementations that solve any or all of the disadvantages or problems presented above.
A location sharing component operating on a mobile computing device such as a smartphone, tablet, or laptop personal computer (PC) is configured to enable a local party and a remote party to share each other's locations during a phone call to facilitate a physical meet-up. The location sharing component exposes various options to set a length of time for the location sharing or the location can be shared up until the meet-up occurs. User interfaces (UIs) exposed by the location sharing component can provide directions and dynamically updated maps which show the locations of the parties. The location sharing experience can be persisted after the phone call ends by showing updates to the directions and maps and by surfacing notifications when the parties are close so that they can start looking for each other. The location sharing time interval can be extended if it is due to expire before the meet-up occurs.
In various illustrative examples, location sharing can be initiated from within a voice or video calling experience and locations may be shared with all parties on a call in multi-party calling scenarios. Location sharing may also be initiated in asynchronous forms of communication such as messaging and email. In cases in which the remote party's device is not configured with a location sharing component, an external web service can be used to support a location sharing experience on the remote device. The location sharing component can provide an estimated meet-up time based on the parties' locations as well as contextual data such as traffic level and mode of travel (e.g., walking, car, plane, public or mass transportation such as bus, subway, etc.). Maps and directions and other location information can be shown on a device's lock screen or other UI so that a user does not need to unlock the device to keep up with the progress towards the meet-up. The location sharing component can also be configured to interoperate with a digital assistant that executes with the device in some implementations.
Advantageously, the present location sharing can operate on any scale from large cities to small neighborhoods, and can operate in different types of locations such as urban and rural areas, shopping areas, malls, corporate and college campuses, theme parks, and the like. Location sharing can also be applied to a variety of contexts including business, personal, recreation, travel, etc., whether the meet-up is between two people or a group of people. By enabling a participant in an upcoming meet-up to see location status of the other participants, the meet-up experience is improved as planning can be based on accurate and current information without needing to place additional calls or send messages. In addition, having current location status of the other meet-up participants, and knowing that they know your location status as well can also reduce the emotional stress and pressure when trying to make the meet-up, for example, when running late due to extra traffic congestion and the like.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure. It may be appreciated that the above-described subject matter may be implemented as a computer-controlled apparatus, a computer process, a computing system, or as an article of manufacture such as one or more computer-readable storage media. These and various other features may be apparent from a reading of the following Detailed Description and a review of the associated drawings.
Like reference numerals indicate like elements in the drawings. Elements are not drawn to scale unless otherwise indicated. It is emphasized that the particular UIs displayed in the drawings can vary from what is shown according to the needs of a particular implementation. While UIs are shown in portrait mode in the drawings, the present arrangement may also be implemented using a landscape mode.
The various devices 110 in the environment 100 can support different features, functionalities, and capabilities (here referred to generally as “features”). Some of the features supported on a given device can be similar to those supported on others, while other features may be unique to a given device. The degree of overlap and/or distinctiveness among features supported on the various devices 110 can vary by implementation. For example, some devices 110 can support touch controls, gesture recognition, and voice commands, while others may enable a more limited UI. Some devices may support video consumption and Internet browsing, while other devices may support more limited media handling and network interface features.
As shown, the devices 110 can access the communications network 115 in order to implement various user experiences. The communications network can include any of a variety of network types and network infrastructure in various combinations or sub-combinations including cellular networks, satellite networks, IP (Internet Protocol) networks such as Wi-Fi and Ethernet networks, a public switched telephone network (PSTN), and/or short range networks such as Bluetooth networks. The network infrastructure can be supported, for example, by mobile operators, enterprises, Internet service providers (ISPs), telephone service providers, data service providers, and the like. The communications network 115 typically includes interfaces that support a connection to the Internet 120 so that the mobile devices 110 can access content provided by one or more content providers 125 and access a service provider 130 in some cases.
The devices 110 and communications network 115 may be configured to enable device-to-device communication. As shown in
The communications 200 can be utilized to support the present location sharing to facilitate a physical meet-up. The location sharing can be implemented between a local sharing party 1051 and a single remote party 105N or between the local sharing party and multiple remote parties in a conference call scenario as shown in
The present location sharing may be implemented using components that are instantiated on a given device. In addition, as discussed below, location sharing can also be implemented, in whole or part, using a web service supported by a remote service provider (e.g., service provider 130 in
The application layer 405 in this illustrative example supports various applications (apps) 430 (e.g., web browser, map application, email application, etc.), as well as a phone app 435, messaging app 440, and video calling app 445, such as Skype™. The applications are often implemented using locally executing code. However in some cases, these applications may rely on services and/or remote code execution provided by remote servers or other computing platforms such as those supported by the service provider 130 or other cloud-based resources as indicated by line 460. While the apps 430, 435, 440, and 445 are shown here as components that are instantiated in the application layer 405, it may be appreciated that the functionality provided by a given application may be implemented, in whole or part, using components that are supported in either the OS or hardware layers.
The OS layer 410 supports a location sharing component 450 and various other OS components 455. In some cases, location sharing component 450 can interact with the service provider. That is, the location sharing component 450 in some implementations can partially utilize or fully utilize remote code execution supported at the service provider 130, or using other remote resources. In addition, it may utilize and/or interact with the other OS components 455 (and/or other components that are instantiated in the other layers of the architecture 400) as may be needed to implement the various features and functions described herein. The real-time location sharing component 450 may alternatively be instantiated using elements that are instantiated in both the OS and application layers or be configured as an application, as shown in
A user can typically interact with the real-time sharing component 450 (
As shown, the functions 800 illustratively include implementing a location sharing experience with one or more remote parties (as indicated by reference numeral 825). A given location sharing experience can be initiated from within a calling application (e.g., voice and video calling). The location sharing can typically go in both directions (as shown in
In some implementations, the location sharing component 450 can be configured to interoperate with a personal digital assistant that is operable on the device 110. As shown in
In a similar manner as with the arrangement shown in
When the user (i.e., the local sharing party) selects a share button 1010 that is exposed on the phone app's UI, here using a touch 1015 on a touch screen or other interaction, a sharing UI 1100 is surfaced as shown in
The UI 1100 provides a number of sharing options 1105 that can be invoked by the user by touch. In this example, the user employs a touch 1115 to select the location sharing option 1120 among various options to share various other types of content. The user's selection action surfaces UI 1200 in
In this example, a period for sharing 1220 is displayed with a default time period of 30 minutes. In some implementations, the location sharing component can expose various controls such as user preferences for controlling the default time period. Here, the user can change the time period for sharing using a touch 1225 which invokes the presentation of UI 1300 in
The user has employed a touch 1310 to select the 1 hour location sharing time period, as shown, which invokes presentation of UI 1400 in
In typical real-time location sharing for facilitating a physical meet-up when the local party's location is shared with the remote party, the remote party will also share its location back. That way both participants in the planned meet-up can see where the other is located which can help each of them plan their time and also ensure they can find each other at the meet-up location. Accordingly, the remote party can use a similar location sharing process at the remote party's device as described above to share location information with the local party. As shown in UI 1500, the remote party then initiates location sharing with the local party, a notification 1505 is presented on the phone app's UI to let the user know that the location information is being shared. The UI exposes controls to accept or reject the sharing.
In this example, the user has accepted the sharing by a touch 1510 on the accept button 1515 which invokes presentation of UI 1600 in
Continuing with the example of the call between the parties to discuss the planned meet-up later that day, once the user has reviewed the map and directions, the user can employ a touch 1805 on the remote user's avatar or name, as shown in UI 1800 in
After the call is terminated, the location sharing component 450 can persist the sharing experience so that the parties to the meet-up can continue to get updates as to location status of the others. For example, as shown in UI 2000 in
Similarly, as shown in UI 2100 in
In some implementations, the notification can be surfaced on the device's lock screen (i.e., the screen that is typically displayed on the device to regulate access to the device), and/or on UIs that are being utilized by executing applications. As shown in UI 2200 in
Sometimes during a location sharing experience a planned meet-up may get delayed for some reason (e.g., the parties decide to change the meet-up time and/or location, one of the parties is running late, etc.). In such cases it may be possible that the location sharing time period will be exceeded and the sharing will end before the meet-up will occur. By monitoring input, contextual, and/or historical data, the location sharing component can determine that there is some probability beyond a predetermined threshold that the estimated meet-up will actually occur after the expiration of the location sharing time period. Accordingly, as shown in the UI 2500 in
Location sharing experiences can also be persisted in group meet-up scenarios. For example, the location sharing component can expose controls to enable parties on a call to share meet-up invites after the call ends. The controls may be configured so that a party can enable meet-up invites to be further shared by other invitees with others and the extent to which additional invitations are extended can be controlled by the party in some cases (e.g., by imposing a limit on the number of invitations and/or limiting the time period for sharing invitations). Typically in such group meet-up scenarios, the total number of invitations extended and the number of persons accepting can be tracked and reported back to an initiating party. In some cases, the initiating party can use controls exposed by the location sharing component to provide location information to the entire group of meet-up attendees or to just a subset of attendees.
While the illustrative examples of location sharing above are described in the context of a voice call, location sharing can also be implemented in the context of a video call. As shown in
In some implementations, the location sharing window 2615 can be placed in a particular position on the UI 2600 by the user and/or enlarged or reduced in size. For example, the user can touch and drag the location sharing window 2615 into a desired position and enlarge and shrink the window using multi-touch gestures such as pinching and spreading.
In some location sharing scenarios, each of the devices participating in the sharing (whether single instances of sharing or multi-instance sharing among two or more parties) can have a location sharing component installed and executing to support the location sharing user experience. This is shown in
In other location sharing scenarios, one or more of the parties participating in the sharing may not have a location sharing component 450 instantiated. In such cases, location sharing may still be implemented with a full set of features and user experiences by leveraging capabilities provided by the remote service provider 130 as shown in
When the local sharing party initiates a sharing session, the service provider 130 can send a message 2820 to a messaging application 2825 that is available on the remote device. For example, the message 2820 can be a text message that is transported using SMS (Short Message Service) that contains a link to a location sharing experience that is facilitated by the web service 2805.
When the message 2820 is received by the messaging application 2825 it can typically surface the message in a UI, for example UI 2900 shown in
While the above illustrative examples are described in the context of voice and video calls, the present location sharing may also be implemented with asynchronous communication forms such as messaging and email. For example, a local party could send a text message or email at 4 pm to arrange a physical meet-up in which location sharing begins at 6 pm for a meet-up at 7 pm.
In step 3005, a UI can be exposed for the local sharing party to initiate real-time location sharing with the remote party during a phone call. As noted above, the UI may be incorporated into the UI exposed by a voice calling application or video calling application. Upon initiation of the location sharing, the location sharing component can activate the device's speakerphone so that the user is able to view and interact with the UI in step 3010. In step 3015, the local sharing party may be enabled to select an expiration for the location sharing by interacting with various controls exposed by the UI. In step 3020, a notification can be surfaced when the remote user initiates location sharing from the remote device back to the local device. Typically, the local user is given options to accept or reject the location sharing from the remote user.
In step 3025, a map may be generated and displayed which shows the location of the local party, the remote party, or both the local and remote parties simultaneously. The map is dynamically updated to reflect changes in the parties' locations. In step 3030, directions for travel between the local and remotes parties may be generated and displayed, typically using the map along with graphics, text, and the like. In step 3035, the location sharing component can interact with a digital assistant, in some implementations, in order to facilitate and/or enhance the location sharing experiences for one or more of the parties.
In step 3040, a notification may be surfaced when the parties are close to meeting-up, in which closeness can be defined in terms of either time or distance. In some implementations, notifications can also be automatically surfaced to a party when the other party is running late and even in cases when the late-running party does not explicitly provide a notification. For example, the real-time location sharing component and/or service provider can estimate each party's progress in getting to the meet-up location and then provide the notification when it becomes evident that a party will be late. In some implementations, the personal digital assistant 910 (
In step 3045, an estimated meet-up time can be generated using available input and historical, environmental, contextual, and other data and displayed. In step 3050, enablement may be provided for persisting the location sharing experience after the phone call is ended. This may include providing continuing support for the dynamic mapping and the provision of directions and notifications. In group meet-up scenarios, the persisted location sharing experience can include invitation sharing, as described above. In step 3055, enablement can be provided for the location sharing expiration to be extended by the local party or in response to an extension request by the remote party. If not otherwise extended, the location sharing experience is ended upon the occurrence of the expiration.
In step 3115, when the remote party follows the link, a web service can be provided to a client that runs on the remote device. The web service can then render the real-time location sharing experience into the web service client such as a browser or other application. In step 3120, inputs are received for setting the expiration from the remote party for that party's location sharing. The web service can also receive inputs from the remote party for extending the expiration of location sharing (and/or requesting an extension for sharing from the local device) in step 3125. The web service may end the location sharing experience upon the occurrence of the expiration in step 3130.
A number of program modules may be stored on the hard disk, magnetic disk 3233, optical disk 3243, ROM 3217, or RAM 3221, including an operating system 3255, one or more application programs 3257, other program modules 3260, and program data 3263. A user may enter commands and information into the computer system 3200 through input devices such as a keyboard 3266 and pointing device 3268 such as a mouse. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, trackball, touchpad, touch screen, touch-sensitive device, voice-command module or device, user motion or user gesture capture device, or the like. These and other input devices are often connected to the processor 3205 through a serial port interface 3271 that is coupled to the system bus 3214, but may be connected by other interfaces, such as a parallel port, game port, or universal serial bus (USB). A monitor 3273 or other type of display device is also connected to the system bus 3214 via an interface, such as a video adapter 3275. In addition to the monitor 3273, personal computers typically include other peripheral output devices (not shown), such as speakers and printers. The illustrative example shown in
The computer system 3200 is operable in a networked environment using logical connections to one or more remote computers, such as a remote computer 3288. The remote computer 3288 may be selected as another personal computer, a server, a router, a network PC, a peer device, or other common network node, and typically includes many or all of the elements described above relative to the computer system 3200, although only a single representative remote memory/storage device 3290 is shown in
When used in a LAN networking environment, the computer system 3200 is connected to the local area network 3293 through a network interface or adapter 3296. When used in a WAN networking environment, the computer system 3200 typically includes a broadband modem 3298, network gateway, or other means for establishing communications over the wide area network 3295, such as the Internet. The broadband modem 3298, which may be internal or external, is connected to the system bus 3214 via a serial port interface 3271. In a networked environment, program modules related to the computer system 3200, or portions thereof, may be stored in the remote memory storage device 3290. It is noted that the network connections shown in
The architecture 3300 illustrated in
The mass storage device 3312 is connected to the CPU 3302 through a mass storage controller (not shown) connected to the bus 3310. The mass storage device 3312 and its associated computer-readable storage media provide non-volatile storage for the architecture 3300.
Although the description of computer-readable storage media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, it may be appreciated by those skilled in the art that computer-readable storage media can be any available storage media that can be accessed by the architecture 3300.
By way of example, and not limitation, computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. For example, computer-readable media includes, but is not limited to, RAM, ROM, EPROM (erasable programmable read only memory), EEPROM (electrically erasable programmable read only memory), Flash memory or other solid state memory technology, CD-ROM, DVDs, HD-DVD (High Definition DVD), Blu-ray, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the architecture 3300.
According to various embodiments, the architecture 3300 may operate in a networked environment using logical connections to remote computers through a network. The architecture 3300 may connect to the network through a network interface unit 3316 connected to the bus 3310. It may be appreciated that the network interface unit 3316 also may be utilized to connect to other types of networks and remote computer systems. The architecture 3300 also may include an input/output controller 3318 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus (not shown in
It may be appreciated that the software components described herein may, when loaded into the CPU 3302 and executed, transform the CPU 3302 and the overall architecture 3300 from a general-purpose computing system into a special-purpose computing system customized to facilitate the functionality presented herein. The CPU 3302 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, the CPU 3302 may operate as a finite-state machine, in response to executable instructions contained within the software modules disclosed herein. These computer-executable instructions may transform the CPU 3302 by specifying how the CPU 3302 transitions between states, thereby transforming the transistors or other discrete hardware elements constituting the CPU 3302.
Encoding the software modules presented herein also may transform the physical structure of the computer-readable storage media presented herein. The specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to, the technology used to implement the computer-readable storage media, whether the computer-readable storage media is characterized as primary or secondary storage, and the like. For example, if the computer-readable storage media is implemented as semiconductor-based memory, the software disclosed herein may be encoded on the computer-readable storage media by transforming the physical state of the semiconductor memory. For example, the software may transform the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory. The software also may transform the physical state of such components in order to store data thereupon.
As another example, the computer-readable storage media disclosed herein may be implemented using magnetic or optical technology. In such implementations, the software presented herein may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations also may include altering the physical features or characteristics of particular locations within given optical media to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
In light of the above, it may be appreciated that many types of physical transformations take place in the architecture 3300 in order to store and execute the software components presented herein. It may also be appreciated that the architecture 3300 may include other types of computing devices, including handheld computers, embedded computer systems, smartphones, PDAs, and other types of computing devices known to those skilled in the art. It is also contemplated that the architecture 3300 may not include all of the components shown in
The illustrated device 110 can include a controller or processor 3410 (e.g., signal processor, microprocessor, microcontroller, ASIC (Application Specific Integrated Circuit), or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions. An operating system 3412 can control the allocation and usage of the components 3402, including power states, above-lock states, and below-lock states, and provides support for one or more application programs 3414. The application programs can include common mobile computing applications (e.g., image-capture applications, email applications, calendars, contact managers, web browsers, messaging applications), or any other computing application.
The illustrated mobile device 110 can include memory 3420. Memory 3420 can include non-removable memory 3422 and/or removable memory 3424. The non-removable memory 3422 can include RAM, ROM, Flash memory, a hard disk, or other well-known memory storage technologies. The removable memory 3424 can include Flash memory or a Subscriber Identity Module (SIM) card, which is well known in GSM (Global System for Mobile communications) systems, or other well-known memory storage technologies, such as “smart cards.” The memory 3420 can be used for storing data and/or code for running the operating system 3412 and the application programs 3414. Example data can include web pages, text, images, sound files, video data, or other data sets to be sent to and/or received from one or more network servers or other devices via one or more wired or wireless networks.
The memory 3420 may also be arranged as, or include, one or more computer-readable storage media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, Flash memory or other solid state memory technology, CD-ROM (compact-disc ROM), DVD, (Digital Versatile Disc) HD-DVD (High Definition DVD), Blu-ray, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the mobile device 110.
The memory 3420 can be used to store a subscriber identifier, such as an International Mobile Subscriber Identity (IMSI), and an equipment identifier, such as an International Mobile Equipment Identifier (IMEI). Such identifiers can be transmitted to a network server to identify users and equipment. The mobile device 110 can support one or more input devices 3430; such as a touch screen 3432; microphone 3434 for implementation of voice input for voice recognition, voice commands and the like; camera 3436; physical keyboard 3438; trackball 3440; and/or proximity sensor 3442; and one or more output devices 3450, such as a speaker 3452 and one or more displays 3454. Other input devices (not shown) using gesture recognition may also be utilized in some cases. Other possible output devices (not shown) can include piezoelectric or haptic output devices. Some devices can serve more than one input/output function. For example, touch screen 3432 and display 3454 can be combined into a single input/output device.
A wireless modem 3460 can be coupled to an antenna (not shown) and can support two-way communications between the processor 3410 and external devices, as is well understood in the art. The modem 3460 is shown generically and can include a cellular modem for communicating with the mobile communication network 3404 and/or other radio-based modems (e.g., Bluetooth 3464 or Wi-Fi 3462). The wireless modem 3460 is typically configured for communication with one or more cellular networks, such as a GSM network for data and voice communications within a single cellular network, between cellular networks, or between the mobile device and a public switched telephone network (PSTN).
The mobile device can further include at least one input/output port 3480, a power supply 3482, a satellite navigation system receiver 3484, such as a GPS receiver, an accelerometer 3486, a gyroscope (not shown), and/or a physical connector 3490, which can be a USB port, IEEE 1394 (FireWire) port, and/or an RS-232 port. The illustrated components 3402 are not required or all-inclusive, as any components can be deleted and other components can be added.
Based on the foregoing, it may be appreciated that technologies for real-time location sharing have been disclosed herein. Although the subject matter presented herein has been described in language specific to computer structural features, methodological and transformative acts, specific computing machinery, and computer-readable storage media, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features, acts, or media described herein. Rather, the specific features, acts, and mediums are disclosed as example forms of implementing the claims.
The subject matter described above is provided by way of illustration only and may not be construed as limiting. Various modifications and changes may be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the present invention, which is set forth in the following claims.