The present invention in general relates, for example, to device usability in multi-core or multi-chip embedded solutions.
Embedded devices generally comprise objects that contain an embedded computing system, which may be enclosed by the object. The embedded computer system may be designed with a specific use in mind, or the embedded computer system may be at least in part general-purpose in the sense that a user may be enabled to install software in it. An embedded computer system may be based on a microcontroller or microprocessor CPU, for example.
Embedded devices may comprise one or more processors, user interfaces and displays, such that a user may interact with the device using the user interface. The user interface may comprise buttons, for example. An embedded device may comprise a connectivity function configured to communicate with a communications network, such as, for example, a wireless communications network. The embedded device may be enabled to receive from such a communications network information relating to, for example, a current time and current time zone.
More complex embedded devices, such as cellular telephones, may allow a user to install applications into a memory, such as, for example, a solid-state memory, comprised in the device. Embedded devices are frequently resource-constrained when compared to desktop or laptop computers. For example, memory capacity may be more limited than in desktop or laptop computers, processor computational capacity may be lower and energy may be available from a battery. The battery, which may be small, may be rechargeable.
Conserving battery power is a key task in designing embedded devices. A lower current usage enables longer time intervals in-between battery charging. For example, smartphones benefit greatly when they can survive an entire day before needing recharging, since users are thereby enabled to recharge their phones overnight, and enjoy uninterrupted use during the day.
Battery resources may be conserved by throttling a processor clock frequency between a maximum clock frequency and a lower clock frequency, for example one half of the maximum clock frequency. Another way to conserve battery power is to cause a display of an embedded device to switch itself off then the device is not used, since displaying content on a display consumes energy in order to cause the display to emit light that humans can see.
The invention is defined by the features of the independent claims. Some specific embodiments are defined in the dependent claims.
According to a first aspect of the present invention, there is provided an apparatus, such as a personal device, comprising at least one processor with at least one processing core, at least one display, at least one sensor, at least one memory including a computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to:
In a second aspect of the invention, a method is provided for for presenting information to a user of an apparatus, such as a personal device, said device comprising at least one processor having at least on processing core, at least one display, at least one sensor and at least one memory including computer program code, said method comprising the steps of:
In some embodiments, at least two processing cores are provided, wherein a selected activity is initiated by a first processing core in said first display mode, said first power-save mode is entered by putting said first processing core in a hibernating mode and by switching to said second display mode using a second processing core, and said second power-save mode is entered by putting said second processing core in a hibernating mode.
In some embodiments, the clock unit is a Real Time Clock (RTC) unit which updates said display in said second power-save mode in predefined time intervals with reduced thematic maps stored in said at least one memory of the apparatus, wherein said reduced thematic map to be shown is selected based on the current time as registered by said RTC unit
Various embodiments of the first aspect may comprise at least one feature from the following bulleted list:
A thematic map database, for example a heat map, may be compiled to cover a geographic area. Users may engage in activity sessions while in the geographic area. Activity types of such activity sessions may include jogging, swimming and cycling, for example. When a user wishes to engage in an activity session of his own, his device may determine a route for this activity session based at least in part on the thematic map database. Determining the route may comprise designing the route, optionally based partly on user settings, based on where other users have engaged in activity sessions of the same type in the past. For example, a jogging route may be determined based, at least partly, on indications where other users have jogged in the past. Route determination may be partly based on further considerations as well, as will be laid out below.
Device 110 may be communicatively coupled with a communications network. For example, in
Device 110 may be configured to receive, from satellite constellation 150, satellite positioning information via satellite link 151. The satellite constellation may comprise, for example the global positioning system, GPS, or the Galileo constellation. Satellite constellation 150 may comprise more than one satellite, although only one satellite is illustrated in
Alternatively or additionally to receiving data from a satellite constellation, device 110 may obtain positioning information by interacting with a network in which base station 120 is comprised. For example, cellular networks may employ various ways to position a device, such as trilateration, multilateration or positioning based on an identity of a base station with which attachment is possible or ongoing. Likewise a non-cellular base station, or access point, may know its own location and provide it to device 110, enabling device 110 to position itself within communication range of this access point.
Device 110 may be configured to obtain a current time from satellite constellation 150, base station 120 or by requesting it from a user, for example. Once device 110 has the current time and an estimate of its location, device 110 may consult a look-up table, for example, to determine a time remaining until sunset or sunrise, for example. Device 110 may likewise gain knowledge of the time of year.
Network 140 may be coupled to a database server 160 via a connection 161, or the further network connection 141. When device 110 apparatus has determined the present location of itself it may transmit a query to database server 160 acting as a thematic map database server. The query may comprise an indication of the current location of the apparatus obtained by, for example, a global positioning system as explained above. Updated thematic map data related to the present location of the device 110 may then be downloaded from the server 160 over network connections 161 or 141 and stored in a memory of the device 110.
The device may be configured to present to the user a selection of updated thematic maps as heatmaps created for different sports activities in said location. The updated heatmaps thus stored in the memory of the device 110 may be used offline in activity sessions.
Device 110 may be configured to provide an activity session. An activity session may be associated with an activity type. Examples of activity types include rowing, paddling, cycling, jogging, walking, hunting and paragliding. In a simplest form, an activity session may comprise device 110 displaying a map of the surroundings, and a route on the map that relates to the activity session. Device 110 may be configured to display, on the route, an indication where along the route the user is currently located, enabling the user to see the place along the route where his jog, for example, is at the moment progressing.
According to some embodiments, the device 110 may be configured to present to the user a selection of updated thematic maps as heatmaps created for different sports activities in second locations outside but adjacent to said present location. Some activities preferred by the user, such as cycling for example, may involve moving long distances and/or for a lengthy time. In the planning of such activities it may be beneficial to have heatmaps also covering locations nearby the present one.
In some embodiments the apparatus may be configured to automatically update thematic maps from the server 160 when the apparatus is being charged with a charging device 170 and is connected to a wireless network 112.
According to some embodiments, the device 110 may first be able to determine the present location of the device, to transmit a query to a thematic map database server, the query comprising an indication of the current location of the device, and then to update itself with thematic maps related to its location by downloading thematic map data from the server and store the thematic map data in the memory of the device 110. Then, the device may present to the user on the display in a first display mode a selection of downloaded thematic maps as suggested activity types. The selection may be based on at least one of the criteria of a pre-recorded user preference, user activity history, intensity of activities in said location, special activities in said location, time of the day, time of the year, or a second location adjacent to said present location. In response to a user selection input, the device 110 may then initiate an activity session of one of the activity being selected, and display a thematic map of the activity in a second display mode.
Processing heatmaps, sensor information, displays and other hardware required for tracking the whereabouts and physical performance of a person is a power-consuming task. From a battery performance point of view, it is important to minimize the energy consumption of the device 110. Therefore, a first display mode has been envisaged, where all relevant data and features required for the user to perform searching, browsing and a selection of activities, as well as using any other features offered by the device. This requires much battery power, but once the activity selection is made and initiated, the device 110 may enter a low-power mode focusing only on processing data which is essential for the activity in question. Such a low-power mode may involve the use of a second display mode, where for example, resolution is reduced, colours are not displayed, map display is simplified and/or sensors are shut off and their corresponding information is suppressed on the display.
According to some embodiments, the apparatus comprises at least two processing cores, at least one display with at least two display modes. A first processing core causes the apparatus to determine the present location of the apparatus, transmit from the apparatus a query to a thematic map database server, and to update the apparatus with thematic maps related to said location from said server by downloading thematic map data and storing the thematic map data in at least one memory of the apparatus. It also presents to the user on the display in a first display mode a selection of downloaded thematic maps as suggested activity types, where the activity types are based on at least one of the following criteria: a pre-recorded user preference, user activity history, intensity of activities in said location, special activities in said location, time of the day, time of the year, or a second location adjacent to said present location. In response to a user selection input, an activity session is initiated and a thematic map of the selected activity is displayed in a second display mode by a second processing core.
In some embodiments, updated thematic heatmaps may be created for different sports activities also in second locations outside, but adjacent to the present location. This may be beneficial if the activity, such as cycling, takes place over considerable distances. The device 110 may be configured to automatically update thematic maps related to its location from a thematic map database server anytime when the apparatus is being charged and is connected to a wireless network with coverage of its present location.
In some embodiments, the updated thematic maps are stored in a memory of the device 110 for offline use. Thus a stored thematic map of an activity session that is to be initiated may be displayed directly in a second display mode.
An activity session in device 110 may enhance a utility a user can obtain from the activity, for example, where the activity involves movement outdoors, the activity session may provide a recording of the activity session. An activity session in device 110 may, in some embodiments, provide the user with contextual information during the activity session. Such contextual information may comprise, for example, locally relevant weather information, received via base station 120, for example. Such contextual information may comprise at least one of the following: a rain warning, a temperature warning, an indication of time remaining before sunset, an indication of a nearby service that is relevant to the activity, a security warning, an indication of nearby users and an indication of a nearby location where several other users have taken photographs. Where the contextual information comprises a security warning, the warning may comprise a security route, determined in a way that enables the user to avoid danger. For example, in case of a chemical leak, the security route may comprise a route that leads indoors or to public transport. The device 110 may determine a security route, or the device 110 may receive the security route, at least in part, from a network. The security route may be determined using existing roads, pathways and other transit routes that are known to the entity determining the security route. Transit routes may be known from a public mapping service, for example.
A recording may comprise information on at least one of the following: a route taken during the activity session, a metabolic rate or metabolic effect of the activity session, a time the activity session lasted, a quantity of energy consumed during the activity session, a sound recording obtained during the activity session and an elevation map along the length of the route taken during the activity session. A route may be determined based on positioning information, for example. Metabolic effect and consumed energy may be determined, at least partly, based on information concerning the user that device 110 has access to. A recording may be stored in device 110, an auxiliary device, or in a server or data cloud storage service. A recording stored in a server or cloud may be encrypted prior to transmission to the server or cloud, to protect privacy of the user.
An activity session may have access to a backhaul communications link to provide indications relating to the ongoing activity. For example, search and rescue services may be given access to information on joggers in a certain area of a forest, to enable their rescue if a chemical leak, for example, makes the forest unsafe for humans. In some embodiments, routes relating to activity sessions are provided to a cloud service for storage when the activity sessions start, to enable searching for missing persons along the route the persons were planning to take.
The user may initiate an activity session by interacting with a user interface of device 110, for example. Where device 110 has s small form factor, the user interface may be implemented over a limited user interaction capability, such as, for example, a small screen, small touchscreen, and/or limited number of push buttons. A limited user interaction capability may make it arduous for the user to perform complicated interactions with device 110, which makes it less likely the user will choose to interact with device 110. Therefore, it is of interest to simplify the interaction between device 110 and the user, to make it easier for the user to complete the interaction, and thus more likely the user will perform the interaction.
Device 110 may provide to the thematic map database 160 an indication relating to the activity session, to enhance the thematic map database further. Such indications may be anonymized prior to sending to the database, both to protect the user's privacy and/or to comply with local legislation. Such indications may comprise, for example, information on a determined route and a corresponding activity type.
In general, a thematic map database 160 may associate at least one form of data with a geographic location. For example, the thematic map database may associate past indications of activity sessions with geographic locations, for example to enable mapping areas where activity sessions of a given activity type have been performed. Areas may be mapped as to the intensity, or frequency, of past indications of activity session and type. Thus a first area of a lake may be associated with frequent rowing, and another area of the same lake with less frequent, but still non-zero, rowing. Such a frequency may be referred to as intensity, and the thematic map database may, in general, associate activity type intensities with locations. Alternatively to intensities, the thematic map database may simply associate, whether an activity session of a given activity type has in the past been performed in a geographic location. Alternatively to intensities, the thematic map database may indicate any specialities of activities in the location. Additionally or alternatively, a traffic density may be associated with the geographic locations. Traffic density may comprise pedestrian or vehicular traffic density, for example. Walking or jogging may be less pleasant, or less healthy, in areas with a high vehicular traffic density due to exhaust fumes, wherefore a route relating to an activity session with such type may be determined in a way that avoids such high-traffic density areas. Likewise, additionally or alternatively, crime density may be mapped, and employed in route determination to avoid high-crime areas. Avalanche risk density, obtained from meteorological services, may similarly be used to route ski activity sessions in safe areas. In some embodiments, places where many users have taken photographs may be used in routing, such that routes are determined to visit frequently photographed locations, since such locations are likely to be beautiful and inspiring.
In some embodiments, the user may have indicated in user settings that he wishes to engage in a certain type of activity session, wherein such indications may be taken into account when determining the route for the activity session. The settings may be taken into account, for example, by designing the route so that performing the activity session along the route causes an increase in energy consumption in the user that is approximately in line with what the user has requested in the settings. Alternatively or additionally, a cardiovascular effect of the activity session may be tuned to be in line with a user setting by designing the route in a suitable way. Likewise the user may specify a desired effect on oxygen consumption, EPOC effect and/or a recovery time length after the activity session. EPOC refers to excess post-exercise oxygen consumption, sometimes known colloquially as afterburn.
A route may be determined to be able to be interrupted. For example, where the activity comprises cycling, the route may come close to the starting and ending location close to a midpoint of the route, to enable the user to cut the route short. The user may specify in user settings he wishes to engage in an interruptable route, or interruptability may be a default setting that is attempted to comply with, where possible.
A level of physical exertion, in terms of energy consumption, oxygen consumption, cardiovascular effect, EPOC or recovery time length, the route causes in the user may be modified by determining elevation changes along the route. Where the user wishes a light activity session, the route may be determined as relatively flat, and where the user wishes for a strenuous activity session, the route may be determined in a way that it has more elevation changes. Using the thematic map database in connection with elevation data in this sense may comprise, for example, determining the route based on elevation changes to match the desired strenuousness, in an area which the thematic map database indicates that activity sessions of a corresponding type have been conducted in the past. In general, the user settings may be employed in determining the route after a suitable area for the route has been identifier using the thematic map database.
A time of year and/or a time of day may be employed in either the thematic map database or in the determining of the route. For example, the thematic map database 160 may comprise data collected at different times of year, for example a same location may be associated with frequent jogging in summertime and frequent skiing during the winter months. Thus, the database may return a jogging route in the location in case the query is made in the summer, and the database may return a skiing route in the location in case the query is made in the winter. Alternatively or in addition, device 110 may select activity types consistent with the time of year, or time of day, from the set of activity types returned from the database when determining the predicted user activity type. Device 110 may perform this task in embodiments where a thematic map database doesn't collect statistics separately according to time of year or day, for example. As a specific example, local residents may consider a certain location as safe during the day but unsafe after dark. In such a situation, a user requesting a jogging route could be routed to this location if the request is made in the daytime, but routed elsewhere if the request is made after dark.
In general, the thematic map database 160 may be comprised in a server or cloud device, or it may be downloaded, at least in part, to device 110 or an auxiliary device, for offline use. An auxiliary device is described below in connection with
Responsive to the user approving, implicitly or explicitly, a suggested route, an activity session based on the approved suggested route may be initiated.
More than one route may be determined, such that at least one of the determined routes is presented to the user as a suggested route. For example, two routes may be determined that match requirements defined by the user, and these two routes may then be presented as suggested routes, with information concerning each route presented to the user as well. For example, energy consumption, estimated time to completion and/or length of a route may be presented to assist the user in making a selection. Energy consumption, estimated time to completion and/or other suitable information may be determined, at least partly, on the elevation information.
Information may be presented also, or alternatively, concerning segments of any suggested route, to enable the user to construct his route from interconnected segments.
In some embodiments, the user needn't explicitly select a suggested route, rather, the device may deduce from the way positioning information changes, which route the user is following. As a response, any other suggested routes may be removed from the display to reduce clutter. In case the user deviates from the route, the device may notice this from the positioning information, and responsively determine an alternative route for the user, which may again be displayed. Thus movement of the user may cause, via the positioning information, an approval of a suggested route and/or a new determination of a new suggested route in case of deviation from a previously approved route. Such a new suggested route may be determined from the current location of the device to the same end point as the originally approved route. Such an end point may comprise the start point of the route, or, alternatively, another point input by the user. Remaining time, energy consumption and/or other information may be presented concerning the new suggested route.
Device 110 may be communicatively coupled, for example communicatively paired, with an auxiliary device 110x. The communicative coupling, or pairing, is illustrated in
In some embodiments, where an auxiliary device 100x is present, device 110 is configured to use connectivity capability of auxiliary device 110x. For example, device 110 may access a network via auxiliary device 110x. In these embodiments, device 110 need not be furnished with connectivity toward base station 120, for example, since device 110 may access network resources via interface 111 and a connection auxiliary device 110x has with base station 120. Such a connection is illustrated in
In some embodiments, device 110 may have some connectivity and be configured to use both that and connectivity provided by auxiliary device 110x. For example, device 110 may comprise a satellite receiver enabling device 110 to obtain satellite positioning information directly from satellite constellation 150. Device 110 may then obtain network connectivity to base station 120 via auxiliary device 110x. For example, device 110 may transmit a query to the thematic map database via auxiliary device 110x. In some embodiments, device 110 is configured to request, and responsively to receive, sensor information from auxiliary device 110x. Such sensor information may comprise acceleration sensor information, for example. In general, processing, such as route determination and/or communication processing, may be distributed in a suitable way between device 110, auxiliary device 110x and/or a cloud computing service.
Similarly as discussed in connection with
Furnishing an embedded device with two or more processor cores, at least some of which are enabled to control the display of the device, makes possible power savings where a less-capable processor core is configured to toggle a more capable processor core to and from a hibernation state. A hibernation state may comprise that a clock frequency of the more capable processing core is set to zero, for example. In a hibernation state, in addition to, or alternatively to, setting the clock frequency of the more capable processing core to zero, a memory refresh rate of memory used by the more capable core may be set to zero. Alternatively to zero, a low non-zero frequency may be used for the clock frequency and/or the memory refresh frequency. In some embodiments, a more capable processing core may employ a higher-density memory technology, such as double data rate, DDR, memory, and a less capable processing core may employ a lower-density memory technology, such as static random access memory, SRAM, memory. In a hibernation state the hibernated processing core, or more generally processing unit, may be powered off. Alternatively to a processor core, an entire processor may, in some embodiments, be transitioned to a hibernation state. An advantage of hibernating an entire processor is that circuitry in the processor outside the core is also hibernated, further reducing current consumption.
Device 110 may comprise two or more processing units. The two or more processing units may each comprise a processing core. Each processing unit may comprise one or multiple uniformal or heterogeneous processor cores and/or different volatile and non-volatile memories. For example, device 110 may comprise a microprocessor with at least one processing core, and a microcontroller with at least one processing core. The processing cores needn't be of the same type, for example, a processing core in a microcontroller may have more limited processing capability and/or a less capable memory technology than a processing core comprised in a microprocessor. In some embodiments, a single integrated circuit comprises two processing cores, a first one of which has lesser processing capability and consumes less power, and a second one of which has greater processing capability and consumes more power. In general a first one of the two processing units may have lesser processing capability and consume less power, and a second one of the two processing units may have greater processing capability and consume more power. Each of the processing units may be enabled to control the display of device 110. The more capable processing unit may be configured to provide a richer visual experience via the display. The less capable processing unit may be configured to provide a reduced visual experience via the display. An example of a reduced visual experience is a reduced colour display mode, as opposed to a rich colour display mode. An another example of a reduced visual experience is one which is black-and-white. An example of a richer visual experience is one which uses colours. Colours may be represented with 16 bits or 24 bits, for example.
Each of the two processing units may comprise a display interface configured to communicate toward the display. For example, where the processing units comprise a microprocessor and a microcontroller, the microprocessor may comprise transceiver circuitry coupled to at least one metallic pin under the microprocessor, the at least one metallic pin being electrically coupled to an input interface of a display control device. The display control device, which may be comprised in the display, is configured to cause the display to display information in dependence of electrical signals received in the display control device. Likewise the microcontroller in this example may comprise transceiver circuitry coupled to at least one metallic pin under the microcontroller, the at least one metallic pin being electrically coupled to an input interface of a display control device. The display control device may comprise two input interfaces, one coupled to each of the two processing units, or alternatively the display control device may comprise a single input interface into which both processing units are enabled to provide inputs via their respective display interfaces. Thus a display interface in a processing unit may comprise transceiver circuitry enabling the processing unit to transmit electrical signals toward the display.
One of the processing units, for example the less capable or the more capable one, may be configured to control, at least in part, the other processing unit. For example, the less capable processing unit, for example a less capable processing core, may be enabled to cause the more capable processing unit, for example a more capable processing core, to transition into and from a hibernating state. These transitions may be caused to occur by signalling via an inter-processing unit interface, such as for example an inter-core interface.
When transitioning into a hibernating state from an active state, the transitioning processing unit may store its context, at least in part, into a memory, such as for example a pseudostatic random access memory, PSRAM, SRAM, FLASH or ferroelectric RAM, FRAM. The context may comprise, for example, content of registers and/or addressing. When transitioning from a hibernated state using a context stored in memory, a processing unit may resume processing faster and/or from a position where the processing unit was when it was hibernated. This way, a delay experienced by a user may be minimised. Alternative terms occasionally used for context include state and image. In a hibernating state, a clock frequency of the processing unit and/or an associated memory may be set to zero, meaning the processing unit is powered off and does not consume energy. Circuitry configured to provide an operating voltage to at least one processing unit may comprise a power management integrated circuit, PMIC, for example. Since device 110 comprises another processing unit, the hibernated processing unit may be powered completely off while maintaining usability of device 110.
When transitioning from a hibernated state to an active state, the transitioning processing unit may have its clock frequency set to a non-zero value. The transitioning processing unit may read a context from a memory, wherein the context may comprise a previously stored context, for example a context stored in connection with transitioning into the hibernated state, or the context may comprise a default state or context of the processing unit stored into the memory in the factory. The memory may comprise pseudostatic random access memory, SRAM, FLASH and/or FRAM, for example. The memory used by the processing unit transitioning to and from the hibernated state may comprise DDR memory, for example.
With one processing unit in a hibernation state, the non-hibernated processing unit may control device 110. For example, the non-hibernated processing unit may control the display via the display interface comprised in the non-hibernated processing unit. For example, where a less capable processing unit has caused a more capable processing unit to transition to the hibernated state, the less capable processing unit may provide a reduced user experience, for example, via at least in part, the display. An example of a reduced user experience is a mapping experience with a reduced visual experience comprising a black-and-white rendering of the mapping service. The reduced experience may be sufficient for the user to obtain a benefit from it, with the advantage that battery power is conserved by hibernating the more capable processing unit. In some embodiments, a more capable processing unit, such as a microprocessor, may consume a milliampere of current when in a non-hibernated low-power state, while a less capable processing unit, such as a microcontroller, may consume only a microampere when in a non-hibernated low-power state. In non-hibernated states current consumption of processing units may be modified by setting an operating clock frequency to a value between a maximum clock frequency and a minimum non-zero clock frequency. In at least some embodiments, processing units, for example less capable processing units, may be configurable to power down for short periods, such as 10 or 15 microseconds, before being awakened. In the context of this document, this is not referred to as a hibernated state but an active low-power configuration. An average clock frequency calculated over a few such periods and the intervening active periods is a positive non-zero value. A more capable processing unit may be enabled to run the Android operating system, for example.
Triggering events for causing a processing unit to transition to the hibernated state include a user indicating a non-reduced experience is no longer needed, a communication interface of the processing unit no longer being needed and device 110 not having been used for a predetermined length of time. An example indication that a non-reduced experience is no longer needed is where the user deactivates a full version of an application, such as for example a mapping application. Triggering events for causing a processing unit to transition from the hibernated state to an active state may include a user indicating a non-reduced experience is needed, a communication interface of the processing unit being requested and device 110 being interacted with after a period of inactivity. Alternatively or additionally, external events may be configured as triggering events, such as, for example, events based on sensors comprised in device 110. An example of such an external event is a clock-based event which is configured to occur at a preconfigured time of day, such as an alarm clock function, for example. In at least some embodiments, the non-reduced experience comprises use of a graphics mode the non-hibernated processing unit cannot support, but the hibernated processing unit can support. A graphics mode may comprise a combination of a resolution, colour depth and/or refresh rate, for example.
In some embodiments, a user need or user request for the non-reduced experience may be predicted. Such predicting may be based at least in part on a usage pattern of the user, where the user has tended to perform a certain action in the reduced experience before requesting the non-reduced experience. In this case, responsive to a determination the user performs the certain action in the reduced experience, the non-reduced mode may be triggered.
If the processing units reside in separate devices or housings, such as a wrist-top computer and a handheld or fixedly mounted display device for example, a bus may be implemented in a wireless fashion by using a wireless communication protocol. Radio transceiver units functionally connected to their respective processing units may thus perform the function of the bus, forming a personal area network, PAN. The wireless communication protocol may be one used for communication between computers, and/or between any remote sensors, such as a Bluetooth LE or the proprietary ANT+ protocol. These are using direct-sequence spread spectrum, DSSS, modulation techniques and an adaptive isochronous network configuration, respectively. Enabling descriptions of necessary hardware for various implementations for wireless links are available, for example, from the Texas Instrument®'s handbook “Wireless Connectivity” which includes IC circuits and related hardware configurations for protocols working in sub-1- and 2.4-GHz frequency bands, such as ANT™, Bluetooth®, Bluetooth® low energy, RFID/NFC, PurePath™ Wireless audio, ZigBee®, IEEE 802.15.4, ZigBee RF4CE, 6LoWPAN, Wi-Fi®.
In connection with hibernation, the PAN may be kept in operation by the non-hibernated processing unit, such that when hibernation ends, the processing unit leaving the hibernated mode may have access to the PAN without needing to re-establish it.
In some embodiments, microphone data is used in determining, in a first processor, whether to trigger a second processor from hibernation. The first processor may be less capable and consume less energy than the second processor. The first processor may comprise a microcontroller and the second processor may comprise a microprocessor, for example. The microphone data may be compared to reference data and/or preprocessed to identify in the microphone data features enabling determination whether a spoken instructions has been uttered and recorded into the microphone data. Alternatively or in addition to a spoken instruction, an auditory control signal, such as a fire alarm or beep signal, may be searched in the microphone data.
Responsive to the spoken instruction and/or auditory control signal being detected, by the first processor, in the microphone data, the first processor may start the second processor. In some embodiments, the first processor starts the second processor into a state that the first processor selects in dependence of which spoken instruction and/or auditory control signal was in the microphone data. Thus, for example, where the spoken instruction identifies a web search engine, the second processor may be started up into a user interface of this particular web search engine. As a further example, where the auditory control signal is a fire alarm, the second processor may be started into a user interface of an application that provides emergency guidance to the user. Selecting the initial state for the second processor already in the first processor saves time compared to the case where the user or second processor itself selects the state.
In cases where a microphone is comprised in the apparatus, the microphone may in particular be enclosed inside a waterproof casing. While such a casing may prevent high-quality microphone data from being generated, it may allow for microphone quality to be generated that is of sufficient quality for the first processor to determine, whether the spoken instruction and/or auditory control signal is present.
In some embodiments, the first processor is configured to process a notification that arrives in the apparatus, and to decide whether the second processor is needed to handle the notification. The notification may relate to a multimedia message or incoming video call, for example. The notification may relate to a software update presented to the apparatus, wherein the first processor may cause the second processor to leave the hibernating state to handle the notification. The first processor may select, in dependence of the notification, an initial state into which the second processor starts from the hibernated state. For a duration of a software update, the second processor may cause the first processor to transition into a hibernated state.
In general, an instruction from outside the apparatus may be received in the apparatus, and the first processor may responsively cause the second processor to leave the hibernation state. The instruction from outside the apparatus may comprise, for example, the notification, the spoken instruction or the auditory control signal.
Microcontroller 210 is communicatively coupled, in the illustrated example, with a buzzer 270, a universal serial bus, USB, interface 280, a pressure sensor 290, an acceleration sensor 2100, a gyroscope 2110, a magnetometer 2120, satellite positioning circuitry 2130, a Bluetooth interface 2140, user interface buttons 2150 and a touch interface 2160. Pressure sensor 290 may comprise an atmospheric pressure sensor, for example.
Microprocessor 220 is communicatively coupled with an optional cellular interface 240, a non-cellular interface 250 and a USB interface 260. Microprocessor 220 is further communicatively coupled, via microprocessor display interface 222, with display 230. Microcontroller 210 is likewise communicatively coupled, via microcontroller display interface 212, with display 230. Microprocessor display interface 222 may comprise communication circuitry comprised in microprocessor 220. Microcontroller display interface 212 may comprise communication circuitry comprised in microcontroller 210.
Microcontroller 210 may be configured to determine whether triggering events occur, wherein responsive to the triggering events microcontroller 210 may be configured to cause microprocessor 220 to transition into and out of the hibernating state described above. When microprocessor 220 is in the hibernating state, microcontroller 210 may control display 230 via microcontroller display interface 222. Microcontroller 210 may thus provide, when microprocessor 220 is hibernated, for example, a reduced experience to a user via display 230.
Responsive to a triggering event, microcontroller 210 may cause microprocessor 220 to transition from the hibernated state to an active state. For example, where a user indicates, for example via buttons 2150, that he wishes to originate a cellular communication connection, microcontroller 210 may cause microprocessor 220 to transition to an active state since cellular interface 240 is controllable by microprocessor 220, but, in the example of
In various embodiments, at least two elements illustrated in
In
Memory 2170 is used by microprocessor 220, and may be based on a DDR memory technology, such as for example DDR2 or DDR3, for example. Memory 2180 is used by microcontroller 210, and may be based on SRAM technology, for example.
Illustrated is device 300, which may comprise, for example, an embedded device 110 of
Device 300 may comprise memory 320. Memory 320 may comprise random-access memory and/or permanent memory. Memory 320 may comprise volatile and/or non-volatile memory. Memory 320 may comprise at least one RAM chip. Memory 320 may comprise magnetic, optical and/or holographic memory, for example. Memory 320 may be at least in part accessible to processor 310. Memory 320 may be means for storing information. Memory 320 may comprise computer instructions that processor 310 is configured to execute. When computer instructions configured to cause processor 310 to perform certain actions are stored in memory 320, and device 300 overall is configured to run under the direction of processor 310 using computer instructions from memory 320, processor 310 and/or its at least one processing core may be considered to be configured to perform said certain actions. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be at least in part external to device 300 but accessible to device 300.
Device 300 may comprise a transmitter 330. Device 300 may comprise a receiver 340. Transmitter 330 and receiver 340 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard. Transmitter 330 may comprise more than one transmitter. Receiver 340 may comprise more than one receiver. Transmitter 330 and/or receiver 340 may be configured to operate in accordance with global system for mobile communication, GSM, wideband code division multiple access, WCDMA, long term evolution, LTE, IS-95, wireless local area network, WLAN, Ethernet and/or worldwide interoperability for microwave access, WiMAX, standards, for example. Transmitter 330 and/or receiver 340 may be controllable via cellular interface 240, non-cellular interface 250 and/or USB interface 280 of
Device 300 may comprise a near-field communication, NFC, transceiver 350. NFC transceiver 350 may support at least one NFC technology, such as NFC, Bluetooth, Wibree or similar technologies.
Device 300 may comprise user interface, UI, 360. UI 360 may comprise at least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by causing device 300 to vibrate, a speaker and a microphone. User input to UI 360 may be based on patterns, such as, for example, where a user shakes device 300 to initiate actions via UI 360. A user may be able to operate device 300 via UI 360, for example to accept incoming telephone calls, to originate telephone calls or video calls, to browse the Internet, to manage digital files stored in memory 320 or on a cloud accessible via transmitter 330 and receiver 340, or via NFC transceiver 350, and/or to play games. UI 360 may comprise, for example, buttons 2150 and display 230 of
Device 300 may comprise or be arranged to accept a user identity module 370. User identity module 370 may comprise, for example, a subscriber identity module, SIM, card installable in device 300. A user identity module 370 may comprise information identifying a subscription of a user of device 300. A user identity module 370 may comprise cryptographic information usable to verify the identity of a user of device 300 and/or to facilitate encryption of communicated information and billing of the user of device 300 for communication effected via device 300.
Processor 310 may be furnished with a transmitter arranged to output information from processor 310, via electrical leads internal to device 300, to other devices comprised in device 300. Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 320 for storage therein. Alternatively to a serial bus, the transmitter may comprise a parallel bus transmitter. Likewise processor 310 may comprise a receiver arranged to receive information in processor 310, via electrical leads internal to device 300, from other devices comprised in device 300. Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 340 for processing in processor 310. Alternatively to a serial bus, the receiver may comprise a parallel bus receiver.
Device 300 may comprise further devices not illustrated in
Processor 310, memory 320, transmitter 330, receiver 340, NFC transceiver 350, UI 360 and/or user identity module 370 may be interconnected by electrical leads internal to device 300 in a multitude of different ways. For example, each of the aforementioned devices may be separately connected to a master bus internal to device 300, to allow for the devices to exchange information. However, as the skilled person will appreciate, this is only one example and depending on the embodiment various ways of interconnecting at least two of the aforementioned devices may be selected without departing from the scope of the present invention.
In phase 410, processing unit 2, which may comprise a processing core, controls the display. For example, processing unit 2 may run an application and provide to the display instructions to display information reflective of the state of the application.
In phase 420, processing unit 1 determines that a triggering event occurs, the triggering event being associated with a transition of processing unit 2 from an active state to a hibernated state. Processing unit 1 may determine an occurrence of a triggering event by receiving from processing unit 2 an indication that a task performed by processing unit 2 has been completed, for example. As discussed above, the hibernating state may comprise that a clock frequency of processing unit 2 is set to zero. Responsive to the determination of phase 420, processing unit 1 assumes control of the display in phase 430, and causes processing unit 2 to transition to the hibernating state in phase 440. Subsequently, in phase 450, processing unit 2 is in the hibernated state. When processing unit 2 is in the hibernated state, battery resources of the device may be depleted at a reduced rate. In some embodiments, phase 430 may start at the same time as phase 440 occurs, or phase 440 may take place before phase 430 starts.
In phase 460, a user interacts with the user interface UI in such a way that processing unit 1 determines a triggering event to transition processing unit 2 from the hibernated state to an active state. For example, the user may trigger a web browser application that requires a connectivity capability that only processing unit 2 can provide. Responsively, in phase 470 processing unit 1 causes processing unit 2 to wake up from the hibernating state. As a response, processing unit 2 may read a state from a memory and wake up to this state, and assume control of the display, which is illustrated as phase 480.
Phase 510 comprises generating, by a first processing core, first control signals. Phase 520 comprises controlling a display by providing the first control signals to the display via a first display interface. Phase 530 comprises generating, by a second processing core, second control signals. Phase 540 comprises controlling the display by providing the second control signals to the display via a second display interface. Finally, phase 550 comprises causing the second processing core to enter and leave a hibernation state based at least partly on a determination, by the first processing core, concerning an instruction from outside the apparatus.
PU1 corresponds to processing unit 1, for example, a less capable processing unit. PU2 corresponds to processing unit 2, for example, a more capable processing unit. These units may be similar to those in discussed in connection with
Starting from the initial power-off state, first PU1 is powered up, indicated as a “1” in the state of PU1, while PU2 remains in an off state, denoted by zero. Thus the compound state is “10”, corresponding to a case where PU1 is active and PU2 is not. In this state, the device may offer a reduced experience to a user and consume relatively little current from battery reserves.
In addition to, or alternatively to, a power-off state PU1 and/or PU2 may have an intermediate low-power state from which it may be transitioned to an active state faster than from a complete power-off state. For example, a processing unit may be set to such an intermediate low-power state before being set to a power-off state. In case the processing unit is needed soon afterward, it may be caused to transition back to the power-up state. If no need for the processing unit is identified within a preconfigured time, the processing unit may be caused to transition from the intermediate low-power state to a power-off state.
Arrow 610 denotes a transition from state “10” to state “11”, in other words, a transition where PU2 is transitioned from the hibernated state to an active state, for example, a state where its clock frequency is non-zero. PU1 may cause the transition denoted by arrow 610 to occur, for example, responsive to a triggering event. In state “11”, the device may be able to offer a richer experience, at the cost of faster battery power consumption.
Arrow 620 denotes a transition from state “11” to state “10”, in other words, a transition where PU2 is transitioned from an active state to the hibernated state. PU1 may cause the transition denoted by arrow 620 to occur, for example, responsive to a triggering event.
According to some embodiments, the user may be presented with a selection of updated heatmaps created for different sports activities in locations outside, but adjacent to said present location. Thus the display 700 may show a map 720 with a hill in a neighbouring town, county or borough, for example. The rule of what is within the present location of the device 110 and what is in adjacent location may be set by the boundaries between such areas, if the positioning system used contain such data, or simply by a radius form the present location, e.g. 10 km.
Some activities preferred by the user, such as cycling for example, may involve moving long distances and/or for a lengthy time. In the planning of such activities it may be beneficial to have heatmaps also covering locations nearby the present one.
A start point 730 is illustrated in the user interface, as is a route 740, which is indicated with a dashed line. In this example, the route may be traversed twice to obtain the physical exercise effect the user wants. The route proceeds along a relatively constant elevation around the hill, and when traversed twice provides an opportunity to interrupt the activity session halfway through, as the user passes start point 730. To interrupt the session, the user can simply stop at start point 730 instead of beginning a second lap along the route. In this example the area of map 720 may be indicated in the thematic map database as being associated with past activity sessions of a corresponding, or indeed same, activity type as the session the user selects. The route may be determined, in part, based on mapping information obtained from a mapping service, such as a proprietary service, HERE maps or Google maps, for example. Elevation information may be obtained from the same, or similar, service.
The user may be presented with information concerning route options, for example for the first option, an estimated energy consumption associated with an activity session along the route defined by the first option, and likewise for the second option. The user may, explicitly or implicitly, select one of the presented options, and along the route deviate therefrom to use a different set of route segments. For example, a user setting on along the first option, may decide to shorten the activity session by taking segments 750e and 750d back to the start point 730. Alternatively, the user may decide to lengthen the session by taking, in the first option, segment 750f instead of segment 750b.
In some embodiments, information is presented separately concerning route segments, to enable the user to design a route with greater precision. For example, an energy consumption associated with segment 750a, when used as a route segment in an activity session of a given type, may be presented. Likewise, other physiological effects, such as EPOC or oxygen consumption, may be presented in addition to, or alternatively to, the energy consumption.
Phase 810 comprises determining a predicted user activity type based at least partly on a thematic map database and a current location of an apparatus. Phase 820 comprises presenting, by the apparatus, the predicted user activity type as a suggested activity type to a first user. Finally, phase 830 comprises, responsive to the first user approving the suggested activity type, initiating an activity session of the suggested activity type.
Phase 900 comprises determining the present location of the apparatus. Phase 910 comprises the action of transmitting from said apparatus a query to a thematic map database server. The query may comprise an indication of the current location of the apparatus. In phase 920 the device 110 is updated with thematic maps related to the location by downloading thematic map data and storing the thematic map data in a memory of the device. In phase 930 the user is presented in a first display mode with a selection of local thematic maps as suggested activity types. The thematic maps may be selected to be downloaded based on at least one of the following criteria: a pre-recorded user preference, user activity history, intensity of activities in said location, special activities in said location, time of the day, time of the year, or a second location adjacent to said present location. Finally, in phase 940 and in response to the user approving a suggested activity type, an activity session is initiated and displayed in a second display mode.
In
During normal operation, when thematic maps, which may be downloaded from the server 1070 through a communication interface 1022 of the first processor 1020, of suggested activities is presented to a user on the display 1010, the device 1000 assumes a first display mode controlled by the first processor 1020. The communication interface 1022 may correspond to any or several of the interfaces 240-260 of
The first processor 1020 initiates the selected activity and displays to the user in the first adisplay mode performance-related information relating to physical performance of the user, including sensor information relating to position, distance, speed, heart rate etc. This first activity mode is active for a predetermined time, or ends when, for example, acceleration sensor information indicates that the user is in a steady performance mode, based on cadence, rhythmic movements, heart rate, etc.
The first processor 1020 may then produce a reduced version of the thematic map of a selected activity, or the reduced maps may be downloaded from the server 1070 on demand. The demand may be based on the type of the device, the preferences of the user and/or the location of the user, and the server provides the appropriate selection of activities for downloading.
The device 1000 may enter a first power-save mode by determining the last known context of the user and/or the performance. Having determined from the context what to display in a second display mode, the first processor 1020 may enter a hibernating mode and switch from a first display mode to a second display mode. In a two-processor embodiment, the second display mode may be controlled by the second processor 1030. In the second display mode, time and other information relating to said activity may be shown, such as the location of the user provided by a GPS sensor. With a “reduced” map is here meant a reduced version of a thematic map. This may for example mean one or several of the following: less or no colours, lesser display resolution, slower display updates, reduced content, etc.
In some embodiments, where two processors are involved, first and second power-save modes may be used. The preferred sequence from a power-saving point of view would be to first hibernate the first processing core, which consumes more power. This may be controlled by the low-power second processor, for example when there is nothing to left to execute for the first processor. In some alternative embodiments where on only one processor is used, only one power-save mode may be used. In both cases, the final power-save mode involves a complete or almost complete shutdown of any the processing cores in the device, while a clock unit, such as a Real Time Clock (RTC) unit 1060 is used to keep track of the time. When a motion sensor or a press of button indicates the user is looking at the display, the RTC unit provides a time signal to show time-related context on the display, such as the time and a reduced thematic map.
Reduced thematic maps may be downloaded from the server 1070, or they may be produced by the first processor 1020 and stored in its memory 1021. In a two-processing core embodiment, the image(s) of the reduced thematic map may be copied (arrow C in
As the performance of the user continues on a steady path and there is no indication of the user looking at the display, the device 1000 may enter a second power-save mode by switching off the second display mode and putting the second processing core 1030 in a hibernating mode.
In a second power-save mode, the only process running in the device may be the real time clock in the RTC unit 1060. The RTC unit is preferably a separate unit connected to a battery of the device, for example. The processing cores may then be completely shut off. RTC units may also be integrated in either one of the processors 1020 or 1030, or in both, but would then require at least some hardware around the processor in question to be powered up, with a power consumption of a few microamperes. Which alternative RTC units to use is a matter of design choice.
In a one-processor embodiment, the transfer of maps internally in the device is of course not needed, otherwise a second display mode may be used in the same fashion as with two processors, and the reduced thematic map is then shown from the memory 1021 on the display 1010. The single processor may thus have three levels of operation and power consumption: full operation, reduced operation and hibernation (with or without an internal RTC clock). During the performance, an acceleration sensor 1040 may continuously sense the movement of the device 1000. In some embodiments, the processor may be left in a reduced operation mode, if the activity and/or context are deemed to require a fast wakeup of the core. Wakeup from a state of hibernation will take longer. Various power-saving modes may also be entered when the device 1000 deems the user is sleeping, for example. Indeed, various sensor inputs and their combinations may be used for determining the context of the user and select an appropriate time to enter a particular power-save mode. Such input may include the time (e.g. night-time), acceleration sensor input, ambient light, position signals from a GPS sensor, etc.
Reversing the power-saving sequence may be initiated simply by a user pressing a button, or it may be automatic. In some embodiments, for example, when a vertical move is sensed by a smart acceleration sensor 1040, the corresponding sensor signal may have pre-recorded threshold values that when exceeded are interpreted as a raise of the arm in an attempted reading of the display 1010. A power controller 1050 then powers up the high-power processor 1020, or the low-power processor 1030, depending on the embodiment (one or two processors) and the previous context or display mode of the device 1000). In order to speed up the wakeup of hibernating processing cores, their power supplies (switched-mode power supply SMPS, for example) may be left on. Another alternative embodiment is to switch the SMPS off and connect a low-dropout (LDO) regulator as a fast power source for the hibernating processing core in parallel over the SMPS.
In some embodiments, the RTC unit may also start a processor. For example, if a relatively long time has passed since the user last made an attempt to look at the display, the context is difficult to predict and may have changed. The user would then no longer be interested in looking at a reduced thematic map that probably does not show the correct location and/or activity of the user anymore. Instead of just fetching for display a stored thematic map relating to a wrong context, the time delay since the last display action may be used as an indicator that the context has probably changed. As the RTC unit reveals this time delay, the information may be used for example to activate a GPS sensor in order to check the location and start at least a low power processor to update the context of the user, including fetching a thematic map which matches the current location of the user.
The context-dependent images may be fetched from a memory by using a LDO regulator as the power source for a hibernating processor, which provides a fast wakeup. After wakeup, transfer of stored images may take place directly from an internal memory of the processor or from an external memory unit to the display.
Reference is now made to
In step 1110, the present location of the apparatus is determined and a query is transmitted from the apparatus to a thematic map database server for available activity or thematic maps at the current location of the apparatus.
In step 1120, the apparatus is update with thematic maps related to the current location from the thematic map database server, by downloading thematic map data and storing the thematic map data in said at least one memory of the apparatus.
At least one downloaded thematic map is presented to a user of the apparatus as a selection of local heatmaps as a suggested activity in step 1130, using a first display mode, as presented by a high-power first processing core. The activity session may be selected based on at least one of the following criteria: user selection input, a pre-recorded user preference, user activity history, intensity of activities in said location, special activities in said location, time of the day, time of the year, or a second location adjacent to said present location.
In step 1140, a selected activity is initiated and displayed to the user in the display mode, containing performance-related information relating to physical performance of the user in the activity.
Now, in step 1150, a first power-save mode is entered, by putting the first processing core in a hibernating mode, and by switching from the first display mode to a second display mode. The second display mode may display to the user, using a second low power processing core, the time and static information relating to said activity. In some embodiments, in predefined time intervals a pre-calculated static thematic map relating to the activity and the current time is shown. As explained in connection with
Finally, in step 1160 a second power-save mode is entered by switching off the second display mode and putting also the low-power second processing core in a hibernating mode. A third display mode is entered, where a Real Time Clock (RTC) unit is used to keep the time. Pre-stored thematic maps may be shown when requested by a user input or a sensor request, showing the predicted location of the user on the map at that time.
The apparatus may now go stepwise back to the second and/or first display modes by activating the second and/or first processing cores from hibernation. This may be triggered on at least one of the following criteria: user selection input, acceleration data input from an acceleration sensor in said apparatus indicating a display reading posture of said user.
It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment.
As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and example of the present invention may be referred to herein along with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of the present invention.
Furthermore, described features, structures, or characteristics may be combined in any suitable or technically feasible manner in one or more embodiments. In the following description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
While the forgoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.
At least some embodiments of the present invention find industrial application in enhancing device usability and/or personal safety.
Number | Date | Country | Kind |
---|---|---|---|
20155906 | Dec 2015 | FI | national |
1521192 | Dec 2015 | GB | national |
20165790 | Oct 2016 | FI | national |
1617575 | Oct 2016 | GB | national |
This application is a continuation-in-part of U.S. patent application Ser. No. 16/377,267 filed on Apr. 8, 2019, which is a continuation-in-part of U.S. patent application Ser. No. 15/365,972 filed on Dec. 1, 2016, which claims priority from both the Finnish patent application No. 20155906 filed on Dec. 1, 2015 and the British patent application No. 1521192.3 filed on Dec. 1, 2015, and U.S. patent application Ser. No. 15/784,234 filed on Oct. 16, 2017, which claims priority from both the Finnish patent application No. 20165790 filed on Oct. 17, 2016 and the British patent application No. 1617575.4 filed on Oct. 17, 2016. The subject matter of these is incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5924980 | Coetzee | Jul 1999 | A |
6882955 | Ohlenbusch | Apr 2005 | B1 |
7627423 | Brooks | Dec 2009 | B2 |
7706973 | McBride et al. | Apr 2010 | B2 |
7721118 | Tamasi et al. | May 2010 | B1 |
7917198 | Ahola et al. | Mar 2011 | B2 |
7938752 | Wang | May 2011 | B1 |
8052580 | Saalasti et al. | Nov 2011 | B2 |
8323188 | Tran | Dec 2012 | B2 |
8328718 | Tran | Dec 2012 | B2 |
8538693 | McBride et al. | Sep 2013 | B2 |
8612142 | Zhang | Dec 2013 | B2 |
8655591 | Van Hende | Feb 2014 | B2 |
8781730 | Downey et al. | Jul 2014 | B2 |
8949022 | Fahrner et al. | Feb 2015 | B1 |
9008967 | McBride et al. | Apr 2015 | B2 |
9107586 | Tran | Aug 2015 | B2 |
9222787 | Blumenberg | Dec 2015 | B2 |
9317660 | Burich et al. | Apr 2016 | B2 |
9648108 | Granqvist et al. | May 2017 | B2 |
9665873 | Ackland et al. | May 2017 | B2 |
9829331 | McBride et al. | Nov 2017 | B2 |
9830516 | Biswas et al. | Nov 2017 | B1 |
9907473 | Tran | Mar 2018 | B2 |
9923973 | Granqvist et al. | Mar 2018 | B2 |
10234290 | Lush | Mar 2019 | B2 |
10244948 | Pham et al. | Apr 2019 | B2 |
10327673 | Eriksson et al. | Jun 2019 | B2 |
10415990 | Cho et al. | Sep 2019 | B2 |
10433768 | Eriksson et al. | Oct 2019 | B2 |
10515990 | Hung | Dec 2019 | B2 |
10634511 | McBride et al. | Apr 2020 | B2 |
10816671 | Graham | Oct 2020 | B2 |
20030038831 | Engelfriet | Feb 2003 | A1 |
20030109287 | Villaret | Jun 2003 | A1 |
20050070809 | Acres | Mar 2005 | A1 |
20050086405 | Kobayashi et al. | Apr 2005 | A1 |
20060068812 | Carro et al. | Mar 2006 | A1 |
20060136173 | Case, Jr. et al. | Jun 2006 | A1 |
20070156335 | McBride et al. | Jul 2007 | A1 |
20070208544 | Kulach et al. | Sep 2007 | A1 |
20070276200 | Ahola et al. | Nov 2007 | A1 |
20080052493 | Chang | Feb 2008 | A1 |
20080109158 | Huhtala et al. | May 2008 | A1 |
20080158117 | Wong et al. | Jul 2008 | A1 |
20080214360 | Stirling et al. | Sep 2008 | A1 |
20080294663 | Heinley et al. | Nov 2008 | A1 |
20080318598 | Fry | Dec 2008 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090048070 | Vincent et al. | Feb 2009 | A1 |
20090094557 | Howard | Apr 2009 | A1 |
20090100332 | Kanjilal et al. | Apr 2009 | A1 |
20090265623 | Kho et al. | Oct 2009 | A1 |
20100099539 | Haataja | Apr 2010 | A1 |
20100167712 | Stallings et al. | Jul 2010 | A1 |
20100187074 | Manni | Jul 2010 | A1 |
20100257014 | Roberts et al. | Oct 2010 | A1 |
20100313042 | Shuster | Dec 2010 | A1 |
20110010704 | Jeon et al. | Jan 2011 | A1 |
20110152695 | Granqvist et al. | Jun 2011 | A1 |
20110251822 | Darley et al. | Oct 2011 | A1 |
20110252351 | Sikora et al. | Oct 2011 | A1 |
20110281687 | Gilley et al. | Nov 2011 | A1 |
20110283224 | Ramsey et al. | Nov 2011 | A1 |
20110288381 | Bartholomew et al. | Nov 2011 | A1 |
20110296312 | Boyer et al. | Dec 2011 | A1 |
20110307723 | Cupps et al. | Dec 2011 | A1 |
20120022336 | Teixeira | Jan 2012 | A1 |
20120100895 | Priyantha et al. | Apr 2012 | A1 |
20120109518 | Huang | May 2012 | A1 |
20120116548 | Goree et al. | May 2012 | A1 |
20120123806 | Schumann et al. | May 2012 | A1 |
20120158289 | Bernheim Brush et al. | Jun 2012 | A1 |
20120185268 | Wiesner et al. | Jul 2012 | A1 |
20120219186 | Wang et al. | Aug 2012 | A1 |
20120239173 | Laikari et al. | Sep 2012 | A1 |
20120283855 | Hoffman et al. | Nov 2012 | A1 |
20120289791 | Jain et al. | Nov 2012 | A1 |
20120317520 | Lee | Dec 2012 | A1 |
20130053990 | Ackland et al. | Feb 2013 | A1 |
20130060167 | Dracup et al. | Mar 2013 | A1 |
20130095459 | Tran | Apr 2013 | A1 |
20130127636 | Aryanpur et al. | May 2013 | A1 |
20130151874 | Parks et al. | Jun 2013 | A1 |
20130178334 | Brammer | Jul 2013 | A1 |
20130187789 | Lowe | Jul 2013 | A1 |
20130190903 | Balakrishnan et al. | Jul 2013 | A1 |
20130217979 | Blackadar et al. | Aug 2013 | A1 |
20130225370 | Flynt et al. | Aug 2013 | A1 |
20130234924 | Janefalkar et al. | Sep 2013 | A1 |
20130250845 | Greene et al. | Sep 2013 | A1 |
20130304377 | Van Hende | Nov 2013 | A1 |
20130332286 | Medelius et al. | Dec 2013 | A1 |
20130345978 | Lush et al. | Dec 2013 | A1 |
20140018686 | Medelius et al. | Jan 2014 | A1 |
20140046223 | Kahn et al. | Feb 2014 | A1 |
20140094200 | Schatzberg et al. | Apr 2014 | A1 |
20140142732 | Karvonen | May 2014 | A1 |
20140149754 | Silva et al. | May 2014 | A1 |
20140163927 | Molettiere et al. | Jun 2014 | A1 |
20140208333 | Beals et al. | Jul 2014 | A1 |
20140218281 | Amayeh et al. | Aug 2014 | A1 |
20140235166 | Molettiere et al. | Aug 2014 | A1 |
20140237028 | Messenger et al. | Aug 2014 | A1 |
20140257533 | Morris et al. | Sep 2014 | A1 |
20140275821 | Beckman | Sep 2014 | A1 |
20140288680 | Hoffman | Sep 2014 | A1 |
20140336796 | Agnew | Nov 2014 | A1 |
20140337036 | Haiut et al. | Nov 2014 | A1 |
20140337450 | Choudhary et al. | Nov 2014 | A1 |
20140343380 | Carter et al. | Nov 2014 | A1 |
20140350883 | Carter et al. | Nov 2014 | A1 |
20140365107 | Dutta et al. | Dec 2014 | A1 |
20140372064 | Darley et al. | Dec 2014 | A1 |
20150006617 | Yoo et al. | Jan 2015 | A1 |
20150037771 | Kaleal, III et al. | Feb 2015 | A1 |
20150042468 | White et al. | Feb 2015 | A1 |
20150057945 | White et al. | Feb 2015 | A1 |
20150113417 | Yuen et al. | Apr 2015 | A1 |
20150119198 | Wisbey et al. | Apr 2015 | A1 |
20150119728 | Blackadar et al. | Apr 2015 | A1 |
20150127966 | Ma et al. | May 2015 | A1 |
20150141873 | Fei | May 2015 | A1 |
20150160026 | Kitchel | Jun 2015 | A1 |
20150180842 | Panther | Jun 2015 | A1 |
20150185815 | DeBates | Jul 2015 | A1 |
20150209615 | Edwards | Jul 2015 | A1 |
20150233595 | Fadell et al. | Aug 2015 | A1 |
20150272483 | Etemad et al. | Oct 2015 | A1 |
20150312857 | Kim et al. | Oct 2015 | A1 |
20150326709 | Pennanen et al. | Nov 2015 | A1 |
20150335978 | Syed et al. | Nov 2015 | A1 |
20150342533 | Kelner | Dec 2015 | A1 |
20150347983 | Jon et al. | Dec 2015 | A1 |
20150350822 | Xiao et al. | Dec 2015 | A1 |
20150362519 | Balakrishnan et al. | Dec 2015 | A1 |
20150374279 | Takakura et al. | Dec 2015 | A1 |
20150382150 | Ansermet et al. | Dec 2015 | A1 |
20160007288 | Samardzija et al. | Jan 2016 | A1 |
20160007934 | Arnold et al. | Jan 2016 | A1 |
20160023043 | Grundy | Jan 2016 | A1 |
20160026236 | Vasistha et al. | Jan 2016 | A1 |
20160034043 | Le Grand et al. | Feb 2016 | A1 |
20160034133 | Wilson et al. | Feb 2016 | A1 |
20160041593 | Dharawat | Feb 2016 | A1 |
20160058367 | Raghuram et al. | Mar 2016 | A1 |
20160058372 | Raghuram et al. | Mar 2016 | A1 |
20160059079 | Watterson | Mar 2016 | A1 |
20160072557 | Ahola | Mar 2016 | A1 |
20160081028 | Chang et al. | Mar 2016 | A1 |
20160081625 | Kim et al. | Mar 2016 | A1 |
20160084869 | Yuen et al. | Mar 2016 | A1 |
20160091980 | Baranski et al. | Mar 2016 | A1 |
20160104377 | French | Apr 2016 | A1 |
20160135698 | Baxi et al. | May 2016 | A1 |
20160143579 | Martikka et al. | May 2016 | A1 |
20160144236 | Ko et al. | May 2016 | A1 |
20160148396 | Bayne et al. | May 2016 | A1 |
20160148615 | Lee et al. | May 2016 | A1 |
20160184686 | Sampathkumaran | Jun 2016 | A1 |
20160209907 | Han et al. | Jul 2016 | A1 |
20160226945 | Granqvist et al. | Aug 2016 | A1 |
20160259495 | Butcher et al. | Sep 2016 | A1 |
20160317097 | Adams et al. | Nov 2016 | A1 |
20160327915 | Katzer et al. | Nov 2016 | A1 |
20160328991 | Simpson et al. | Nov 2016 | A1 |
20160346611 | Rowley | Dec 2016 | A1 |
20160374566 | Fung et al. | Dec 2016 | A1 |
20160379547 | Okada | Dec 2016 | A1 |
20170010677 | Roh et al. | Jan 2017 | A1 |
20170011089 | Bermudez et al. | Jan 2017 | A1 |
20170011210 | Cheong et al. | Jan 2017 | A1 |
20170032256 | Otto et al. | Feb 2017 | A1 |
20170038740 | Knappe et al. | Feb 2017 | A1 |
20170063475 | Feng | Mar 2017 | A1 |
20170065230 | Sinha et al. | Mar 2017 | A1 |
20170087431 | Syed et al. | Mar 2017 | A1 |
20170124517 | Martin | May 2017 | A1 |
20170153119 | Nieminen et al. | Jun 2017 | A1 |
20170153693 | Duale et al. | Jun 2017 | A1 |
20170154270 | Lindman et al. | Jun 2017 | A1 |
20170168555 | Munoz et al. | Jun 2017 | A1 |
20170173391 | Wiebe et al. | Jun 2017 | A1 |
20170232294 | Kruger et al. | Aug 2017 | A1 |
20170262699 | White et al. | Sep 2017 | A1 |
20170266494 | Crankson | Sep 2017 | A1 |
20170316182 | Blackadar et al. | Nov 2017 | A1 |
20180015329 | Burich et al. | Jan 2018 | A1 |
20180108323 | Lindman et al. | Apr 2018 | A1 |
20180193695 | Lee | Jul 2018 | A1 |
20180345077 | Blahnik et al. | Dec 2018 | A1 |
20190025928 | Pantelopoulos et al. | Jan 2019 | A1 |
20190056777 | Munoz et al. | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
2007216704 | Apr 2008 | AU |
1877340 | Dec 2006 | CN |
102495756 | Jun 2012 | CN |
103309428 | Sep 2013 | CN |
103631359 | Mar 2014 | CN |
204121706 | Jan 2015 | CN |
104680046 | Jun 2015 | CN |
105242779 | Jan 2016 | CN |
106062661 | Oct 2016 | CN |
106604369 | Apr 2017 | CN |
108052272 | May 2018 | CN |
103154954 | Jun 2018 | CN |
108377264 | Aug 2018 | CN |
108983873 | Dec 2018 | CN |
1755098 | Feb 2007 | EP |
2096820 | Sep 2009 | EP |
2107837 | Oct 2009 | EP |
2172249 | Apr 2010 | EP |
2770454 | Aug 2014 | EP |
2703945 | Mar 2015 | EP |
2849473 | Mar 2015 | EP |
2910901 | Aug 2015 | EP |
3023859 | May 2016 | EP |
3361370 | Aug 2018 | EP |
126911 | Feb 2017 | FI |
2425180 | Oct 2006 | GB |
2513585 | Nov 2014 | GB |
2530196 | Mar 2016 | GB |
2537423 | Oct 2016 | GB |
2541234 | Feb 2017 | GB |
2555107 | Apr 2018 | GB |
20110070049 | Jun 2011 | KR |
101500662 | Mar 2015 | KR |
528295 | Oct 2006 | SE |
201706840 | Feb 2017 | TW |
I598076 | Sep 2018 | TW |
WO02054157 | Jul 2002 | WO |
WO2010083562 | Jul 2010 | WO |
WO2010144720 | Dec 2010 | WO |
WO2011061412 | May 2011 | WO |
WO2011123932 | Oct 2011 | WO |
WO2012037637 | Mar 2012 | WO |
WO2012115943 | Aug 2012 | WO |
WO2012141827 | Oct 2012 | WO |
WO2013091135 | Jun 2013 | WO |
WO2013121325 | Aug 2013 | WO |
WO2014118767 | Aug 2014 | WO |
WO2014144258 | Sep 2014 | WO |
WO2014193672 | Dec 2014 | WO |
WO2014209697 | Dec 2014 | WO |
WO2014182162 | Jun 2015 | WO |
WO2015087164 | Jun 2015 | WO |
WO2015131065 | Sep 2015 | WO |
WO2016022203 | Feb 2016 | WO |
WO2017011818 | Jan 2017 | WO |
WO2018217348 | Nov 2018 | WO |
WO2018222936 | Dec 2018 | WO |
Entry |
---|
SHETA et al: Packet scheduling in LTE mobile network. International Journal of Scientific & Engineering Research, Jun. 2013, vol. 4, Issue 6. |
ARM big. Little. Wikipedia, The free encyclopedia, Oct. 11, 2018, Retrieved on May 28, 2020 from: <https://en.wikipedia.org/w/index.php?title=ARM_bit.LITTLE&oldid=863559211> foreword on p.1, section “Run-state migration” on pp. 1-2. |
Qualcomm Snapdragon Wear 3100 Platform Supports New Ultra-Low Power System Architecture For Next Generation Smartwatches. Qualcomm Technologies, Inc., Sep. 10, 2018, Retrieved on May 28, 2020 from: <https://www.qualcomm.com/news/releases/2018/09/10/qualcomm-snapdragon-wear-3100-platform-supports-new-ultra-low-power-system> sections “Snapdragon Wear 3100 Based Smartwatches Aim to Enrich the User Experience” on pp. 3-4. |
CNET: Dec. 11, 2017, “Apple watch can now sync with a treadmill”, youtube.com, [online], Available from: https://www.youtube.com/watch?v=7RvMC3wFDME [ Accessed Nov. 19, 2020]. |
CASH: A guide to GPS and route plotting for cyclists. 2018. www.cyclinguk.org/article/guide-gps-and-route-plotting-cyclists. |
Number | Date | Country | |
---|---|---|---|
20200133383 A1 | Apr 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16377267 | Apr 2019 | US |
Child | 16731134 | US | |
Parent | 15365972 | Dec 2016 | US |
Child | 16377267 | US | |
Parent | 16731134 | US | |
Child | 16377267 | US | |
Parent | 15784234 | Oct 2017 | US |
Child | 16731134 | US |