1. Field of the Technology
The present disclosure relates generally to electronic devices, such as mobile communication devices operating in wireless communication networks, having media player/recorder modules.
2. Description of the Related Art
When a media file is played on a computer or an electronic device, there are metadata associated with the media file. Traditionally, the metadata associated with a media file are descriptive of the file's data as a whole. For audio media files, such as songs, the metadata may include the artist's name, information about the artist, the album name, the track name, a track number, etc. With respect to video files, the metadata may include actor names, directors, producers, movie trivia on particular scenes/actors/producer/directors, etc. The metadata information is typically integrated within the media file.
When the media file is played, the metadata may be accessed by or rendered through a media player. Traditionally, the metadata that are available with the media file is static in nature and provides only a single definition regarding the contents in the media file, as the metadata is generally provided by the content provider before dissemination to the public. The metadata are also not dynamically presented during the playing of the media file, but can be accessed before or after play. In some cases, the metadata provided with the media file may include sub-title tracks as well as accompanying video. This metadata are also predefined.
Accordingly, what are needed are methods and apparatus to overcome limitations and deficiencies of the prior art.
Embodiments of present invention will now be described by way of example with reference to attached figures, wherein:
The present disclosure relates to the integration of time-based metadata with media files. In general, time-based metadata includes tagged timestamps associated with the contents of a media file. A plurality of tags associated with the time-based metadata may be defined in the media file. When the contents of the media file are rendered by a media player, the associated tags are identified which render the corresponding time-based metadata. More particularly, specific metadata is associated with specific content of the media file and at a timestamp of play of the media file. A tag is generated containing the timestamp and reference to the identified metadata. The tag is linked to the media file, so that the media player identifies the tag during the play of the media file and presents the identified metadata upon reaching the time code of play, along with the presentation of content of the media file.
Within this context, techniques for use in a mobile communication device for providing object indicators for video recording and/or playback have been devised. The mobile device includes a media recorder/player module and a sensor device. The media recorder module is used to record video data in a media file. The sensor device detects signals which indicate the presence of an object during a time period of recording. The mobile device receives object data comprising an identification of the object, and stores the object data in association with timestamp data which corresponds to the time period during which the object is present during the recording.
More particularly, the mobile device may identify a detected position of the object in the video, and store position data corresponding to the detected position of the object in association with the timestamp data. For example, the mobile device may receive via the sensor device visual identification data for visually identifying the object. The mobile device produces, based on the visual identification data, an image array of a representative image of the object. The mobile device may then detect from the video data an actual image of the object that matches the representative image of the object. The actual image being positioned at x-y coordinates in the video. The mobile device may then store the x-y coordinates in association with the timestamp, and/or display an object indicator at the x-y coordinates.
To illustrate an exemplary environment for practicing the techniques of the present disclosure,
Controller 106 interfaces with device display 112 to display received information, stored information, user inputs, and the like. Keyboard 114, which may be a telephone type keypad or full alphanumeric keyboard, is normally provided for entering data for storage in mobile device 102, information for transmission to the network, a telephone number to place a telephone call, commands to be executed on mobile device 102, and possibly other or different user inputs.
Mobile device 102 sends communication signals to and receives IS communication signals from network 104 over a wireless link via antenna 110. RF transceiver circuitry 108 performs functions similar to those of a base transceiver station (BTS) 134 and a base station controller (BSC) 136 (described later below), including for example modulation/demodulation and possibly encoding/decoding and encryption/decryption. It is contemplated that RF transceiver circuitry 108 may perform certain functions in addition to those performed by BIS 134/BSC 136. It will be apparent to those skilled in art that RF transceiver circuitry 108 will be adapted to particular wireless network or networks in which mobile device 102 is intended to operate.
BTS 134 and BSC 136 may be referred to as a base station subsystem (BSS) and part of the radio access network (RAN). BSC 136 is associated with a GSM/EDGE Radio Access Network (GERAN) system, but other networks nodes such as radio network controller (RNC) or E-UTRAN Node_B (eNB) may be employed for UMTS or LTE, respectively. Others networks, not necessarily conforming to 3GPP standards, for example, networks conforming to IEEE or IETF standards, may alternatively be utilized.
Mobile device 102 includes a battery interface 122 for receiving one or more rechargeable batteries 124. Battery 124 provides electrical power to electrical circuitry in mobile device 102, and battery interface 122 provides for a mechanical and electrical connection for battery 124. Battery interface 122 is coupled to a regulator 126 which regulates power to the device. Mobile device 102 may be a handheld portable communication device, which includes a housing (e.g. a plastic housing) which carries and contains the electrical components of mobile device 102 including battery 124.
Mobile device 102 operates using a Subscriber Identity Module (SIM) or Universal SIM (USIM) 120 which is connected to or inserted in mobile device 102 at a SIM or USIM interface 118. SIM/USIM 120 is one type of a conventional “smart card” used to identify an end user (or subscriber) of mobile device 102 and to personalize the device, among other things. By inserting SIM/USIM 120 into mobile device 102, an end user can have access to any and all of his/her subscribed services. SIM/USIM 120 generally includes a processor and memory for storing information. Since SIM/USIM 120 is coupled to SIM/USIM interface 118, it is coupled to controller 106 through communication lines 144. In order to identify the subscriber, SIM/USIM 120 contains some user parameters such as an International Mobile Subscriber Identity (IMSI). An advantage of using SIM/USIM 120 is that end users are not necessarily bound by any single physical mobile device. SIM/USIM 120 may store additional user information for the mobile device as well, including datebook (or calendar) information and recent call information.
Mobile device 102 may consist of a single unit, such as a data communication device, a cellular telephone, a multiple-function communication device with data and voice communication capabilities, a personal digital assistant (PDA) enabled for wireless communication, or a computer incorporating an internal modem. Preferably, as mentioned earlier, mobile device 102 is a handheld portable communication device which includes a housing (e.g. a plastic housing) which carries and contains the electrical components of mobile device 102. Alternatively, mobile device 102 may be a multiple-module unit comprising a plurality of separate components, including but in no way limited to a computer or other device connected to a wireless modem. In particular, for example, in the mobile device block diagram of
Mobile device 102 communicates in and through wireless communication network 104. Wireless communication network 104 may be a cellular telecommunications network. In the embodiment of
BTS 134 is a fixed transceiver station, and BTS 134 and BSC 136 may together be referred to as a base station subsystem (BSS). The BSS provides wireless network coverage for a particular coverage area commonly referred to as a “cell”. The BSS transmits communication signals to and receives communication signals from mobile devices within its cell via station 134. The BSS normally performs such functions as modulation and possibly encoding and/or encryption of signals to be transmitted to the mobile device in accordance with particular, usually predetermined, communication protocols and parameters, under control of its controller. The BSS similarly demodulates and possibly decodes and decrypts, if necessary, any communication signals received from mobile device 102 within its cell. Communication protocols and parameters may vary between different networks. For example, one network may employ a different modulation scheme and operate at different frequencies than other networks.
The wireless link shown in communication system 100 of
For all mobile device's 102 registered with a network operator, permanent data (such as mobile device 102 user's profile) as well as temporary data (such as mobile device's 102 current location) are stored in HLR 132. In case of a voice call to mobile device 102, HLR 132 is queried to determine the current location of mobile device 102. A Visitor Location Register (VLR) of MSC 140 is responsible for a group of location areas and stores the data of those mobile devices that are currently in its area of responsibility. This includes parts of the permanent mobile device data that have been transmitted from HLR 132 to the VLR for faster access. However, the VLR of MSC 140 may also assign and store local data, such as temporary identifications. Optionally, the VLR of MSC 140 can be enhanced for more efficient co-ordination of GPRS and non-GPRS services and functionality (e.g. paging for circuit-switched calls which can be performed more efficiently via SGSN 138, and combined GPRS and non-GPRS location updates).
Serving GPRS Support Node (SGSN) 138 is at the same hierarchical level as MSC 140 and keeps track of the individual locations of mobile devices. SGSN 138 also performs security functions and access control. Gateway GPRS Support Node (GGSN) 128 provides interworking with external packet-switched networks and is connected with SGSNs (such as SGSN 138) via an IP-based GPRS backbone network. SGSN 138 performs authentication and cipher setting procedures based on algorithms, keys, and criteria (e.g. as in existing GSM). In conventional operation, cell selection may be performed autonomously by mobile device 102 or by the transceiver equipment instructing mobile device 102 to select a particular cell. Mobile device 102 informs wireless network 104 when it reselects another cell or group of cells, known as a routing area.
In order to access GPRS services, mobile device 102 first makes its presence known to wireless network 104 by performing what is known as a GPRS “attach”. This operation establishes a logical link between mobile device 102 and SGSN 138 and makes mobile device 102 available to receive, for example, pages via SGSN, notifications of incoming GPRS data, or SMS messages over GPRS. In order to send and receive GPRS data, mobile device 102 assists in activating the packet data address that it wants to use. This operation makes mobile device 102 known to GGSN 128; interworking with external data networks can thereafter commence. User data may be transferred transparently between mobile device 102 and the external data networks using, for example, encapsulation and tunneling. Data packets are equipped with GPRS-specific protocol information and transferred between mobile device 102 and GGSN 128.
Mobile device 202 will normally incorporate a communication subsystem 211, which includes a receiver 212, a transmitter 214, and associated components, such as one or more (preferably embedded or internal) antenna elements 216 and 218, local oscillators (LOs) 213, and a processing module such as a digital signal processor (DSP) 220. Communication subsystem 211 is analogous to RF transceiver circuitry 108 and antenna 110 shown in
Mobile device 202 may send and receive communication signals over the network after required network registration or activation procedures have been completed. Signals received by antenna 216 through the network are input to receiver 212, which may perform such common receiver functions as signal amplification, frequency down conversion, filtering, channel selection, and like, and in example shown in
Network access is associated with a subscriber or user of mobile device 202, and therefore mobile device 202 may utilize a Subscriber Identity Module or “SIM” or Universal SIM “USIM” card 262 to be inserted in a SIM/USIM interface 264 in order to operate in the network. SIM/USIM 262 includes those features described in relation to
Mobile device 202 includes a microprocessor 238 (which is one implementation of controller 106 of
Microprocessor 238, in addition to its operating system functions, preferably enables execution of software applications on mobile device 202. A predetermined set of applications which control basic device operations, including at least data and voice communication applications, will normally be installed on mobile device 202 during its manufacture. A preferred application that may be loaded onto mobile device 202 may be a personal information manager (PIM) application having the ability to organize and manage data items relating to user such as, but not limited to, e-mail, calendar events, voice mails, appointments, and task items. Naturally, one or more memory stores are available on mobile device 202 and SIM 256 to facilitate storage of PIM data items and other information. The PIM application preferably has the ability to send and receive data items via the wireless network. In the present disclosure, PIM data items are seamlessly integrated, synchronized, and updated via the wireless network, with the mobile device user's corresponding data items stored and/or associated with a host computer system thereby creating a mirrored host computer on mobile device 202 with respect to such items. This is especially advantageous where the host computer system is the mobile device user's to office computer system. Additional applications may also be loaded onto mobile device 202 through network, an auxiliary I/O subsystem 228, serial port 230, short-range communications subsystem 240, or any other suitable subsystem 242, and installed by a user in RAM 226 or preferably a non-volatile store (not shown) for execution by microprocessor 238. Such flexibility in application installation increases the functionality of mobile device 202 and may provide enhanced on-device functions, communication-related functions, or both. These applications will be described later in relation to
In a data communication mode, a received signal such as a text message, an e-mail message, or web page download will be processed by communication subsystem 211 and input to microprocessor 238. Microprocessor 238 will preferably further process the signal for output to display 222 or alternatively to auxiliary I/O device 228. A user of mobile device 202 may also compose data items, such as e-mail messages, for example, using keyboard 232 in conjunction with display 222 and possibly auxiliary I/O device 228. Keyboard 232 is preferably a complete alphanumeric keyboard and/or telephone-type keypad. These composed items may be transmitted over a communication network through communication subsystem 211. For voice communications, the overall operation of mobile device 202 is substantially similar, except that the received signals would be output to speaker 234 and signals for transmission would be generated by microphone 236. Alternative voice or audio I/O subsystems, such as a voice message recording subsystem, may also be implemented on mobile device 202. Although voice or audio signal output is preferably accomplished primarily through speaker 234, display 222 may also be used to provide an indication of the identity of a calling party, duration of a voice call, or other voice call related information, as some examples.
Serial port 230 in
Note that, in the technique of
Beginning at a start block 302 of
The mobile device also enables a sensor device in the mobile device (step 306 of
The sensor device may be a specifically manufactured sensor device for the type of object identification and description broadcasting described herein. Further, any computational platform capable of wireless communication with suitable software/firmware client embedded therein may be converted into such a sensor device for facilitating the type of object identification and description broadcasting described herein.
Note that the object itself may be associated with a corresponding object transceiver, transmitter, or component. The sensor device may therefore receive signals directly from such object component, or indirectly from the object component via a wireless network (e.g. a cellular network, a WLAN or WiFi network, etc.). In one embodiment, the sensor device receives from the wireless network signals which indicate the presence of one or more such objects within view and/or a nearby geographic location.
These signals which indicate the presence of the object may be detected during a time period of the recording (step 308 of
The processor receives object data associated with the object, and stores the object data in the metadata block for the object (step 314 of
The object data includes at least object identification which identifies the object. The object identification may identify the object by its name, with alphanumeric characters or text, for example. The object data may also include an address or link to a server, where the server is configured to communicate information regarding the object to the mobile device for display in response to a request from the mobile device (e.g. by “clicking” on the address or the link).
The processor also receives timestamp data corresponding to a current time of recording of the video data. The processor initializes or sets a start timestamp in the metadata block for the object to be the current time of recording (step 316 of
If the detection of signals in step 308 is identified as not being new at step 310 (i.e. it is not the first time that signals indicating the presence of the object have been detected for the current recording session), then the processor may optionally receive updated object data associated with the object, and store this updated object data in the metadata block for the object (step 318 of
The processor further identifies whether a signal from the object was detected in N previous iterations (N=1, 2, 3, . . . ) (step 320 of
The processor then causes an object indicator and/or the object identification to be displayed in the display (step 325 of
Alternatively, if a signal loss is detected (or little or no signal is detected) in step 308, then the processor identifies whether a signal from the object was detected in N previous iterations (N=1, 2, 3, . . . ) (step 320 of
Beginning at a start block 402 of
During video playback, the processor operates to compare a current timestamp of the video playback with timestamp data in the metadata blocks (step 406 of
Note that, in the technique of
Beginning at a start block 502 of
The mobile device also enables a sensor device (step 506 of
The sensor device may be a cellular transceiver, a WLAN or WiFi transceiver (e.g. IEEE 802.11 compliant, IEEE 802.16 compliant, etc.), a BLUETOOTH transceiver, a GPS receiver, an RF ID detector, as examples, of the mobile device. The sensor device may be a specifically manufactured sensor device for the type of object identification and description broadcasting described herein. Further, any computational platform capable of wireless communication with suitable software/firmware client embedded therein may be converted into such a sensor device for facilitating the type of object identification and description broadcasting described herein. Note again that the object itself may be associated with a corresponding object transceiver, transmitter, or component. The sensor device may therefore receive signals directly from such object component, or indirectly from the object component via a wireless network (e.g. a cellular network, a WLAN or WiFi network, etc.). In one embodiment, the sensor device receives from the wireless network signals which indicate the presence of one or more such objects within view and/or a nearby geographic location:
Next, the processor produces or obtains an image array corresponding to a representative image of the object (step 510 of
The processor further produces or obtains an image array corresponding to the actual image from the video data being recorded (step 512 of
The processor then identifies whether such detection is a new detection of the actual image (step 519 of
The object data includes at least object identification which identifies the object. The object identification may identify the object by its name, with alphanumeric characters or text, for example. The object data may also include an address or link to a server, where the server is configured to communicate information regarding the object to the mobile device for display in response to a request from the mobile device (e.g. by “clicking” on the address or the link).
In step 534 of
If the detection of the actual image in step 516 is identified as not being new at step 519 (i.e. it is not the first time that the actual image has been detected for the current recording session), then the processor identifies whether the newly identified x-y coordinates of the detected image are within a predetermined distance or percentage from the previously stored x-y coordinates for the detected image (step 522 of
In one embodiment, step 524 may involve updating the end timestamp in the metadata block for the object to be the current time of recording. In another embodiment, no adding or updating is performed in step 524, as the end timestamp will be updated later in step 536 when the image is lost (as described later below). Processing then proceeds back to step 508.
On the other hand, the processor identifies in step 522 whether the x-y coordinates of the detected image are outside of the predetermined distance or percentage of the previously stored x-y coordinates for the detected image. Here, using the circle having the radius “r” around the previously stored x-y coordinates which was algorithmically determined, it may be tested whether the Euclidian distance between the previously stored x-y coordinates and the newly identified x-y coordinates is greater than the threshold of the r. If so, then the newly identified x-y coordinates are outside of the circle, and the coordinates are deemed to be different from each other. Therefore, if yes in step 522, then the processor sets an end timestamp to be the current time of recording for association with the previously stored x-y coordinates (step 526 of
Alternatively, if there is no match in step 516, then the processor adds or updates an end timestamp to be the current time of recording in association with the previously-identified x-y coordinates (step 536 of
From the technique of
In one embodiment, the
In
Thus, as described herein, techniques for use in providing object indicators for video recording and/or playback have been devised. A mobile communication device includes a media recorder/player module and a sensor device. The media recorder module is used to record video data in a media file. The sensor device detects signals which indicate the presence of an object during a time period of recording. The mobile device receives object data comprising an identification of the object, and stores the object data in association with timestamp data which corresponds to the time period during which the object is present during the recording. In the technique, the mobile device may identify a detected position of the object in the video, and store position data corresponding to the detected position of the object in association with the timestamp data. In particular, for example, the mobile device may receive via the sensor device visual identification data for visually identifying the object. The mobile device produces, based on the visual identification data, an image array of a representative image of the object. The mobile device may then detect from the video data an actual image of the object that matches the representative image of the object. The actual image being positioned at x-y coordinates in the video. The mobile device may then store the x-y coordinates in association with the timestamp, and/or display an object indicator at the x-y coordinates.
A related technique for use in providing object indicators during playback of a media file involves one or one or more metadata blocks associated with the media file. Each metadata block includes object data corresponding to an identification of an object which is present at one or more playback positions in the media file, timestamp data corresponding to the one or more playback positions in the media file during which the object is present, and x-y coordinate data corresponding to an x-y position of the object provided at the one or more playback positions. When a current playback position of the media file matches one of the playback positions in the metadata block, the x-y coordinate data corresponding to the current playback position is read. An object indicator is then displayed, over or adjacent to the object, at the x-y position associated with the x-y coordinate data read from the identified metadata block.
The above-described embodiments of the present application are intended to be examples only. Those of skill in the art may effect alterations, modifications and variations to the particular embodiments without departing from the scope of the application. The invention described herein in the recited claims intends to cover and embrace all suitable changes in technology.