Mobile devices with integrated telephones (e.g., smart phones) typically execute a proprietary operating system (OS) on top of which different software applications run, such as electronic mail, short message service, instant messaging, contact management, and electronic games, for example. However, writing applications for a proprietary OS can be difficult and time consuming, especially for content developers who typically do not have extensive training in software development. Some mobile device OS's allow applications to receive events from the OS that indicate, for instance, when a user has pressed a key on the mobile device or when a new message has arrived. Creating content that is sensitive to such events can require intimate knowledge of a mobile device OS's proprietary event model. Moreover, porting event-driven applications created for one proprietary OS to another can require a significant amount of additional software development for each OS the application is ported to.
In general, one aspect of the subject matter described in this specification can be embodied in a method that includes accepting content that defines an event handler. The content is provided to a media player configured to render the content. The event handler is configured to receive an event from an operating system on which the media player runs, the operating system executing on a mobile device having an integrated display and an integrated telephone. The content is presented by the media player on the display. The event is received from the operating system and providing the event to the event handler. And the presentation of the content is modified by the media player based on processing of the event by the event handler. Other implementations of this aspect include corresponding systems, apparatus, and computer program products.
These and other implementations can optionally include one or more of the following features. The media player is capable of presenting vector-based graphics, video and audio. The media player is capable of presenting the content according to a timeline. The operating system can be Binary Runtime Environment for Wireless (BREW), Symbian OS, or Windows Mobile. The event can be a result of a change in state of the mobile device, user interaction with the mobile device, or a network communication to the mobile device. Configuring the event handler further comprises registering the event handler such that the event handler will receive the event. The event handler is invoked and performs one or more actions that alter the presentation of the content. Presenting the content includes assigning the content to one or more layers of an imaging model; and rendering the content according to the layers. The event handler is associated with timeline-based content.
Particular implementations of the subject matter described in this specification can be implemented to realize one or more of the following advantages. Presentation of content can dynamically change based on OS-generated events. Content authors can create event-sensitive content by leveraging high-level content creation tools. Rich content can be presented on a mobile device's display without having to develop a rich content rendering framework. Event-sensitive content is easily portable to different mobile device OS's. New events can be added through a programming model.
The details of one or more implementations of the invention are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the invention will become apparent from the description, the drawings, and the claims.
Like reference numbers and designations in the various drawings indicate like elements.
For example, the display 102a in
Content includes the definition of at least one event handler which is particular to one or more OS-generated events and performs one or more actions based on the event. In various implementations, an event handler is defined in a programming language such as ActionScript or another ECMAScript-based language. However, the use of other imperative or declarative programming languages is possible. ActionScript is an object-oriented programming language for performing computations and manipulating movie-clips, text fields and sounds, for example. In other implementations, event handlers can be defined through descriptive data in the content rather than programming language statements.
An ExtendedEvents object in ActionScript is a global object whose members, methods and properties can be used without using a constructor. The listener members of this object are used to register event handlers for capturing OS-generated events. In various implementations, Flash movie clips register event handlers with specific event listeners using the ExtendedEvents object. New event types can be added by extending the ExtendedEvents object. Event handlers are objects that define an event handler function in order to handle specific events. The signatures of exemplary event handling functions are provided in TABLE 2, however other event handlers are possible.
Next, the content is provided to a media player configured to render the content (step 204). For example, a SWF file is provided to a Flash Lite player on a mobile device. The event handler defined in the content is then configured to receive an event from an operating system on which the media player runs; the operating system executes on a mobile device having an integrated display and an integrated telephone (step 206). For example, the code statements 306 configure the event handler function onEvent to receive Incoming Call events by adding the event handler object myCallEventHandler to the listener for the event: ExtendedEvents.IncomingCall (see TABLE 3 below).
The content is then presented by the media player on the display (step 208). An event is then received from the operating system and provided the event handler (step 210). This is illustrated with reference to
Finally, the presentation of the content is modified by the media player based on the processing of the event by the event handler (step 212). With reference to
The media player 400 includes a rendering engine 412, a virtual machine 414 for defining and configuring event handlers 414a, and a variety of sources of content 416. In some implementations, the rendering engine 412 includes the ability to render content using an imaging model 412a according to one or more timelines associated with the content and a minimum frames per second rendering speed. For example, a Flash movie-clip is rendered according to its timeline. In some implementations, the rendering engine 412 natively includes the ability to render Scalable Vector Graphics Tiny (SVG-T) for small displays 412b, gradients 412c, vectors 412d, text 412e and images 412f.
The imaging model 412a includes a stack of logical rendering layers which, when composited together, create a presentation of the content on the display 404. Each piece of content 416 to be rendered is assigned to one of the layers. Content residing on lower layers is rendered beneath content on higher layers. Unless higher content is transparent, it will occlude content in lower layers occupying the same display space. Otherwise, the higher layer content is blended with content in the lower layers according to alpha channel blending rules, for instance. By way of illustration and with reference to
The virtual machine 414 is capable of executing programming language statements that define, configure and perform actions for event handlers 414a. The programming language statements can also control which content is provided to the rendering engine 412 and how the content is presented. In various implementations, the rendering engine 414 manages content through a Document Object Model (DOM) which is movie centric and can represent animations, audio, text and event handling. Programming language statements manipulate content sources 416 through programmatic objects that correspond to these sources. Content sources can include, but are not limited to, the following: vector font data 416a, Joint Photographic Experts Group (JPEG) images 416b, audio data 416c, data obtained over one or more network connections 416d, user input text 416e, data accessed through a codec provided by the mobile device OS (e.g., 416f-j), persistent data 416k from the mobile device, and dynamic data 416l.
The processor 502 is capable of processing instructions for execution within the system device 500. Such executed instructions can implement one or more steps of method 200 or one or more components of system 400, for example. The processor 502 is a single or multi-threaded processor having one or more processor cores, for instance. The processor 502 is capable of processing instructions stored in the memory 504 or on the storage device 508 to display graphical information for a user interface on the display 404. The memory 504 is a computer readable medium such as volatile or non volatile random access memory that stores information within the system device 500. The memory 504 could store data structures representing content, for example. The storage device 508 is capable of providing persistent storage for the device 500. The storage device 508 may be a hard disk device, an optical disk device, a flash memory, or other suitable persistent storage means.
Implementations of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus. The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them.
The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, implementations of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
Implementations of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described is this specification, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular implementations of the invention. Certain features that are described in this specification in the context of separate implementations can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular implementations of the invention have been described. Other implementations are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order and still achieve desirable results.
This application claims priority to U.S. Provisional Application No. 60/893,865, filed on Mar. 8, 2007. The disclosure of each prior application is considered part of and is incorporated by reference in the disclosure of this application.
Number | Name | Date | Kind |
---|---|---|---|
5797098 | Schroeder et al. | Aug 1998 | A |
6025841 | Finkelstein et al. | Feb 2000 | A |
6188401 | Peyer | Feb 2001 | B1 |
6195569 | Frederiksen | Feb 2001 | B1 |
6292099 | Tse et al. | Sep 2001 | B1 |
6381468 | Larsen et al. | Apr 2002 | B1 |
6512529 | Janssen et al. | Jan 2003 | B1 |
6529744 | Birkler et al. | Mar 2003 | B1 |
6729929 | Sayers et al. | May 2004 | B1 |
6732358 | Siefert | May 2004 | B1 |
6757372 | Dunlap et al. | Jun 2004 | B1 |
6775362 | Ransom | Aug 2004 | B1 |
6892067 | Sharma et al. | May 2005 | B1 |
6928468 | Leermakers | Aug 2005 | B2 |
6964061 | Cragun et al. | Nov 2005 | B2 |
6976217 | Vertaschitsch et al. | Dec 2005 | B1 |
6983421 | Lahti et al. | Jan 2006 | B1 |
7003327 | Payne et al. | Feb 2006 | B1 |
7035629 | Fujii | Apr 2006 | B2 |
7096474 | Wong et al. | Aug 2006 | B2 |
7099685 | Park et al. | Aug 2006 | B2 |
7142977 | Knuuttila et al. | Nov 2006 | B2 |
7152203 | Gao et al. | Dec 2006 | B2 |
7158788 | Holler et al. | Jan 2007 | B2 |
7159500 | John et al. | Jan 2007 | B2 |
7165099 | Sprigg et al. | Jan 2007 | B2 |
7167728 | Wagner et al. | Jan 2007 | B1 |
7187948 | Alden | Mar 2007 | B2 |
7243164 | Vegge | Jul 2007 | B2 |
7275243 | Gibbons et al. | Sep 2007 | B2 |
7283841 | Luke et al. | Oct 2007 | B2 |
7299289 | Lorenz et al. | Nov 2007 | B1 |
7299409 | Joshi et al. | Nov 2007 | B2 |
7308689 | Black et al. | Dec 2007 | B2 |
7310784 | Gottlieb et al. | Dec 2007 | B1 |
7316003 | Dulepet et al. | Jan 2008 | B1 |
7319862 | Lincoln et al. | Jan 2008 | B1 |
7403209 | Liao et al. | Jul 2008 | B2 |
7478158 | Rodgers et al. | Jan 2009 | B1 |
7480422 | Ackley et al. | Jan 2009 | B2 |
7634559 | Brown | Dec 2009 | B2 |
7639943 | Kalajan | Dec 2009 | B1 |
7706782 | Hosmer et al. | Apr 2010 | B1 |
7743339 | Chanda et al. | Jun 2010 | B1 |
20020140729 | Price et al. | Oct 2002 | A1 |
20020152239 | Bautista-Lloyd et al. | Oct 2002 | A1 |
20020161634 | Kaars | Oct 2002 | A1 |
20020161796 | Sylthe | Oct 2002 | A1 |
20030105845 | Leermakers | Jun 2003 | A1 |
20030167318 | Robbin et al. | Sep 2003 | A1 |
20030210270 | Clow et al. | Nov 2003 | A1 |
20040034853 | Gibbons et al. | Feb 2004 | A1 |
20040104938 | Saraswat et al. | Jun 2004 | A1 |
20040203384 | Sugikawa et al. | Oct 2004 | A1 |
20040213296 | Kanayama et al. | Oct 2004 | A1 |
20040215652 | Muller et al. | Oct 2004 | A1 |
20040237068 | Ren | Nov 2004 | A1 |
20040260652 | Rose | Dec 2004 | A1 |
20050090246 | Leermakers | Apr 2005 | A1 |
20050131837 | Sanctis et al. | Jun 2005 | A1 |
20050172154 | Short et al. | Aug 2005 | A1 |
20050215238 | Macaluso | Sep 2005 | A1 |
20050226188 | Santhoff et al. | Oct 2005 | A1 |
20050246193 | Roever et al. | Nov 2005 | A1 |
20050246703 | Ahonen | Nov 2005 | A1 |
20050246726 | Labrou et al. | Nov 2005 | A1 |
20050266884 | Marriott et al. | Dec 2005 | A1 |
20060013502 | Weigand | Jan 2006 | A1 |
20060015819 | Hawkins et al. | Jan 2006 | A1 |
20060026304 | Price | Feb 2006 | A1 |
20060123360 | Anwar et al. | Jun 2006 | A1 |
20060153040 | Girish et al. | Jul 2006 | A1 |
20060165104 | Kaye | Jul 2006 | A1 |
20060171515 | Hintermeister et al. | Aug 2006 | A1 |
20060184968 | Clayton et al. | Aug 2006 | A1 |
20060200815 | Li | Sep 2006 | A1 |
20060206918 | Mclean | Sep 2006 | A1 |
20060224943 | Snyder et al. | Oct 2006 | A1 |
20060250578 | Pohl et al. | Nov 2006 | A1 |
20060256130 | Gonzalez | Nov 2006 | A1 |
20060265508 | Angel et al. | Nov 2006 | A1 |
20070026799 | Wang et al. | Feb 2007 | A1 |
20070038931 | Allaire et al. | Feb 2007 | A1 |
20070130331 | Kao et al. | Jun 2007 | A1 |
20070140116 | Vega-Garcia | Jun 2007 | A1 |
20070155426 | Balakrishnan et al. | Jul 2007 | A1 |
20070220504 | Eker | Sep 2007 | A1 |
20070277230 | Hawkins et al. | Nov 2007 | A1 |
20080059533 | Krikorian | Mar 2008 | A1 |
20080077956 | Morrison et al. | Mar 2008 | A1 |
20080127060 | Reamey | May 2008 | A1 |
20080147671 | Simon et al. | Jun 2008 | A1 |
20080184128 | Swenson et al. | Jul 2008 | A1 |
20080261657 | Amit | Oct 2008 | A1 |
20080268911 | Eronen et al. | Oct 2008 | A1 |
20090031418 | Matsuda et al. | Jan 2009 | A1 |
20090042599 | Scott | Feb 2009 | A1 |
20090094272 | Skriletz | Apr 2009 | A1 |
Entry |
---|
Flanagan, David, “Java(tm) Foundation Classes in a Nutshell,” Sep. 1999, O'Reilly Publishing, <http://docstore.mik.ua/orelly/java-ent/jfc/ch02—06.htm>, Section 2.6. |
Adobe Systems Inc., “Flex 2 Developer's Guide” [Online], 2006, San Jose, CA, Retrieved from the Internet: <URL: http://download.macromedia.com/pub/documentation/en/flex/2/flex2—devguide.pdf>, pp. 1-17, 21-39, 83-129, 1113-1129. |
European Examiner Stefan Krischer, Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for Application No. PCT/US2008/056278, dated Aug. 12, 2009, 15 pages. |
Binary Runtime Environment for Wireless—BREW™ 2.1 API Reference, QUALCOMM Incorporated, San Diego, CA, May 8, 2003, cover, copyright and pp. 376-416. |
Chanda, “Embedded Document within an Application,” Pending U.S. Appl. No. 11/567,111, filed Dec. 5, 2006, 24 pages, to be published by USPTO. |
Hosmer, et al., “System and Method for Developing Information for a Wireless Information System,” Pending U.S. Appl. No. 10/791,299, filed Mar. 1, 2004, 26 pages, to be published by USPTO. |
Morris, et al., “Mobile Rich Media Information System,” Pending U.S. Appl. No. 10/791,298, filed Mar. 1, 2004, 43 pages, to be published by USPTO. |
Rodgers, et al. “Bandwidth Management System,” Pending U.S. Appl. No. 10/791,311, filed Mar. 1, 2004, 27 pages, to be published by USPTO. |
USPTO Non-Final Office Action in U.S. Appl. No. 12/577,035, mailed Jul. 25, 2011, 14 pages. |
Business Wire, “BSquare Delivers PDF Viewing Capabilities for Windows Powered Devices”, Oct. 16, 2000, pp. 1-2. |
Number | Date | Country | |
---|---|---|---|
20080222520 A1 | Sep 2008 | US |
Number | Date | Country | |
---|---|---|---|
60893865 | Mar 2007 | US |