The present invention relates to a digital device, and more particularly, to a digital device and control method thereof, suitable for maximizing user's use convenience through a more intuitive and faster content access by configuring a content-oriented menu.
The rapid transition from an analog system to a digital system is in progress. Particularly, as the digital system is more robust against eternal noise than the analog system, it has a less data loss and an advantage for error correction and is capable of providing an interactive service.
A digital TV of a related art receives a control signal through a control means (e.g., a remote controller paired by a manufacturer) or key buttons provided to an outer frame/front panel of the digital TV, performs an operation corresponding to the received control signal, and outputs a corresponding result. For example, if a menu is requested through the control means or the key button, the related art digital TV provides a menu screen, which contains formats or contents defaulted by a manufacturer and the like, on a screen. So to speak, the related art digital TV outputs a default content as the requested menu screen. Hence, in order for a user to search for and use a desired function/content or the like through a digital TV, it is inconvenient for the user to go through several depths, separate operations, function buttons and the like. In such an environment, it is difficult for a user to access and use a desired function or content.
To solve the above problems, one technical task of the present invention is to provide a digital device, by which a user can access a desired function, data (e.g., content, etc.) more easily and quickly than the related art.
Another technical task of the present invention is to provide a digital device, by which desired data can be accessed and used more easily and quickly through minimum depth or screen change on a paged menu while minimizing disturbance in watching a content currently outputted to a main screen, i.e., a currently watched content.
Further technical task of the present invention is to provide a digital device, which configures and provides a more intuitive menu screen with maximized use convenience than the related art so as to enable everyone to use the digital device easily and conveniently.
Technical tasks obtainable from the present invention are non-limited by the above-mentioned technical task(s). And, other unmentioned technical tasks can be clearly understood from the following description by those having ordinary skill in the technical field to which the present invention pertains.
In one technical aspect of the present invention, provided herein is a digital device, including a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling, a decoder decoding the content and the signaling data, a controller configured to control the decoded content to be outputted to a screen and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen and wherein if a pointer within the menu screen is located at or hovering on a prescribed region, the controller is configured to control a GUI to be outputted for a menu screen configuration switching of the outputted menu screen and control the menu screen configuration to be switched in response to a user's selection from the outputted GUI.
In another technical aspect of the present invention, provided herein is a digital device, including a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling, a decoder decoding the content and the signaling data, a controller configured to control the decoded content to be outputted to a screen and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list installed on the digital device and wherein the controller collects history data for the application list configuring the menu screen and controls a preview image to be outputted to one or more applications included in the application list configuring the menu screen based on the collected history data.
In another technical aspect of the present invention, provided herein is a digital device, including a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling and a second user input for a menu selection, a decoder decoding the content and the signaling data, a controller configured to collect a history data for one or more contents used for the device, control the decoded content to be outputted to a screen, and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen in response to the first user input and wherein the controller controls a timeline based content list to be outputted by referring to the collected history data for the one or more contents in response to the second user input.
In another technical aspect of the present invention, provided herein is a digital device, including a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling and a second user input for a menu selection, a decoder decoding the content and the signaling data, a controller configured to collect a history data for one or more contents used for the device, control the decoded content to be outputted to a screen, and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen in response to the first user input, wherein the controller controls a timeline based content list to be outputted by referring to the collected history data for the one or more contents in response to the second user input, and wherein the controller excludes a broadcast hour expiring content or a full-playback ended content from the outputted timeline based content list.
In another technical aspect of the present invention, provided herein is a digital device, including a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input and a second user input related to a menu, a decoder decoding the content and the signaling data, a controller configured to control the decoded content to be outputted to a screen and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen and an icon for entering a submenu screen, wherein if a pointer within the menu screen is located at or hovering on the icon, the controller is configured to control the submenu screen to be outputted, wherein the submenu screen includes one or more categories, wherein each of the categories includes a content list including one or more contents, and wherein the contents belonging to the content list is controlled to be arranged based on at least one of time data including a season, weather data, emotional data associated with at least one of the time data and the weather data, retrieval ranking, and user's content use pattern data.
In another technical aspect of the present invention, provided herein is a method of providing a menu screen in a digital device, including receiving a content and a signaling data for the content, decoding the content and the signaling data, outputting the decoded content to a screen, receiving a first user input for a menu calling, outputting a menu screen to overlay a prescribed region of the content outputted screen in response to the first user input, the menu screen including an application list or a content list including at least one content related to the content currently outputted to the screen, detecting whether a pointer within the menu screen is located at or hovering on a prescribed region, outputting a GUI for a menu screen configuration switching of the outputted menu screen, and switching to output the menu screen configuration in response to a user's selection from the outputted GUI.
In another technical aspect of the present invention, provided herein is a method of providing a menu screen in a digital device, including receiving a content and a signaling data for the content, decoding the content and the signaling data, outputting the decoded content to a screen, collecting a history data for one or more contents used for the device, receiving a first user input for a menu calling, outputting a menu screen to overlay a prescribed region of the content outputted screen in response to the first user input, the menu screen including an application list or a content list including at least one content related to the content currently outputted to the screen, receiving a second user input for a menu selection, and outputting a timeline based content list by referring to the collected history data for the one or more contents in response to the second user input.
In further technical aspect of the present invention, provided herein is a method of providing a menu screen in a digital device, including receiving a content and a signaling data for the content, decoding the content and the signaling data, outputting the decoded content to a screen, receiving a first user input for a menu calling, outputting a menu screen to overlay a prescribed region of the content outputted screen in response to the first user input, the menu screen including an application list or a content list including at least one content related to the content currently outputted to the screen and an icon for entering a submenu screen, detecting whether a pointer within the menu screen is located at or hovering on the icon, and outputting the submenu screen, wherein the submenu screen includes one or more categories, wherein each of the categories includes a content list including one or more contents, and wherein the contents belonging to the content list is arranged based on at least one of time data including a season, weather data, emotional data associated with at least one of the time data and the weather data, retrieval ranking, and user's content use pattern data.
Technical task(s) obtainable from the present invention are non-limited by the above-mentioned technical task. And, other unmentioned technical tasks can be clearly understood from the following description by those having ordinary skill in the technical field to which the present invention pertains.
The present invention provides the following features or effects.
According to one of various embodiments of the present invention, a user can access a desired function, data (e.g., content, etc.) more easily and quickly than the related art.
According to another one of various embodiments of the present invention, desired data can be accessed and used more easily and quickly through minimum depth or screen change on a paged menu while minimizing disturbance in watching a content currently outputted to a main screen, i.e., a currently watched content.
According to further one of various embodiments of the present invention, a digital device configures and provides a more intuitive menu screen with maximized use convenience than the related art so as to enable everyone to use the digital device easily and conveniently.
Effects obtainable from the present invention are non-limited by the above mentioned effect. And, other unmentioned effects can be clearly understood from the following description by those having ordinary skill in the technical field to which the present invention pertains.
Description will now be given in detail according to various embodiment(s) for a digital device and method of controlling the same disclosed herein, with reference to the accompanying drawings.
Suffixes such as “module”, “unit” and the like in this disclosure may be used to refer to elements or components. Use of such a suffix herein is merely intended to facilitate description of the specification, and both suffixes may be interchangeably usable. The description with ordinal numbers such as ‘first˜’, ‘second˜’ and the like is provided to facilitate the description of the corresponding terminologies only, which is non-limited by such terminologies or ordinal numbers.
Although terminologies used in the present specification are selected from general terminologies used currently and widely in consideration of functions in the present invention, they may be changed in accordance with intentions of technicians engaged in the corresponding fields, customs, advents of new technologies and the like. Occasionally, some terminologies may be arbitrarily selected by the applicant(s). In this case, the meanings of the arbitrarily selected terminologies shall be described in the corresponding part of the detailed description of the invention. Therefore, terminologies used in the present specification need to be construed based on the substantial meanings of the corresponding terminologies and the overall matters disclosed in the present specification rather than construed as simple names of the terminologies.
Meanwhile, the descriptions disclosed in the present specification and/or drawings correspond to one preferred embodiment of the present invention and are non-limited by the preferred embodiment. And, the scope/extent of the right should be determined through the appended claims.
‘Digital device’ described in the present specification includes any device capable of performing at least one of transmission, reception, processing and output of data, content, service, application and the like for example. The digital device can be paired or connected (hereinafter ‘paired’) with another digital device, an external server and the like through wire/wireless network and transmit/receive prescribed data through the pairing. In doing so, if necessary, the data may be appropriately converted before the transmission/reception. The digital devices may include standing devices (e.g., Network TV, HBBTV (Hybrid Broadcast Broadband TV), Smart TV, IPTV (Internet Protocol TV), PC (Personal Computer), etc.) and mobile devices (e.g., PDA (Personal Digital Assistant), Smart Phone, Tablet PC, Notebook, etc.). In the present specification, to help the understanding of the present invention and the clarity of the applicant's description, a digital TV is shown as an embodiment of a digital device in
Meanwhile, ‘wire/wireless network’ described in the present specification is a common name of a communication network supportive of various communication specifications and/or protocols for the paring or/and data transceiving between digital devices or between a digital device and an external server. Such wire/wireless networks include all communication networks supported currently or all communication networks that will be supported in the future, by the specifications and are capable of supporting one or more communication protocols for the same. Such wire/wireless networks can be established by a network for a wire connection and a communication specification or protocol for the same (e.g., USB (Universal Serial Bus), CVBS (Composite Video Banking Sync), Component, S-video (analog), DVI (Digital Visual Interface), HDMI (High Definition Multimedia Interface), RGB, D-SUB, etc.) and a network for a wireless connection and a communication specification or protocol (e.g., Bluetooth, RFID (Radio Frequency Identification), IrDA (infrared Data Association), UWB (Ultra Wideband), ZigBee, DLNA (Digital Living Network Alliance), WLAN (Wireless LAN)(Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), LTE/LTE-A (Long Term Evolution/LTE-Advanced), Wi-Fi direct).
If a device is named a digital device in this disclosure, the meaning may indicate a standing device or a mobile device according to a context, or can be used to indicate both unless mentioned specially.
Meanwhile, a digital device is an intelligent device supportive of a broadcast receiving function, a computer function or support, at least one external input and the like, and is able to support e-mail, web browsing, banking, game, application and the like through the aforementioned wire/wireless network. Moreover, the digital device may include an interface (e.g., manual input device, touchscreen, space remote controller, etc.) to support at least one input or control means.
Besides, a digital device may use a standardized OS (operating system). Particularly, a digital device described in the present specification uses webOS for one embodiment. Hence, a digital device can process adding, deleting, amending, updating and the like of various services or applications on Universal OS kernel or Linux kernel, through which a further user-friendly environment can be configured and provided.
Meanwhile, the aforementioned digital device can receive and process an external input. Herein, the external input includes an external input device, i.e., any input means or digital device capable of transmitting/receiving and processing data by being connected to the aforementioned digital device through wire/wireless network. For instance, as the external inputs, a game device (e.g., HDMI (High-Definition Multimedia Interface), Playstation, X-Box, etc.), a printing device (e.g., smart phone, tablet PC, pocket photo, etc.), and a digital device (e.g., smart TV, Blu-ray device, etc.) are included.
Besides, ‘server’ described in the present specification means a digital device or system that supplies data to the aforementioned digital device (i.e., client) or receives data from it, and may be called a processor. For example, the server may include a portal server providing web page, web content or web service, an advertising server providing advertising data, a content server providing contents, an SNS server providing SNS (Social Network Service), a service server provided by a manufacturer, an MVPD (Multichannel Video Programming Distributor) providing VoD (Video on Demand) or streaming service, a service server providing a pay service and the like.
Moreover, in case that the following description is made using an application only for clarity in the present specification, it may mean a service as well as an application on the basis of a corresponding content and the like and also include a web application on a webOS platform according to the present invention.
A digital device according to one embodiment of the present invention may include a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling, a decoder decoding the content and the signaling data, a controller configured to control the decoded content to be outputted to a screen and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen and wherein if a pointer within the menu screen is located at or hovering on a prescribed region, the controller is configured to control a GUI to be outputted for a menu screen configuration switching of the outputted menu screen and control the menu screen configuration to be switched in response to a user's selection from the outputted GUI.
The controller may collect history data for the application list configuring the menu screen, control a preview image to be outputted to one or more applications included in the application list configuring the menu screen based on the collected history data, identify one or more contents included in the content list of the menu screen, and control data for the identified one or more contents to be read from a memory or/and collected from an external server.
The data for the identified one or more contents may include history data for the corresponding content. And, in configuring the content list of the menu screen, the controller may control a preview image for the corresponding content to be outputted based on the history data.
A digital device according to another embodiment of the present invention may include a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling, a decoder decoding the content and the signaling data, a controller configured to control the decoded content to be outputted to a screen and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list installed on the digital device and wherein the controller collects history data for the application list configuring the menu screen and controls a preview image to be outputted to one or more applications included in the application list configuring the menu screen based on the collected history data.
A digital device according to another embodiment of the present invention may include a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling and a second user input for a menu selection, a decoder decoding the content and the signaling data, a controller configured to collect a history data for one or more contents used for the device, control the decoded content to be outputted to a screen, and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen in response to the first user input and wherein the controller controls a timeline based content list to be outputted by referring to the collected history data for the one or more contents in response to the second user input.
When a corresponding content is a live broadcast program, if the live broadcast program stops being broadcasted, the controller may control the live broadcast program to be excluded from the outputted timeline based content list. When a corresponding content is a streamed or downloaded content, if the corresponding content is completely played, the controller may control the corresponding content to be excluded from the outputted timeline based content list. When a corresponding content is restricted by a viewable time or rating, if the viewable time for the content expires or the viewable rating is changed, the controller may control the corresponding content to be excluded from the outputted timeline based content list.
If a prescribed content belonging to the timeline based content list is selected, the controller may control one or more contents related to the selected content among contents belonging to the content list to be outputted in a manner of being arranged to be adjacent to the selected content. If the content selection is released, the controller may control the content list to be outputted in a manner of being arranged as the timeline based content list before the selection.
A digital device according to another embodiment of the present invention may include a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input for a menu calling and a second user input for a menu selection, a decoder decoding the content and the signaling data, a controller configured to collect a history data for one or more contents used for the device, control the decoded content to be outputted to a screen, and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen in response to the first user input, wherein the controller controls a timeline based content list to be outputted by referring to the collected history data for the one or more contents in response to the second user input, and wherein the controller excludes a broadcast hour expiring content or a full-playback ended content from the outputted timeline based content list.
A digital device according to further of the present invention may include a receiving unit receiving a content and a signaling data for the content, a user input receiving unit receiving a first user input and a second user input related to a menu, a decoder decoding the content and the signaling data, a controller configured to control the decoded content to be outputted to a screen and control a menu screen to be outputted to overlay a prescribed region of the content outputted screen in response to the first user input, and an output unit outputting the content and the menu screen, wherein the controller controls the menu screen to be outputted by including an application list or a content list including at least one content related to the content currently outputted to the screen and an icon for entering a submenu screen, wherein if a pointer within the menu screen is located at or hovering on the icon, the controller is configured to control the submenu screen to be outputted, wherein the submenu screen includes one or more categories, wherein each of the categories includes a content list including one or more contents, and wherein the contents belonging to the content list is controlled to be arranged based on at least one of time data including a season, weather data, emotional data associated with at least one of the time data and the weather data, retrieval ranking, and user's content use pattern data.
The controller may control the icon for entering the submenu screen to be changed based on the time data including the season. The controller may control the arranged contents to be outputted by differing from each other in size based on at least one of the time data including the season, the weather data, the emotional data associated with the at least one of the time data and the weather data, the retrieval ranking, and the user's content use pattern data and an attribute of a corresponding content. If a prescribed content is selected within a corresponding category, the controller may control one or more contents associated with the selected content to be provided by being accessibly rearranged to be adjacent to the selected content. If the content selection is released, the controller may control rearrangement to be performed in arrangement order before the selection. The controller may control a prescribed category among the categories to include a list including at least one of an application and a content based on collected history data and wherein the controller controls the application included in the list to include a preview image based on the history data.
In the following description, the present invention is explained in detail with reference to attached drawings.
Referring to
The CP 10 produces and provides various contents. Referring to
The SP 20 service-packetizes a content produced by the CP 10 and then provides it to the HNED 40. For instance, the SP 20 packetizes at least one of contents, which are produced by a first terrestrial broadcaster, a second terrestrial broadcaster, a cable MSO, a satellite broadcaster, various internet broadcasters, applications and the like, for a service and then provides it to the HNED 40.
The SP 20 can provide services to the client 100 in a uni-cast or multi-cast manner Meanwhile, the SP 20 can collectively send data to a multitude of pre-registered clients 100. To this end, it is able to use IGMP (internet group management protocol) and the like.
The CP 10 and the SP 20 can be configured in the form of one entity. For example, the CP 10 can function as the SP 20 by producing a content, service-packetizing the produced content, and then providing it to the HNED 40, and vice versa.
The NP 30 provides a network environment for data exchange between the CP 10 and/or the SP 20 and the client 100.
The client 100 is a consumer belonging to the HNED 40. The client 100 may receive data by establishing a home network through the NP 30 for example and transmit/receive data for various services (e.g., VoD, streaming, etc.), applications and the like.
The CP 10 or/and the SP 20 in the service system may use a conditional access or content protection means for the protection of a transmitted content. Hence, the client 100 can use a processing means such as a cable card (CableCARD) (or POD (point of deployment) or a downloadable CAS (DCAS), which corresponds to the conditional access or the content protection.
In addition, the client 100 may use an interactive service through a network as well. In this case, the client 100 can directly serve as a content provider. And, the SP 20 may receive and transmit it to another client or the like.
In
In the following, a digital device mentioned in the present specification may correspond to the client 100 shown in
The digital device 200 may include a network interface 201, a TCP/IP manager 202, a service delivery manager 203, an SI decoder 204, a demux or demultiplexer 205, an audio decoder 206, a video decoder 207, a display A/V and OSD (On Screen Display) module 208, a service control manager 209, a service discovery manager 210, a SI & metadata database (DB) 211, a metadata manager 212, a service manager 213, a UI manager 214, etc.
The network interface 201 may transmit/receive IP (internet protocol) packet(s) or IP datagram(s) (hereinafter named IP pack(s)) through an accessed network. For instance, the network interface 201 may receive services, applications, contents, side informations and the like from the service provider 20 shown in
The TCP/IP manager 202 may involve delivery of IP packets transmitted to the digital device 200 and IP packets transmitted from the digital device 200, that is, packet delivery between a source and a destination. The TCP/IP manager 202 may classify received packet(s) according to an appropriate protocol and output the classified packet(s) to at least one of the service delivery manager 205, the service discovery manager 210, the service control manager 209, the metadata manager 212, and the like.
The service delivery manager 203 may be in charge of controlling the received service data. The service delivery manager 203 may control real-time streaming data, for example, using RTP/RTCP. In case of transmitting the real-time streaming data using RTP, the service delivery manager 203 may parse the received data packet according to the RTP and then transmits the parsed data packet to the demultiplexer 205 or save the parsed data packet to the SI & metadata DB 211 under the control of the service manager 213. The service delivery manager 203 may feed back the network reception information to the service providing server side using RTCP.
The demultiplexer 205 may demultiplex a received packet into audio data, video data, SI (system information) data and the like and then transmit the demultiplexed data to the audio/video decoder 206/207 and the SI decoder 204, respectively.
The SI decoder 204 may decode the demultiplexed SI data, i.e., service informations of PSI (Program Specific Information), PSIP (Program and System Information Protocol), DVB-SI (Digital Video Broadcasting-Service Information), DTMB/CMMB (Digital Television Terrestrial Multimedia Broadcasting/Coding Mobile Multimedia Broadcasting), etc. And, the SI decoder 204 may save the decoded service informations to the SI & metadata DB 211. The saved service information can be used by being read by a corresponding component in response to a user's request for example.
The audio decoder 206 and the video decoder 207 may decode the demultiplexed audio data and the demultiplexed video data, respectively. The decoded audio and video data may be provided to the user through the display unit 208.
The application manager includes a service manager 213 and a user interface (UI) manager 214 and is able to perform a function of a controller of the digital device 200. So to speak, the application manager can administrate the overall states of the digital device 200, provide a user interface (UI), and manage other mangers.
The UI manager 214 provides a graphical user interface/user interface (GUI/UI) using OSD (on screen display) and the like. The UI manager 214 receives a key input from a user and then performs a device operation according to the input. For instance, if receiving a key input about a channel selection from a user, the UI manager 214 transmits the key input signal to the service manager 213.
The service manager 213 may control and manage service-related managers such as the service delivery manager 203, the service discovery manager 210, the service control manager 209, and the metadata manager 212.
The service manager 213 creates a channel map and controls a selection of a channel and the like using the created channel map in response to a key input received from the UI manager 214. The service manager 213 may receive service information from the SI decoder 204 and then sets an audio/video PID of a selected channel for the demultiplexer 205. Such a PID can be used for the demultiplexing procedure. Therefore, the demultiplexer 205 performs filtering (PID or section filtering) on audio data, video data and SI data using the PID.
The service discovery manager 210 may provide information required to select a service provider that provides a service. Upon receipt of a signal for selecting a channel from the service manager 213, the service discovery manager 210 searches for a service using the information.
The service control manager 209 may select and control a service. For example, the service control manager 209 may perform service selection and control using IGMP (Internet Group Management Protocol) or real time streaming protocol (RTSP) when the user selects a live broadcast service and using RTSP when the user selects a video on demand (VOD) service. The RTSP protocol can provide a trick mode for real-time streaming And, the service control manager 209 may initialize and manage a session through the IMS gateway 250 using IMS (IP multimedia subsystem) and SIP (session initiation protocol). The protocols are exemplary, and other protocols are usable according to implementations.
The metadata manager 212 may manage metadata associated with services and save the metadata to the SI & metadata DB 211.
The SI & metadata DB 211 may store service information decoded by the SI decoder 204, metadata managed by the metadata manager 212, and information required to select a service provider, which is provided by the service discovery manager 210. In addition, the SI & metadata DB 211 can store system set-up data and the like for the system.
The SI & metadata database 211 may be implemented with non-volatile RAM (NVRAM), flash memory and the like.
Meanwhile, an IMS gateway 250 is a gateway in which functions required for an access to an IMS based IPTV service are collected.
A storage unit (not shown) may store programs for various signal processing and controls, and may also store a processed video, audio or data signal. In addition, the storage unit may execute a function of temporarily storing a video, audio or data signal inputted from an external device interface or the network interface 201. The storage unit may store information on a prescribed broadcast channel through a channel memory function. The storage unit 240 may store an application or an application list inputted from the external device interface or the network interface 201. And, the storage unit may store various platforms which will be described later. For example, the storage unit may include storage media of one or more types, such as a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g. SD or XD memory), RAM, EEPROM, etc. The digital device 200 may play content files (a video file, a still image file, a music file, a text file, an application file, etc.) stored in the storage unit and provide them to a user.
The above-described digital device 200 may include a digital broadcast receiver capable of processing digital broadcast signals of ATSC or DVB of a stationary or mobile type. Regarding the digital device according to the present invention, some of the illustrated components may be omitted or new components (not shown) may be further added as required. On the other hand, the digital device may not include the tuner and the demodulator, differently from the aforementioned digital device, and may play a content by receiving the content through the network interface or the external device interface.
The former description with reference to
Referring to
The respective components are described in detail as follows.
The wireless communication unit 310 typically includes one or more modules which permit wireless communication between the mobile device 300 and a wireless communication system or network within which the mobile device 300 is located. For instance, the wireless communication unit 310 can include a broadcast receiving module 311, a mobile communication module 312, a wireless Internet module 313, a short-range communication module 314, a location information module 315, etc.
The broadcast receiving module 311 receives a broadcast signal and/or broadcast associated information from an external broadcast managing server via a broadcast channel The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast managing server may mean a server generating to send a broadcast signal and/or broadcast associated information or a server receiving to send a pre-generated broadcast signal and/or broadcast associated information to a terminal. The broadcast signal may be implemented as a TV broadcast signal, a radio broadcast signal, and/or a data broadcast signal, among other signals. If desired, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information includes information associated with a broadcast channel, a broadcast program, or a broadcast service provider. Furthermore, the broadcast associated information can be provided via a mobile communication network. In this case, the broadcast associated information can be received by the mobile communication module 312. The broadcast associated information can be implemented in various forms, e.g., an electronic program guide (EPG), an electronic service guide (ESG), and the like. The broadcast receiving module 311 may be configured to receive digital broadcast signals using broadcasting systems such as ATSC, DVB-T (Digital Video Broadcasting-Terrestrial), DVB-S (Satellite), MediaFLO (Media Forward Link Only), DVB-H (Handheld), ISDB-T (Integrated Services Digital Broadcast-Terrestrial), and the like. Optionally, the broadcast receiving module 311 can be configured to be suitable for other broadcasting systems as well as the above-noted digital broadcasting systems. The broadcast signal and/or broadcast associated information received by the broadcast receiving module 311 may be saved to the memory 360.
The mobile communication module 312 transmits/receives wireless signals to/from at least one of a base station, an external terminal, and a server via a mobile network. Such wireless signals may carry audio signals, video signals, and data of various types according to transceived text/multimedia messages.
The wireless Internet module 313 includes a module for wireless Internet access and may be internally or externally coupled to the mobile device 300. The wireless Internet technology can include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), and the like.
The short-range communication module 314 is a module for short-range communications. Suitable technologies for implementing this module include Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra-wideband (UWB), ZigBee, RS-232, RS-485 and the like.
The location information module 315 is a module for obtaining location information of the mobile terminal 100. And, this module may be implemented with a global positioning system (GPS) module for example
The audio/video (A/V) input unit 320 is configured to provide audio or video signal input. The A/V input unit 320 may include a camera 321, a microphone 322 and the like. The camera 321 receives and processes image frames of still pictures or video, which are obtained by an image sensor in a video call mode or a photographing mode. Furthermore, the processed image frames can be displayed on the display 351.
The image frames processed by the camera 321 can be stored in the memory 360 or transmitted externally via the wireless communication unit 310. Optionally, at least two cameras 321 can be provided according to the environment of usage.
The microphone 322 receives an external audio signal in call mode, recording mode, voice recognition mode, or the like. This audio signal is processed and converted into electrical audio data. The processed audio data is transformed into a format transmittable to a mobile communication base station via the mobile communication module 312 in call mode. The microphone 322 typically includes assorted noise cancelling algorithms to cancel noise generated in the course of receiving the external audio signal.
The user input unit 330 generates input data for a user to control an operation of the terminal. The user input unit 330 may include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel, a jog switch, and/or the like.
The sensing unit 340 generates sensing signals for controlling operations of the mobile device 300 using status measurements of various aspects of the mobile terminal. For instance, the sensing unit 340 may detect an open/closed status of the mobile device 300, a location of the mobile device 300, an orientation of the mobile device 300, a presence or absence of user contact with the mobile device 300, an acceleration/deceleration of the mobile device 300, and the like. For example, if the mobile device 300 is moved or inclined, it is able to sense a location or inclination of the mobile device. Moreover, the sensing unit 340 may sense a presence or absence of power provided by the power supply unit 390, a presence or absence of a coupling or other connection between the interface unit 370 and an external device, and the like. Meanwhile, the sensing unit 340 may include a proximity sensor 341 such as NFC (near field communication) and the like.
The output unit 350 generates output relevant to the senses of vision, hearing and touch, and may include the display 351, an audio output module 352, an alarm unit 353, a haptic module 354, and the like.
The display 351 is typically implemented to visually display (output) information processed by the mobile device 300. For instance, if the mobile terminal is operating in phone call mode, the display will generally provide a user interface (UI) or graphical user interface (GUI) related to a phone call. For another instance, if the mobile device 300 is in video call mode or photographing mode, the display 351 may display photographed or/and received images or UI/GUI.
The display module 351 may be implemented using known display technologies. These technologies include, for example, a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light-emitting diode display (OLED), a flexible display and a three-dimensional display. The mobile device 300 may include one or more of such displays.
Some of the displays can be implemented in a transparent or optical transmittive type, which can be called a transparent display. A representative example of the transparent display is the TOLED (transparent OLED). A rear configuration of the display 351 can be implemented as the optical transmittive type as well. In this configuration, a user may be able to see an object located in rear of a terminal body through a region occupied by the display 351 of the terminal body.
Two or more displays 351 can be provided to the mobile device 300 in accordance with an implementation type of the mobile device 300. For instance, a plurality of displays can be disposed on the mobile device 300 in a manner of being spaced apart from a single face or being integrally formed on a single face. Alternatively, a plurality of displays may be disposed on different faces of the mobile device 300, respectively.
If the display 351 and a sensor (hereinafter called ‘touch sensor’) for detecting a touch action configure a mutual layer structure, the display 351 is usable as an input device as well as an output device. In this case, the touch sensor can be configured with a touch film, a touch sheet, a touchpad, or the like.
The touch sensor can be configured to convert a pressure applied to a specific portion of the display 351 or a variation of capacitance generated from a specific portion of the display 351 into an electrical input signal. Moreover, the touch sensor is configurable to detect pressure of a touch as well as a touched position or size.
If a touch input is applied to the touch sensor, signal(s) corresponding to the touch input is transferred to a touch controller. The touch controller processes the signal(s) and then transfers the processed signal(s) to the controller 380. Therefore, the controller 380 is able to know whether a prescribed portion of the display 351 is touched.
A proximity sensor 341 can be disposed on an inner region of the mobile device enclosed by the touchscreen or near the touchscreen. The proximity sensor is a sensor that detects a presence or non-presence of an object approaching a prescribed detecting surface or an object existing around the proximity sensor using an electromagnetic field strength or infrared ray without mechanical contact. Hence, the proximity sensor is more durable than a contact type sensor and also has utility higher than that of the contact type sensor.
The proximity sensor may include one of a transmittive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a radio frequency oscillation proximity sensor, an electrostatic capacity proximity sensor, a magnetic proximity sensor, an infrared proximity sensor, etc. If the touch screen includes the electrostatic capacity proximity sensor, it is configured to detect the proximity of a pointer using a variation of an electric field according to the proximity of the pointer. In this configuration, the touchscreen (or touch sensor) can be sorted into a proximity sensor.
For clarity and convenience of explanation, an action for enabling the pointer approaching the touch screen to be recognized as placed on the touch screen may be named ‘proximity touch’ and an action of enabling the pointer to actually come into contact with the touch screen may be named ‘contact touch’. And, a position, at which the proximity touch is made to the touch screen using the pointer, may mean a position of the pointer vertically corresponding to the touch screen when the pointer makes the proximity touch.
The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch duration, a proximity touch position, a proximity touch shift state). Information corresponding to the detected proximity touch action and the detected proximity touch pattern can be output to the touch screen.
The audio output module 352 functions in various modes including a call-receiving mode, a call-placing mode, a recording mode, a voice recognition mode, and a broadcast reception mode to output audio data which is received from the wireless communication unit 310 or stored in the memory 360. During operation, the audio output module 352 may output an audio signal related to a function (e.g., call received, message received) executed in the mobile device 300. The audio output module 352 may include a receiver, a speaker, a buzzer and the like.
The alarm unit 353 outputs a signal for announcing the occurrence of an event of the mobile device 300. Typical events occurring in the mobile device may include a call signal received, a message received, a touch input received, and the like. The alarm unit 353 may output a signal for announcing the event occurrence by way of vibration as well as video or audio signal. The video or audio signal can be outputted via the display 351 or the audio output module 352. Hence, the display 351 or the audio output module 352 can be sorted into a part of the alarm unit 353.
The haptic module 354 generates various tactile effects that can be sensed by a user. Vibration is a representative one of the tactile effects generated by the haptic module 354. The strength and pattern of the vibration generated by the haptic module 354 are controllable. For instance, different vibrations can be output in a manner of being synthesized together or can be output in sequence. The haptic module 354 is able to generate various tactile effects as well as the vibration. For instance, the haptic module 354 may generate an effect attributed to the arrangement of pins vertically moving against a contact skin surface, an effect attributed to the injection/suction power of air though an injection/suction hole, an effect attributed to the skim over a skin surface, an effect attributed to a contact with an electrode, an effect attributed to an electrostatic force, and an effect attributed to the representation of a hot/cold sense using an endothermic or exothermic device. The haptic module 354 can be implemented to enable a user to sense the tactile effect through a muscle sense of a finger or an arm as well as to transfer the tactile effect through direct contact. Optionally, two or more haptic modules 354 can be provided to the mobile device 300 in accordance with a configuration type of the mobile device 300.
The memory 360 may store a program for an operation of the controller 380, or may temporarily store inputted/outputted data (e.g., phonebook, message, still image, video, etc.). And, the memory 360 may store data of vibrations and sounds of various patterns outputted in response to a touch input to the touchscreen.
The memory 360 may be implemented using any type or combination of suitable volatile and non-volatile memory or storage devices, including hard disk, random access memory (RAM), static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk, multimedia card micro type memory, card-type memory (e.g., SD memory or XD memory), or other similar memory or data storage device. Furthermore, the mobile device 300 is able to operate in association with the web storage for performing a storage function of the memory 360 on the Internet.
The interface unit 370 may play a role as a passage to every external device connected to the mobile device 300 with external devices. The interface unit 370 receives data from the external devices, delivers a supplied power to the respective elements of the mobile device 300, or enables data within the mobile device 300 to be transferred to the external devices. For instance, the interface unit 370 may include a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for coupling to a device having an identity module, audio input/output ports, video input/output ports, an earphone port, and the like.
The identity module is a chip for storing various kinds of information for authenticating a use authority of the mobile device 300 and may include User Identify Module (UIM), Subscriber Identity Module (SIM), Universal Subscriber Identity Module (USIM), and the like. A device having the identity module (hereinafter called ‘identity device’) can be manufactured in form of a smart card. Therefore, the identity device is connectible to the mobile device 300 through a port.
When the mobile device 300 is connected to an external cradle, the interface unit 370 becomes a passage for supplying the mobile device 300 with a power from the cradle or a passage for delivering various command signals input from the cradle by a user to the mobile device 300. Each of the various command signals inputted from the cradle or the power can operate as a signal for recognizing that the mobile device 300 is correctly installed in the cradle.
The controller 380 typically controls the overall operations of the mobile device 300. For example, the controller 380 performs the control and processing associated with voice calls, data communications, video calls, and the like. The controller 380 may include a multimedia module 381 that provides multimedia playback. The multimedia module 381 may be configured as a part of the controller 380, or implemented as a separate component. Moreover, the controller 380 is able to perform a pattern recognition processing for recognizing a writing input and a picture drawing input performed on the touchscreen as a text and an image, respectively.
The power supply unit 390 is supplied with an external or internal power and then supplies a power required for an operation of each component, under the control of the controller 380.
Various embodiments described herein may be implemented in a recording medium readable by a computer or a device similar to the computer using software, hardware, or some combination thereof for example.
For hardware implementation, the embodiments described herein may be implemented within at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, and a selective combination thereof. Such embodiments may also be implemented by the controller 180.
For software implementation, the embodiments described herein may be implemented with separate software modules, such as procedures and functions, each of which performs one or more of the functions and operations described herein. The software codes can be implemented with a software application written in any suitable programming language and may be stored in the memory 360, and executed by the controller 380.
Meanwhile, a mobile terminal may extend to a wearable device wearable on a user body beyond a dimension in which a user uses a mobile terminal held in a user's hand. Such wearable devices may include a smart watch, a smart glass, a head mounted display (HMS) and the like. Examples of a mobile terminal extending to a wearable device are described in the following.
A wearable device may be configured to exchange (or link) data with another mobile terminal 300. The short range communication module 314 may sense (or recognize) a communication-available wearable device around the mobile terminal 300. Moreover, if the sensed wearable device is a device authenticated to communicate with the mobile terminal 300, the controller 280 may send at least one portion of data processed by the mobile terminal 300 to the wearable device through the short range communication module 314. Hence, a user may use the data, which is processed by the mobile terminal 300, through the wearable device. For example, if the mobile terminal 300 receives an incoming call, a phone call is performed through the wearable device. If the mobile terminal 300 receives a message, the received message can be checked through the wearable device.
A digital device described in the present specification can be operated by a webOS platform. Hereinafter, such a processing as a webOS based configuration or algorithm may be performed by the controller of the above-described digital device or the like. In this case, the controller is used in a broad sense including the aforementioned controllers. Hence, in the following description, regarding a configuration for processing webOS based or related services, applications, contents and the like in a digital device, a hardware or component including software, firmware and the like may be described in a manner of being named a controller.
Such a webOS based platform may improve development independency and functional extensibility by integrating services, applications and the like based on Luna-service Bus for example and is able to increase application development productivity based on a web application framework. In addition, system resources and the like are efficiently used through a webOS process and resource management, whereby multitasking can be supported.
Meanwhile, a webOS platform described in the present specification may be available not only for stationary devices such as personal computers (PCs), TVs and settop boxes (STBs) but also for mobile devices such as cellular phones, smartphones, tablet PCs, laptops, wearable devices, and the like.
A software structure for a digital device is a monolithic structure capable of solving conventional problems depending on markets and has difficulty in external application with a multi-threading based signal process and closed product. In pursuit of new platform based development, cost innovation through chipset replacement and UI application and external application development efficiency, layering and componentization are performed to obtain a 3-layered structure and an add-on structure for an add-on, a single source product and an open application. Recently, modular design of a software structure has been conducted in order to provide a web open application programming interface (API) for an echo system and modular architecture of a functional unit or a native open API for a game engine, and thus a multi-process structure based on a service structure has been produced.
The architecture of a webOS platform is described with reference to
The platform can be mainly classified into a system library based webOS core platform, an application, a service and the like.
The architecture of the webOS platform includes a layered structure. OS, system library(s), and applications exist in a lowest layer, a next layer and a most upper layer, respectively. First of all, regarding the lowest layer, as a Linux kernel is included as an OS layer, Linux may be included as an OS of the digital device. Above the OS layer, BSP/HAL (Board Support Package/Hardware Abstraction layer, webOS core modules layer, service layer, Luna-Service Bus layer, Enyo framework/NDK(Native Developer's Kit)/QT layer, and an application layer (as a most upper layer) exist in order. Meanwhile, some layers can be omitted from the aforementioned webOS layer structure. A plurality of layers can be integrated into a single layer, and vice versa. The webOS core module layer may include LSM (Luna Surface Manager) for managing a surface window and the like, SAM (System & Application Manage) for managing launch, running state and the like of an application, WAM (Web Application Manager) for managing Web application and the like based on WebKit, etc.
The LSM manages an application window appearing on a screen. The LSM is in charge of a display hardware (HW), provides a buffer capable of rendering substance required for applications, and outputs a composition of rendering results of a plurality of application to a screen.
The SAM manages a performance policy per conditions of system and application.
Meanwhile, since webOS may regard a web application (Web App) as a basic application, the WAM is based on Enyo Framework.
A service use of application is performed through Luna-service Bus. A new service may be registered as the Bus, and an application can find and use a service required for itself.
The service layer may include services of various service levels such as TV service, webOS service and the like. Meanwhile, the webOS service may include a media server, a Node.JS and the like. Particularly, Node.JS service supports javascript for example.
The webOS service is Linux process of implementing a function logic and can communicate through Bus. This can be mainly divided into four parts and is constructed with a TV process, services migrating into webOS from an existing TV or services corresponding to manufacturer-differentiated services, webOS common service, and Node.js service developed with javascript and used through Node.js.
The application layer may include all applications supportable by the digital device, e.g., TV application, showcase application, native application Web application, etc.
Application on webOS may be sorted into Web Application, PDK (Palm Development Kit) application, QML (Qt Meta Language or Qt Modeling Language) application and the like according to implementing methods. The Web Application is based on WebKit engine and is run on WAM Runtime. Such a web application is based on Enyo Framework or may be run in a manner of being developed based on general HTML5, CSS (cascading style sheets), and javascript. The PDK application includes a native application and the like developed with C/C++ based on PDK provided for a 3rd party or an external developer. The PDK means a set of development libraries and tools provided to enable a third party (e.g., a game, etc.) to develop a native application (C/C++). The PDK application can be used to develop an application of which performance is significant. The QML application is a Qt based native application and includes basic applications (e.g., card view, home dashboard, virtual keyboard, etc.) provided with webOS platform. Herein, QML is a mark-up language of a script type instead of C++. Meanwhile, in the above description, the native application means an application that is developed with C/C++, complied, and run in binary form. Such a native application has an advantage of a fast running speed.
The following description is made with reference to
Node.js services (e-mail, contact, calendar, etc.) based on HTML5, CSS, and java script, webOS services such as Logging, backup, file notify, database (DB), activity manager, system policy, AudioD (Audio Daemon), update, media server and the like, TV services such as EPG (Electronic Program Guide), PVR (Personal Video Recorder), data broadcasting and the like, CP services such as voice recognition, Now on, Notification, search, ACR (Auto Content Recognition), CBOX (Contents List Browser), wfdd, DMR, Remote Application, download, SDPIF (Sony Philips Digital Interface Format) and the like, native applications such as PDK applications, browser, QML application and the like, and Enyo Framework based UI related TV applications and Web applications are processed through the webOS core module like the aforementioned SAM, WAM and LSM via Luna-Service-Bus. Meanwhile, in the above description, it is not mandatory for the TV applications and the Web applications to be Enyo-Framework-based or UI-related.
CBOX can manage a list and metadata for contents of such an external device connected to TV as USB, DLNA, Cloud and the like. Meanwhile, the CBOX can output a content listing of various content containers such as USB, DMS, DVR, Cloud and the like in form of an integrated view. And, the CBOX shows a content listing of various types such as picture, music, video and the like and is able to manage the corresponding metadata. Besides, the CBOX can output a content of an attached storage by real time. For instance, if a storage device such as USB is plugged in, the CBOX should be able to output a content list of the corresponding storage device. In doing so, a standardized method for the content list processing may be defined. And, the CBOX may accommodate various connecting protocols.
SAM is provided to enhance improvement and extensibility of module complexity. Namely, for instance, since an existing system manager handles various functions (e.g., system UI, window management, web application run time, constraint condition processing on UX, etc.) by a single process, implementation complexity is very high. Hence, by separating major functions and clarifying an inter-function interface, implementation complexity can be lowered.
LSM supports system UX implementation (e.g., card view, launcher, etc.) to be independently developed and integrated and also supports the system UX implementation to easily cope with a product requirement change and the like. In case of synthesizing a plurality of application screens like App On App, the LSM enables multitasking by utilizing hardware (HW) resource to the maximum, and is able to provide a window management mechanism for multi-window, 21:9 and the like. LSM supports implementation of system UI based on QML and enhances development productivity thereof. QML UX can easily configure a screen layout and a UI component view and facilitates development of a code for processing a user input. Meanwhile, an interface between QML and webOS component is achieved through QML extensive plug-in, and a graphic operation of application may be based on wayland protocol, luna-service call and the like. LSM is an abbreviation of Luna Surface Manager, as described above, and performs a function of an application window compositor. LSM synthesizes an independently developed application, a US component and the like and then outputs the synthesized one to a screen. With respect to this, if components such as Recents application, showcase application, launcher application and the like render contents of their own, respectively, LSM defines an output region, an interoperating method and the like as a compositor. So to speak, the LSM (i.e., compositor) processes graphic synthesis, focus management, input event and the like. In doing so, LSM receives an event, a focus and the like from an input manager. Such an input manager may include a remote controller, an HID (e.g., mouse & keyboard), a joy stick, a game pad, an application remote, a pen touch and the like. Thus, LSM supports a multiple window model and can be simultaneously run on all applications owing to system UI features. With respect to this, LSM can support launcher, recents, setting, notification, system keyboard, volume UI, search, finger gesture, Voice Recognition (STT (Sound to Text), TTS (Text to Sound), NLP (Natural Language Processing), etc.), pattern gesture (camera, MRCU (Mobile Radio Control Unit)), Live menu, ACR (Auto Content Recognition), and the like.
Referring to
If a web application based graphic data (or application) is generated as a UI process from the web application manager 610, the generated graphic data is forwarded to a full-screen application or the LSM 630. Meanwhile, the web application manager 610 receives an application generated from the webkit 620 for sharing the GPU (graphic processing unit) memory for the graphic managing between the UI process and the web process and then forwards it to the LSM 630 if the application is not the full-screen application. If the application is the full-screen application, it can bypass the LSM 630. In this case, it may be directly forwarded to the graphic manager 640.
The LSM 630 sends the received UI application to a wayland compositor via a wayland surface. The wayland compositor appropriately processes it and then forwards it to the graphic manager. Thus, the graphic data forwarded by the LSM 630 is forwarded to the graphic manager compositor via the LSM GM surface of the graphic manager 640 for example.
Meanwhile, as described above, the full-screen application is directly forwarded to the graphic manager 640 without passing through the LSM 630. Such an application is processed by the graphic manager compositor via the WAM GM surface. The graphic manager processes all graphic data within the webOS device. The graphic manager receives all the graphic data through the GM surface like data broadcasting application, caption application and the like as well as the data through the LSM GM and the data through the WAM GM surface and then processes them to be outputted to the screen appropriately. Herein, a function of the GM compositor is equal or similar to that of the aforementioned compositor.
Referring to
The media server can add robustness to system stability. For instance, by removing an erroneous play pipeline per pipeline in the course of a media play and then re-maneuvering the media play, another media play is not affected even if such an error occurs. Such a pipeline is a chain of connecting the respective unit functions (e.g., decoding, analysis, output, etc.) in case of a media play request, and necessary unit functions may be changed according to a media type and the like.
The media server may have extensibility. For instance, the media server can add a pipeline of a new type without affecting an existing implementation scheme. For instance, the media server can accommodate a camera pipeline, a video conference (Skype) pipeline, a third-party pipeline and the like.
The media server can handle a general media play and a TV task execution as separate services, respectively. The reason for this is that an interface of a TV service is different from a media play case. In the above description, the media server supports operations of ‘setchanne’, ‘channelup’, ‘channeldown’, ‘channeltuning’, ‘recordstart’ and the like in association with the TV service but supports operations of ‘play’, ‘pause’, ‘stop’ and the like in association with the general media play, thereby supporting different operations for the two services, respectively. Thus, the media server is able to handle the services separately.
The media server may control or manage resource management functions integratedly. Hardware resource allocation, recovery and the like in a device are integratedly performed in the media server. Particularly, a TV service process delivers a currently running task, a current resource allocation status and the like to the media server. Each time each media is executed, the media server secures a resource, activates a pipeline, and performs a grant of execution by a priority (e.g., policy), a resource recovery of other pipelines and the like in response to a media execution request based on a current resource status occupied by each pipeline. Herein, a predefined execution priority and a necessary resource information for a specific request are managed by a policy manager, and a resource manager can handle resource allocation, recovery and the like by communicating with the policy manager.
The media server can retain an ID (identifier) for every operation related to a play. For instance, based on an identifier, the media server can give a command by indicating a specific pipeline. For two or more media plays, the media server may give a command to pipelines by distinguishing the two from each other. The media server may be in charge of a play of HTMS 5 standard media.
Besides, the media server may follow a TV reconfiguration range for a separate service processing of a TV pipeline. The media server can be designed irrespective of the TV reconfiguration range. If the TV is not separately service-processed, when a problem arises from a specific task, the TV may be re-executed entirely.
The media server is so-called uMS, i.e., a micro media server. Herein, a media player is a media client. This may mean a webkit for HTML 5 video tag, camera, TV, Skype, 2nd screen and the like.
A core function of the media server is the management of a micro resource such as a resource manager, a policy manager or the like. With respect to this, the media server controls a playback control role on a web standard media content. Regarding this, the media server may manage a pipeline controller resource.
Such a media server supports extensibility, reliability, efficient resource usage and the like for example.
So to speak, the uMS, i.e., the media server manages and controls the use of resources for an appropriate processing in a WebOS device such as a resource (e.g., cloud game, MVPD (pay service, etc.), camera preview, 2nd screen, Skype, etc.), a TV resource and the like overall, thereby functioning in managing and controlling an efficient usage. Meanwhile, when resources are used, each resource uses a pipeline for example. And, the media server can manage and control generation, deletion, usage and the like of the pipeline for resource management overall. Here, a pipeline may be generated if a media related to a task starts to continue a job such as a parsing of request, decoding stream, video output, or the like. For instance, in association with a TV service or application, watching, recording, channel tuning or the like is individually processed in a manner that a resource usage or the like is controlled through a pipeline generated in response to a corresponding request.
A processing structure of a media server and the like are described in detail with reference to
In
The application or service is provided with various clients according to its property and is able to exchange data with the media server 820 or the pipelines through them.
The clients may include a uMedia client (webkit) for the connection to the media server 820, an RM (resource manager) client (C/C++) and the like for example.
The application including the uMedia client, as described above, is connected to the media server 820. In particular, the uMedia client corresponds to a video object to be described later. Such a client uses the media server 820 for an operation of a video in response to a request or the like. Here, the video operation relates to a video status. Loading, unloading, play (or, playback, reproduce), pause, stop and the like may include all status data related to video operations. Each operation or status of a video can be processed through individual pipeline generation. Hence, the uMedia client sends status data related to the video operation to the pipeline manager 822 in the media server.
The pipeline manager 822 obtains information on a current resource of a device through data communication with the resource manager 824 and makes a request for allocation of a resource corresponding to the status data of the uMedia client. In doing so, the pipeline manager 822 or the resource manager 824 controls the resource allocation through the data communication with the policy manager 826 if necessary in association with the resource allocation and the like. For instance, if a resource to be allocated by the resource manager in response to the request made by the pipeline manager 822 does not exist or is insufficient, an appropriate resource allocation or the like according to the request can be performed according to priority comparison of the policy manager 826 and the like. Meanwhile, the pipeline manager 822 makes a request for pipeline generation for an operation according to the uMedia client's request for the resource allocated according to the resource allocation of the resource manager 824 to a media pipeline controller 828.
The media pipeline controller 828 generates a necessary pipeline under the control of the pipeline manager 822. Regarding the generated pipelines, as shown in the drawing, pipelines related to play, pause, stop and the like can be generated as well as a media pipeline and a camera pipeline. Meanwhile, the pipelines may include pipelines for HTML5, Web CP, smartshare play, thumbnail extraction, NDK, cinema, MHEG (Multimedia and Hypermedia Information coding Experts Group) and the like.
Besides, pipelines may include a service based pipeline (self-pipeline) and a URI based pipeline (media pipeline) for example.
Referring to
Hence, by receiving the resource management of the resource manager 824 through the uMS connector, the application or service can cope with the request of the RM client. Such an RM client may process services such as native CP, TV service, 2nd screen, flash player, U-tube MSE (media source extensions), cloud game, Skype and the like. In this case, as described above, the resource manager 824 can manage resource through appropriate data communication with the policy manager 826 if necessary for the resource management.
Meanwhile, the URI based pipeline is processed through the media server 820 instead of the case of directly processing media like the RM client. The URI based pipelines may include player factory, Gstreamer, streaming plug-in, DRM (Digital Rights Management) plug-in pipeline and the like.
A method of interfacing between an application and media services is described as follows.
There is an interfacing method using a service on a web application. This may be a Luna Call method using PSB (palm service bridge) or a method using Cordova. This is to extend a display with a video tag. Besides, there may be a method of using HTMS5 standard for video tag or media element. And, there is a method of interfacing using a service in PDK. Alternatively, there is a method of using a service in an existing CP. This is usable by extending plug-in of an existing platform on the basis of luna for backward compatibility.
Finally, there is an interfacing method in case of non-webOS. In this case, it is able to interface by directly calling a luna bus.
Seamless change is processed by a separate module (e.g., TVWIN), which is a process for showing a TV on a screen preferentially without webOS and then processing seamlessly before or during webOS booting. Since a booting time of webOS is considerably long, it is used to provide basic functions of a TV service preferentially for a quick response to a user's power-on request. And, the module is a part of a TV service process and supports a seamless change capable of providing fast booting and basic TV functions, a factory mode and the like. And, the module may be in charge of a switching from non-webOS mode to webOS mode.
Referring to
In
A service, a web application or a PDK application (hereinafter ‘application) is connected to various service processing configurations through a luna-service bus. Through it, the application operates or an operation of the application is controlled.
A corresponding data processing path is changed according to a type of an application. For instance, if the application is an image data related to a camera sensor, it is processed by being sent to a camera processor 930. Herein, the camera processor 930 includes a gesture module, a face detection module and the like and processes image data of the application received. Herein, in case of data requiring a usage of a pipeline and the like automatically or according to a user's selection, the camera processor 930 may process the corresponding data by generating the pipeline through a media server processor 910.
Alternatively, if an application includes audio data, the corresponding audio can be processed through an audio processor (AudioD) 940 and an audio module (PulseAudio) 950. For instance, the audio processor 940 processes audio data received from the application and then sends it to an audio module 950. In doing so, the audio processor 940 may determine the processing of the audio data by including an audio policy manager. The processed audio data is processed and handled by the audio module 950. Meanwhile, the application may notify data related to the audio data processing to the audio module 960, which may be notified to the audio module 960 by a related pipeline. The audio module 950 includes ALSA (Advanced Linux Sound Architecture).
Or, in case that an application includes or processes (hereinafter ‘includes’) a DRM hooked content, a corresponding content data is sent to a DRM service processor 960. The DRM service processor 960 generates the DRM hooked content data by generating a DRM instance. Meanwhile, for the processing of the DRM hooked content data, the DRM service processor 960 can be connected to a DRM pipeline in a media pipeline through the luna-service bus.
A processing for a case that an application includes media data or TV service data (e.g., broadcast data) is described as follows.
The following description is made with reference to
First of all, in case that an application includes TV service data, it is processed by the TV service processor 820/920.
Herein, the TV service processor 820 may include at least one of a DVR/channel manager, a broadcast module, a TV pipeline manager, a TV resource manager, a data broadcast module, an audio setting module, a path manager and the like. Alternatively, the TV service processor 920 in
In the present specification, The TV service processor may be implemented into the configuration shown in
Based on attribute or type of the TV service data received from the application, the TV service processor 820/920 sends DVR or channel associated data to the DVR/channel manager and also sends it to the TV pipeline manager to generate and process a TV pipeline. Meanwhile, if the attribute or type of the TV service data is a broadcast content data, the TV service processor 820 generates and processes a TV pipeline through the TV pipeline manager to process the corresponding data through the broadcast module.
Or, a json (Javascript standard object notation) file or a file composed with c is processed by the TV broadcast handler, sent to the pipeline manager through the TV broadcast interface, and then processed by generating a TV pipeline. In this case, the TV broadcast interface sends the data or file through the TV broadcast handler to the TV pipeline manager on the basis of the TV service policy so that the data or file can be referred to for the pipeline generation.
In the following, a processing process within the TV service processor 920, and more particularly, below the TV broadcast interface is described in detail.
The TV broadcast interface may perform f controller function of the TV service processor 920. The TV broadcast interface makes a request for a pipeline generation to the TV pipeline manager. Then, the TV pipeline manager generates a TV pipeline and makes a request for resources to the TV resource manager. If the TV resource manager makes a resource request to the media server through a UMS connector and then obtains resources, the TV resource manager returns them to the TV pipeline manager.
The TV pipeline manager arranges the returned resources within the generated TV pipeline and registers pipeline information at a path manager. Thereafter, the TV pipeline manager returns the result to the TV pipeline manager. And, the TV pipeline manager returns the pipeline to the TV broadcast interface.
Thereafter, the TV broadcast interface request a channel change and the like by communicating with a TV middleware (MW), and the TV middleware returns the result.
Through the aforementioned process, the TV service can be processed.
The TV pipeline manager may be controlled by the TV resource manager when generating one or more pipelines in response to a TV pipeline generation request from the Processing module or manager in the TV service. Meanwhile, in order to request a status and allocation of a resource allocated for the TV service in response to a TV pipeline generation request made by the TV pipeline manager, the TV resource manager may be controlled by the TV policy manager and performs data communication with the media server processor 810/910 through the uMS connector. The resource manager in the media server processor delivers a status and a presence/non-presence of allocation of a resource for a current TV service in response to a request made by the TV resource manager. For instance, as a result of confirmation of the resource manager within the media server processor 810/910, if all resources for the TV service are already allocated, it is able to notify the TV resource manager that all current resources are completely allocated. In doing so, the resource manager in the media server processor may request or assign TV pipeline generation for the requested TV service by removing a prescribed TV pipeline according to a priority or prescribed reference from TV pipelines previously assigned for the TV service, together with the notification. Alternatively, according to a status report of the resource manager in the media server processor 810/910, the TV resource manager may control TV pipelines to be appropriately removed, added, or established.
Meanwhile, BSP supports backward compatibility with an existing digital device for example.
The above-generated TV pipelines may operate appropriately in the corresponding processing process under the control of the path manager. The path manager may determine or control a processing path or process of pipelines by considering an operation of a pipeline generated by the media server processor 810/910 as well as the TV pipeline in the processing process.
If the application includes media data instead of TV service data, the data is processed by the media server processor 810/910. Herein, the media server processor 810/910 includes a resource manager, a policy manager, a media pipeline manager, a media pipeline controller and the like. Meanwhile, various pipelines generated under the control of the media pipeline manager and the media pipeline controller may include a camera preview pipeline, a cloud game pipeline, a media pipeline and the like. Streaming protocol, auto/static gstreamer, DRM and the like may be included in the media pipeline, of which processing flow may be determined under the control of the path manager. The former description with reference to
In the present specification, the resource manager in the media server processor 810/910 can perform a resource managing with a counter base for example.
The media server design on the aforementioned webOS platform is described in detail as follows.
A media server is a media framework supporting to enable 3rd-party multimedia pipeline(s) to interface with a webOS platform. The media server can control, manage, isolate and deconflict resources to enable the 3rd-party multimedia pipeline(s) to be compliant. Such a media server may be regarded as a platform module configured to provide a generalized API to enable an application to perform a media play and manage a hardware resource and policy consistently. Meanwhile, a design of a media server is devised to reduce complexity through media processing generalization and associated module separation.
The core of such a media server is to provide integration of service interface and webOS UI. To this end, a media server controls a resource manager, a policy manager and a pipeline manager and provides an API access according to a resource manager query.
A uMS connector is a main API or SDK that enables client media pipeline processes to interface with a media server. A Ums connector is an event or message about an interface. The client media pipelines implement client media pipeline state events for enabling load, play, pause, seek, stop, unload, release_resource, acquire_resource and the like.
A uMedia API provides C, C++ API to the media server.
The media resource manager provides a method of describing a use of media hardware resources and a use of a pipeline client resource using a single simple configuration file. The media resource manager provides all performance and information required for implementing a default or 3rd-party media policy management.
The media policy manager functions when a resource manager declines a media pipeline due to resource conflict. The policy manager can provide consistent API and SDK to enable 3rd-party policy manager implementation. The policy manager supports media pipelines matching LRU (least recently used) and may be used for one or more conflicted resources.
The pipeline manager tracks and maintains client media pipelines. The pipeline controller provides the pipeline manager with a consistent API so as to enable the pipeline manager to control and manage the client media pipelines.
The media server communicates with the resource manager through a library call, and the resource manager can communicate with TV services and a media pipeline through Luna-service Bus.
The media resource manager configures an overall configurable configuration file to describe media hardware and media client pipelines, detects a resource conflict, and collects all information necessary to implement media policy management.
A media policy manager reads policy_select and policy_action fields of a resource configuration file. A resource contention attempts to select an active pipeline described by the policy_select field and issues a problem for outgoing/selected pipelines based on the policy_action field. The selection function may include a parameter supported by a pipeline configuration setting entry. Policy actions include ‘unload’ and ‘release’. All pipelines support an unload command for releasing all allocated resources. A pipeline can additionally support a release command to release a specific resource. In the above description, the release command is provided for fast switch pipelines contending with common resources, and the unload command of all resources may not be required for deconflicting an incoming pipeline.
A pipeline manager manages a pipeline controller. The pipeline manager maintains a cunning queue of the pipeline controller and provides a unique indexing for an incoming message from application(s) through a media server.
The pipeline controller maintains a relation of a related media client pipeline process. The pipeline controller maintains all related states and provides a media client pipeline control interface to the pipeline manager. A pipeline client process is an individual process that uses a uMS connector to provide a control interface to the media server and the like. A pipeline (client) media technology (Gstreamer, Stage Fright) may be decoupled from media server management and services individually and completely.
Meanwhile, ‘image data’ disclosed in the present specification may be used to inclusively mean moving picture data (e.g., video) as well as still image data (e.g., picture/photograph, thumbnail image, capture image, etc.).
In the present specification, a menu is provided in a manner of overlaying an application running screen outputted through a full or main region of the screen by a web launcher or a men launcher (hereinafter named ‘menu launcher) in a webOS loaded digital device.
The menu is configured in a manner of including a first part including history data for a previously watched or run application and a second part listing one or more runnable applications. Here, the second part may provide the application list and also provide a content list related to an application currently provided through a main region. So to speak, the second part may provide at least one of a first mode and a second mode. In the following, the first mode of the second part is described as providing an application list and the second mode of the second part is described as providing a content list. Whether to provide first mode or the second mode to the second in providing a menu launcher may follow settings or be determined according to an application currently provided to a main region. The above determination may be made and provided according to various references such as an application attribute of the main region and the like depending on a user as well as the above-mentioned reference. Meanwhile, the first part and the second part can be called a recent part and a list part, respectively.
Each of the recent part and the list part may include at least one menu item. The menu item may be a unit for sorting, identifying and accessing a content, application, data or the like in each part. And, the menu item may be provided through a window.
Meanwhile, in the present specification, a menu, a recent part, a list part, a menu item or the like is illustrated and described in quadrangular or trapezoidal shape, by which the present invention is non-limited.
If an application run request signal is received, a digital device 1000 provides a running screen of the run-requested application through a full screen or a main region of a screen.
Here, the application means to include every application of a webOS loaded digital device, and the webOS loaded digital device may call it a web application. The application may include various applications such as a TV (broadcast) application for a broadcast service (e.g., a broadcast program, etc.), an application for an external input, an application for image data and the like. For clarity,
Meanwhile, the application may be launched through data already stored in a storage medium such as a memory of the digital device 1000 or application data downloaded or streamed from various external servers such as a broadcasting station and the like. In the present specification, ‘application’ and ‘application data’ may be interchangeably usable in some cases. In this case, the corresponding meaning may be determined through a context in a corresponding sentence or paragraph. Besides, in case of a preferred application or a frequently used application, a digital device receives application data using a preload function manually or automatically, thereby enabling a fast access or switching of the corresponding application.
The application running screen shown in
Meanwhile, if an application includes a TV application, it can be processed through the component(s) of the digital device shown in
If a signal for requesting a menu is received through an input means in the course of using an application like
Regarding a digital device of the related art, although a screen is switched overall to output a menu or a considerable or main part of a currently used application running screen is blocked or overlapped by the corresponding screen, it is inconvenient for a user to use a content.
A webOS platform loaded digital device according to the present invention provides the requested menu, as shown in
Therefore, if a menu is configured and provided, as shown in
Besides, the menu can change its size based on a position or movement of a pointer. For example, if a pointer is located on a menu, the menu is provided in an original menu launcher size by determining user's intention as a menu control or access. Here, a size of the menu launcher may be a maximum size or not. On the other hand, if the pointer is located not on the menu but on a different region (e.g., a currently used application output region), convenience in using the application can be enhanced in a manner of minimizing a size of the currently used menu launcher or temporarily hiding the currently used menu launcher by determining user's intention as an application control or access. If the menu launcher for providing the menu is temporarily hidden, it is able to notify a user that the menu launcher is already and currently outputted in response to a request. This is to minimize user's inconvenience in calling a menu again or using the menu. In brief, a menu launcher including a menu according to a user's request, an attribute of a currently used application, a position/movement of a pointer and the like can be provided as various versions and can be controlled to change its size and the like. Moreover, as described above, the menu launcher can be controlled automatically or manually after having been outputted.
As described above, a menu provided through a menu launcher can be mainly configured with two parts. A first part 1120 outputs recents data and a second part 1130 includes or outputs an application list or a content list.
To one side of the first part 1120, an interface 1140 for accessing history data related to the first part 1120 or additional history data is provided. The history data means data for an application previously provided to a screen before using an application currently provided through the screen. Hence, if a user selects the interface 1140, as shown in
Meanwhile, to one side of the second part 1130, an interface 1150 for accessing an application list related to the second part 1130 or an additional application list is provided. If a user selects the interface 1150, the digital device provides a menu launcher shown in
In
In the present specification, ‘access’ or ‘selection’ can be achieved in a manner that a pointer provided to a screen or the like is located in a prescribed region, part or the like during a prescribed time or by at least one of hovering, click, drag, drop and the like. And, the ‘selection’ may be made through separate key button(s) provided to a remote controller or a front panel of a digital device, performed through a voice, gesture or the like, or achieved by the combination thereof. Meanwhile, in the present specification, a remote controller is described by taking a motion remote controller as an example, by which the present invention is non-limited. For example, the remote controller may include another digital device such as a smartphone or the like.
Although
As described above, the second part 1130 of the menu launcher provides an application list. Yet, as shown in
Meanwhile, if a menu item corresponding to a prescribed one of the menu items corresponding to the applications listed on the second part 1130 of
A menu item selected from each part can be provided by being differentiated in a manner of differing from an unselected menu item in color, having an enlarged size larger than that of the unselected menu item, and the like. Besides, as shown in the drawing, a selected menu item is slightly lifted up, an outline of the selected menu item is highlighted, or separate title data of an application matching the corresponding menu item is outputted under the corresponding menu item. Moreover, at least two of the differentiated substances are combined to provide various configurations to differentiate the selected menu item from unselected menu items.
Like the above description and
In response to a user's menu request, a digital device can provide a content/application based menu configured like
Referring to
Unlike
A past mode relates to a menu configuration for previously used history data, and a future mode may include a menu configuration for a recommended content/application or the like to be used in the future. Meanwhile, as shown in
In the present specification, for clarity of the description, in response to a user's menu request, a digital device may be set to provide a menu of a present move as an initial menu through a menu launcher. In this case, in the digital device, a past mode and a future mode may be entered from the initially provided present mode, by which the present invention is non-limited. For example, if a user fails to request a menu of a specific mode, a digital device provides at least one of a past mode and a future mode through a menu launcher according to user's settings or various associated factors. Yet, for clarity of the following description, a menu launcher of a present mode is provided as a basic or default menu and a past or future mode is accessed from the present mode is taken as one example for description. Yet, regarding the present invention, a scenario for a mode switching, a mode entry or the like can be configured in various ways without being limited by the illustrated and escribed substances in the present specification.
First of all,
Referring to
A menu in a present mode is provided in a manner that the recent part 1220 and the list part 1230 are discriminated from each other with reference to a reference indicator 1240.
The recent part 1220 corresponds to the first part shown in
If receiving a selection signal of the recent part 1220, the digital device plays an application included in the recent part 1220, i.e., a previously run application in a manner of switching it to an application currently provided through a main region. In doing so, the play may start in continuation with a previously play stopped part or start from the beginning again. Meanwhile, if the switched application is a real-time application (e.g., a ral0time broadcast program), a user is guided to select a previous play stop timing or a point to be played in data at a current time and the corresponding selection may be followed. Or, although not shown, as described above, if the recent part 1220 is selected or a pointer 1260 is located within the recent part 1220, the digital device may provide a play icon (a reproducing icon) for a play of a corresponding application or detailed information on a corresponding content. Or, although not shown, if the recent part 1220 or the like is selected, as described above, a corresponding application can be run within the corresponding recent part 1220 without affecting an application currently run in the main region instead of being switched and played. In this case, by providing a play bar for the playback, the digital device may further provide data such as a current play position, a play end time, a time left to the play end and the like.
Meanwhile, if receiving a signal of selecting title data, server data or the like within the recent part 1220 through the pointer 1260, the digital device may provide an application related to the title or server data or guide data for a server access related to the application through a screen or browser. Here, the guide data or the browser is overlaid but can be provided at a most upper level on the screen.
The digital device may provide additional or related data in form of a menu icon in the recent part (1220) selection or access process in the aforementioned description. Yet, it is non-limited by the icon form provision scheme only.
Besides, in the present specification, the data for one application is provided by the recent part 1220 for clarity, which is just one example only and by which the present invention is non-limited.
Meanwhile, the aforementioned recents part is related to a past mode that will be described later. Yet, related details are omitted here but shall be described later.
In the above description, the reference indicator 1240 plays a role as a reference for discriminating the recent part 1220 and the list part 1230, i.e., a boundary role. For example, as described above, since the recent part 1220 outputs one recent data only, it is not scrolled to the left/right or the like. Yet, list part data can be scrolled to the left/right according to the number of the list part data. The list part data is configured with reference to the reference indicator 1240 and the recent part 1220 can continue to be provided at the corresponding location without being affected by the scroll. Moreover, if the menu launcher has a multi-step configuration instead of a single-step configuration with reference to top- to-bottom reference, as shown in the drawing, the reference indicator 1240 may perform a scroll function.
As described, the list part 1230 provides one or more menu items for an application. In doing so, the menu items are arranged for the list part of the present mode according to various references for example. The various references may mean at least one of settings of a user or digital device, a recently launched application order, a running time, a running count, a running frequency, an attribute and the like for example.
Each of the menu items, as shown in
As describe above, although the list part is not shown clearly, the list part provides a scroll bar. Hence, additional application menu items can be accessed with reference to the reference indicator 1240 without a change of screen configuration.
One embodiment of controlling a list part of a menu launcher is described with reference to
If receiving a signal of selecting a prescribed menu item belonging to a list part, as shown in
With reference to
In
When a digital device provides a menu launcher, it is difficult for a user to directly identify whether the second part is provided in first or second ode. Hence, after the menu has been provided, it may be required to switch a mode of the second part easily and conveniently. According to
Here, if receiving a selection signal of the mode switching interface 1420, the digital device provides the second part in a manner of switching a mode from the second mode in
Referring to
Referring to
Referring to
The scroll in
Meanwhile, if a pointer is located at a menu item, the digital device may perform an operation for a control of the corresponding menu item [not shown]. For example, if a pointer selects a menu item and is located on an edge of the selected menu item, the digital device can change a size of the corresponding menu item top and bottom or right and left. For example, if the pointer is located on a right edge of the selected menu item, as shown in
Referring to
Moreover, as described above, although
If a menu item is selected and the selected menu item is then moved in a top or bottom direction [
Or, while an application (e.g., MP3 application) of a menu item is currently played in
As shown in
If a menu item of a second part is moved in a screen bottom edge direction [
Meanwhile, in
In the following, as described above, a menu launcher configuration of a past mode menu, a control of the past mode menu and the like are described in detail with reference to
With respect to a past mode, a corresponding entry can be achieved in various ways. A past mode entry can be performed in various ways. For example, the past mode entry can be performed in a manner of accessing a recent part or an interface or part (e.g., the first part 1120 or the third part 1140 in
A scenario after a past mode entry is described in detail as follows.
Referring to
Meanwhile, in
Referring to
In present, past and future modes, various functions such as EPG/ESG data calling, instant/reserved recording, reserved viewing and the like may be used in direct through menu items [not shown].
Besides, while a past mode is used, if there is a previously provided history data, i.e., an update item (e.g., modification, alteration, etc.) of prescribed data for menu items, a digital device can identifiably provide it to a corresponding menu item or a prescribed region of a screen.
Referring to
In doing so, a position of a menu item 2310 for a previously viewed KBS2 professional baseball broadcast service application among the arranged menu items can be shifted within a past mode. For example, the menu item 2310 can be located between Halcyon Days application menu item and Rain on Me application menu item, which are arranged as viewed ahead of the menu item 2310. If a position of the prescribed menu item 2310 is shifted, as shown in
Referring to
If a selection signal of a menu item 2420 is received through a pointer 2410, a digital device slightly lifts up the corresponding menu item 2420 so as to indicate that the item 2420 is selected and then outputs at least one of a play icon 2430 and a play bar 2440 on the menu item. In doing so, if a selection signal of the play icon 2430 is received, the digital device may play a corresponding application in continuation with a previous paly part on a main region or a corresponding menu item window.
Although
Unlike
Unlike
Referring to
Yet, in configuring a menu of a past mode, referring to
In this case, if a user requests past mode data for other applications such as game, web browser, HDMI and the like as well as a TV service application, such applications can be provided as shown in
If a pointer is located on a prescribed bottom region of the currently provided menu launcher [
If the menu item switching interface 2630 of
Although
Meanwhile, in configuring a menu of a past mode, if there are many history data of the past mode, a group icon may be provided only unlike the former description [not shown in the drawing]. Such a group icon may include ‘a week ago’, yesterday’ or the like in
A menu configuration scheme described in the present specification may follow user's settings or be determined by a digital device for example. In the latter case, the determination can be made using various references such as user data, use pattern, time, date, day of week, weather and the like or by combining the references appropriately.
As described above, even in past mode, a menu item can perform a prescribed function by being moved in top, bottom, right and left directions.
If a prescribed menu item 2710 is selected through a pointer and the selected menu item is dragged to move in a bottom edge direction of a screen [
The related data 2722, 2724 and 2726 may include a function icon for performing a prescribed function for an application of the menu item, an application related to the above application, a preview image (previously viewed part included) for the application, and the like. For example, if the selected menu item relates to a baseball broadcast content, the related data 2722, 2724 and 2726 may include a baseball broadcast content of another channel
Finally, a menu configuration and control of a past mode is described in detail with reference to
As describe above, a future mode menu may be entered by accessing an interface 2820 for a future mode entry provided separately from a menu launcher through a pointer 2810 [
The future mode entry interface 2820 may be provided without a present mode menu in case of a menu calling like
If a selection signal of the future mode entry interface 2820/2830 is received [
A future mode means a menu configuration for recommendation and the like for an application/content not used by a user yet or an application/content having high availability.
In future mode, recommended applications/contents can be provided by being sorted by categories. Such categories may include a basic category, a live category, a TV shows category, a movies category, a music category, a recommended applications (Apps) category, a my contents category and the like. Embodiments of menu configurations for the respective categories are shown in
Referring to
The basic category of
Meanwhile, one or more applications/contents selected from the recommended applications/contents listed in
If a menu item is selected from the provided guide shown in
As described above,
Referring to
Meanwhile, at least one of the mobile device 3620, the relay 3630 and the second digital device 3640 may receive and output menu data provided by the digital device 3610. For example, if the digital device is outputting a present mode menu, at least one of the mobile device 3620 and the second digital device 3640 receives and outputs the present mode menu and may also receive and output a menu of a past or future mode as well as the present mode. Moreover, each of the mobile device 3620 and the second digital device 3640 may receive and output menu data of different modes from the digital device 3610, respectively.
Meanwhile,
A digital device receives a broadcast signal containing a content and signaling data [S3902], decodes the received content and signaling data, and then outputs them to a screen [S3904].
If receiving a first user input signal for a menu calling [S3906], the digital device configures a menu screen in a prescribed region of the content outputted screen and then outputs it in overlay form [S3908].
Thus, after the menu has been outputted together with the content, if a second user input signal is received [S3910], the digital device outputs a GUI for the outputted menu screen configuration switching according to the user input signal [S3912].
If receiving a third user input signal [S3914], the digital device switches and outputs a configuration of the menu screen according to the third user input signal [S3916].
Meanwhile, a method of providing a menu screen in a digital device according to one embodiment of the present invention may include receiving a content and a signaling data for the content, decoding the content and the signaling data, outputting the decoded content to a screen, receiving a first user input for a menu calling, outputting a menu screen to overlay a prescribed region of the content outputted screen in response to the first user input, the menu screen including an application list or a content list including at least one content related to the content currently outputted to the screen, detecting whether a pointer within the menu screen is located at or hovering on a prescribed region, outputting a GUI for a menu screen configuration switching of the outputted menu screen, and switching to output the menu screen configuration in response to a user's selection from the outputted GUI.
A method of providing a menu screen in a digital device according to another embodiment of the present invention may include receiving a content and a signaling data for the content, decoding the content and the signaling data, outputting the decoded content to a screen, collecting a history data for one or more contents used for the device, receiving a first user input for a menu calling, outputting a menu screen to overlay a prescribed region of the content outputted screen in response to the first user input, the menu screen including an application list or a content list including at least one content related to the content currently outputted to the screen, receiving a second user input for a menu selection, and outputting a timeline based content list by referring to the collected history data for the one or more contents in response to the second user input.
A method of providing a menu screen in a digital device according to further embodiment of the present invention may include receiving a content and a signaling data for the content, decoding the content and the signaling data, outputting the decoded content to a screen, receiving a first user input for a menu calling, outputting a menu screen to overlay a prescribed region of the content outputted screen in response to the first user input, the menu screen including an application list or a content list including at least one content related to the content currently outputted to the screen and an icon for entering a submenu screen, detecting whether a pointer within the menu screen is located at or hovering on the icon, and outputting the submenu screen, wherein the submenu screen includes one or more categories, wherein each of the categories includes a content list including one or more contents, and wherein the contents belonging to the content list is arranged based on at least one of time data including a season, weather data, emotional data associated with at least one of the time data and the weather data, retrieval ranking, and user's content use pattern data.
According various embodiments of the present invention, a digital device enables a user to access a desired function, data (e.g., content, etc.) more easily and quickly than the related art. And, the digital device enables desired data to be accessed and used more easily and quickly through minimum depth or screen change on a paged menu while minimizing disturbance in watching a content currently outputted to a main screen, i.e., a currently watched content. Moreover, the digital device configures and provides a more intuitive menu screen with maximized use convenience than the related art so as to enable everyone to use the digital device easily and conveniently.
A digital device and controlling method thereof according to the present invention can be achieved by combination of structural elements and features of the present invention. Each of the structural elements or features should be considered selectively unless specified separately. Also, some structural elements and/or features may be combined with one another to enable various modifications of the embodiments of the present invention. The description with reference to each drawing in the present specification may limited to the description of the corresponding drawing, by which the technical idea of the present invention is non-limited. Hence, contents failing to conflict with each other in the contents shown in the corresponding drawing or mentioned in the description part of the corresponding drawing are applicable to a related drawing or a description part of the related drawing intactly or by being appropriately combined together, which is included in the technical idea of the present invention as well.
Meanwhile, a digital device operating method of the present invention can be implemented in a program recorded medium, which can be read by a processor provided to the digital device, as processor-readable codes. The processor-readable media may include all kinds of recording devices in which data readable by a processor are stored. The processor-readable media may include ROM, RAM, CD-ROM, magnetic tapes, floppy discs, optical data storage devices, and the like for example and also include carrier-wave type implementations (e.g., transmission via Internet). Further, the recording medium readable by a processor is distributed to a computer system connected to a network, whereby codes readable by the processor by distribution can be saved and executed.
It will be appreciated by those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the inventions. Thus, it is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents. And, such modifications and variations should not be individually understood from the technical idea or prospect of the present invention.
The present invention relates to a digital device and is applicable to digital devices of various types. Therefore, the present invention has industrial applicability.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2015/004376 | 4/30/2015 | WO | 00 |