The disclosed embodiments generally relate to a system and a method for presenting video and additional information, and more specifically to a user interface (UI) with gesture controls and user controls for assembling and presenting the video and information. Further, aspects of the disclosure are also directed to viewing video and related information of events using the UI.
In various team sports, officials of a game and other people collect statistics such as shots, passes, points of the game. These game statistics can then be associated with a time line and recorded videos of the game. The statistics and recorded videos are viewed by spectators, coaches, and players during and after the game for various purposes such as game analysis or for entertainment, etc. Examples of such recordings may include, for example, a video of a game of basketball for broadcasting purposes or for use by coaches to evaluate how well a player works within the team context, strength and weakness analysis, shots, assists, rebounds, steals turnovers, or game improvement leads.
There are known methods for collecting statistics and associated video data and one such system has been described by the applicant in a patent application number U.S. Ser. No. 13/471,404 filed on 14 May 2012. Typically, users view game videos on a computer, laptop, or television by viewing complete videos or selecting events and points in the game to view a specific video section.
The problem associated with prior systems and methods include an inability of users, especially coaches, of the system to track game progress or study videos and statistics in real time and an inability to use a portable computing and communication device such as tablet, mobile phone, and iPad®. Additionally, there are inefficiencies in available systems for displaying and selecting the statistics and related videos for a team or player.
Hence, there exists a need for a system, method, and a user interface that solves the problems of viewing an event video associating statistics with appropriate portions of the event and controlling video playback, that reduces time spent in identifying relevant sections of the event, selecting relevant videos for observers, and controlling the video and statistics for the observers. It would be advantageous to implement the system, method and user interface on a portable computing and communication device such as tablet, mobile phone, and iPad® and that overcomes the above-mentioned limitations of existing methods of presenting information.
The disclosed embodiments provide a system and method for presenting video and additional information on various devices including mobile devices.
In one aspect, embodiments of the present disclosure provide a system for presenting video and information on the various devices. Optionally, the system includes a server system that collects and receives information from one or more venues over a communication network such as Internet, Local Area Network (LAN), Wide Area Network (WAN), etc. The information from the venues may include video and/or audio streams from one or more cameras recording an event, statistics collected from the event and location data of objects associated with the venue or the event, for example, participants or equipment.
While the disclosed embodiments are described in the context of a sporting event, it should be understood that the disclosed embodiments may be applicable to any suitable venue, for example, a concert hall, banquet hall, theater, sports arena, or any applicable location, and may be applicable to any suitable event, for example, a musical or theatrical presentation, an awards ceremony or other celebration, or even a simple gathering.
In the context of a sporting event, location data may be collected related to objects, such as players, referees and sporting equipment and projectiles used in the sport. Statistics may also include actions such shots, goals, points, penalties, special events, game breaks, special moves, game events etc. Optionally, location data can be collected, for example, using indoor and/or outdoor positioning means.
In accordance with at least one embodiment of the present disclosure, the server combines information sources to a common time line and may provide an analysis of the event. Said combining may enable interested parties such as subsequent viewers, participants, sports spectators, coaches, or broadcasters to easily select and access event related information. In some embodiments the information may be selectable and accessible over one or more communication networks such as the Internet, LAN, WAN, etc.
In accordance with another embodiment of the present disclosure, information from the server system can be accessed with a user terminal. The user terminal can be any suitable device, fixed, portable or mobile, and may include one or more mobile phones, portable computing devices, wireless enabled devices including, although are not limited to, smart phones, Mobile Internet Devices (MID), wireless-enabled tablet computers, Ultra-Mobile Personal Computers (UMPC), phablets, tablet computers, Personal Digital Assistants (PDA), web pads, cellular phones, iPhones®, laptops or desktop computers. The user terminal may have a touch screen to enable easy usage of the statistics services. The user terminal can also share all or some of the data to an external display device such as a large screen television, projection or other display used by other persons (for example other persons portable computing device). Video, as well as statistics and coordination footage can be presented in many ways. The server system can be configured to allow different levels of access for different users.
In another aspect of the disclosed embodiments, the system includes a user interface (UI) that is used to access information on the user terminals.
Optionally, the user interface has a configuration that may change depending on different orientations of the display, and may display various items including a time line, statistics, videos, and may include a gesture control for controlling various aspects of the user interface, for example, scrolling speed or the direction and speed of the video.
Optionally, the User interface configuration change may be based on an orientation of the display.
Optionally, the User interface can be used for displaying rich media content and gestures may be used to control the system user interface (UI). The gestures may include timeline scroll and resize gestures or playback control gestures.
In another aspect, an embodiment of the present disclosure provides a method for implementing gesture control and to prevent wrong recognition of gestures. The method facilitated gesture control provides enhanced video playback control and is user-friendly. The gesture control may simulate reel scrolling, further enhancing a user experience.
Further in accordance with an alternative or additional embodiment of the present disclosure, information from the server system can be accessed with a user terminal such as a mobile device with touch screen. The touch screen can be configured to detect multi touch gestures. The user gesture might include the user placing two fingers in the touch screen and determining a middle point between the two fingers. Further the two fingers can define a rectangular area on the screen i.e. an area for gesture detecting operation. Gesture detecting operation can have a point such as corner of the area (as a point of gesture detecting operation). Further the gesture can include detecting the changes in angle between this middle point and the point of the gesture detecting operation. The changes in the angle can define a velocity and rotation velocity of a reel. The reel rotation velocity can correlate with playback parameters.
In accordance with yet another embodiment of the present disclosure, a system and method including the user interface (UI) can be used for various on field team sports such as hockey, football, American football, association football, basketball, volleyball, tennis, water polo, rugby, lacrosse, cricket, handball, ice hockey, and many others.
Additional aspects, advantages, features and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative embodiments.
It will be appreciated that features of the disclosed embodiments are susceptible to being combined in various combinations or further improvements without departing from the scope of the embodiments as disclosed herein.
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the disclosed embodiments are not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
Referring now to the drawings, particularly by their reference numbers,
In at least one of the disclosed embodiments, server 102 may include computer readable program code stored on at least one computer readable medium 112 for carrying out and executing the processes and methods described herein. The computer readable medium 112 may also be referred to as a memory of server 102. In alternate aspects, the computer readable program code may be stored in a memory external to, or remote from server 102. The memory may include magnetic media, semiconductor media, optical media, or any media which may be readable and executable by a computer. Server 102 may also include a processor 114 for executing the computer readable program code stored in the memory 112. In at least one aspect, server 102 may include one or more input or output devices, generally referred to, for example, the user interface (UI) described herein, which may operate to allow input to the server 102 or to provide output from the server 102, respectively.
The processor 114, with the computer readable program code, is configured to receive one or more audio/video streams 104 from an event, collect statistics 106 related to the event, collect location data 108 of objects related to the event, and store the audio visual streams, collected statistics, and collected location data as data elements in files in memory 112. The processor, with the computer readable program code, is configured to combine the data elements into a common time line for analysis of the event and to provide an improved user interface for presenting one or more synchronized views of the audio/video streams 104, collected statistics 106, and collected location data 108 on the common time line. The combined audio visual streams, collected statistics, and collected location data may be referred to as the combined information.
The server 102 combines information sources to a common time line and provides for analysis of the event. Said combining enables interested parties such as current or subsequent viewers, sports spectators, coaches, or broadcasters to easily select and access information over a communication network 110 such as the Internet.
In a non-limiting example, videos related to a certain statistic/event such as a goal and a time of the goal are identified from the statistics data 106; the server system 102 selects the video data 104 at the identified time. The video steam can be selected such that it extends to a certain time before and after the event such as 20 sec before until 10 sec after the goal to make a 30 sec video clip. The duration can be dependent on users' preferences as well. Additionally, where the positions and/or direction of the cameras are known then the server 102 can further combine location data 108 of the players to enable following certain players during the making of the goal.
Information from the server system 102 can be accessed with user terminal 120. The user terminal 120 can be one or more fixed, portable, or mobile computing devices or wireless enabled devices including, although not limited to, smart phones, Mobile Internet Devices (MID), wireless-enabled tablet computers, Ultra-Mobile Personal Computers (UMPC), phablets, tablet computers, Personal Digital Assistants (PDA), web pads, cellular phones, and iPhone®, laptops or desktops. In some embodiments the device may have a touch screen to enable easy usage of the statistics services. The user terminal 120 can also share all or some of the data with an external display device 122 such as a large screen television, projection or other display used by other persons (for example, other persons portable computing device). Video, as well as statistics and coordination footage can be presented in many ways. The server system 102 can be configured to allow different levels of access for different users.
In
According to the disclosed embodiments, the system may include a user interface (UI) that is used to access information using the user terminal 120. In some exemplary embodiments, the user terminal 120 may be a web tablet with a touch screen with multi touch support and may be used in a basketball game. The multi touch support refers to a device display's ability to recognize if there are two or more pointing devices or fingers touching the screen at the same time. The computing device may have a communication interface to access information from server system 102 through, for example, a wireless local area network, a cellular channel or a wired means. In an optional embodiment, the user terminal 120 is capable of further relying and/or communicating some or all of the information to an external device or display. In an optional embodiment the user terminal 120 has accelerometers to detect orientation of the user terminal 120 to determine which side of the user terminal 120 is up and which is down.
In prior art solutions, if a user terminal 120 is turned around or rotated, the UI typically turns or rotates so that the image is turned around to enable the same view independently of the orientation of the user terminal 120. Based on the disclosed embodiments, the user interface may be modified based on the orientation of the display. In at least one embodiment, the user interface orientation and change in orientation may be sensed by an accelerometer or other sensor in the user terminal 120. In the example shown in
In a further example, in the initial position, the portion of the user interface nearest the upper side (Side A) of the user terminal (above the mirror axis element) may include information contents related to certain statistics, such a as two point shots and three point shots, related to the home team, and the portion of the user interface nearest the lower side (Side B) of the user terminal (below the mirror axis element) may have information content, such as play time of the players, related to the visitor team. When the tablet is rotated to the second position, the portion of the user interface nearest the upper side may show the information content related to the visitor team, and the portion of the user interface nearest the lower side may show the information content related to the home team. In this example, the information content class may stay the same in the upper (such as two and three point shots) and lower portions (such as play time of the players) before and after rotation, but the team represented (home team, visitor team) in the upper and lower portions may change with the rotation. In other words, when the user terminal is rotated, the information content class for the visitor team would be same as it was for the home team before turning, i.e. statistics related to two point shots and three point shots. The information content class for the home team would be the same as it was for the visitor team before turning i.e. statistics related to play time.
The player section may display all the players participating in the game. Along the timeline, a time spent on the court is shown for each player by using player lanes. Icons of events associated with a certain player may be placed over the time on court indication. These icons can represent different activities, for example, field goals (distinguished by number of points obtained or not, or if the shot was contested by another player), free throws, rebounds, turnovers etc. On the right hand panel, player specific statistics are present.
The offensive plays section shows the detected type of offensive plays used in the game. These offensive plays can be defined by the coach and detected from the coordinate footage automatically. A coach may also be able to define the name of the offensive play. On the right hand panel, with logic similar to the player section, the offensive play specific statistics are calculated.
In at least one embodiment, the processor, with the computer readable program code, is further configured to provide a gesture control mechanism in at least one of the statistics view and video view. Furthermore, the processor, with the computer readable program code, may also be configured to provide a gesture control mechanism for synchronized playback of the statistics view and video view. Thus, scrolling through one view may cause a synchronized playback of the other view. While the statistics view 608 and video view 610 are shown on different devices, it should be understood that both views may also be shown on the same device as will be explained further below. In other embodiments, the statistics view and video view may be displayed and controlled individually.
The gesture controlled device 600 may be operable for controlling video playback or statistics playback and may include a gesture based control on a touch screen for watching videos related to the time line. In at least one embodiment, the processor, with the computer readable program code, as described above, is configured to control a display of the data elements or combined information using gestures.
In prior art video play back solutions, a user points with a finger or other pointer to a place on the time line where the user wants to show the video. This is not user friendly, particularly in the case of showing fast movements in sports. For example, using a finger or pointer, one may not be able to control the speed of the video precisely and may not be able to go, for example, frame by frame back and/or forth.
According to the disclosed embodiments, the user interface (UI) includes a gesture control for controlling playback. In
The overlay wheel may be displayed at the top of the screen after a certain time out of having two fingers positioned on the touch screen. In at least one embodiment, to avoid an unwanted display of the wheel 604, a time out of approximately 400 msec can be implemented to overcome any accidental interpretation of other gestures with two or more fingers. The wheel will appear after said time out. Thus, the at least one gesture is recognized after remaining stable for a predetermined period of time. In at least one embodiment, the at least one gesture is recognized after remaining stable for a predetermined period of time and the gesture is substantially rotational.
The playhead 606 can be dragged along the statistics view and can be set to a position representing a specific moment in the match. Wheel 604 may not be presented when the play head 606 is moving. Similar to the embodiment described above, the wheel 604 may only be displayed when the play head 606 is stable within for approximately 400 milliseconds. This solution provides much faster performance considering, that most of the time, the coach will use remote footage, and seeking might take some time.
While the embodiment of
In another embodiment, characteristics of the playhead 606 may change according to a play rate, i.e. a video play rate. As an example, when paused, the playhead 606 may have a solid white color, and may continuously change its opacity, creating a simple pulsing effect. However, this is merely an example and different playhead 606 states can be applied, when i.e. current player time is out/in the game time, etc. An ordinary person skilled in the art would appreciate that the time values, such as 400 msec, or color variations are exemplary and that there can be other variations of the time out period or colors which may be implemented, for example, for aesthetic reasons or according to user preferences. These examples should not unduly limit the scope of the application herein.
In at least one embodiment, rich content may be used in the user interface (UI), where gestures are the domain of controlling the system user interface (UI). A gesture on top of the timeline screen in the user interface (UI) can be divided in two main categories viz. Timeline scroll & resize gestures and Playback control gestures.
Timeline scroll and resize gestures may be implemented for vertical synchronized scroll gestures. The above mentioned views can be referred as (SZTimelineView, SZStatsView, and SZLabelsView) and also may also respond to a pinch to resize the timeline and to a horizontal timeline scroll gesture. Playback control gestures may include a Playhead 606 move gesture, a tap with two fingers (play/pause), or two finger fine seeking with wheel 604, and a tap to select a timeline event.
One or more embodiments include a method to prevent wrong recognition of gestures, hereinafter referred to as “UIGestureRecognizer”, a delegate protocol class that prevents or allows simultaneous gesture recognition. For implementing the method, simultaneous recognition may be disabled for all UIGestureRecognizersclasses and subclasses, except a scrolling gesture recognizer, so the timeline can be moved simultaneously to all directions using the wheel 604 i.e. using a two finger fine-seeking gesture. Although there are gestures available to seek to a desired moment in the game, none of them (tap to select event, or dragging the playhead 606) provides fine seeking, frame by frame or second by second in the game. This type of seeking is advantageous when the coach or other observer wants to go through a certain moment in the game step by step. This may be implemented with wheel 604 which may be displayed as an overlay on top of a UI showing statistics or on top of a UI showing videos. The presently disclosed embodiments suggest a solution implementing this gesture, for events on the timeline and for selecting or playing back a particular segment of video. In one or more embodiments, the two finger rotational gesture may include a fine scrubbing gesture. The system enables gesture control which is responsive to the velocity of the scrolling i.e. when scrolling is faster, the video is forwarded faster.
A score graph section 708 may represent score changes within the game time. The graph shows a score difference so the coach has a great overview as to which team is in a better situation within a certain game time. On the side panels, game specific data are displayed. In order to display game data for the other team, sections of the side panels are mirrored below the score graph.
The gestures described above may be recognized by a gesture recognizer implemented by sub classing the “UIGestureRecognizer”. The gesture recognizer may be a function stored as computer readable program code in memory 112. The processor 114 of server 102, with the computer readable program code, is configured to implement the gesture recognizer. The “UIGestureRecognizer” has following states: When recognizing this gesture, the existence of multiple gestures is taken into account. The problem is solved by keeping the gesture recognizer in state “Possible” for 400 milliseconds, without going forward to “Begin” state. During this threshold period, initializing of any other gesture recognizer can cancel this gesture recognizer. The reason is to prevent simultaneous pinch to resize and fine seeking. When started, the gesture recognizer takes the middle point between two fingers and detects the changes in angle between this middle point and the point of a gesture detecting operation, for example a “UIView” property. This angle also defines the velocity, because the gesture recognizer period is constant. According to this angle, the number of frames to seek is calculated. When the gesture is changed, an immediate response is provided by popping up the rotational reel on the top of the screen. The reel is rotating according to the rotational angle. It is visible in both modes, when the gesture is recognized. This gesture provides a very good control on the video playback and is very effective and user-friendly. It also simulates the real reel scrolling such that the user experience about this gesture is even more enhanced.
Thus, the processor, with the computer readable program code, is configured to detect a gesture of the user using a mobile device with a touch screen with multi touch detecting capabilities, a gesture detecting operation calculating the middle point between two fingers and detecting the changes in an angle between the middle point and a point of the gesture detecting operation, and based upon the change of angle, calculate the rotation velocity of a reel, wherein the rotation velocity of the reel is adapted to define playback parameters.
This is merely an example of one of the gestures which makes video playback control on this system user interface (UI) unique. An ordinary person skilled in the art would appreciate that the time values such as 400 msec or number of fingers used to activate the gesture are merely examples. There can be other variations of said features for functional reasons or to adjust user preferences. These examples should not unduly limit the scope of the embodiments disclosed herein.
In at least one embodiment, camera position and orientation data may be collected from video camera systems used to record the event. Using one or more of the camera position, camera location, camera orientation, and collected location data of the players 904, 906, player locations can be determined in each frame of the video. This allows a user to interact with the video in real-time or when re-playing the footage.
A user may select one or more objects to be tracked (e.g. a player, referee, piece of equipment, ball, puck) on the video by tapping on the user interface in the example view 900, 920. The mark/information field 908, showing the selected tracked object's position on the video may be shown. The mark follows the tracked object's location at the video frame as the playback of the video progresses. Additional supplemental information based on the statistics of the tracked object can be shown on the video overlay/statistics field 910.
A user can zoom in or out at the tracked object at any time. The zoomed display 920 as shown in
One of ordinary skill in the art would recognize many variations, alternatives, and modifications of embodiments herein.
One or more of the disclosed embodiments can be used for various purposes, including, though not limited to, enabling users to view game statistics, game videos, build game strategies, entertainment, browse video catalogues, etc.
Modifications to the embodiments described in the foregoing are possible without departing from the scope of the presently disclosed embodiments as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “consisting of”, “have”, “is” used to describe and claim the disclosed embodiments are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural.
Number | Date | Country | |
---|---|---|---|
61823097 | May 2013 | US |