The present invention relates to a network system including at least first and second communication terminals capable of communication with each other, a communication method, and a communication terminal. Particularly, the present invention relates to a network system in which first and second communication terminals reproduce the same motion picture contents, a communication method, and a communication terminal.
There is known a network system in which a plurality of communication terminals capable of connecting on the Internet exchange a hand-drawing image. For example, a server/client system, a P2P (Peer to Peer) system and the like can be cited. In such a network system, each communication terminal transmits and/or receives a hand-drawing image, text data, and the like. Each communication terminal provides a display of a hand-drawing image and/or text on the display device based on received data.
There is also known a communication terminal that downloads contents including a motion picture from a server that stores such contents, through the Internet or the like, to reproduce the downloaded contents.
For example, Japanese Patent Laying-Open No. 2006-4190 (PTL 1) discloses a chat service system for mobile phones. According to Japanese Patent Laying-Open No. 2006-4190 (PTL 1), the system includes a distribution server causing a plurality of mobile phone terminals and a Web terminal for an operator, connected for communication on the Internet, to form a motion picture display region and text display region on the browser display screen of the terminal, and distribute the motion picture data that is streaming-displayed at the motion picture display region, and a chat server supporting a chat between the mobile phone terminals and the operator Web terminal and causing chat data that is constituted of text data to be displayed at the text display region. The chat server allows each operator Web terminal to establish, relative to the plurality of mobile phone terminals, a chat channel independently for each mobile phone terminal.
It is difficult for a plurality of users to transmit/receive information related to the motion picture contents while looking at the motion picture contents. For example, the progressing state of the contents may differ between each of the communication terminals. There is a possibility that the intention of a user transmitting (entering) information cannot be conveyed effectively to a user receiving (viewing) the information. Furthermore, even if the user of the first communication terminal wishes to send comments on a first scene, there is a possibility that the relevant comments will be displayed in a second scene at the second communication terminal.
The present invention is directed to solving such problems, and an object is to provide a network system in which the intention of a user transmitting (entering) information can be conveyed effectively to a user receiving (viewing) the information, a communication method, and a communication terminal.
According to an aspect of the present invention, there is provided a network system including first and second communication terminals. The first communication terminal includes a first communication device for communicating with the second communication terminal, a first touch panel for displaying motion picture contents, and a first processor for accepting input of a hand-drawing image via the first touch panel. The first processor transmits the hand-drawing image input during display of the motion picture contents, and start information for identifying a point of time when input of the hand-drawing image at the motion picture contents is started to the second communication terminal via the first communication device. The second communication terminal includes a second touch panel for displaying the motion picture contents, a second communication device for receiving the hand-drawing image and start information from the first communication terminal, and a second processor for displaying the hand-drawing image from the point of time when input of the hand-drawing image at the motion picture contents is started, on the second touch panel, based on the start information.
Preferably, the network system further includes a contents server for distributing motion picture contents. The first processor obtains motion picture contents from the contents server according to a download instruction, and transmits motion picture information for identifying the motion picture contents obtained to the second communication terminal via the first communication device. The second processor obtains the motion picture contents from the contents server based on the motion picture information.
Preferably, the first processor transmits an instruction to eliminate the hand-drawing image to the second communication terminal via the first communication device, when the scene of the motion picture contents changes and/or when an instruction to clear the input hand-drawing image is accepted.
Preferably, the second processor calculates a time starting from the point of time when input is started up to a point of time when a scene in the motion picture contents is changed, and determines a drawing speed of the hand-drawing image on the second touch panel based on the calculated time.
Preferably, the second processor calculates the length of a scene in the motion picture contents including the point of time when input is started, and determines the drawing speed of the hand-drawing image on the second touch panel based on the calculated length.
According to another aspect of the present invention, there is provided a communication method at a network system including first and second communication terminals capable of communication with each other. The communication method includes the steps of: displaying, by the first communication terminal, motion picture contents; accepting, by the first communication terminal, input of a hand-drawing image; transmitting, by the first communication terminal, to the second communication terminal the hand-drawing image input during display of the motion picture contents and start information for identifying the point of time when input of the hand-drawing image at the motion picture contents is started; displaying, by the second communication terminal, the motion picture contents; receiving, by the second communication terminal, the hand-drawing image and start information from the first communication terminal; and displaying, by the second communication terminal, the hand-drawing image from the point of time when input of the hand-drawing image at the motion picture contents is started, based on the start information.
According to another aspect of the present invention, there is provided a communication terminal capable of communicating with an other communication terminal. The communication terminal includes a communication device for communicating with an other communication terminal, a touch panel for displaying motion picture contents, and a processor for accepting input of a first hand-drawing image via the touch panel. The processor transmits the first hand-drawing image input during display of the motion picture contents and first start information for identifying the point of time when input of the first hand-drawing image at the motion picture contents is started to the other communication terminal via the communication device, receives a second hand-drawing image and second start information from the other communication terminal, and causes display of the second hand-drawing image from the point of time when input of the second hand-drawing image at the motion picture contents is started, on the touch panel, based on the second start information.
According to another aspect of the present invention, there is provided a communication method at a communication terminal including a communication device, a touch panel, and a processor. The communication method includes the steps of: causing, by the processor, display of motion picture contents on the touch panel; accepting, by the processor, input of a first hand-drawing image via the touch panel; transmitting, by the processor, the first hand-drawing image input during display of the motion picture contents and start information for identifying the point of time when input of the first hand-drawing image at the motion picture contents is started to an other communication terminal via the communication device; receiving, by the processor, a second hand-drawing image and second start information from the other communication terminal via the communication device; and causing, by the processor, display of the second hand-drawing image from the point of time when input of the second hand-drawing image at the motion picture contents is started on the touch panel, based on the second start information.
By a network system, communication method, and communication terminal of the present invention, the intention of a user transmitting (entering) information can be conveyed more effectively to a user receiving (viewing) the information.
Embodiments will be described hereinafter with reference to the drawings. In the description, the same elements have the same reference characters allotted, and their designation and function are also identical. Therefore, detailed description thereof will not be repeated.
The following description is based on a mobile phone 100 as a typical example of a “communication terminal”. The communication terminal may be any other information communication device that can be connected on a network such as a personal computer, a car navigation system (satellite navigation system), a PND (Personal Navigation Device), a PDA (Personal Data Assistance), a game machine, an electronic dictionary, an electronic book, or the like.
First, an entire configuration of a network system 1 according to the present embodiment will be described.
For the sake of simplification, network system 1 of the present embodiment will be described based on the case where first mobile phone 100A, second mobile phone 100B, third mobile phone 100C and fourth mobile phone 100D are incorporated. Mobile phones 100A, 100B, 100C and 100D may be generically referred to as mobile phone 100 when a configuration or function common to each of mobile phones 100A, 100B, 100C and 100D is described. Furthermore, mobile phones 100A, 100B, 100C and 100D, car navigation device 200, and personal computer 300 may also be generically referred to as a communication terminal when a configuration or function common to each thereof is to be described.
Mobile phone 100 is configured to allow connection to carrier network 700. Car navigation device 200 is configured to allow connection to Internet 500. Personal computer 300 is configured to allow connection to Internet 500 via a local area network (LAN) 350 or a wide area network (WAN). Chat server 400 is configured to allow connection to Internet 500. Contents server 600 is configured to allow connection to Internet 500.
In more detail, first mobile phone 100A, second mobile phone 100B, third mobile phone 100C and fourth mobile phone 100D, car navigation device 200 and personal computer 300 can be connected with each other and transmit/receive data mutually via Internet 500 and/or carrier network 700 and/or a mail transmission server (chat server 400 in
In the present embodiment, mobile phone 100, car navigation device 200, and personal computer 300 have identification information for identifying itself (for example, mail address, Internet protocol (IP) address, or the like) assigned. Mobile phone 100, car navigation device 200, and personal computer 300 can store the identification information of another communication terminal in its internal recording medium, and can carry out data transmission/reception with that other communication terminal via carrier network 700 or Internet 500 based on the identification information.
Mobile phone 100, car navigation device 200, and personal computer 300 of the present embodiment can use the IP address assigned to another terminal for data transmission/reception with the relevant other communication terminal without the intervention of servers 400 and 600. In other words, mobile phone 100, car navigation device 200, and personal computer 300 in network system 1 of the present embodiment can establish the so-called P2P (Peer to Peer) type network.
When each communication terminal is to gain access to chat server 400, i.e. each communication terminal gains access on the Internet, it is assumed that an IP address is assigned by chat server 400 or a server device not shown. Since the details of this IP address assigning process is well known, description thereof will not be repeated.
Mobile phone 100, car navigation device 200, and personal computer 300 can receive various motion picture contents from contents server 600 via Internet 500. The users of mobile phone 100, car navigation device 200, and personal computer 300 can view the motion picture contents from contents server 600.
<Overall Operation Overview of Network System 1>
The operation overview of network system 1 according to the present embodiment will be described hereinafter.
As shown in
The following description is based on the case where each communication terminal transmits/receives a message and/or attach file via a chat room generated by chat server 400. Further, the case where first mobile phone 100A generates a new chat room, and invites a second mobile phone 100B to that chat room will be described. Chat server 400 may be configured to play the role of contents server 600.
First, first mobile phone 100A (terminal A in
Chat server 400 responds to the request to store the mail address of first mobile phone 100A in association with its IP address. Chat server 400 produces a room name, and generates a chat room of the relevant room name, based on the mail address of first mobile phone 100A and the mail address of second mobile phone 100B. At this stage, chat server 400 may notify first mobile phone 100A that generation of a chat room is completed. Chat server 400 stores the room name and the IP address of the participating communication terminal in association.
Alternatively, first mobile phone 100A produces a room name of a new chat room, and transmits that room name to chat server 400, based on the mail address of first mobile phone 100A and the mail address of second mobile phone 100B. Chat server 400 generates a new chat room based on the room name.
First mobile phone 100A transmits to second mobile phone 100B a mail message informing that a new chat room has been generated, i.e. requesting P2P participation indicating an invitation to that chat room (step S0004, step S0006). Specifically, first mobile phone 100A transmits P2P participation request mail to second mobile phone 100E via carrier network 700, mail transmission server (chat server 400) and Internet 500 (step S0004, step S0006).
Upon receiving the P2P participation request mail (step S0006), second mobile phone 100B produces a room name based on the mail address of first mobile phone 100A and the mail address of second mobile phone 100B, and transmits to chat server 400 the mail address and IP address of second mobile phone 100B as well as a message indicating participation in the chat room of that room name (step S0008). Second mobile phone 100B may obtain the IP address at the same time, or first obtain an IP address, and then gain access to chat server 400.
Chat server 400 accepts that message and determines whether the mail address of second mobile phone 100B corresponds to the room name, and then stores the mail address of second mobile phone 100B in association with the IP address. Then, chat server 400 transmits to first mobile phone 100A a mail message informing that second mobile phone 100B is participating in the chat room and the IP address of second mobile phone 100B (step S0010). At the same time, chat server 400 transmits to second mobile phone 100B a mail message informing acceptance of the participation in the chat room and the IP address of first mobile phone 100A.
First mobile phone 100A and second mobile phone 100B obtain the mail address and IP address of the other party to authenticate each other (step S0012). Upon completing authentication, first mobile phone 100A and second mobile phone 100B initiate P2P communication (chat communication) (step S0014). The operation overview during P2P communication will be described afterwards.
In response to first mobile phone 100A transmitting a message informing disconnection of P2P communication to second mobile phone 100B (step S0016), second mobile phone 100B transmits a message informing that the disconnection request has been accepted to first mobile phone 100A (step S0018). First mobile phone 100A transmits a request for eliminating the chat room to chat server 400 (step S0020). Chat server 400 eliminates the chat room.
The operation overview of network system 1 according to the present embodiment will be described hereinafter in further detail with reference to
As shown in
As shown in
First mobile phone 100A and second mobile phone 100B may both receive the contents from contents server 600 upon starting P2P communication, i.e. during P2P communication.
As shown in
As shown in
Thus, as shown in
More specifically, in the present embodiment, first mobile phone 100A accepts input of a hand-drawing image from a user, and displays the hand-drawing image over the contents. First mobile phone 100A transmits the hand-drawing image to second mobile phone 100B. Second mobile phone 100B displays the hand-drawing image on the contents based on the hand-drawing image from first mobile phone 100A.
In an opposite manner, second mobile phone 100B accepts input of a hand-drawing image from a user and displays that hand-drawing image over the contents. Second mobile phone 100B transmits the hand-drawing image to first mobile phone 100A. Second mobile phone 100B displays the hand-drawing image over the contents based on the hand-drawing image from first mobile phone 100A.
After first mobile phone 100A disconnects P2P communication (step S0016, step S0018), second mobile phone 100B can carry out mail transmission with first mobile phone 100A and the like, as shown in
<Operation Overview Related to Hand-Drawing Image Transmission/Reception at Network System 1>
The operation overview related to input and drawing of a hand-drawing image during reproduction of motion picture contents will be described in further detail hereinafter.
Referring to
One mobile phone (first mobile phone 100A in
In other words, the length of period starting when the motion picture contents is started up to the time when drawing a hand-drawing image is started is the same for each of mobile phones 100A-100D. Namely, each of mobile phones 100A-100D will display the hand-drawing image input at first mobile phone 100A on the same scene in the same motion picture contents. In other words, each of mobile phones 100A-100D begins to draw the hand-drawing image input at first mobile phone 100A on the relevant motion picture contents at an elapse of the same time from starting the motion picture contents.
Thus, in network system 1 of the present embodiment, the hand-drawing image input at a communication terminal can be displayed for other communication terminals on the same scene or same frame even though respective communication terminals download the motion picture contents individually from contents server 600.
Therefore, when the user of one communication terminal wishes to convey his/her information related to a certain scene, the relevant information will be displayed together with the certain one scene at other communication terminals. In other words, the intention of a user transmitting (entering) information can be conveyed effectively to a user receiving (viewing) the information.
A configuration of network system 1 to realize such function will be described in detail hereinafter.
<Hardware Configuration of Mobile Phone 100>
The hardware configuration of mobile phone 100 according to the present embodiment will be described hereinafter.
As shown in
Display 107 according to the present embodiment realizes a touch panel 102 constituted of a liquid crystal panel or a CRT. In other words, mobile phone 100 of the present embodiment has a pen tablet 104 provided at the upper side (top side) of display 107. Accordingly, the user can enter hand-drawing such as graphical information to CPU 106 via pen tablet 104 by using a stylus pen 120 or the like.
The user can input hand-drawing by other methods, as set forth below. By using a special pen that outputs infrared ray or ultrasonic wave, the movement of the pen is identified by a reception unit receiving an infrared ray or ultrasonic wave emitted from the pen. In this case, by connecting the relevant reception unit to a device that stores the trace, CPU 106 can receive the trace output from the relevant device as hand-drawing input.
Alternatively, the user can write down, on an electrostatic panel, a hand-drawing image using his/her finger or a pen corresponding to the electrostatic field.
Thus, display 107 (touch panel 102) provides the display of an image or text based on the data output from CPU 106. For example, display 107 shows the motion picture contents received via communication device 101. Display 107 can show a hand-drawing image overlapped on the motion picture contents, based on the hand-drawing image accepted via tablet 104 or accepted via communication device 101.
Various-type button 110 accepts information from a user through key input operation or the like. For example, various-type button 110 includes a TEL button 110A for accepting/dispatching conversation, a mail button 110B for accepting/dispatching mail, a P2P button 110C for accepting/dispatching P2P communication, an address book button 110D for invoking address book data, and an end button 110E for ending various processing. In other words, various-type button 110 selectively accepts, from a user, an instruction to participate in a chat room and/or an instruction to display the mail contents when P2P participation request mail is received via communication device 101.
Furthermore, various-type button 110 may include a button to accept an instruction to start hand-drawing input, i.e. a button for accepting a first input. Various-type button 110 may also include a button for accepting an instruction to end a hand-drawing input, i.e. a button for accepting a second input.
First notification unit 111 issues a ringing sound via a speaker 109 or the like. Alternatively, first notification unit 111 has vibration capability. First notification unit 111 issues sound or causes mobile phone 100 to vibrate when called, when receiving mail, or when receiving P2P participation request mail.
Second notification unit 112 includes a telephone LED (Light Emitting Diode) 112A that blinks when receiving a call, a mail LED 112B that blinks when receiving mail, and P2P LED 112C that blinks when receiving P2P communication.
CPU 106 controls various elements in mobile phone 100. For example, various instructions are accepted from the user via various-type button 110 to transmit/receive data to/from communication device 101 or an external communication terminal via communication device 101.
Communication device 101 converts communication data from CPU 106 into communication signals for output to an external source. Communication device 101 converts externally applied communication signals into communication data for input to CPU 106.
Memory 103 is realized by a random access memory (RAM) functioning as a work memory, a read only memory (ROM) for storing a control program and the like, a hard disk storing image data, and the like.
As shown in
As shown in
As shown in
As shown in
Each mobile phone 100 according to the present embodiment can transmit/receive data to/from another communication terminal by the method set forth above (refer to
<Hardware Configuration of Chat Server 400 and Contents Server 600>
The hardware configuration of chat server 400 and contents server 600 according to the present embodiment will be described hereinafter. First, the hardware configuration of chat server 400 will be described.
Memory 406 serves to store various information. For example, memory 406 temporarily stores data required for execution of a program at CPU 405. Hard disk 407 stores a program and/or database for execution by CPU 405. CPU 405 is a device controlling each element in chat server 400 for implementing various operations.
Communication device 409 converts the data output from CPU 405 into electrical signals for transmission outwards, and converts externally received electrical signals into data for input to CPU 405. Specifically, communication device 409 transmits the data from CPU 405 to a device that can be connected on the network such as mobile phone 100, car navigation device 200, personal computer 300, a game machine, an electronic dictionary, and an electronic book via Internet 500 and/or carrier network 700. Communication device 409 applies data received from a device that can be connected on the network such as mobile phone 100, car navigation device 200, personal computer 300, a game machine, an electronic dictionary, and an electronic book to CPU 405 via Internet 500 and/or carrier network 700.
The data stored in memory 406 or hard disk 407 will be described hereinafter.
As shown in
As will be described afterwards, room name R is determined based on the mail address of the communication terminal having an IP address of A and the mail address of a communication terminal having an IP address of B by CPU 406. When a communication terminal having an IP address of E newly enters the chat room of room name S at the state of
Specifically, when first mobile phone 100A requests generation of a new chat room (step S0002 in
When second mobile phone 100B request participation in the chat room to chat server 400 (step S0008 in
The hardware configuration of contents server 600 will be described hereinafter. As shown in
Memory 606 stores various types of information. For example, memory 606 temporarily stores data required for execution of a program at CPU 605. Hard disk 607 stores the program and/or database for execution by CPU 605. CPU 605 is a device for controlling various elements in contents server 600 to implement various operations.
Communication device 609 transmits data output from CPU 605 into electrical signals for transmission, and converts externally applied electrical signals into data for input to CPU 605. Specifically, communication device 609 transmits the data from CPU 605 to the device that can be connected on the network such as mobile phone 100, car navigation device 200, personal computer 300, a game machine, an electronic dictionary, and an electronic book via Internet 500, carrier network 700, and the like. Communication device 609 inputs the data received from a device that can be connected on the network such as mobile phone 100, car navigation device 200, personal computer 300, a game machine, an electronic dictionary, and an electronic book to CPU 605 via Internet 500, carrier network 700.
Memory 606 or hard disk 615 of contents server 600 stores motion picture contents. CPU 605 of contents server 600 receives a specification of contents (an address or the like indicating the storage destination of the motion picture contents) from first mobile phone 100A and second mobile phone 100B via communication device 609. Based on the specification of the contents, CPU 605 of contents server 600 reads out the motion picture contents corresponding to that specification from memory 606 to transmit the relevant contents to first mobile phone 100A and second mobile phone 100B via communication device 609.
<Communication Processing at Mobile Phone 100>
P2P communication processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Hereinafter, transmission of specification of motion picture contents, a hand-drawing image or the like from first mobile phone 100A to second mobile phone 100B will be described hereinafter. In the present embodiment, first mobile phone 100A and second mobile phone 100B transmits/receives data via chat server 400. However, data may be transmitted/received through P2P communication without the intervention of chat server 400. In this case, first mobile phone 100A must store data or transmit data to second mobile phone 100B or third mobile phone 100C, on behalf of chat server 400.
Referring to
As used herein “data associated with chat communication” includes the chat room ID, member's terminal information, notification (notice information), the chat contents up to the present time, and the like.
CPU 106 of first mobile phone 100A causes touch panel 102 to display a window for chat communication (step S006). Similarly, CPU 106 of second mobile phone 100B causes touch panel 102 to display a window for chat communication (step S008).
CPU 106 of first mobile phone 100A receives motion picture contents via communication device 101 based on a contents reproduction instruction from a user (step S010). More specifically, CPU 106 receives an instruction to specify motion picture contents from the user via touch panel 102. The user may directly enter URL (Uniform Resource Locator) at first mobile phone 100A, or select a link corresponding to the desired motion picture contents on the currently displayed Web page.
CPU 106 of first mobile phone 100A uses communication device 101 to transmit motion picture information (a) for identifying selected motion picture contents to another communication terminal participating in the chat via chat server 400 (step S012). Alternatively, CPU 106 of first mobile phone 100A uses communication device 101 to transmit motion picture information (a) for identifying selected motion picture contents directly to another communication terminal participating in the chat by P2P communication. As shown in
As shown in
CPU 106 of second mobile phone 100B receives motion picture information (a) from chat server 400 via communication device 101 (step S016). CPU 106 analyzes the motion picture information (step S018), and downloads the motion picture contents from contents server 600 (step S020). As shown in
The present example is based on, but not limited to the case where first mobile phone 100A and second mobile phone 100B obtain motion picture information during chat communication. First mobile phone 100A and second mobile phone 100B may obtain common motion picture information prior to chat communication.
It is assumed that third mobile phone 100C participates in the chat subsequently. CPU 106 of third mobile phone 100C obtains the chat data from chat server 400 via communication device 101 (step S024).
At this stage, chat server 400 stores motion picture information (a) from first mobile phone 100A. CPU 405 of chat server 400 transmits motion picture information (a) as a portion of the chat data to third mobile phone 100C via communication device 409.
CPU 106 of third mobile phone 100C analyzes the chat data to obtain motion picture information (step S026). CPU 106 obtains motion picture contents from contents server 600 based on the motion picture information (step S028). As shown in
It is here assumed that CPU 106 accepts hand-drawing input by a user via touch panel 102 during reproduction of the motion picture contents at first mobile phone 100A (step S032).
More specifically, CPU 106 obtains change in the touching position on touch panel 102 (trace) by sequentially accepting touch coordinate data from touch panel 102 at every predetermined time. Then, as shown in
Hand-drawing clear information (b) includes information (true) for clearing the hand-drawing input up to that time or information (false) for continuing hand-drawing input. Information (c) indicating the trace of the touching position includes the coordinates of each apex constituting a hand-drawing stroke, and the elapsed time from the point of time when hand-drawing input corresponding to respective apexes is started. Timing information (f) also indicates the timing when the drawing of a hand-drawing image should be started. More specifically, timing information (f) includes the time (ms) from starting motion picture contents, information to identify a scene in the motion picture contents (scene number or the like), information to identify the frame in the motion picture contents (frame number or the like), when hand-drawing input is accepted at first mobile phone 100A.
At this stage, i.e. at step S032, CPU 106 causes display of the input hand-drawing image on the motion picture contents (overlapping on the motion picture contents) at touch panel 102. As shown in
As shown in
CPU 106 repeats the processing of steps S032-S034 every time input of a hand-drawing image is accepted. Alternatively, CPU 106 repeats the processing of steps S032-S036 every time input of a hand-drawing image is accepted. As shown in
CPU 106 uses communication device 101 to transmit the relevant transmission data to another communication terminal participating in the chat via chat server 400 (step S036). CPU 405 of chat server 400 stores transmission data (b)-(f) in memory 406 for any communication terminal that comes to participate later on. At the current point of time, second mobile phone 100B and third mobile phone 100C are participating in the chat. Alternatively, CPU 106 uses communication device 101 to directly transmit the relevant transmission data to another communication terminal participating in the chat through P2P communication (step S036).
CPU 106 of second mobile phone 100B receives transmission data (b)-(f) from chat server 400 via communication device 101 (step S038). CPU 106 analyzes the transmission data (step S040). As shown in
As shown in
CPU 106 of third mobile phone 100C receives the transmission data from chat server 400 via communication device 101 (step S044). CPU 106 analyzes the transmission data (step S046). As shown in
As shown in
Then, it is assumed that fourth mobile phone 100D comes to participate in the chat. More specifically, it is assumed that fourth mobile phone 100D participates in the chat after input of a hand-drawing image ends at first mobile phone 100A. Whether reproduction of the motion picture contents has ended or not at first mobile phone 100A, second mobile phone 100B and third mobile phone 100C is irrespective.
CPU 106 of fourth mobile phone 100D obtains the chat data from chat server 400 via communication device 101 (step, S050). At this stage, chat server 400 stores motion picture information (a) from first mobile phone 100A. CPU 405 of chat server 400 transmits motion picture information (a) and transmission data (b)-(f) stored up to that point of time as a portion of chat data to fourth mobile phone 100D via communication device 409.
CPU 106 of fourth mobile phone 100D analyzes the chat data to obtain the motion picture information and transmission data (step S052). CPU 106 obtains the motion picture contents from contents server 600 based on the motion picture information (step S054). As shown in
As shown in
As shown in
Accordingly, the hand-drawing image is drawn at second mobile phone 100B, third mobile phone 100C and fourth mobile phone 100D at a timing identical to that in the motion picture contents having the hand-drawing image input at first mobile phone 100A. In other words, the desired information is drawn at the scene intended by the user of first mobile phone 100A even at second mobile phone 100B, third mobile phone 100C and fourth mobile phone 100D.
<Modification of Communication Processing at Mobile Phone 100>
A modification of P2P communication processing at mobile phone 100 of the present embodiment will be described hereinafter.
Specifically,
Referring to
As used herein, “data associated with chat communication” includes the chat room ID, member's terminal information, notification (notice information), the chat contents up to the present time, and the like.
CPU 106 of first mobile phone 100A causes touch panel 102 to display a window for chat communication (step S106). Similarly, CPU 106 of second mobile phone 100B causes touch panel 102 to display a window for chat communication (step S108).
CPU 106 of first mobile phone 100A receives motion picture contents via communication device 101 based on a contents reproduction instruction from the user (step S110). More specifically, CPU 106 receives an instruction to specify motion picture contents from the user via touch panel 102. The user may directly enter URL at first mobile phone 100A, or select a link corresponding to the desired motion picture contents on the currently displayed Web page.
As shown in
It is here assumed that CPU 106 accepts hand-drawing input by a user via touch panel 102 during reproduction of the motion picture contents at first mobile phone 100A (step S114).
More specifically, CPU 106 obtains change in the touching position on touch panel 102 (trace) by sequentially accepting touch coordinate data from touch panel 102 at every predetermined time. Then, as shown in
Hand-drawing clear information (b) includes information (true) for clearing the hand-drawing input up to that time or information (false) for continuing hand-drawing input. Timing information (f) indicates the timing when hand-drawing should be effected. More specifically, timing information (f) includes the time (ms) from starting motion picture contents, information to identify a scene in the motion picture contents (scene number or the like), information to identify the frame in the motion picture contents (frame number or the like), when hand-drawing input is accepted at first mobile phone 100A.
At this stage, i.e. at step S114, CPU 106 causes display of the input hand-drawing image on the motion picture contents (overlapping on the motion picture contents) at touch panel 102 based on transmission data. As shown in
As shown in
CPU 106 repeats the processing of steps S114-S116 every time input of a hand-drawing image is accepted. As shown in
CPU 106 uses communication device 101 to transmit motion picture information (a) and the already-created transmission data (b)-(f) to another communication terminal participating in the chat via chat server 400 (step S120). As shown in
Alternatively, CPU 106 uses communication device 101 to directly transmit motion picture information (a) and the already-created transmission data (b)-(f) to another communication terminal participating in the chat by P2P transmission (step S120). In this case, CPU 106 stores motion picture information (a) and all transmission data (b)-(f) already produced in its own memory 103.
CPU 405 of chat server 400 may leave motion picture information (a) and transmission data (b)-(f) in memory 406 for any communication terminal that may participate in the chat later on. At the current point of time, second mobile phone 100B is participating in the chat.
CPU 106 of second mobile phone 100B receives motion picture information (a) and transmission data (b)-(f) from chat server 400 via communication device 101 (step S122). CPU 106 analyzes motion picture information (a) and transmission data (b)-(f) (step S124). CPU 106 downloads the motion picture contents from contents server 600 (step S126). As shown in
As shown in
As shown in
Accordingly, the hand-drawing image is drawn at second mobile phone 100B, at a timing identical to that in the motion picture contents having the hand-drawing image input at first mobile phone 100A. In other words, the desired information is drawn at the scene intended by the user of first mobile phone 100A even at second mobile phone 100B.
<Input Processing at Mobile Phone 100>
The input processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
When the pen information setting process (step S300) ends, CPU 106 determines whether data (b) is true or not (step S202). When data (b) is true (YES at step S202), CPU 106 stores data (b) in memory 103 (step S204). CPU 106 ends the input processing.
When data (b) is not true (NO at step S202), CPU 106 determines whether stylus pen 120 has touched touch panel 102 or not (step S206). In other words, CPU 106 determines whether pen-down has been detected or not.
When pen-down is not detected (NO at step S206), CPU 106 determines whether the touching position of stylus pen 120 against touch panel 102 has changed or not (step S208). In other words, CPU 106 determines whether pen-dragging has been detected or not. When pen-dragging has not been detected (NO at step S208), CPU 106 ends the input processing.
When CPU 106 detects pen-down (YES at step S206), or pen-dragging (YES at step S208), CPU 106 sets “false” for data (b) (step S210). CPU 106 executes the hand-drawing processing (step S400). The hand-drawing process (step S400) will be described afterwards.
When the hand-drawing processing (step S400) ends, CPU 106 stores data (b) (c), (d), (e) and (f) in memory 103 (step S212). CPU 106 ends the input processing.
(Pen Information Setting Processing at Mobile Phone 100)
The pen information setting processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
When an instruction to clear the hand-drawing image has not been accepted from the user (NO at step S302), CPU 106 sets “false” for data (e) (step S306). CPU 106 determines whether an instruction to modify the color of the pen has been accepted or not from the user via touch panel 102 (step S308). When an instruction to modify the color of the pen has not been accepted from the user (NO at step S308), CPU 106 executes the process starting from step S312.
When an instruction to modify the color of the pen has been accepted from the user (YES at step S308), CPU 106 sets the modified color of the pen for data (d) (step S310). CPU 106 determines whether an instruction to modify the width of the pen has been accepted or not from the user via touch panel 102 (step S312). When an instruction to modify the width of the pen has not been accepted from the user (NO at step S312), CPU 106 ends the pen information setting processing.
When an instruction to modify the width of the pen has been accepted from the user (YES at step S312), CPU 106 sets the modified width of the pen for data (e) (step S314). CPU 106 ends the pen information setting processing.
(Hand-Drawing Processing at Mobile Phone 100)
The hand-drawing processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
When stylus pen 120 is touching touch panel 102 (YES at step S402), CPU 106 refers to a clock not shown to obtain the elapsed time from starting the motion picture contents (step S404). CPU 106 sets the time (period) from starting motion picture contents up to starting hand-drawing input for data (f) (step S406).
In the following, CPU 106 may set information to identify a scene or information to identify a frame, instead of the time (period) from starting motion picture contents up to starting hand-drawing input. This is because the intention of the person entering the hand-drawing image can be readily conveyed if the scene is identified.
CPU 106 obtains via touch panel 102 the touching coordinates (X, Y) of stylus pen 120 on touch panel 102 and current time (T) (step S408). CPU 106 sets “X, Y, T” for data (c) (step S410).
CPU 106 determines whether a predetermined time has elapsed from the time of obtaining the previous coordinates (step S412). When the predetermined time has not elapsed (NO at step S412), CPU 106 repeats the processing from step S308.
When the predetermined time has elapsed (YES at step S412), CPU 106 determines whether pen-dragging has been detected or not by a touch panel 102 (step S414). When pen-dragging has not been detected (NO at step S414), CPU 106 executes the processing from step S420.
When pen-dragging has been detected (YES at step S414), CPU 106 obtains via touch panel 102 the touching position coordinates (X, Y) of stylus pen 120 on touch panel 102 and the current time (T) (step S416). CPU 106 adds “: X, Y, T” to data (c) (step S418). CPU 106 determines whether a predetermined time has elapsed from obtaining the previous touching coordinates (step S420). When the predetermined time has not elapsed (NO at step S420), CPU 106 skips the processing from step S420.
When the predetermined time has elapsed (YES at step S420), CPU 106 determines whether pen-up has been detected via touch panel 102 (step S422). When pen-up has not been detected (NO at step S422), CPU 106 repeats the processing from step S414.
When pen-up has been detected (YES at step S422), CPU 106 obtains via touch panel 102 the touching position coordinates (X, Y) of the stylus pen on touch panel 102 and the current time (T) (step S424). CPU 106 adds “: X, Y, T” to data (c) (step S426). CPU 106 ends the hand-drawing processing.
<Modification of Input Processing at Mobile Phone 100>
A modification of input processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Specifically, the input processing set forth above with reference to
Referring to
When the pen information setting processing (step S300) ends, CPU 106 determines whether data (b) is “true” or not (step S252). When data (b) is “true” (YES at step S252), CPU 106 stores data (b) in memory 103 (step S254). CPU 106 ends the input processing.
When data (b) is not true (NO at step S252), CPU 106 determines whether stylus pen 120 has touched touch panel 102 or not (step S256). In other words, CPU 106 determines whether pen-down has been detected or not.
When pen-down has not been detected (NO at step S256), CPU 106 determines whether the touching position of stylus pen 120 on touch panel 102 has changed or not (step S258). In other words, CPU 106 determines whether pen-dragging has been detected or not. When pen-dragging has not been detected (NO at step S258), CPU 106 ends the input processing.
When pen-down has been detected (YES at step S256), or when pen-dragging has been detected (YES at step S258), CPU 106 sets “false” for data (b) (step S260). CPU 106 executes the hand-drawing processing (step S400) set forth above.
When the hand-drawing processing (step S400) ends, CPU 106 determines whether the scene has been changed or not (step S262). More specifically, CPU 106 determines whether the scene when hand-drawing input has been started differs from the current scene or not. Instead of determining whether the scene has changed or not, CPU 106 may determine whether a predetermined time has elapsed from the pen-up.
When the scene has not changed (NO at step S262), CPU 106 adds “:” to data (c) (step S264). CPU 106 determines whether a predetermined time has elapsed from the previous hand-drawing processing (step S266). When the predetermined time has not elapsed (NO at step S266), CPU 106 repeats the processing from step S266. When the predetermined time has elapsed (YES at step S266), CPU 106 repeats the processing from step S400.
When the scene has changed (YES at step S262), CPU 106 stores data (b), (c), (d), (e) and (f) into memory 103 (step S268). CPU 106 ends the input processing.
<Hand-Drawing Image Display Processing at Mobile Phone 100>
The hand-drawing image display processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
CPU 106 determines whether time=t is established or not (step S516). When time=t is not established (NO at step S516), CPU 106 repeats the processing from step S514.
When time=t is established (YES at step S516), CPU 106 obtains the coordinates of the apex of the hand-drawing stroke (data (c)) (step S518). CPU 106 obtains the count “n” of apexes coordinates of the hand-drawing stroke (step S520).
CPU 106 executes the first drawing processing (step S610). The first drawing processing (step S610) will be described afterwards. Then, CPU 106 ends the hand-drawing image display processing.
(First Drawing Processing at Mobile Phone 100)
The first drawing processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
When the time Ct (i+1) has elapsed from time t (YES at step S614), CPU 106 uses touch panel 102 to draw a hand-drawing stroke by connecting coordinates (Cxi, Cyi) and coordinates (Cx (i+1), Cy (i+1)) by a line (step S616). CPU 106 increments variable i (step S618).
CPU 106 determines whether variable i is greater than or equal to the count n (step S620). When variable i is less than n (NO at step S620), CPU 106 repeats the processing from step S614. When variable i is greater than or equal to the count n (YES at step S620), CPU 106 ends the first drawing processing.
The relationship between the input and output of a hand-drawing image according to the present embodiment will be described hereinafter.
As mentioned above, CPU 106 of the communication terminal having a hand-drawing image input (first communication terminal) generates transmission data every time a hand-drawing image is input (from pen-down to pen-up), or when a clear instruction is input, or when the scene has changed. For example, when the scene changes during input of a hand-drawing image, transmission data indicating the hand-drawing image up to the point of time when the scene changes is produced.
Referring to
<First Modification of Hand-Drawing Image Display Processing at Mobile Phone 100>
A first modification of the hand-drawing image display processing at mobile phone 100 according to the present embodiment will be described hereinafter.
When the time required for inputting the hand-drawing image is longer than the period of time from starting hand-drawing input up to the next change of scene, the communication terminal according to the present modification can complete the drawing of the hand-drawing image before the scene is changed by shortening the drawing time. In other words, the case where input of a hand-drawing image can be continued independent of scene change (without the hand-drawing image being cleared at the change of a scene) will be described.
Referring to
CPU 106 determines whether time=t is established or not (step S536). When time=t is not established (NO at step S536), CPU 106 repeats the processing from step S534.
When time=t is established (YES at step S536), CPU 106 obtains the coordinates of the apex of the hand-drawing stroke (data (c)) (step S538). CPU 106 obtains the count “n” of apexes coordinates of the hand-drawing stroke (step S540).
CPU 106 refers to the motion picture contents to obtain the time T before the next change of scene from timing information “time” (step S542). CPU 106 determines whether time T is greater than or equal to the time Ct×n between apexes (step S544).
When time T is greater than or equal to the time Ct×n between apexes (YES at step S544), CPU 106 executes the first drawing processing (step S610) set forth above. CPU 106 ends the hand-drawing image display processing. This corresponds to the case where clear information is input prior to a change of scene or when a predetermined time has elapsed from pen-up before a change of scene.
When time T is less than time Ct×n between apexes (NO at step S544), CPU 106 executes the second drawing processing (step S630). The second drawing processing (step S630) will be described afterwards. Then, CPU 106 ends the hand-drawing image display processing. This corresponds to the case where a change of scene has occurred during input of a hand-drawing image.
(Second Drawing Processing at Mobile Phone 100)
The second drawing processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
CPU 106 enters 1 to variable i (step S634). CPU 106 determines whether time dt×i has elapsed from time t (step S636). When the time dt×i has not elapsed from time t (NO at step S636), CPU 106 repeats the processing from step S636.
When the time dt×i has elapsed from time t (YES at step S636), CPU 106 uses touch panel 102 to draw a hand-drawing stroke by connecting coordinates (Cxi, Cyi) and coordinates (Cx (i+1), Cy (i+1)) by a line (step S638). CPU 106 increments variable i (step S640).
CPU 106 determines whether variable i is greater than or equal to the count n (step S642). When variable i is less than n (NO at step S642), CPU 106 repeats the processing from step S636. When variable i is greater than or equal to the count n (YES at step S642), CPU 106 ends the second drawing processing.
The relationship between the input and output of a hand-drawing image according to the present modification will be described hereinafter.
As mentioned above, CPU 106 of the communication terminal having a hand-drawing image input (first communication terminal) generates transmission data every time a hand-drawing image is input (from pen-down to pen-up), or when a clear instruction is input in the present modification.
Referring to
<Second Modification of Hand-Drawing Image Display Processing at Mobile Phone 100>
A second modification of hand-drawing image processing at mobile phone 100 according to the present embodiment will be described.
Referring to
CPU 106 obtains time T1 that starts from starting reproducing the motion picture contents up to the change of scene immediately previous to the scene corresponding to timing information “time” (step S556). In other words, the scene corresponding to timing information “time” is identified, and a length Ti that starts from starting reproducing the motion picture contents until the ending point of time of the scene immediately previous to the relevant scene is obtained. CPU 106 obtains a reproducing time t of the motion picture contents (a period of time that starts from the point of time when the motion picture contents is started up to the current time) (step S558).
CPU 106 determines whether Ti=t is established or not (step S560). When Ti=t is not established (NO at step S560), CPU 106 repeats the processing from step S558.
When Ti=t is established (YES at step S560), CPU 106 obtains the coordinates of the apex of the hand-drawing stroke (data (c)) (step S562). CPU 106 obtains the count “n” of the apexes coordinates of the hand-drawing stroke (step S564).
CPU 106 executes the third drawing processing (step S650). The third drawing processing (step S650) will be described afterwards. Then, CPU 106 ends the hand-drawing image display processing.
(Third Drawing Processing at Mobile Phone 100)
The third drawing processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
CPU 106 inserts 1 to a variable i (step S654). CPU 106 determines whether a time dt×i has elapsed from the reproducing time (time t) (step S656). When time dt×i has not elapsed from time t (NO at step S656), CPU 106 repeats the processing from step S656.
When time dt×i has elapsed from time t (YES at step S656), CPU 106 uses touch panel 102 to draw a hand-drawing stroke by connecting coordinates (Cxi, Cyi) and coordinates (Cx (i+1), Cy (i+1)) by a line (step S658). CPU 106 increments variable i (step S660).
CPU 106 determines whether variable i is greater than or equal to the count n (step S662). When variable i is less than n (NO at step S662), CPU 106 repeats the processing from step S656. When variable i is greater than or equal to the count n (YES at step S662), CPU 106 ends the third drawing processing.
The relationship between the input and output of a hand-drawing image according to the present embodiment will be described hereinafter.
As mentioned above, CPU 106 of the communication terminal having a hand-drawing image input (first communication terminal) generates transmission data every time a hand-drawing image is input (from pen-down to pen-up), or when a clear instruction is input.
Referring to
Even if the user of the transmission side enters a hand-drawing image spanning a plurality of scenes, the communication terminal of the recipient side can complete drawing the hand-drawing image sufficiently within the scene intended by the user of the transmission side. In other words, the communication terminal of the recipient side will begin to draw the hand-drawing image at a timing earlier than the point of time when input of a hand-drawing image is started at the communication terminal of the transmission side, i.e. from the point of time of starting the scene to which the point of time when input of the hand-drawing image is started belongs to.
A second embodiment of the present invention will be described hereinafter. Network system 1 according to the first embodiment set forth above has the motion picture contents reproduced at a different timing between each of the communication terminals (first mobile phone 100A, second mobile phone 100B, third mobile phone 100C, and fourth mobile phone 100D). In contrast, network system 1 of the present embodiment effectively conveys the intention of a user transmitting (entering) information to the user receiving (viewing) the information by having each communication terminal start reproducing the motion picture contents at the same time.
Elements similar to those of network system 1 of the first embodiment have the same reference number allotted. Their functions are also identical. Therefore, description of such constituent elements will not be repeated. For example, the overall configuration of network system 1, the overall operation overview of network system 1, the hardware configuration of mobile phone 100, chat server 400, and contents server 600, and the like are similar to those of the first embodiment. Therefore, description thereof will not be repeated.
<Communication Processing at Mobile Phone 100>
P2P communication processing at mobile phone 100 of the present embodiment will be described hereinafter.
The following description is based on the case where first mobile phone 100A transmits a hand-drawing image to second mobile phone 100B. In the present embodiment, first mobile phone 100A and second mobile phone 100B transmit/receive data via chat server 400. However, data may be transmitted/received through P2P communication without the intervention of chat server 400. In this case, first mobile phone 100A must store data or transmit data to second mobile phone 100B or third mobile phone 100C, on behalf of chat server 400.
Referring to
As used herein “data associated with chat communication” includes the chat room ID, member's terminal information, notification (notice information), the chat contents up to the present time, and the like.
CPU 106 of first mobile phone 100A causes touch panel 102 to display a window for chat communication (step S706). Similarly, CPU 106 of second mobile phone 100B causes touch panel 102 to display a window for chat communication (step S708).
CPU 106 of first mobile phone 100A receives motion picture contents via communication device 101 based on a contents reproduction instruction from the user (step S710). More specifically, CPU 106 receives an instruction to specify motion picture contents from the user via touch panel 102. The user may directly enter URL at first mobile phone 100A, or select a link corresponding to the desired motion picture contents on the currently displayed Web page.
CPU 106 of first mobile phone 100A uses communication device 101 to transmit motion picture information (a) for identifying selected motion picture contents to another communication terminal participating in the chat via chat server 400 (step S712). As shown in
CPU 106 of second mobile phone 100B receives motion picture information (a) from chat server 400 via communication device 101 (step S714). CPU 106 analyzes the motion picture information (step S716), and downloads the motion picture contents from contents server 600 (step S718).
CPU 106 transmits a message to first mobile phone 100A informing that preparation of reproducing motion picture contents has been completed via communication device 101 (step S720). CPU 106 of first mobile phone 100A receives that message from second mobile phone 100B via communication device 101 (step S722).
CPU 106 of first mobile phone 100A begins to reproduce the received motion picture contents via touch panel 102 (step S724). CPU 106 may output the sound of motion picture contents via speaker 109. Similarly, CPU 106 of second mobile phone 100B begins to reproduce the received motion picture contents via touch panel 102 (step S726). At this stage, CPU 106 may have the sound of the motion picture contents output via speaker 109.
It is here assumed that CPU 106 accepts hand-drawing input by a user via touch panel 102 during reproduction of the motion picture contents at first mobile phone 100A (step S728).
More specifically, CPU 106 obtains change in the touching position on touch panel 102 (trace) by sequentially accepting touch coordinate data from touch panel 102 at every predetermined time. At this stage, i.e. at step S728, CPU 106 causes display of the input hand-drawing image on the motion picture contents (overlapping on the motion picture contents) at touch panel 102. CPU 106 causes display of a hand-drawing image at touch panel 102 according to input of the hand-drawing image.
Then, as shown in
CPU 106 of first mobile phone 100A uses communication device 101 to transmit transmission data to second mobile phone 100B via chat server 400 (step S732). CPU 106 of second mobile phone 100B receives the transmission data from first mobile phone 100A via communication device 101 (step S734).
CPU 106 of second mobile phone 100B analyzes the transmission data (step S736). CPU 106 of second mobile phone 100B causes display of a hand-drawing image at touch panel 102 based on the analyzed result (step S738).
Every time a scene in the motion picture contents is changed, the hand-drawing image input up to that time will be cleared at first mobile phone 100A of the present embodiment. CPU 106 may transmit clear information (true) using communication device 101 at the change of a scene. CPU 106 of second mobile phone 100B may eliminate the hand-drawing image based on clear information from first mobile phone 100A. Alternatively, CPU 106 may determine by itself that the scene has been changed, and eliminate the hand-drawing image.
CPU 106 of first mobile phone 100A repeats the processing from step S728 to step S732 every time input of hand-drawing is accepted. By contrast, CPU 106 of second mobile phone 100B repeats the processing from step S734—step S738 every time transmission data is received.
CPU 106 of first mobile phone 100A ends the reproduction of the motion picture contents (step S740). CPU 106 of second mobile phone 100B ends the reproduction of the motion picture contents (step S742).
Accordingly, the hand-drawing image is drawn at second mobile phone 100B, at a timing identical to that in the motion picture contents having the hand-drawing image input at first mobile phone 100A. In other words, at second mobile phone 100B, the desired information is drawn at the scene intended by the user of first mobile phone 100A.
<Input Processing at Mobile Phone 100>
The input processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
When the pen information setting process (step S300) ends, CPU 106 determines whether data (b) is true or not (step S802). When data (b) is true (YES at step S802), CPU 106 stores data (b) in memory 103 (step S804). CPU 106 ends the input processing.
When data (b) is not true (NO at step S802), CPU 106 determines whether stylus pen 120 has touched touch panel 102 or not (step S806). In other words, CPU 106 determines whether pen-down has been detected or not.
When pen-down is not detected (NO at step S806), CPU 106 determines whether the touching position of stylus pen 120 against touch panel 102 has changed or not (step S808). In other words, CPU 106 determines whether pen-dragging has been detected or not. When pen-dragging has not been detected (NO at step S808), CPU 106 ends the input processing.
When CPU 106 detects pen-down (YES at step S806), or pen-dragging (YES at step S808), CPU 106 sets data (b) at “false” (step S810). CPU 106 executes the hand-drawing processing (step S900). The hand-drawing process (step S900) will be described afterwards.
When the hand-drawing processing (step S900) ends, CPU 106 stores data (b) (c), (d), and (e) in memory 103 (step S812). CPU 106 ends the input processing.
(Hand-Drawing Processing at Mobile Phone 100)
The hand-drawing processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
CPU 106 determines whether a predetermined time has elapsed from obtaining the previous coordinates (step S906). When the predetermined time has not elapsed (NO at step S906), CPU 106 repeats the processing from step S906.
When the predetermined time has elapsed (YES at step S906), CPU 106 determines whether pen-dragging has been detected or not via touch panel 102 (step S908). When pen-dragging has not been detected (NO at step S908), CPU 106 determines whether pen-up has been detected or not via touch panel 102 (step S910). When pen-up has not been detected (NO at step S910), CPU 106 repeats the processing from step S906.
When pen-dragging has been detected (YES at step S908) or when pen-up has been detected (YES at step S910), CPU 106 obtains via touch panel 102 the touching position coordinates (X, Y) of stylus pen 120 on touch panel 102 (step S912). CPU 106 adds “: X, Y” to data (c) (step S914). CPU 106 ends the hand-drawing processing.
<Display Processing at Mobile Phone 100>
Display processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
When the reproduction of the motion picture contents has not ended (NO at step S1002), CPU 106 obtains clear information “clear” (data (b)) (step S1004). CPU 106 determines whether clear information “clear” is “true” or not (step S1006). When clear information “clear” is “true” (YES at step S1006), CPU 106 sets the hand-drawing image at “not display” (step S1008). CPU 106 ends the display processing.
When clear information “clear” is not “true” (NO at step S1006), CPU 106 obtains the color of the pen (data (d)) (step S1010). CPU 106 resets the color of the pen (step S1012). CPU 106 obtains the width of the pen (data (e)) (step S1014). CPU 106 resets the width of the pen (step S1016). Then, CPU 106 executes the hand-drawing image display processing (step S1100). The hand-drawing image display processing (step S1100) will be described afterwards. CPU 106 ends the display processing.
<Exemplary Application of Display Processing at Mobile Phone 100>
An exemplary application of display processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
When reproduction of the motion picture contents has not ended (NO at step S1052), CPU 106 determines whether the scene of motion picture contents has changed or not (step S1054). When the scene of the motion picture contents has not changed (NO at step S1054), CPU 106 executes the processing from step S1058.
When the scene of the motion picture contents has been changed (YES at step S1054), CPU 106 sets the hand-drawing image that has been displayed up to that time at “not-display” (step S1056). CPU 106 obtains clear information “clear” (data (b)) (step S1058). CPU 106 determines whether clear information “clear” is “true” or not (step S1060). When clear information clear is true “true” (YES at step S1060), CPU 106 sets the hand-drawing image that has been displayed up to that time at “not-display” (step S1062). CPU 106 ends the display processing.
When clear information “clear” is not “true” (NO at step S1060), CPU 106 obtains the color of the pen (data (d)) (step S1064). CPU 106 resets the color of the pen (step S1066). CPU 106 obtains the width of the pen (data (e)) (step S1068). CPU 106 resets the width of the pen (step S1070). Then, CPU 106 executes the hand-drawing image display processing (step S1100). The hand-drawing image display processing (step S1100) will be described afterwards. CPU 106 ends the display processing.
<Hand-Drawing Image Display Processing at Mobile Phone 100>
A hand-drawing image display processing at mobile phone 100 according to the present embodiment will be described hereinafter.
Referring to
<Another Application of Network System>
The present invention can also be applied to the case where the present invention is achieved by supplying a program to a system or device. The advantage of the present invention can be enjoyed by supplying a storage medium in which is stored the program represented by software for achieving the present invention to a system or device, and a computer (or CPU or MPU) of that system or device reading out and executing the program codes stored in the storage medium.
In this case, the program codes per se read out from the storage medium will implement the function of the embodiments set forth above, and the storage medium storing the programs codes will constitute the present invention.
For a storage medium to supply the program code, a hard disk, optical disk, magneto optical disk, CD-ROM, CD-R, magnetic tape, non-volatile memory card (IC memory card), ROM (mask ROM, flash EEPROM and the like), for example, may be used.
In addition to realizing the functions of the embodiments set forth above by executing program codes read out by a computer, the functions of the embodiments described above may be realized by a process according to an OS (Operating System) running on the computer performing a part of or all of the actual process, based on the commands of the relevant program codes.
Further, the program codes read out from a storage medium may be written to a memory included in a functionality expansion board inserted to a computer or a functionality expansion unit connected to a computer. Then, the functions of the embodiments described above may be realized by a process according to a CPU or the like provided on the functionality expansion board or the functionality expansion unit, performing a part of or all of the actual process, based on the commands of the relevant program codes.
It is to be understood that the embodiments disclosed herein are only by way of example, and not to be taken by way of limitation. The scope of the present invention is not limited by the description above, but rather by the terms of the appended claims, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
1 network system; 100, 100A, 100B, 100C, 100D mobile phone; 101 communication device; 102 touch panel; 103 memory; 103A work memory; 103B address book data; 103C self-terminal data; 103D address data; 103E address data; 104 pen tablet; 106 CPU; 107 display; 108 microphone; 109 speaker; 110 various-type button; 111 first notification unit; 112 second notification unit; 113 TV antenna; 120 stylus pen; 200 car navigation device; 250 vehicle; 300 personal computer; 400 chat server; 406 memory; 406A room management table; 407 hard disk; 408 internal bus; 409 communication device; 500 Internet; 600 contents server; 606 memory; 607 hard disk; 608 internal bus; 609 communication device; 615 hard disk; 700 carrier network.
Number | Date | Country | Kind |
---|---|---|---|
2010-077782 | Mar 2010 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2011/055382 | 3/8/2011 | WO | 00 | 9/28/2012 |