The present invention relates to an information processing system and a server supporting a remote desktop, control method therefor, and storage media storing control programs therefor.
In the remote desktop, a screen displayed on a display unit varies according to an operation of a user, but latency in which a communication time delay occurs may be relatively large. In order to suppress the latency, there is a known remote desktop system in which an image of a wider range than a screen size is stored (cached) and a part of the image is extracted from a storage device to display a screen during a scroll operation (for example, see Japanese Patent Laid-Open Publication No. 2016-110335 (JP2016-110335A)).
However, since the remote desktop system disclosed in JP2016-110335A is configured to store an image wider than the screen size, it is difficult to cope with a screen having a header container that remains on the display even when a scroll operation is performed. Moreover, when a scroll operation is performed, it is difficult to quickly recognize a moving amount by the operation, or it is difficult to feel that the image has moved smoothly.
The present invention provides an information processing system, a server, control methods therefor, and storage media storing computer programs, which enable to quickly grasp a moving amount by a moving operation when a user performs the moving operation for moving an image and to feel that the image has been smoothly moved.
An information processing system includes an information processing apparatus and a server communicably connected to the information processing apparatus. The information processing apparatus includes a display device that displays image data obtained from the server as an image, an operation device that accepts a moving operation by a user for moving the image displayed on the display device, an apparatus-side memory device that stores a set of instructions; and at least one apparatus-side processor that executes the set of instructions to transmit operation information regarding a moving amount by the moving operation to the server after the moving operation is ended, display the image before the moving operation as a first image on the display device, and display a second image in a case where the moving operation is applied to the first image. The second image includes a moved image in a state where the first image is moved by the moving operation and a blank image that becomes a blank in accordance with movement of the first image. The server includes a server-side memory device that stores a set of instructions, and at least one server-side processor that executes the set of instructions to receive the operation information from the information processing apparatus, generate image data of a third image including an image corresponding to the moved image and an image that is continuously connected to the moved image and fills the blank image based on the operation information received, and transmit the image data of the third image to the information processing apparatus. The at least one apparatus-side processor executes the set of instructions to displaying the third image based on the image data received from the server on the display device in place of the second image.
According to the present invention, when a user performs a moving operation for moving an image is performed, it is possible to quickly grasp a moving amount by the movement operation and to feel that the image has been smoothly moved.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereafter, embodiments according to the present invention will be described in detail by referring to the drawings. However, the configurations described in the following embodiments are merely examples, and the scope of the present invention is not limited by the configurations described in the embodiments. For example, each of sections constituting the present invention can be replaced with any configuration that can exhibit the same function. Further, an arbitrary component may be added. In addition, two or more arbitrary configurations (features) of the embodiments may be combined.
A first embodiment will now be described with reference to
In this embodiment, the cloud browser system 1000 has the three image forming apparatuses 101, 102, 103, but the number of the image forming apparatuses arranged is not limited to this, and may be, for example, one, two, or four or more. Hereinafter, the image forming apparatus 101 is represented among the image forming apparatuses 101, 102, and 103. The cloud browser system 1000 includes one image generation system 100, but the number of image generation systems 100 to be arranged is not limited to this, and may be, for example, two or more. The number of the image generation systems 100 to be arranged is preferably smaller than the number of the image forming apparatuses to be arranged. In the cloud browser system 1000, the proxy server 104 may be omitted. For example, a user using the image forming apparatus 101 shall be viewing a screen displayed on the image forming apparatus 101. In this case, the user operates the image forming apparatus 101 to input a request for browsing the external site 105 to be browsed. In response to this, the image forming apparatus 101 transmits an address (URL) of the external site 105 to the image generation system 100 via the proxy server 104. The external site 105 is an http server that returns an http (Hyper Text Transfer Protocol) response to an http request, but the configuration thereof is not particularly limited.
The image generation system 100 is a server having a gateway 106, a virtual machine 107, and a storage device 109. A browser engine, which is a software module, operates on the virtual machine 107. The browser engine receives the URL of the external site 105 transmitted from the image forming apparatus 101 via the gateway 106. Then, the browser engine accesses the external site 105 corresponding to the URL via the gateway 106, and receives a Web content such as HTML from the external site 105. Thereafter, a rendering result of the Web content is generated by a software module that performs rendering. The rendering result is transmitted to the image forming apparatus 101 via the gateway 106. The image forming apparatus 101 receives the rendering result transmitted from the image generation system 100 and displays the rendering result. This allows a user to check the content of the external site 105. Here, “rendering” generally means displaying a browser screen (an image of a web site supplied from the external site 105) on the basis of a Web content. In this embodiment, the “rendering” indicates that the Web content such as HTML received from the external site 105 is converted into the image data of a predetermined screen size.
The storage device 109 stores user data such as a cookie obtained by the browser engine operating on the virtual machine 107 interpreting the web content received from the external site 105. The cloud browser system 1000 is configured to be communicable with the authentication site 110. Users using the image forming apparatuses 101, 102, and 103 can be identified by the authentication site 110. The image generation system 100 obtains authentication of a user on the basis of user information transmitted from any of the image forming apparatuses 101, 102, and 103. The image generation system 100 manages the user data stored in the storage device 109 for each authenticated user. In this embodiment, the image generation system 100 includes one virtual machine 107, but the number of virtual machines to be arranged is not limited thereto, and may include a plurality of virtual machines. In this case, it is preferable to arrange a load balancer between the gateway 106 and each virtual machine 107. Further, it is preferable to select one virtual machine 107 from among the plurality of virtual machines 107 in accordance with a load state. The storage device 109 is preferably used with the plurality of virtual machines 107.
The hardware configuration of the virtual machine 107 is not limited to the configuration shown in
The whole control module 251 collectively receives various requests transmitted from the image forming apparatus 101. The whole control module 251 appropriately distributes the requests received from the image forming apparatus 101 to the browser engine 252 and the rendering control module 253. The browser engine 252 accesses the external site 105 via the network control module 254 on the basis of the URL transmitted from the whole control module 251. Thus, the Web content of the external site 105 can be obtained. The browser engine 252 includes a function of an http client required to obtain a Web content. The browser engine 252 can analyze the Web content, appropriately store information such as a cookie in the storage device 109, and refer to information such as a cookie stored in the storage device 109 in advance. When the analysis of the Web content is completed, the browser engine 252 requests the rendering control module 253 to perform a rendering process. The rendering control module 253 renders the Web content on the basis of the rendering request from the browser engine 252. The rendering control module 253 transmits a rendering result to the image forming apparatus 101, which is the request source of the URL, via the network control module 254. The network control module 254 is a module that relays communication between the browser processing module 250 and an external device such as the external site 105 or the authentication site 110 by controlling the communication interface 205. The network control module 254 includes a protocol stack of http communication required for access to the external site 105.
As described above, in the cloud browser system 1000, the user can input the browsing request for the external site 105 that the user wants to browse by operating the image forming apparatus 101. As a result, the URL of the external site 105 is transmitted from the image forming apparatus 101 to the image generation system 100 via the proxy server 104. The whole control module 251 receives the URL of the external site 105 and transfers it to the browser engine 252. This allows the browser engine 252 to receive the URL. Then, the browser engine 252 requests the network control module 254 to access the external site 105 indicated by the URL in order to obtain the Web content indicated by the URL. When receiving the access request, the network control module 254 controls the communication interface 205 to access the external site 105 via the gateway 106 and the Internet 111. The network control module 254 obtains the Web content from the external site 105 and transfers it to the browser engine 252.
The browser engine 252 transmits and receives the user data such as a cookie to and from the storage device 109 as necessary on the basis of the Web content transferred from the network control module 254, and analyzes the Web content. The browser engine 252 requests the rendering control module 253 to render the Web content in accordance with the analysis of the Web content. When receiving the request, the rendering control module 253 executes the rendering of the Web content. When the rendering is completed, the rendering control module 253 notifies the whole control module 251 of the completion. The whole control module 251 notifies the image forming apparatus 101 of the URL request source of the completion of the rendering via the network control module 254. When receiving the completion of the rendering, the image forming apparatus 101 requests to obtain the rendering result from the image generation system 100. This request is communicated to the whole control module 251 via the network control module 254 and transferred by the whole control module 251 to the rendering control module 253. Then, when receiving the request to obtain the rendering result, the rendering control module 253 transmits the rendering result to the image forming apparatus 101 that is the request source via the network control module 254. This allows the image forming apparatus 101 to display the rendering result. The user can browse the contents of the desired external site 105 on the image forming apparatus 101.
The controller unit 300 controls operations of the operation device 312, the scanner 370, and the printer 395 (a control unit). For example, the controller unit 300 achieves a print function (a copy function) by controlling the scanner 370, which is an image input device, to read image data and controlling the printer 395, which is an image output device, to print out the image data. The CPU 301 starts an operating system (OS) 351 (see
The operation device 312 is configured by a touch panel having a touch function in this embodiment. Thus, the operation device 312 can display image data from the image generation system 100 as an image (a first, second, or third image display step), for example. The touch panel may be a pressure-sensitive touch panel or an electrostatic touch panel. In the operation device 312, the user can perform a moving operation of moving an image displayed on the touch panel by dragging (scrolling) on the touch panel, that is, sliding a fingertip on the touch panel (an operation reception step). As described above, the operation device 312 in this embodiment has the function as the display unit for displaying an image and the function as the operation device capable of achieving an image moving operation by a touch function of the touch panel. In the image forming apparatus 101, a part having the function as the display unit and a part having the function as the operation device may be provided separately. The operation-unit I/F 306 is an interface that is in charge of connection with the operation device 312, and outputs the image data to the operation device 312. Thus, the operation device 312 can display the image of the image data. The operation-unit I/F 306 transmits input information (operation information) input from the user via the operation device 312 to the CPU 301.
The network I/F 310 is an interface for connecting the image forming apparatus 101 to a LAN. Accordingly, the network I/F 310 can transmit, for example, input information to the image generation system 100 (an apparatus-side transmission step). In addition, the network I/F 310 can transmit the image data obtained by the scanner 370 to an external apparatus. When the transmission of the image data is completed, the network I/F 310 can notify the image generation system 100 of transmission completion information to this effect. The network I/F 310 can also receive various kinds of information from the image generation system 100. In this way, the network I/F 310 in this embodiment has the function as the reception unit (apparatus-side reception unit) that receives various kinds of information and the function as the transmission unit (apparatus-side transmission unit) that transmits various kinds of information. In the image forming apparatus 101, a part functioning as the reception unit and a part functioning as the transmission unit may be provided separately. The USB host I/F 313 is an interface for communicating with a USB storage device 314. The USB host I/F 313 can transmit data stored in the USB storage device 314 to the CPU 301. The USB host I/F 313 also functions as an output unit to store the data stored in the storage device 304 in the USB storage device 314. The USB storage device 314 is an external storage device that stores data and is detachable from the USB host I/F 313. A plurality of USB storage devices 314 can be connected to the USB host I/F 313. The RTC 315 clocks current time. The time information clocked by the RTC 315 is used for recording job input time etc.
The image bus I/F 305 is a bus bridge for connecting the system bus 307 and the image bus 308 that transfers the image data at high speed, and converting the data format. The image bus 308 is configured by a PCI bus or the like. The scanner 370 and the printer 395 are connected to the device I/F 320. The device I/F 320 performs conversion between synchronous and asynchronous systems of the image data. The scanner image processor 380 corrects, processes, or edits the input image data, for example. The printer image processor 390 corrects print output image data or converts its resolution depending on the printer 395. As described above, the image forming apparatus 101 is an apparatus having the print function and the scan function in this embodiment, but this is not limiting, and the image forming apparatus 101 may be an apparatus having at least one of the print function, the scan function, and a facsimile function.
The browser control module 360 is a sub-module contained in the OS 351. When receiving a notification of a user operation from the UI control module 352, the browser operation module 362 notifies the command I/F module 364 or the proxy processing module 365 of the content of the user operation. When receiving the notification from the browser operation module 362, the proxy processing module 365 requests to obtain proxy setting information from the storage control module 355. When it is determined that the proxy setting is valid on the basis of the proxy setting information obtained from the storage control module 355, the proxy processing module 365 transmits a communication request to the proxy server 104 via the network control module 354. The proxy processing module 365 receives a response to the communication request via the network control module 354. The proxy processing unit 365 notifies the browser display module 363 or the command I/F module 364 of the result of processing the content of the response.
When receiving the notifications from the browser operation module 362 and the proxy processing module 365, the command I/F module 364 requests communication with the image generation system 100 via the network control module 354. The communication request at this time may include the information notified from the browser the operation module 362 and the proxy processing module 365. This information also includes the information about user operations, such as text input, button press, drag, and zoom. The information about the text input includes, for example, a URL. The information about the button press includes a coordinate of a pressed position on the operation device 312. The information about the drag and zoom includes a character string associated with each operation. The command I/F module 364 receives communication from the image generation system 100 via the network control module 354. The command I/F module 364 processes the communication content from the image generation system 100 and notifies the image data obtaining module 361 or the browser display module 363 of the communication content. The image data obtaining module 361 receives a completion notification of the rendering result from the command I/F module 364. When receiving the completion notification, the image data obtaining module 361 requests to obtain an image from the image generation system 100. The image obtaining request is transmitted to the rendering control module 253 operating on the virtual machine 107 of the image generation system 100. The image is then sent out and the rendering result is sent to the image data obtaining module 361. When receiving the rendering result, the image data obtaining module 361 transfers the image as the rendering result to the browser display module 363. The browser display module 363 receives the image from the image data obtaining module 361 and instructs the UI control module 352 to draw the image. In addition, the browser display module 363 receives the notifications from the command I/F module 364 and the proxy processing module 365, and instructs the UI control module 352 to display a message corresponding to the notifications.
In a step S602, the UI control module 352 of the image forming apparatus 101 transmits an activation request to the browser control module 360. As a result, the browser control module 360 is activated, and the browser screen 500 (see
In a step S603a, the browser control module 360 of the image forming apparatus 101 transmits a rendering request to render a web page to the browser processing module 250 (the virtual machine 107) of the image generation system 100. The rendering request includes, for example, URL information indicating the location of the Web content to be displayed on the browser screen 500, the device information of the image forming apparatus 101. The “URL information” is information about a URL set in advance as a setting value of the button 403 or information about a URL input in the address bar 503 on the browser screen 500. The “device information” is, for example, screen size information of the operation device 312 of the image forming apparatus 101 and screen type information for specifying the operation device 312, but this is not limiting.
In a step S603b, the browser control module 360 of the image forming apparatus 101 requests the UI control module 352 to display a load screen on the operation device 312, and the UI control module 352 renders the load screen. The load screen is a screen for notifying the user that the process is in progress until the browser screen 500 is displayed. The notification screen is not limited to the load screen, and may be, for example, a pop-up screen.
When receiving the rendering request in the step S603a, the browser processing module 250 checks presence or absence of user agent information stored in the storage device 109 on the basis of the device information in the rendering request in a step S604.
When the user agent information exists in the storage device 109, the browser processing module 250 obtains the user agent information from the storage device 109 in a step S605.
In a step S606, the browser processing module 250 requests to obtain the Web content from the external site 105 on the basis of the URL information obtained in response to the rendering request in the step S603a and the user agent information obtained in the step S605.
In a step S607, the browser processing module 250 receives the Web content from the external site 105 as a response to the request to obtain the Web content in the step S606.
In step S608, the browser processing module 250 renders the web content received in the step S607.
In a step S609a, the browser processing module 250 stores the results of the rendering performed in the step S608 in the storage device 109.
In a step S609b, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of the URL information indicating the location of the image data of the rendering result stored in the step S609a.
In a step S610, the browser control module 360 of the image forming apparatus 101 requests to obtain the image data from the storage device 109 of the image generation system 100 on the basis of the URL information notified in the step S609b. This request is performed using the URL information notified in the step S609b.
In a step S611, the browser control module 360 receives the image data from the storage device 109 as a result of the request to obtain the image data in the step S610.
In a step S612, the browser control module 360 instructs the UI control module 352 to display the image data received in the step S611 in the content area 505 on the browser screen 500. As a result, the browser screen 500 including the image of the content area 505 suitable for the size of the operation device 312, that is, the screen size of the touch panel is displayed on the operation device 312 of the image forming apparatus 101.
In a step S642, the browser processing module 250 receives the web content from the external site 105 as a response to the request to obtain a Web content in the step S641 as with the process in the step S607 in
In a step S643, the browser processing module 250 renders the web content as with the process in the step S608 in
In a step S644a, the browser processing module 250 stores (saves) the result of the rendering performed in the step S643 in the storage device 109 as with the process in the step S609a in
In a step S644b, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of the URL information indicating the location of the image data of the rendering result stored in the step S644a as with the process in the step S609b in
In a step S645, the browser control module 360 of the image forming apparatus 101 requests to obtain the image data from the storage device 109 of the image generation system 100 as with the process in the step S610 in
In a step S646, the browser control module 360 receives the image data from the storage device 109 as a result of the image data obtaining request in the step S645 as with the process in the step S611 in
In a step S647, the browser control module 360 instructs the UI control module 352 to render the image data received in the step S646. Thus, the image of the image data is displayed (rendered) on the operation device 312.
In a step S648, the user performs tap-in (a tap-in operation) on the image displayed on the operation device 312 in the step S647. The “tap-in” is an operation in which the user touches or presses the touch panel with a fingertip or the like. Then, the operation device 312 transmits an electric signal corresponding to the position of the fingertip (tap-in) on the touch panel to the UI control module 352 (CPU 301). Thus, the UI control module 352 can specify the coordinate of the position of the fingertip on the basis of the electric signal.
In a step S649, the UI control module 352 of the image forming apparatus 101 transmits the information about the coordinate specified in the step S648 to the browser control module 360 as a tap-in event.
In a step S648, the user can also perform a drag by sliding the fingertip on the touch panel, for example, to up, down, right, or left after the tap-in. In this case, the electric signal also varies in accordance with the change in the position of the fingertip on the touch panel. Thus, the UI control module 352 can specify the coordinate that varies in accordance with the movement of the fingertip on the basis of the electric signal at each predetermined time interval. Then, in a step S649, the UI control module 352 transmits the information about the coordinate that varies in accordance with the movement of the fingertip and the information about the moving amount of the fingertip to the browser control module 360 as a drag event.
In a step S650, the browser control module 360 stores the information transmitted from the UI control module 352 in the step S649 in the RAM 302.
In a step S651, the user performs a tap-out (tap-out operation) on the image displayed on the operation device 312 in the step S647. The “tap-out” refers to an operation of releasing the fingertip from the position where the user touches the touch panel by the tap-in or the drag. Then, an electric signal corresponding to a position where the fingertip is separated from the touch panel (tap-out) is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate of the position where the fingertip is separated on the basis of the electric signal.
In a step S652, the UI control module 352 of the image forming apparatus 101 transmits information about the coordinate identified in the step S651 to the browser control module 360 as a tap-out event. Note that an operation of performing a tap-in, then sliding the fingertip on the touch panel more quickly than a normal drag, and then performing a touch-out is referred to as a “flick”. Further, the flick operation may be divided into the drag and the tap-out. In this case, the process at the time of the drag and the process at the time of tap-out are executed. This enables image movement (scrolling) by the flick. The UI control module 352 can transmit information about the coordinate of the flick to the browser control module 360 as a flick event as with the tap-out event and the drag event.
The browser control module 360 determines whether the operation performed by the user in the steps S648 to S651 is an operation including a drag (flick). This determination will be described with reference to
In the step S692, the browser control module 360 determines that the operation performed by the user in the steps S648 to S651 is a drag operation.
In the step S693, the browser control module 360 determines that the operation performed by the user in the steps S648 to S651 is a tap operation, i.e., a simple press operation. Note that, here, as an example, it is assumed that the operation is determined to be a tap operation. The case where the operation is determined to include a drag will be described later with reference to
Then, in a step S653, the browser control module 360 transmits the start coordinate of the tap operation, that is the information about the coordinate at the time of the tap-in, to the browser processing module 250 of the image generation system 100 as operation information.
In a step S654, the browser processing module 250 analyzes the presence or absence of a URL of a link destination at the coordinate, the presence or absence of an input field for a text or the like on the basis of the coordinate information transmitted in the step S653. In this example, it is assumed that the URL of the link destination is present as an analysis result.
In a step S655, the browser processing module 250 requests to obtain the URL of the link destination from the external site 105 as the analysis result in the step S654, that is, the Web content.
In a step S656, the browser processing module 250 receives the web content from the external site 105 as a response to the request to obtain the web content in the step S655.
In a step S657, the browser processing module 250 renders the web content received in the step S656.
In a step S658a, the browser processing module 250 stores the results of the rendering performed in the step S657 in the storage device 109.
In a step S658b, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of the URL information indicating the location of the image data of the rendering result stored in the step S658a.
In a step S659, the browser control module 360 of the image forming apparatus 101 requests to obtain image data from the storage device 109 of the image generation system 100 on the basis of the URL information notified in the step S658b.
In a step S660, the browser control module 360 receives the image data from the storage device 109 as a result of the request to obtain the image data in the step S659.
In a step S661, the browser control module 360 instructs the UI control module 352 to render the image data received in the step S660. As a result, the image of the link destination analyzed in the step S654 is displayed on the operation device 312.
In a step S702, the browser processing module 250 receives the web content from the external site 105 as a response to the web content obtaining request in the step S701 as with the process in the step S607 in
In step S703, the browser processing module 250 renders the web content as with the process in the step S608 in
In a step S704a, the browser processing module 250 stores the result of the rendering performed in the step S703 in the storage device 109 as with the process in the step S609a in
In a step S704b, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of the URL information indicating the location of the image data of the rendering result stored in the step S704a as with the process in the step S609b in
In a step S705, the browser control module 360 of the image forming apparatus 101 requests to obtain the image data from the storage device 109 of the image generation system 100 as with the process in the step S610 in
In a step S706, the browser control module 360 receives the image data from the storage device 109 as a result of the image data obtaining request in the step S705 as with the process in the step S611 in
In a step S707, the browser control module 360 instructs the UI control module 352 to render the image data received in step S706. As a result, the image of the image data is displayed on the operation device 312. For example, an image shown in
In a step S708, the user performs the tap-in operation on the image displayed on the operation device 312 in the step S707. Then, an electric signal corresponding to the tap-in is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate of the position of the fingertip on the basis of the electric signal.
In a step S709, the UI control module 352 of the image forming apparatus 101 transmits the information about the coordinate specified in the step S708 to the browser control module 360 as a tap-in event.
In a step S710, the browser control module 360 stores the information (coordinate of the tap-in position) transmitted from the UI control module 352 in the step S709 in the RAM 302.
In a step S711, the user performs a drag by sliding the fingertip that taps in the image in the step S708 on the touch panel as-is. Then, an electric signal that varies in accordance with the drag is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate that varies in accordance with the movement of the fingertip on the basis of the electric signal.
In a step S712, the UI control module 352 transmits the information about the coordinate that is specified in the step S711 and is varying with the movement of the fingertip to the browser control module 360 as a drag event.
In a step S713, the browser control module 360 calculates the coordinate of the start point and the coordinate of the end point in the drag operation on the basis of the information transmitted from the UI control module 352 in the step S712. The coordinate of the start point is the same as the coordinate of the tap-in position in the step S708. The calculation result by the browser control module 360 is stored in the RAM 302.
In a step S714, the browser control module 360 instructs the UI control module 352 to render the image data on the basis of the coordinate of the start point and the coordinate of the end point calculated in the step S713. Thus, a partial image of the image data is displayed on the operation device 312. For example, a partial image shown in
In a step S715, the user performs a tap-out of releasing the fingertip from the image on which the drag is performed in the step S711. Then, an electric signal corresponding to the position of the tap-out is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate of the position where the fingertip is separated on the basis of the electric signal. The coordinate of the position where the fingertip is separated is the same as the coordinate of the end point calculated in the step S713.
In a step S716, the UI control module 352 of the image forming apparatus 101 transmits the information about the coordinate specified in the step S715 to the browser control module 360 as a tap-out event.
Next, the browser control module 360 determines whether the operation performed by the user in the steps S708 to S715 is an operation including a drag. Note that, here, as an example, it is assumed that the operation is determined to include a drag. Then, the browser control module 360 calculates the moving amount of the fingertip (the amount of scroll of the image) at the time of the drag in a step S717 on the basis of the coordinate of the start point and the coordinate of the end point calculated in the step S713, for example. Specifically, the browser control module 360 calculates the distance between the coordinate of the start point and the coordinate of the end point as the moving amount of the fingertip. In the step S717, the browser control module 360 may calculate the difference between the coordinate of the tap-in position stored in the step S710 and the coordinate of the tap-out or flick position, and determine whether the difference is more than a predetermined threshold. Thus, when the moving amount due to the drag is small, scrolling of the image can be suppressed.
In a step S718, the browser control module 360 transmits the information about the moving amount calculated in the step S717 to the browser processing module 250 of the image generation system 100.
In a step S719, the browser processing module 250 performs a scroll process. The scroll process is achieved by the browser processing module 250 that links an operation amount (a rotation amount) of a mouse wheel to the moving amount of the image. The browser processing module 250 executes a process to generate the image data based on the moving amount transmitted in the step S718.
In a step S720a, the browser processing module 250 stores the image data generated in the step S719 in the storage device 109.
In a step S720b, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of the URL information indicating the location of the image data stored in the step S720a.
In a step S721, the browser control module 360 of the image forming apparatus 101 requests to obtain the image data from the storage device 109 of the image generation system 100 on the basis of the URL information notified in the step S720b.
In a step S722, the browser control module 360 receives the image data from the storage device 109 as a result of the request to obtain the image data in the step S721. The image data is the data stored in the step S720a.
In a step S723, the browser control module 360 instructs the UI control module 352 to render the image data received in the step S722. As a result, the image of the image data is displayed on the operation device 312. For example, an image shown in
In a state shown in
In a state shown in
In a state shown in
As shown in the left side in
In the step S1002, the browser control module 360 transfers the image data received in the reception memory to the image memory, and causes the UI control module 352 to render the data in the image memory. The reception memory is assigned to the RAM 302 (see
In the step S1003, the browser control module 360 stores the coordinate tapped in on the operation device 312 (a start coordinate) in the RAM 302. After the execution of the step S1003, the process returns to the step S1001, and the subsequent steps are sequentially executed.
In the step S1004, the browser control module 360 obtains data of a transmission target image (the rectangular area 900) on the basis of the start coordinate stored in the step S1003, the moving amount M, and the start point and the end point of the image data in the reception memory. The method of obtaining the data is not particularly limited, and for example, the method described with reference to
In the step S1005, the browser control module 360 transfers data of the transmission target image (the rectangular area 900) on the reception memory obtained in the step S1004 to the image memory, and causes the UI control module 352 to render the data in the image memory. After the execution of the step S1005, the process proceeds to a step S1006.
In the step S1006, the browser control module 360 transfers data of a background of a different color from the moved image (image data) to an area to be a blank image in the image memory, and causes the UI control module 352 to render the blank image. The different color of the background is not particularly limited, and for example, the different color of the background may be a color obtained by reversing the original color of the area to be the blank image, or may be predetermined hatching. After the execution of the step S1006, the process returns to the step S1001, and the subsequent steps are sequentially executed.
In the step S1007, the browser control module 360 calculates the moving amount M on the basis of the coordinate of the tap-in and the coordinate of the tap-out. The moving amount M includes an up-down component that is a moving direction of the image in this embodiment, but this is not limiting. For example, when the object is moved in a left-right direction, the moving amount M may include a component in the left-right direction or may include only a component having a larger value among the component in the up-down direction and the component in the left-right direction. After the execution of the step S1007, the process proceeds to a step S1008.
In the step S1008, the browser control module 360 notifies the browser processing module 250 included in the image generation system 100 of drag information. The drag information includes the start coordinate stored in the step S1003 and the moving amount M obtained in the step S1007. After the execution of the step S1008, the process returns to the step S1001, and the subsequent steps are sequentially executed.
As shown in
Hereinafter, a second embodiment will be described with reference to
In a step S1102, the browser processing module 250 receives the web content from the external site 105 as a response to the request to obtain the web content in the step S1101.
In a step S1103a, the browser processing module 250 renders the web content received in the step S1102.
In a step S1103b, the browser processing module 250 detects a frame image included in a draggable image as a result of the rendering in the step S1103a (calculates a frame area).
In a step S1104a, the browser processing module 250 stores the result of the rendering performed in the step S1103a in the storage device 109.
In a step S1104b, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of URL information indicating the location of the image data of the rendering result stored in the step S1104a. The browser processing module 250 also communicates frame information about the frame image detected in the step S1103b. The frame information is not particularly limited, and includes, for example, location information of the frame image and size information of the frame image (see
In a step S1105, the browser control module 360 of the image forming apparatus 101 requests to obtain the image data from the storage device 109 of the image generation system 100 on the basis of the URL information notified in the step S1104b.
In a step S1106, the browser control module 360 receives the image data from the storage device 109 as a result of the request to obtain the image data in the step S1105.
In a step S1107, the browser control module 360 instructs the UI control module 352 to render the image data received in the step S1106. As a result, the draggable image of the image data is displayed on the operation device 312. For example, an image shown in
In a step S1108, the user taps in on the draggable image displayed on the operation device 312 in the step S1107. Then, an electric signal corresponding to the tap-in is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate of the position of the fingertip on the basis of the electric signal. Note that the tap-in to the draggable image is divided into a case where the user taps in a part outside the frame image in the draggable image and a case where the user taps in a part inside the frame image in the draggable image. In any case, the drag is performed after the tap-in. The drag in the former case corresponds to steps S1111a to S1114a in
In a step S1109, the UI control module 352 of the image forming apparatus 101 transmits information about the coordinate specified in the step S1108 to the browser control module 360 as a tap-in event.
In a step S1110, the browser control module 360 stores the information about the coordinate of the tapped position (the start coordinate) transmitted from the UI control module 352 in the step S1109 in the RAM 302.
In a step S1111a, the user drags the fingertip tapped in the part outside the frame image in the draggable image in the step S1108 as-is on the touch panel. Then, an electric signal that varies in accordance with the drag is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate that varies in accordance with the movement of the fingertip on the basis of the electric signal.
In a step S1112a, the UI control module 352 transmits the information about the coordinate that is specified in the step S1111a and is varying with the movement of the fingertip to the browser control module 360 as a drag event.
In a step S1113a, the browser control module 360 calculates the coordinate of the start point and the coordinate of the end point of the drag operation in the area outside the frame image in the draggable image on the basis of the information transmitted from the UI control module 352 in the step S1112a. The calculation result by the browser control module 360 is stored in the RAM 302.
In a step S1114a, the browser control module 360 instructs the UI control module 352 to render the image data on the basis of the coordinate of the start point and the coordinate of the end point calculated in the step S1113a. Thus, a partial image of the image data is displayed on the operation device 312. For example, an image shown in
The above steps S1111a to S1114a are related to the drag after the tap-in to the part outside the frame image in the draggable image.
In a step S1111b, the user drags the fingertip tapped in the part inside the frame image in the draggable image in the step S1108 as-is on the touch panel. Then, an electric signal that varies in accordance with the drag is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate that varies in accordance with the movement of the fingertip on the basis of the electric signal.
In a step S1112b, the UI control module 352 transmits the information about the coordinate that is specified in the step S1111b and is varying with the movement of the fingertip to the browser control module 360 as a drag event.
In a step S1113b, the browser control module 360 calculates the coordinate of the start point and the coordinate of the end point of the drag operation in the frame image on the basis of the information transmitted from the UI control module 352 in the step S1112b. The calculation result by the browser control module 360 is stored in the RAM 302.
In a step S1114b, the browser control module 360 instructs the UI control module 352 to render the image data on the basis of the coordinate of the start point and the coordinate of the end point calculated in the step S1113b. Thus, a partial image of the image data is displayed on the operation device 312. For example, a partial image shown in
The above steps S1111b to S1114b are related to the drag after the tap-in to the frame image in the draggable image.
In a step S1115, the user taps out by releasing the fingertip from the image dragged in the step S1111a or S1111b. Then, an electric signal corresponding to the tap-out position is transmitted from the operation device 312 to the UI control module 352. Thus, the UI control module 352 can specify the coordinate of the position where the fingertip is separated on the basis of the electric signal.
In a step S1116, the UI control module 352 of the image forming apparatus 101 transmits the information about the coordinate specified in the step S1115 to the browser control module 360 as a tap-out event.
In a step S1117, the browser control module 360 determines whether the operation performed by the user in the steps S1108 to S1115 is an operation including a drag. As described above, since the drag is performed in the step S1111a or S1111b, it is determined that the operation includes the drag in the step S1117. Then, the browser control module 360 calculates the moving amount of the fingertip in the drag (the scroll amount of the image).
In a step S1118, the browser control module 360 transmits the information about the moving amount (scroll amount) calculated in the step S1117 to the browser processing module 250 of the image generation system 100.
In a step S1119a, the browser processing module 250 performs a scroll process to generate the image data based on the moving amount transmitted in the step S1118.
In a step S1119b, the browser processing module 250 detects the frame image included in the draggable image. The frame image detection process will be described later with reference to
In a step S1120a, the browser processing module 250 stores the image data generated in the step S1119a in the storage device 109.
In a step S1120b, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of the URL information indicating the location of the image data stored in the step S1120a. The browser processing module 250 also communicates the frame information about the frame image detected in the step S1119b.
In a step S1121, the browser control module 360 of the image forming apparatus 101 requests to obtain the image data from the storage device 109 of the image generation system 100 on the basis of the URL information notified in the step S1120b.
In a step S1122, the browser control module 360 receives the image data from the storage device 109 as a result of the request to obtain the image data in the step S1121. The image data is the image data stored in the step S1120a.
In a step S1123, the browser control module 360 instructs the UI control module 352 to render the image data received in the step S1122. As a result, the image of the image data is displayed on the operation device 312. For example, an image shown in
In a state shown in
In a state shown in
In a state shown in
In a state shown in
In a step S1402, the browser processing module 250 detects “scroll” and “auto” included in the style “overflow” of CSS (Cascading Style Sheets) from the web content. The term “scroll” refers to a drag in a draggable (scrollable) area. The “auto” indicates that “scroll” or other operations are possible. When “scroll” or “auto” is detected, a drag on a frame image is enabled. Note that it is not necessary to detect “scroll” and “auto” as long as a drag on the screen is possible.
In a step S1403, the browser processing module 250 detects the upper left corner coordinate (xi, yi), the width (width_i), and the height (height_i) of the iframe tag part (frame image) extracted in the step S1401 as the frame information. This frame information is encoded in the JSON format (see FIGS. 13A and 13B).
As shown in
In step S1505, the browser control module 360 obtains data of the transmission target image in the frame image on the basis of the start coordinate stored in the step S1503, the moving amount M, and the start point and the end point of the image data in the reception memory, as with the step S1004. The width of the transmission target image is the same as the width of the frame image. After the execution of the step S1505, the process proceeds to a step S1506.
In the step S1506, the browser control module 360 transfers the data of the transmission target image in the frame image in the reception memory obtained in the step S1505 to the image memory and causes the UI control module 352 to render the data in the image memory. After the execution of the step S1506, the process proceeds to a step S1507.
In the step S1507, the browser control module 360 transfers the data of the background different from the moved image of the frame image to the area to be the blank image of the frame image on the image memory and causes the UI control module 352 to render the blank image of the frame image. After the execution of the step S1507, the process returns to the step S1501, and the subsequent steps are sequentially executed.
In the step S1508, the browser control module 360 obtains a draggable image data on the basis of the start coordinate stored in the step S1503, the moving amount M, and the start and end points of the image data in the reception memory.
In a step S1509, the browser control module 360 transfers the draggable image data in the reception memory obtained in the step S1508 to the image memory and causes the UI control module 352 to render the data in the image memory. After the execution of the step S1509, the process proceeds to a step S1510.
In the step S1510, the browser control module 360 transfers data of a background different from the moving image (image data) to an area to be a blank image in the image memory and causes the UI control module 352 to render the blank image. After the execution of the step S1510, the process returns to the step S1501, and the subsequent steps are sequentially executed.
Hereinafter, a third embodiment will be described with reference to
In a state shown in
Therefore, the image generation system 100 is configured to suppress the phenomenon in which the intermediate image 801a is displayed. The configuration and operation will be described below.
In the step S3001, the browser processing module 250 receives the operation information from the image forming apparatus 101 and determines whether the operation information is information related to an image moving operation (a scroll operation) for moving an image displayed on the operation device 312 of the image forming apparatus 101.
In the step S3002, the browser processing module 250 executes a rendering period determination process to determine whether the generation of the third image 803 in the step S1719 is completed (a determination step). Thus, in this embodiment, the browser processing module 250 also has a function as a determination unit that determines whether the generation of the third image 803 is completed. Note that step S3002 is executed whenever the rendering is performed in the step S1719.
In the step S2002, the browser processing module 250 turns ON a scroll mode. This enables to start the generation of the image displayed after the scroll operation, that is, the image data of the third image 803 (the step S1719).
In the step S2003, the browser processing module 250 turns OFF the scroll mode. This omits the generation of the image data of the third image 803 (the step S1719).
In a step S2004, the browser processing module 250 executes the process of generating the image data of the third image 803 (the step S1719) as the process based on the operation information.
In the step S2102, the browser processing module 250 determines whether a predetermined time period has elapsed from the start of the generation (rendering) of the image data of the third image 803. As a result of the determination in the step S2102, when it is determined that the predetermined time period has elapsed, the process proceeds to a step S2103. On the other hand, as a result of the determination in the step S2102, when it is determined that the predetermined time period has not elapsed, the process remains in the step S2102, that is, the process waits until the predetermined time period elapses. The predetermined time period is appropriately set according to, for example, the processing speed of the browser processing module 250, but is not particularly limited. The predetermined time period is preferably changeable.
In the step S2103, the browser processing module 250 turns OFF the scroll mode. Thus, it is determined that the generation of the image data of the third image 803 is completed.
In the step S2104, the browser processing module 250 stores the image data of the third image 803 as the rendering result in the storage device 109 (the step S1720a).
In a step S2105, the browser processing module 250 notifies the browser control module 360 included in the image forming apparatus 101 of the URL information (the step S1720b) indicating the location of the image data stored in the step S2104. After the step S2105 is executed, the process is terminated.
As described above, in this embodiment, the image generation system 100 waits for the transmission of the image data of the third image 803 to the image forming apparatus 101 until it is determined that the generation of the image data of the third image 803 is completed. Then, when it is determined that the generation of the image data of the third image 803 is completed, the image data of the third image 803 is transmitted to the image forming apparatus 101 (the notification of the URL information in this embodiment). Accordingly, the phenomenon in which the intermediate image 801a is displayed is suppressed, and the first image 801 (see
Hereinafter, a fourth embodiment will be described with reference to
In the step S2201, the browser processing module 250 of the image generation system 100 stores the rendering results, for example, in the storage device 109. The rendering result stored here is the result of the rendering executed in the step S1719, and in this embodiment, the image data of the two third images 803 generated at different timings. The image data of the two third images 803 may be, for example, the image data obtained within the predetermined time period.
In the step S2202, the browser processing module 250 compares the image data of the two third images 803 (the rendering results) stored in the step S2201 and determines whether the number of differences between the image data of the two images is equal to or less than a threshold. As a result of the determination in the step S2202, when it is determined that the number of differences is equal to or less than the threshold, the process proceeds to the step S2103. As described above, in the step S2103, the scroll mode is turned OFF and it is determined that the generation of the image data of the third image 803 is completed. On the other hand, as a result of the determination in the step S2202, when it is determined that the number of differences is more than the threshold, the process is terminated. Although the image data of the two third images 803 are compared to find the differences in the step S2202 in this embodiment, this is not limiting. For example, the image data of three or more third images 803 may be compared. The threshold is stored in advance in the storage device 109, for example. The threshold is preferably changeable as appropriate.
As with the third embodiment, in this embodiment, the image generation system 100 waits to transmit the image data of the third image 803 to the image forming apparatus 101 until it is determined that generation of the image data of the third image 803 is complete. Then, when it is determined that the generation of the image data of the third image 803 is completed, the image data of the third image 803 is transmitted to the image forming apparatus 101. Accordingly, the phenomenon in which the intermediate image 801a is displayed is suppressed, and the first image 801 to the third image 803 are sequentially displayed on the operation device 312 of the image forming apparatus 101. Such image change allows the user to feel that the image on the operation device 312 has moved smoothly and successfully.
Although the preferred embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and various modifications and changes can be made within the scope of the gist of the present invention. Further, the information processing apparatus is the image forming apparatus 101 in the respective embodiments, but is not limited thereto, and may be, for example, a desktop or notebook personal computer, a smartphone, or the like.
In the cloud browser system 1000, for example, the image generation system 100 as a server may be installed outside Japan, and the image forming apparatus 101 as a terminal device may be installed in Japan. Even in such a case, the server is able to transmit a file and data to the terminal device, and the terminal device is able to receive the file and data. Even when the server is located outside Japan, the cloud browser system 1000 can transmit and receive the file and data as one unit. In the cloud browser system 1000, even when the server is located outside Japan and the terminal device is located in Japan, the terminal device can perform the main function of the cloud browser system 1000. Further, the effect of the function can be exhibited in Japan. For example, even if the server is located outside Japan, if the terminal device constituting the cloud browser system 1000 is located in Japan, the system can be used in Japan by using the terminal device.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Applications No. 2023-204679, filed Dec. 4, 2023 and No. 2024-064559, filed Apr. 12, 2024, which are hereby incorporated by reference herein in their entireties.
Number | Date | Country | Kind |
---|---|---|---|
2023-204679 | Dec 2023 | JP | national |
2024-064559 | Apr 2024 | JP | national |