The present invention relates to an information processing system, an information processing apparatus, a method of controlling the same, and a storage medium.
As product usage and troubleshooting becomes more complicated, customers (users) frequently call a call center of a manufacturer to ask questions directly and get answers relating to usage, troubleshooting, and the like. A system that, in order to appropriately and quickly deal with such troubles, provides support by directly logging into a terminal of a user from a remote location and changing settings of an environment of the user is considered.
Also, in recent years, it has become possible to use VNC (Virtual Network Computing) to cause another computer to display a desktop screen of a particular computer via a network and perform operations. By using such a technique, it becomes possible for an operator of a call center to remotely control a customer's apparatus and support maintenance of the apparatus or operations by the customer, for example. Also, the operator of the call center can remotely control the customer's apparatus, and can describe in an easy to understand manner an operation procedure of the apparatus to the customer by showing a track of operations made by a cursor or the like.
Also, when performing support with respect to an image processing apparatus that does not normally display a cursor in which, for example, a method of setting or a method of operating the apparatus is explained, causing the image processing apparatus to display a cursor in order to smoothly perform support from a remote location has been considered. Accordingly, a system for describing, for example, a method of setting or a method of operating an apparatus by causing a displayed cursor to move on the image processing apparatus following a track of operations made by a cursor at a call center or the like has been considered (for example, with reference to Japanese Patent Laid-Open No. 2014-153776).
A cursor display on an image processing apparatus is generally performed by control by an operating system on the image processing apparatus. For the cursor display, when a touch operation is performed on an operation panel, the cursor display generally is moved to a touched position.
A virtual keyboard (or a screen keyboard or a software keyboard) is used in a state in which a screen displayed on a display unit of an image processing apparatus, by using the foregoing VNC or the like, is published on an external device of a remote location side. At that time, a cursor will move and be displayed by the external device on keys selected by touch operations on the image processing apparatus. Accordingly, in the image processing apparatus, the user of the external device will know what keys were selected, and particularly in a case of inputting a password of the like by using a virtual keyboard screen, a problem arises in that security is not maintained.
An aspect of the present invention is to eliminate the above-mentioned problem with conventional technology.
A feature of the present invention is to provide a technique in which, in a case where a screen having confidentiality is displayed when image information of a displayed screen is transmitted to a client to cause the client to display the image information, a cursor is caused not to be displayed on the screen.
According to a first aspect of the present invention, there is provided an information processing system having a first information processing apparatus for distributing image information for display and a second information processing apparatus for receiving and displaying the image information, wherein the first information processing apparatus comprises: a first memory device that stores a set of instructions; and at least one processor that executes the instructions to: store the image information into a memory; receive a connection request from the second information processing apparatus; determine whether or not the image information is information of a screen for inputting information having confidentiality; combine an image of a cursor with the image information and cause the memory to store a result of the combining; and in a case that it is determined that the image information is the information of the screen for inputting information having confidentiality, restrict viewing of the image of the cursor in the image information and transmit the image information in which the viewing of the image of the cursor is restricted to the second information processing apparatus, and wherein the second information processing apparatus comprises: a second memory device that stores a set of instructions; and at least one processor that executes the instructions to: transmit to the first information processing apparatus an input event resulting from an operation by a user; receive the image information transmitted from the first information processing apparatus; and cause the display unit to display the image information.
According to a second aspect of the present invention, there is provided an information processing apparatus that transmits image information being displayed on a display unit to a client to cause the client to display the image information, comprising: a memory device that stores a set of instructions; and at least one processor that executes the instructions to: store the image information displayed on the display unit into a memory; determine whether or not the image information is information of a screen for inputting information having confidentiality; combine an image of a cursor with the image information and causes the memory to store a result of the combining; and control to, in a case that it is determined that the image information is information of the screen for inputting information having confidentiality, restrict viewing of the image of the cursor in the image information and transmit the image information in which the viewing of the image of the cursor is restricted to the client.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Embodiments of the present invention will be described hereinafter in detail, with reference to the accompanying drawings. It is to be understood that the following embodiments are not intended to limit the claims of the present invention, and that not all of the combinations of the aspects that are described according to the following embodiments are necessarily required with respect to the means to solve the problems according to the present invention.
In
A control unit 200, which includes a CPU 201, controls operations of the image processing apparatus 101 overall. The CPU 201, by a boot program stored in a ROM 202, reads an OS or a control program installed in an HDD 205 and deploys it into a RAM 203, and performs various control such as reading control or transmission control in accordance with the deployed program. The RAM 203 is used as a temporary storage area such as a main memory, a work area, or the like, of the CPU 201. Also, the RAM 203 is used as a virtual VRAM and also as a storage region for temporarily holding information for a screen display. A VRAM 204 is a storage unit for holding information for a screen display written from a virtual VRAM region of the RAM 203. Note, a region of a portion of the RAM 203 may be used as the VRAM 204. The HDD (hard disk drive) 205 stores image data, various programs, or various information tables.
An operation unit I/F (interface) 206 connects an operation unit 210 to the control unit 200. The operation unit 210 is equipped with a keyboard, a display unit having a touch panel function, and the like. Also, the operation unit 210 displays an image to the display unit based on image information held in the VRAM 204. A printer I/F 207 connects a printer 211 to the control unit 200. Image data to be printed by the printer 211 is transferred from the control unit 200 to the printer 211 via the printer I/F 207, and is printed on a printing medium (sheet) by the printer 211. A scanner I/F 208 connects a scanner 212 to the control unit 200. The scanner 212 generates image data by reading an image on an original, and inputs the image data to the control unit 200 via the scanner I/F 208. A network I/F 209 connects the control unit 200 to the LAN 102. The network I/F 209 transmits image data or information to an external device over the LAN 102 and receives various pieces of information from the external apparatus over the LAN 102.
The image processing apparatus 101 is equipped with an input event reception module 301, an input event generation module 302, an image data generation module 303, and an image data transmission module 304. The input event reception module 301 holds an input event signal, transmitted from the client terminal 103 and received via the LAN 102, in the RAM 203. The input event generation module 302, based on the input event signal received by the input event reception module 301, generates an input event signal with respect to the image processing apparatus 101 and sends it to the image data generation module 303. The image data generation module 303, upon receiving the input event signal, generates image information reflecting the contents and holds the image information in a virtual VRAM region of the RAM 203. The image data transmission module 304 transmits the image information held in the virtual VRAM region of the RAM 203 to the client terminal 103 via the LAN 102. Note, the image information held in the VRAM 204 written from the virtual VRAM region of the RAM 203 may be transmitted to the client terminal 103.
This processing is started by the presence of an input event from the client terminal 103. Firstly, in step S401, the CPU 201 determines whether or not a connection request is received from the client terminal 103. Here, the CPU 201 advances the processing to step S402 in a case where it determines that the connection request was received from the client terminal 103. Meanwhile, when this is not the case, the CPU 201 advances the processing to step S412, displays image information held in the virtual VRAM region of the RAM 203 to the display unit of the operation unit 210, and ends this processing.
In step S402, the CPU 201 reads an image of a cursor stored in the ROM 202 or the HDD 205 and stores it in the RAM 203. Then, the CPU 201 combines the image of the cursor with the image information that the virtual VRAM region of the RAM 203 holds, and stores it to the virtual VRAM region.
In step S403, the CPU 201 determines whether or not the image information to be displayed, held in the virtual VRAM region of the RAM 203, is a screen for password input. Specifically, the CPU 201 determines whether a password input screen is displayed on the display unit of the operation unit 210. When displaying a virtual keyboard screen, the CPU 201 adds information of whether it is the virtual keyboard screen for inputting a password or not. Accordingly, the CPU 201 can determine whether or not the virtual keyboard screen is for inputting a password based on the information. In a case where the virtual keyboard screen on which the information indicating a keyboard screen for inputting a password is added is opened, it is determined that the password input screen is displayed on the display unit. In this way, when it is determined that a screen for password input is displayed on the display unit, the processing is advanced to step S404. When this is not the case, the processing is advanced to step S405. In step S404, the CPU 201 makes it so that the image of the cursor combined with the image information is not displayed, and advances the processing to step S408. Specifically, the CPU 201 deletes the image of the cursor loaded in the RAM 203 in step S402, and advances the processing to step S408.
Accordingly, when a password is inputted, the cursor is not displayed on the screen displayed on the display unit and also the cursor is not included in the image information to be transmitted to the client.
Meanwhile, in step S405, the CPU 201 combines the image of the cursor with the image information, and advances the processing to step S406. In other words, in a case where a screen for a password input is not being displayed on the display unit, the cursor is displayed on the screen. In step S406, the CPU 201 determines whether or not an input event transmitted from the client terminal 103 was received. Specifically, it is determined whether or not an input event signal transmitted from the client terminal 103 via the LAN 102 was received by the input event reception module 301. For example, when a click or movement by a pointing device is performed by the client terminal 103, the client terminal 103 transmits the designated coordinates along with information of the click or movement to the image processing apparatus 101 as an input event. By this, the image processing apparatus 101 receives the input event. In step S406, in a case where the CPU 201 determines that the input event is received, it advances the processing to step S407, and when this is not the case, it advances the processing to step S408. In step S407, the CPU 201 processes the received input event and updates the image information, and advances the processing to step S408.
Specifically, the input event generation module 302, based on the received input event signal, generates an input event signal with respect to the image processing apparatus 101 and sends it to the image data generation module 303. By this, the image data generation module 303 generates image information reflecting the contents after the input event is executed, and holds the image information in a virtual VRAM region of the RAM 203. For example, in a case where the received input event is a movement of the pointing device, the image data generation module 303 changes the position of the cursor (moves the cursor) based on the received coordinates along with the movement information, and generates and displays the image information.
In step S408, the CPU 201 transmits the image information held in the virtual VRAM region of the RAM 203 to the client terminal 103. By this, the client terminal 103 holds the received image information and displays it to the display unit of the client terminal 103. Next, the processing is advanced to step S409, and the CPU 201 writes the image information that the virtual VRAM region of the RAM 203 holds to the VRAM 204, and displays it to the display unit of the operation unit 210 via the operation unit I/F 206.
Next, the processing advances to step S410, and the CPU 201 determines whether or not communication with the client terminal 103 is disconnected. Here, although step S411 is advanced to in a case where it is determined that communication with the client terminal 103 is disconnected, when this is not the case, the processing is returned to step S403. In step S411, the CPU 201 deletes the image of the cursor combined with the image information, and clears the cursor. Then, the processing advances to step S412, and the CPU 201 writes the image information that the virtual VRAM region of the RAM 203 holds to the VRAM 204, and displays it to the display unit of the operation unit 210 via the operation unit I/F 206. By this, in step S412, because the image information is stored in the virtual VRAM region without the cursor being combined therewith, the CPU 201 displays an image to the display unit of the operation unit 210 in a state in which the cursor is not displayed. In place of deleting the cursor, a restriction may be made such as causing a mask image to be superimposed on the virtual keyboard so that a cursor operation cannot be viewed. A superimposition of the mask image may be performed by the image processing apparatus 101 or may be performed by the client terminal 103.
By executing such processing, it is possible to transmit image information including a cursor to the client terminal 103 and cause the display unit of the client terminal 103 to display the image information. Also, in a case where the image information transmitted to the client terminal 103 is a screen relating to security such as a password input screen for example, image information not including the cursor can be transmitted. By this, in the screen relating to security, it is possible to prevent the user of the client terminal knowing what keys are selected on the image processing apparatus 101.
In contrast,
By this, it becomes impossible for an operator operating the client terminal 103 to confirm the destination input details by movement of the cursor in a case where the user inputs a destination via the keyboard screen 503. By this, it is possible to maintain security with respect to an input of a destination.
By virtue of the embodiment described above, it becomes possible to maintain confidentiality in a case where a user operating the image processing apparatus 101 inputs security information such as a password on the display unit of the operation unit 210. Also, in a case where an operator operating the client terminal 103 at the same time uses VNC and inputs security information by remotely controlling the image processing apparatus 101, it is possible to maintain confidentiality.
Note, in the embodiment, although whether a cursor is displayed or hidden is switched in accordance with whether or not the image processing apparatus 101 displays a password input screen, determination may be made by another condition. For example, a display/hiding of the cursor may be switched according to an instruction of a user using the image processing apparatus 101. Specifically, configuration may be taken such that a selection button for indicating whether or not to display a cursor on the display unit of the operation unit 210 of the image processing apparatus 101 is displayed, and the cursor is not displayed in a case where a user selects hide by the button. By this, there is the effect that it is possible to switch display/hiding of the cursor at an arbitrary timing and that it is possible to improve the user convenience.
Also, display/hiding of the cursor may be switched in accordance with a type of event from the client terminal. For example, configuration may be taken such that a hide cursor event can be received from the client terminal. In this case, processing for clearing a display of the cursor is immediately performed when a hide cursor event is received from the client terminal, after the cursor was displayed in step S405, for example. By this, there is the effect that it becomes possible to switch display/hiding of the cursor from the client terminal, and the convenience of operation on the client terminal is improved.
Note, although description is given by an example of an input of a password or a destination such as an electronic mail address as one example of information having confidentiality in the above described embodiment, information such as a serial number, a path name of a folder, and other confidential information for example may be the information having confidentiality. Also, information including personal information (privacy) such as an address or a telephone number may be treated as information having confidentiality. In this way, it is possible to restrict display of the cursor in a case where information having confidentiality is input.
By virtue of this embodiment as described above, when a cursor is being displayed on an operation unit of an image processing apparatus to allow a track of an operation from a terminal of a remote location to be confirmed, a cursor on the screen of the terminal of the remote location can be hidden when a security screen is displayed on the image processing apparatus. By this, it is possible to maintain confidentiality of an operation on the image processing apparatus because the operation on the image processing apparatus is not followed on the terminal of the remote location. By this, it becomes possible to cause efficiency of work support in the image processing apparatus to improve as well as to cause security to improve.
Embodiments of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiments and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiments, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiments and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiments. The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2016-243680, filed Dec. 15, 2016, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-243680 | Dec 2016 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
8640193 | Shigeeda | Jan 2014 | B2 |
8904473 | Sambamurthy | Dec 2014 | B2 |
9075513 | Shogaki | Jul 2015 | B2 |
9148443 | Chizeck | Sep 2015 | B2 |
9972231 | Furutani | May 2018 | B1 |
10089496 | Fukasawa | Oct 2018 | B2 |
20030115481 | Baird | Jun 2003 | A1 |
20030122806 | Edge | Jul 2003 | A1 |
20040093598 | Haga | May 2004 | A1 |
20050243364 | Sakai | Nov 2005 | A1 |
20060103508 | Sato | May 2006 | A1 |
20060282867 | Mizuhashi | Dec 2006 | A1 |
20070081184 | Daos | Apr 2007 | A1 |
20070253035 | Takesada | Nov 2007 | A1 |
20080013862 | Isaka | Jan 2008 | A1 |
20080042923 | De Laet | Feb 2008 | A1 |
20080291494 | Kanoh | Nov 2008 | A1 |
20080303842 | Okamoto | Dec 2008 | A1 |
20110033040 | Nakashima | Feb 2011 | A1 |
20110051182 | Sugiyama | Mar 2011 | A1 |
20110211222 | Shogaki | Sep 2011 | A1 |
20110231909 | Shibuya | Sep 2011 | A1 |
20110291834 | Boldyrev | Dec 2011 | A1 |
20120268477 | Shogaki | Oct 2012 | A1 |
20130042169 | Reedy | Feb 2013 | A1 |
20130067541 | Itoh | Mar 2013 | A1 |
20140101784 | Shukla | Apr 2014 | A1 |
20140189115 | Eggert | Jul 2014 | A1 |
20140223380 | Ikeda | Aug 2014 | A1 |
20140253297 | Kawaguchi | Sep 2014 | A1 |
20150149923 | Shogaki | May 2015 | A1 |
20150227730 | Grigg | Aug 2015 | A1 |
20150278534 | Thiyagarajan | Oct 2015 | A1 |
20160026253 | Bradski | Jan 2016 | A1 |
20170041504 | Fukuda | Feb 2017 | A1 |
20170279821 | Flowers | Sep 2017 | A1 |
20170351478 | Shan | Dec 2017 | A1 |
20180074601 | Nakayama | Mar 2018 | A1 |
20180081444 | Youoku | Mar 2018 | A1 |
20180152440 | Hande | May 2018 | A1 |
20180160002 | Nishiyama | Jun 2018 | A1 |
20180307871 | Rollins | Oct 2018 | A1 |
20190034657 | Ford | Jan 2019 | A1 |
20190392166 | Awad | Dec 2019 | A1 |
Number | Date | Country |
---|---|---|
2014153776 | Aug 2014 | JP |
Entry |
---|
Tomoharu Nakamura, Tomoya Yano, Kohki Watanabe, Yui Ishii, Hideki Ono, 1ppei Tambata, Nobuki Furue, Juji Nakahata, SIGGRAPH: ACM SIGGRAPH 2019 Emerging Technologies . Jul. 2019, Article No. 1, pp. 1-2. |
Mark Rice, Marcus Wan, Min-Hui Foo, Jamioe Ng, Zyuie Wai, Janell Kwok, Samuel Lee, Linda Teo. Sandbox: Proceedings of the 2011 ACM SIGGRAPH Symposium on Video Games. Aug. 2011, p. 17-24. |
Number | Date | Country | |
---|---|---|---|
20180173902 A1 | Jun 2018 | US |