Embodiments relate generally to managing access to images and workflows using roles.
The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.
An increasing number of mobile devices, such as smartphones and tablet computers, are equipped with cameras. This makes them increasingly valuable to individuals and businesses. One of the issues with mobile devices that include cameras is that when multiple images of the same object are captured over time, it can be difficult to analyze changes in the objects because the images may not have been captured at the same distance or angle. Thus, changes in the objects that may appear to have occurred based upon the images may not have actually occurred.
Another issue is that there is often no access controls applied to images acquired with mobile devices, or to workflows for processing images acquired with mobile devices, allowing third party access to sensitive information.
According to an embodiment, a network device includes one or more processors, one or more memories and an image management application configured to receive, over one or more communications links from a first client device that is external to the network device, image data and metadata for an image acquired by the first client device, wherein the metadata for the image specifies one or more logical entities for the image acquired by the first client device. In response to receiving, over the one or more communications links from the first client device that is external to the network device, the image data and the metadata for the image acquired by the first client device, wherein the image data specifies one or more logical entities for the image acquired by the first client device, the image management application stores the image data and the metadata for the image acquired by the first client device, receives, over the one or more communications links from a second client device that is external to the network device and different from the first client device, a request for a user to access the image data for the image acquired by the first client device and in response to receiving, over the one or more communications links from a second client device that is external to the network device and different from the first client device, a request for a user to access the image data and the metadata for the image acquired by the first client device, the image management application determines one or more logical entities assigned to the user, determines, based upon the one or more logical entities assigned to the user and the one or more logical entities for the image acquired by the first client device, whether the user is permitted to access the image acquired by the first client device and in response to determining, based upon the one or more logical entities assigned to the user and the one or more logical entities for the image acquired by the first client device, that the user is permitted to access the image acquired by the first client device, causes the image data and the metadata for the image acquired to the first client device to be transmitted to the second client device. In response to determining, based upon the one or more logical entities assigned to the user and the one or more logical entities for the image acquired by the first client device, that the user is not permitted to access the image acquired by the first client device, then the image management application does not cause the image data and the metadata for the image acquired to the first client device to be transmitted to the second client device.
In the figures of the accompanying drawings like reference numerals refer to similar elements.
In
In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments. It will be apparent, however, to one skilled in the art that the embodiments may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments.
I. OVERVIEW
II. SYSTEM ARCHITECTURE
III. ACQUIRING IMAGES USING A REFERENCE IMAGE AND DISTANCE
IV. MEMO AND AUDIO DATA
V. IMAGE DATA MANAGEMENT
VI. HISTORICAL VIEWS
VII. MANAGING ACCESS TO IMAGES USING ROLES
VIII. MANAGING ACCESS TO WORKFLOWS USING ROLES
IX. IMPLEMENTATION MECHANISMS
An approach is provided for acquiring and managing images. According to the approach, a reference image of one or more objects is displayed on the display of a mobile device in a manner that allows a user of the mobile device to simultaneously view the reference image and a preview image of the one or more objects currently in a field of view of a camera of the mobile device. For example, the reference image may be displayed on the display of the mobile device at a different brightness level, color, or with special effects, relative to the preview image. An indication is provided to the user of the mobile device whether the camera of the mobile device is currently located within a specified amount of a distance at which the reference image was acquired. For example, a visual or audible indication may indicate whether the camera of the mobile device is too close, too far away, or within a specified amount of a distance at which the reference image was acquired. In response to a user request to acquire an image, the camera acquires a second image of the one or more objects and a distance between the camera and the one or more objects at the time the second image was acquired is recorded. The second image and metadata are transmitted to an image management application that is external to the mobile device. For example, the second image and metadata may be transmitted over one or more networks to the image management application executing on an application server. The image management application provides various functionalities for managing images. For example, the image management application may allow a user to review and accept images, reject images and update metadata for images. As another example, the image management application provides a historical view that allows a user to view a sequence of images of one or more objects that were acquired at approximately the same distance and angle, which allows a user to better discern changes over time in the one or more objects.
According to one embodiment, access to images, workflows and workflow levels is managed using roles. Users are assigned roles and users are permitted to access images, workflows and workflow levels for which they have been assigned the required roles.
A. Mobile Device
Mobile device 102 may be any type of mobile device and examples of mobile device 102 include, without limitation, a smart phone, a camera, a tablet computing device, a personal digital assistant or a laptop computer. In the example depicted in
Display 120 may be implemented by any type of display that displays images and information to a user and may also be able to receive user input and embodiments are not limited to any particular implementation of display 120. Mobile device 102 may have any number of displays 120, of similar or varying types, located anywhere on mobile device 102. Camera 122 may be any type of camera and the type of camera may vary depending upon a particular implementation. As with display 120, mobile device 102 may be configured with any number of cameras 122 of similar or varying types, for example, on a front and rear surface of mobile device 102, but embodiments are not limited to any number or type of camera 122.
Distance detection mechanism 124 is configured to detect a distance between the camera 122 on mobile device 102 and one or more objects within the field of view of the camera 122. Example implementations of distance detection mechanism may be based upon, without limitation, infra-red, laser, radar, or other technologies that use electromagnetic radiation. Distance may be determined directly using the distance detection mechanism 124, or distance may be determined from image data. For example, the distance from the camera 122 to one or more objects on the ground and in the field of view of the camera 122 may be calculated based upon a height of the camera 122 and a current angle of the camera 122 with respect to the ground. For example, given a height (h) of the camera 122 and an acute angle (a) between the vertical and a line of sight to the one or more objects, the distance (d) may be calculated as follows: d=h*tan (a). As another example, if one or more dimensions of the one or more objects are known, the distance between the camera 122 and the one or more objects may be determined based upon a pixel analysis of the one or more objects for which the one or more dimensions are known.
Data acquisition component 125 may comprise hardware subcomponents, programmable subcomponents, or both. For example, data acquisition component 125 may include one or more cameras, scanners, memory units or other data storage units, buffers and code instructions for acquiring, storing and transmitting data, or any combination thereof. Data acquisition component 125 may be configured with a Wi-Fi interface and a barcode reader. The Wi-Fi interface may be used to transmit information to and from the data acquisition component 125. The barcode reader may be used to scan or otherwise acquire a code, such as a point of sale (POS) code displayed on an item.
Microphone 130 is configured to detect audio and in combination with other elements, may store audio data that represents audio detected by microphone 130. Communications interface 132 may include computer hardware, software, or any combination of computer hardware and software to provide wired and/or wireless communications links between mobile device 102 and other devices and/or networks. The particular components for communications interface 132 may vary depending upon a particular implementation and embodiments are not limited to any particular implementation of communications interface 132. Power/power management component 134 may include any number of components that provide and manage power for mobile device 102. For example, power/power management component 134 may include one or more batteries and supporting computer hardware and/or software to provide and manage power for mobile device 102.
Computing architecture 138 may include various elements that may vary depending upon a particular implementation and mobile device 102 is not limited to any particular computing architecture 138. In the example depicted in
Operating system 136 executes on computing architecture 138 and may be any type of operating system that may vary depending upon a particular implementation and embodiments are not limited to any particular implementation of operating system 136. Operating system 136 may include multiple operating systems of varying types, depending upon a particular implementation. Applications 126 may be any number and types of applications that execute on computing architecture 138 and operating system 136. Applications 126 may access components in mobile device 102, such as display 120, camera 122, distance detection mechanism 124, computing architecture 138, microphone 130, communications interface 132, power/power management component 134 and other components not depicted in
Applications 126 may provide various functionalities that may vary depending upon a particular application and embodiments are not limited to applications 126 providing any particular functionality. Common non-limiting examples of applications 126 include social media applications, navigation applications, telephony, email and messaging applications, and Web service applications. In the example depicted in
B. Application Server
In the example depicted in
Data interface 160 is configured to receive data from mobile device 102 and may do so using various communication protocols and from various media. Example protocols include, without limitation, the File Transfer Protocol (FTP), the Telnet Protocol, the Transmission Control Protocol (TCP), the TCP/Internet Protocol (TCP/IP), the Hypertext Transfer Protocol (HTTP), the Simple Mail Transfer Protocol (SMTP), or any other data communications protocol. Data receiver 118 may be configured to read data from an FTP folder, an email folder, a Web server, a remote media such as a memory stick, or any other media. Data interface 160 may include corresponding elements to support these transport methods. For example, data interface 160 may include, or interact with, an FTP server that processes requests from an FTP client on mobile device 102. As another example, data interface 160 may include, or interact with, an email client for retrieving emails from an email server on mobile device 102 or external to mobile device 102. As yet another example, data interface 160 may include, or interact with, a Web server that responds to requests from an http client on mobile device 102. Data interface 160 is further configured to support the transmission of data from application server 104 to other devices and processes, for example, EMR system 106, other services 108 and client device 110.
User interface 160 provides a mechanism for a user, such as an administrator, to access application server 104 and data stored on storage 168, as described in more detail hereinafter. User interface 160 may be implemented as an API for application server 104. Alternatively, user interface 160 may be implemented by other mechanisms. For example, user interface 160 may be implemented as a Web server that serves Web pages to provide a user interface for application server 104.
Image management application 164 provides functionality for managing images received from mobile device 102 and stored in storage 168. Example functionality includes reviewing images, accepting images, rejecting images, processing images, for example to improve blurriness or otherwise enhance the quality of images, crop or rotate images, etc., as well as update metadata for images. Example functionality also includes providing a historical view of a sequence of images of one or more objects, where the images in the sequence were acquired using a reference image as a background image and at approximately the same distance from the one or more objects. According to one embodiment, image management application 164 provides a graphical user interface to allow user access to the aforementioned functionality. The graphical user interface may be provided by application software on client device 110, application software on application server 104, or any combination of application software on client device 110 and application server 104. As one example, the graphical user interface may be implemented by one or more Web pages generated on application server 104 and provided to client device 110. Image management application 164 may be implemented in computer hardware, computer software, or any combination of computer hardware and software. For example, image management application 164 may be implemented as an application, e.g., a Web application, executing on application server 104.
Transcription application 166 processes audio data acquired by mobile device 102 and generates a textual transcription. The textual transcription may be represented by data in any format that may vary depending upon a particular implementation. Storage 168 may include any type of storage, such as volatile memory and/or non-volatile memory. Application server 104 is configured to provide image and/or video data and identification data to EMR system 106, other services 108 and client device 110. Application server 104 transmits the data to EMR system 106, other services 108 and client device 110 using standard techniques or alternatively, Application server 104 may transmit data to EMR system 106, other services 108 and client device 110 in accordance with Application Program Interfaces (APIs) supported by EMR system 106, other services 108 and client device 110. Application server 104 may be implemented as a stand-alone network element, such as a server or intermediary device. Application server 104 may also be implemented on a client device, including mobile device 102.
According to one embodiment, mobile device 102 is configured to acquire image data using a reference image as a background image and a distance at which the reference image was acquired.
In step 204, the reference image is displayed on the mobile device as a background image. For example, image acquisition application 128 may cause the reference image to be displayed on display 120 of mobile device 102.
According to one embodiment, a distance at which the reference image was acquired is indicated on the display of the mobile device. For example, as depicted in
In step 206, one or more preview images are displayed of one or more objects currently in the field of view of the camera. For example, image acquisition application 128 may cause one or more preview images to be acquired and displayed on display 120. In
In step 208, a determination is made of a current distance between the mobile device and the one or more objects currently in the field of view of the camera. For example, image acquisition application 128 may cause the distance detection mechanism to measure a current distance between the mobile device 102 and the one or more objects in the field of view of the camera 122. As another example, a current distance between the mobile device 102 and the one or more objects in the field of view of the camera 122 may be determined using a GPS component in mobile device 102 and a known location of the one or more objects. In this example, the GPS coordinates of the mobile device 102 may be compared to the GPS coordinates of the one or more objects to determine the current distance between the mobile device 102 and the one or more objects in the field of view of the camera 122.
In step 210, an indication is provided to a user of the mobile device whether the current distance is within a specified amount of the distance at which the reference image was acquired. For example, the image acquisition application 128 may compare the current distance between the mobile device 102 and the one or more objects, as determined in step 208, to the distance at which the reference image was acquired. The result of this comparison may be indicated to a user of the mobile device 102 in a wide variety of ways that may vary depending upon a particular implementation and embodiments are not limited to any particular manner of notification. For example, the image acquisition application 128 may visually indicate on the display 120 whether the current distance is within a specified amount of the distance at which the reference image was acquired. This may include, for example, displaying one or more icons on display 120 and/or changing one or more visual attributes of icons displayed on display 120. As one example, icon 306 may be displayed in red when the current distance is not within the specified amount of the distance at which the reference image was acquired, displayed in yellow when the current distance is close to being within the specified amount of the distance at which the reference image was acquired and displayed in green when the current distance is within the specified amount of the distance at which the reference image was acquired. As another example, an icon, such as a circle may be displayed and the diameter reduced as the current distance approaches the specified amount of the distance at which the reference image was acquired. The diameter of the circle may increase as the difference between the current distance and distance at which the reference image was acquired increases, indicating that the mobile device 102 is getting farther away from the distance at which the reference image was acquired. As another example, different icons or symbols may be displayed to indicate whether the current distance is within the specified amount of the distance at which the reference image was acquired. As one example, a rectangle may be displayed when the mobile device 102 is beyond a specified distance from the distance at which the reference image was acquired and then changed to a circle as the mobile device 102 approaches the distance at which the reference image was acquired.
Image acquisition application 128 may audibly indicate whether the current distance is within a specified amount of the distance at which the reference image was acquired, for example, by generating different sounds. As one example, the mobile device 102 may generate a sequence of sounds, and the amount of time between each sound is decreased as the mobile device approaches the distance at which the reference image was acquired. The current distance between the mobile device 102 and the one or more objects in the field of view of the camera 122 may also be displayed on the display, for example, as depicted in
In step 212, a second image of the one or more objects is acquired in response to a user request. For example, in response to a user selection of a button 308, the second image of the one or more objects that are currently in the field of view is acquired. Metadata is also generated for the second image and may specify, for example, camera parameter values used to acquire the second image, and a timestamp or other data, such as a sequence identifier, that indicates a sequence in which images were acquired. According to one embodiment, the metadata for the second image includes a reference to the reference image so that the reference image and the second image can be displayed together, as described in more detail hereinafter. The reference may be in any form and may vary depending upon a particular implementation. For example, the reference may include the name or identifier of the reference image. The metadata for the reference image may also be updated to include a reference to the second image.
According to one embodiment, camera settings values used to acquire the reference image are also used to acquire the second image. This ensures, for example, that the same camera settings, such as focus, aperture, exposure time, etc., are used to acquire both the reference image and the second image. This reduces the likelihood that differences in the one or more objects in the sequence of images are attributable to different camera settings used to acquire the images, rather than actual changes in the one or more objects. Camera settings used to acquire an image may be stored in the metadata for the acquired image, for example, in metadata 148, 174.
The current distance may optionally be reacquired and recorded in association with the second image, for example, in the metadata for the second image. Alternatively, the distance at which the reference image was acquired may be used for the second image, since the current distance is within the specified amount of the distance at which the reference image was acquired.
Image data, representing the second image, and optionally the current distance, may be stored locally on mobile device, for example, in memory 142, and/or may be transmitted by mobile device 102 for storage and/or processing on one or more of application server 104, EMR system 106, other services 108 or client device 110. Image data may be transmitted to application server 104, EMR system 106, other services 108 or client device 110 using a wide variety of techniques, for example, via FTP, via email, via http POST commands, or other approaches. The transmission of image data, and the corresponding metadata, may involve the verification of credentials. For example, a user may be queried for credential information that is verified before image data may be transmitted to application server 104, EMR system 106, other services 108 or client device 110. Although the foregoing example is depicted in
According to one embodiment, memorandum (memo) and/or audio data may be acquired to supplement image data. Memorandum data may be automatically acquired by data acquisition component 125, for example, by scanning encoded data associated with the one or more objects in the acquired image. For example, a user of mobile device 102 may scan a bar code or QR code attached to or otherwise associated with the one or more objects, or by scanning a bar code or QR code associated with a patient, e.g., via a patient bracelet or a patient identification card. Memorandum data may be manually specified by a user of mobile device 102, for example, by selecting from one or more specified options, e.g., via pull-down menus or lists, or by entering alphanumeric characters and/or character strings.
Audio data may be acquired, for example, by image acquisition application 128 invoking functionality provided by operating system 136 and/or other applications 126 and microphone 130. The acquisition of audio data may be initiated by user selection of a graphical user interface control or other control on mobile device 102. For example, a user may initiate the acquisition of audio data at or around the time of acquiring one or more images to supplement the one or more images. As described in more detail hereinafter, audio data may be processed by transcription application 166 to provide an alphanumeric representation of the audio data.
Memorandum data and/or audio data may be stored locally on mobile device, for example, in memory 142, and/or may be transmitted by mobile device 102 for storage and/or processing on one or more of application server 104, EMR system 106, other services 108 or client device 110. Memorandum data may be stored as part of metadata 148, 174. Audio data may be stored locally on mobile device 102 as audio data 146 and on application server 104 as audio data 172. In addition, memorandum data and/or audio data may be transmitted separate from or with image data, e.g., as an attachment, embedded, etc.
Various approaches are provided for managing image data. According to one embodiment, image management application 164 provides a user interface for managing image data. The user interface may be implemented, for example, as a Web-based user interface. In this example, a client device, such as client device 110, accesses image management application 164 and the user interface is implemented by one or more Web pages provided by image management application 164 to client device 110.
The unknown images queue accessed via control 618 includes images for which there are incomplete information or other problems, which may occur for a variety of reasons. For example, a particular image may have insufficient metadata to associate the particular image with other images. As another example, a particular image may be determined to not satisfy specified quality criteria, such as sharpness, brightness, etc. Users may perform processing on images in the unknown images queue to provide incomplete information and/or address problems with the images. For example, a user may edit the metadata for a particular image in the unknown images queue to supply missing data for the particular image. As another example, a user may process images in the unknown image queue to address quality issues, such as poor focus, insufficient brightness or color contrast, etc. The images may then be approved and moved to the approval queue or rejected and moved to the rejected queue.
According to one embodiment, images are displayed to a user using a historical view. In general, a historical view displays a sequence of images that includes a reference image and one or more other images acquired using the reference image as a background image as described herein.
In the example depicted in
One or more graphical user interface controls may be provided to arrange the image sequences by a time of information selected, e.g., user identification, organization, event, subject, date/time, etc. The graphical user interface controls may also allow a user to enter particular criteria and have the image sequences that correspond to the particular criteria be displayed. In the example depicted in
The images 802-808 include a reference image 802 and three subsequent images acquired using the reference image 802, namely, Image 1804, Image 2806 and Image 3808. In this example, Image 1804, Image 2806 and Image 3808 were acquired using the reference image 802 displayed on the mobile device 102 as a background image, as previously described herein. In addition, the images 802-808 are arranged on historical view screen 800 in chronological order, based upon the timestamp or other associated metadata, starting with the reference image 802, followed by Image 1804, Image 2806 and Image 3808.
Historical view screen 800 also includes controls 810 for managing displayed images 802-808 and information about a user that corresponds to the images 802-808, which in the present example is represented by patient information 812. Image history information 814 displays metadata for images 802-808. In the example depicted in
Controls 816 allow a user to play an audio recording that corresponds to the displayed image and a control 818 allows a user to view an alphanumeric transcription of the audio recording that corresponds to the displayed image.
The historical view approach for displaying a sequence of images that includes a reference image and one or more other images that were acquired using the reference image as a background image and at approximately the same distance is very beneficial to see changes over time in the one or more objects captured in the images. For example, the approach allows medical personnel to view changes over time of a wound or surgical sight. As another example, the approach allows construction personnel to monitor progress of a project, or identify potential problems, such as cracks, improper curing of concrete, etc. As yet another example, the approach allows a user to monitor changes in natural settings, for example, to detect beach or ground erosion.
According to one embodiment, access to images acquired using mobile devices is managed using roles. Images acquired by a mobile device are assigned one or more logical entities. Users are also assigned one or more roles. The term “role” is used herein to refer to a logical entity and users may have any number of roles. As described in more detail hereinafter, a role for a user may specify one or more logical entities assigned to the user, as well as additional information for the user, such as one or more workflows assigned to the user. Users are allowed to access image data for which they have been assigned the required logical entities. The approach provides a flexible and extensible system for managing access to image data and is particularly beneficial in situations when images contain sensitive information. The approach may be used to satisfy business organization policies/procedure and legal and regulatory requirements. The approaches described herein are applicable to any type of logical entities. Examples of logical entities include, without limitation, a business organization, a division, department, group or team of a business organization.
In step 902, an image is acquired by a client device. For example, a user of mobile device 102 may acquire an image using image acquisition application 128 and metadata for the acquired image is generated. As previously described herein, the metadata for the acquired image may specify the camera settings used to acquire the image, as well as memorandum data for the image. According to one embodiment, metadata for the acquired image specifies one or more logical entities assigned to the acquired image. The one or more logical entities may be specified in a wide variety of ways that may vary depending upon a particular implementation. For example, mobile device 102 may be configured to automatically assign one or more particular logical entities to images captured by mobile device 102. This may be useful, for example, when mobile device 102 is associated with a particular logical entity, such as a department of a business organization, so that images captured with the mobile device 102 are automatically assigned to the department of the business organization. Alternatively, logical entities may be specified by a user of the mobile device. For example, a user of mobile device 102 may manually specify one or more logical entities to be assigned to a captured image. This may be accomplished by the user selecting particular logical entities from a list of available logical entities. For example, image acquisition application 128 may provide graphical user interface (GUI) controls for selecting logical entities. As another example, mobile device 102 may include manual controls that can be used to select logical entities. Alternatively, a user may manually enter data, such as the names, IDs, etc., of one or more logical item groups to be assigned to an acquired image. As another example, a user of a mobile device may use the mobile device to scan encoded data to assign one or more logical groups to an acquired image. For example, a user may use data acquisition mechanism 125 of mobile device 102 to scan encoded data that corresponds to one or more logical entities. Logical entities may be assigned to images in a similar manner for other types of image acquisition devices. For example, images acquired by a scanning device, MFP or camera may be assigned logical entities by a user of the scanning device, MFP or camera, e.g., via a graphical user interface or controls provided by the scanning device, MFP or camera.
In step 904, the acquired image and metadata for the acquired image are transmitted to application server 104. For example, image acquisition application 128 on mobile device 102 may cause the acquired image and corresponding metadata to be transmitted to application server 104 and stored in storage 168. The location where the image data and metadata are stored may be automatically configured in mobile device 102 or the location may be specified by a user, for example, by selecting one or more locations via a GUI displayed by image acquisition application 128. Image data and metadata may be immediately transmitted to application server 104 as soon as the image data and metadata are acquired. Alternatively, image data and metadata may be stored locally on mobile device 102 and transmitted to application server 104 when requested by a user. This may allow a user an opportunity to select particular images, and their corresponding metadata, that are to be transmitted to application server 104.
In step 906, a user wishing to view images acquired by mobile device 102 accesses image management application 164. For example, a user of client device 110 accesses image management application 164 on application server 104. The user of client device 110 may be the same user that acquired the images using mobile device 102, or a different user. As previously described herein, users may be required to be authenticated before being allowed to access image management application 164. For example, as depicted later herein with respect to
In step 908, the user requests to access image data. As previously described herein, users may access images in a wide variety of ways, e.g., via dashboard 610 to access logical collections of images, such as Approval Queue, Rejected Queue, Unknown Queue, etc.
In step 910, a determination is made whether the user is authorized to access the requested image data using logical entities. According to one embodiment, this includes determining one or more roles, i.e., logical entities, assigned to the user and determining one or more logical entities assigned to the image data that the user requested to access. The determination whether the user is authorized to access the requested image data is then made based upon the one or more roles, i.e., logical entities, assigned to the user and the one or more logical entities assigned to the image data that the user requested to access. Consider an example in which a particular image has been acquired via mobile device 102 and stored on application server 104, and a particular user wishes to access the particular image. After being authenticated to access image management application 164 and requesting access to the particular image, one or more roles, i.e., logical entities, assigned to the user and one or more logical entities assigned to the particular image are determined. According to one embodiment, if any of the one or one or more roles, i.e., logical entities, assigned to the user match the one or more logical entities assigned to the particular image, then the user is granted access to the particular image. For example, suppose that the particular image has been assigned the logical entities “Emergency Room” and “Pediatrics.” In this example, if the particular user has been assigned either the role, i.e., logical entity, “Emergency Room” or “Pediatrics,” then in step 912, the user is granted access to the particular image. Otherwise, in step 912, the user is not granted access to the particular image.
User data may be stored on application server 104, for example, in user data 176 on storage 168. Alternatively, user data may be stored remotely with respect to application server 104 and accessed by image management application 164, for example, via network 112. User data 176 may be managed by image management application 164 and according to one embodiment, image management application 164 provides a user interface that allows users, such as an administrator, to define and update user data.
According to one embodiment, access to workflows to process images acquired using mobile devices is managed using roles. The term “workflow” is used herein to refer to a process for processing images acquired by mobile devices and the processes may be provided, for example, by image management application 164. Example processes include, without limitation, processes for approving, rejecting and updating images, and viewing historical views of images, as described herein. Users are authorized to access particular workflows, as specified by user data. When a particular user requests access to a particular process for processing images acquired by mobile devices, a determination is made, based upon the user data for the user, whether the user is authorized to access the particular process to process images acquired by mobile devices. The user is granted or not granted access based upon the determination.
Further access control may be provided using roles. More specifically, user data and roles may be used to limit access by a user to a particular workflow and particular images. For example, as described in more detail hereinafter, a request for a user to process a particular image using a particular workflow (or a request to access the particular workflow to process the particular image) may be verified based upon both whether the user is authorized to access the particular workflow and whether the user is authorized to access the particular image. In addition, workflow levels may be used to manage access to particular functionality within a workflow. Thus, different levels of access granularity may be provided, depending upon a particular implementation.
A. Access Levels
In Level 2, a user is granted access to a particular workflow and images that are particular to the workflow. Level 2 differs from Level 1 in that a user is not granted access to all images using the workflow, but only images that are particular to the workflow. For example, a user may be granted access to the process for viewing and approving or rejecting images, but only with respect to images that are particular to the particular workflow. For Level 2, the user's role and image metadata, pertaining to associated workflows, are used as access criteria. More specifically, the user's data must specify that the user is authorized to access the particular workflow and also the metadata for the images must specify that the images are associated with the particular workflow. In this example, the user data 176 for the user must specify that the user is authorized to access the process for viewing and approving or rejecting images and the metadata for the images must specify that the images are associated with the process for viewing and approving or rejecting images. Access is not allowed for images that are not associated with the particular workflow.
In Level 3, a user is granted access to a particular workflow and images that are particular to logical entities that the user is allowed to access. For example, a user may be granted access to a process for viewing and approving or rejecting images, but only with respect to images that are particular to a particular logical entity, such as a department within a business organization, that the user is authorized to access. For Level 3, the user's role and image metadata, pertaining to logical entities, are used as access criteria. More specifically, the user's data must specify that the user is authorized to access the particular workflow and a particular logical entity, e.g., a particular department of a business organization. Also, the metadata for the images must specify that the images are associated with the specified logical entity. In this example, the user data 176 for the user must specify that the user is authorized to access the process for viewing and approving or rejecting images and is authorized to access images for the particular department of the business organization. The metadata for the images must specify that the images are associated with the department within the business organization. Unlike Level 2, the images are not required to be associated with the workflow, i.e., the process for viewing and approving or rejecting images. Access is not allowed, however, for images that are not associated with the particular logical entity, i.e., the department within the business organization, that the user is authorized to access.
In Level 4, a user is granted access to a particular workflow and images that are particular to both the particular workflow and logical entities that the user is allowed to access. For example, a user may be granted access to the process for viewing and approving or rejecting images, but only with respect to images that are particular to both the process for viewing and approving or rejection images and a logical entity, such as a department within a business organization, that is assigned to the user. For Level 4, the user's role and image metadata pertaining to associated workflows and logical entities are used as access criteria. More specifically, the user's data must specify that the user is authorized to access the particular workflow and one or more logical entities. The metadata for the images must specify that the images are associated with both the particular workflow and the one or more logical entities assigned to the user. Access is not allowed for images that are not associated with both the particular workflow and the one or more logical entities assigned to the user.
The foregoing examples are depicted and described in the context of accessing a particular workflow, i.e., a process for processing images acquired by mobile device 102, but embodiments are not limited to these example processes and are applicable to any types of processes. In addition, the approach is applicable to workflows implemented by other processes implemented application server 104 and also remote to application server 104. In this context, image management application 164 may act as a gatekeeper to processes executing remote to image management application 164.
B. Workflow Levels
According to one embodiment, a workflow may have any number of workflow levels, where each workflow level represents a part of the workflow process. Workflow levels provide additional granularity for managing access to workflows because users may be given selective access to some workflow levels within a workflow, but not other workflow levels in the same workflow. For example, as previously described herein with respect to
Although the flow diagrams of the present application depict a particular set of steps in a particular order, other implementations may use fewer or more steps, in the same or different order, than those depicted in the figures.
According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard-wired and/or program logic to implement the techniques.
Computer system 2100 may be coupled via bus 2102 to a display 2112, such as a cathode ray tube (CRT), for displaying information to a computer user. Although bus 2102 is illustrated as a single bus, bus 2102 may comprise one or more buses. For example, bus 2102 may include without limitation a control bus by which processor 2104 controls other devices within computer system 2100, an address bus by which processor 2104 specifies memory locations of instructions for execution, or any other type of bus for transferring data or signals between components of computer system 2100.
An input device 2114, including alphanumeric and other keys, is coupled to bus 2102 for communicating information and command selections to processor 2104. Another type of user input device is cursor control 2116, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 2104 and for controlling cursor movement on display 2112. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
Computer system 2100 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic or computer software which, in combination with the computer system, causes or programs computer system 2100 to be a special-purpose machine. According to one embodiment, those techniques are performed by computer system 2100 in response to processor 2104 processing instructions stored in main memory 2106. Such instructions may be read into main memory 2106 from another computer-readable medium, such as storage device 2110. Processing of the instructions contained in main memory 2106 by processor 2104 causes performance of the functionality described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the embodiments. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
The term “computer-readable medium” as used herein refers to any medium that participates in providing data that causes a computer to operate in a specific manner. In an embodiment implemented using computer system 2100, various computer-readable media are involved, for example, in providing instructions to processor 2104 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 2110. Volatile media includes dynamic memory, such as main memory 2106. Common forms of computer-readable media include, without limitation, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip, memory cartridge or memory stick, or any other medium from which a computer can read.
Various forms of computer-readable media may be involved in storing instructions for processing by processor 2104. For example, the instructions may initially be stored on a storage medium of a remote computer and transmitted to computer system 2100 via one or more communications links. Bus 2102 carries the data to main memory 2106, from which processor 2104 retrieves and processes the instructions. The instructions received by main memory 2106 may optionally be stored on storage device 2110 either before or after processing by processor 2104.
Computer system 2100 also includes a communication interface 2118 coupled to bus 2102. Communication interface 2118 provides a communications coupling to a network link 2120 that is connected to a local network 2122. For example, communication interface 2118 may be a modem to provide a data communication connection to a telephone line. As another example, communication interface 2118 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 2118 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Network link 2120 typically provides data communication through one or more networks to other data devices. For example, network link 2120 may provide a connection through local network 2122 to a host computer 2124 or to data equipment operated by an Internet Service Provider (ISP) 2126. ISP 2126 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 2128. Local network 2122 and Internet 2128 both use electrical, electromagnetic or optical signals that carry digital data streams.
Computer system 2100 can send messages and receive data, including program code, through the network(s), network link 2120 and communication interface 2118. In the Internet example, a server 2130 might transmit a requested code for an application program through Internet 2128, ISP 2126, local network 2122 and communication interface 2118. The received code may be processed by processor 2104 as it is received, and/or stored in storage device 2110, or other non-volatile storage for later execution.
In the foregoing specification, embodiments have been described with reference to numerous specific details that may vary from implementation to implementation. Thus, the sole and exclusive indicator of what is, and is intended by the applicants to be, the invention is the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. Hence, no limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should limit the scope of such claim in any way. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
This application is related to U.S. patent application Ser. No. 14/543,712 (Attorney Docket No. 49986-0811) entitled IMAGE ACQUISITION AND MANAGEMENT, filed Nov. 17, 2014, and U.S. patent application Ser. No. 14/543,725 (Attorney Docket No. 49986-0817) entitled IMAGE ACQUISITION AND MANAGEMENT, filed Nov. 17, 2014, the contents all of which are incorporated by reference in their entirety for all purposes as if fully set forth herein.