US8700685B2 - Allocation of assessments - Google Patents
Allocation of assessments Download PDFInfo
- Publication number
- US8700685B2 US8700685B2 US13/153,982 US201113153982A US8700685B2 US 8700685 B2 US8700685 B2 US 8700685B2 US 201113153982 A US201113153982 A US 201113153982A US 8700685 B2 US8700685 B2 US 8700685B2
- Authority
- US
- United States
- Prior art keywords
- financial institution
- workload
- assessor
- assessment
- assessors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
Definitions
- IS Information security
- an organization may perform an IS assessment of the third party's information security procedures.
- an IS team may be composed of a group of human assessors, where one or more assessors are assigned to review the IS procedures of a particular third party.
- the assessor may work with the third party to remedy any potential security gaps in IS procedures prior to granting access to the information.
- an assessor is working on a predetermined number of assessments at the same time.
- the assessor with the fewest number of pending assessments is typically assigned the new assessment.
- the amount of time and effort, however, required to complete an assessment may vary significantly from assessment to assessment.
- the disclosure provides, inter alia, an improved manner of assigning assessments to assessors.
- apparatuses, computer readable media, methods, and systems are described for processing a workload record for each of a plurality of assessors, each of the workload records identifying an assessment previously assigned to a particular one of the assessors, calculating a complexity score for each of the assessments, calculating a workload index for each of the assessors based on the complexity score of the assessment previously assigned to that assessor, and assigning a new assessment to a particular one of the assessors based on the workload indexes.
- apparatuses, computer readable media, methods, and systems are described for processing a workload record for each of a plurality of assessors, at least one of the workload records identifying an assessment previously assigned to a particular one of the assessors, calculating a complexity score for the assessment, calculating a workload index for each of the assessors, wherein one of the workload indexes is based on the complexity score, and assigning a new assessment to a particular one of the assessors based on the workload indexes.
- aspects of the embodiments may be provided in at least one computer-readable medium and/or memory storing computer-executable instructions that, when executed by at least one processor, cause a computer or other apparatus to perform one or more of the process steps described herein.
- FIG. 1 shows an illustrative operating environment in which various aspects of the disclosures may be implemented in accordance with example embodiments.
- FIG. 2 is an illustrative block diagram of workstations and servers that may be used to implement the processes and functions of certain aspects of the present disclosure in accordance with example embodiments.
- FIGS. 3-4 illustrate an example assessment complexity score table identifying complexity factors for determining a complexity score for a new assessment to be assigned in accordance with example embodiments.
- FIGS. 5-6 illustrate example variables for determining a workload index of an assessor in accordance with example embodiments.
- FIG. 7 illustrates a formulas table in accordance with example embodiments.
- FIG. 8 illustrates an example flow diagram of a method for allocation of assessments in accordance with example embodiments.
- FIG. 1 illustrates an example of a suitable computing system environment 100 that may be used according to one or more illustrative embodiments.
- the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality contained in the disclosure.
- the computing system environment 100 should not be interpreted as having any dependency or requirement relating to any one or combination of components shown in the illustrative computing system environment 100 .
- the disclosure is operational with numerous other general purpose or special purpose computing system environments or configurations.
- Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the disclosed embodiments include, but are not limited to, personal computers (PCs), server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
- the computing system environment 100 may include a computing device 101 wherein the processes discussed herein may be implemented.
- the computing device 101 may have a processor 103 for controlling overall operation of the computing device 101 and its associated components, including random-access memory (RAM) 105 , read-only memory (ROM) 107 , communications module 109 , and memory 115 .
- RAM random-access memory
- ROM read-only memory
- Computing device 101 typically includes a variety of computer readable media.
- Computer readable media may be any available media that may be accessed by computing device 101 and include both volatile and nonvolatile media, removable and non-removable media.
- computer readable media may comprise a combination of computer storage media and communication media.
- Computer storage media include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media include, but is not limited to, random access memory (RAM), read only memory (ROM), electronically erasable programmable read only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store the desired information and that can be accessed by computing device 101 .
- Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
- Modulated data signal includes a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
- Computing system environment 100 may also include optical scanners (not shown).
- Exemplary usages include scanning and converting paper documents, e.g., correspondence, receipts, and the like to digital files.
- RAM 105 may include one or more are applications representing the application data stored in RAM 105 while the computing device is on and corresponding software applications (e.g., software tasks), are running on the computing device 101 .
- applications representing the application data stored in RAM 105 while the computing device is on and corresponding software applications (e.g., software tasks), are running on the computing device 101 .
- Communications module 109 may include a microphone, keypad, touch screen, and/or stylus through which a user of computing device 101 may provide input, and may also include one or more of a speaker for providing audio output and a video display device for providing textual, audiovisual and/or graphical output.
- Software may be stored within memory 115 and/or storage to provide instructions to processor 103 for enabling computing device 101 to perform various functions.
- memory 115 may store software used by the computing device 101 , such as an operating system 117 , application programs 119 , and an associated database 121 .
- some or all of the computer executable instructions for computing device 101 may be embodied in hardware or firmware.
- Computing device 101 may operate in a networked environment supporting connections to one or more remote computing devices, such as computing devices 141 , 151 , and 161 .
- the computing devices 141 , 151 , and 161 may be personal computing devices or servers that include many or all of the elements described above relative to the computing device 101 .
- Computing device 161 may be a mobile device communicating over wireless carrier channel 171 .
- the network connections depicted in FIG. 1 include a local area network (LAN) 125 and a wide area network (WAN) 129 , but may also include other networks.
- computing device 101 may be connected to the LAN 825 through a network interface or adapter in the communications module 109 .
- the computing device 101 may include a modem in the communications module 109 or other means for establishing communications over the WAN 129 , such as the Internet 131 or other type of computer network. It will be appreciated that the network connections shown are illustrative and other means of establishing a communications link between the computing devices may be used.
- one or more application programs 119 used by the computing device 101 may include computer executable instructions for invoking user functionality related to communication including, for example, email, short message service (SMS), and voice input and speech recognition applications.
- SMS short message service
- Embodiments of the disclosure may include forms of computer-readable media.
- Computer-readable media include any available media that can be accessed by a computing device 101 .
- Computer-readable media may comprise storage media and communication media and in some examples may be non-transitory.
- Storage media include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, object code, data structures, program modules, or other data.
- Communication media include any information delivery media and typically embody data in a modulated data signal such as a carrier wave or other transport mechanism.
- aspects described herein may be embodied as a method, a data processing system, or as a computer-readable medium storing computer-executable instructions.
- a computer-readable medium storing instructions to cause a processor to perform steps of a method in accordance with aspects of the disclosed embodiments is contemplated.
- aspects of the method steps disclosed herein may be executed on a processor on a computing device 101 .
- Such a processor may execute computer-executable instructions stored on a computer-readable medium.
- system 200 may include one or more workstation computers 201 .
- Workstations 201 may be local or remote, and may be connected by one of communications links 202 to computer network 203 that is linked via communications links 205 to server 204 .
- server 204 may be any suitable server, processor, computer, or data processing device, or combination of the same. Server 204 may be used to process the instructions received from, and the transactions entered into by, one or more participants.
- Computer network 203 may be any suitable computer network including the Internet, an intranet, a wide-area network (WAN), a local-area network (LAN), a wireless network, a digital subscriber line (DSL) network, a frame relay network, an asynchronous transfer mode (ATM) network, a virtual private network (VPN), or any combination of any of the same.
- Communications links 202 and 205 may be any communications links suitable for communicating between workstations 201 and server 204 , such as network links, dial-up links, wireless links, hard-wired links, and the like.
- a new assessment may be to review the IS procedures of a particular third party supplier that has access to or will be granted access to an organization's data.
- An assessment may be any type of work project that can be assigned to one or more members of a team.
- the methodology may consider a complexity of the new assessment, as well as a workload index of each assessor that is based on a total number and the complexity of assessments previously assigned to an assessor.
- a computing device 101 may be associated with an organization, such as a financial institution, a bank, a credit union, a company, or other entity. Such an organization may be interested, for example, in maintaining information security.
- the computing device 101 may receive data on parameters of a new assessment to be assigned to a particular one of a group of assessors.
- the computing device 101 may process the data to determine a complexity score for the assessment.
- the computing device 101 may also retrieve workload records for each of the assessors to determine a workload index for each.
- the workload index may be a numerical score indicating a workload of an assessor, for comparison to a workload index of each of the other assessors. Based on the workload index and on the complexity score for the new assessment, the computing device 101 may determine which of the assessors to assign the new assessment, as described below in further detail.
- FIGS. 3-4 illustrate an example assessment complexity score table identifying complexity factors for determining a complexity score for a new assessment to be assigned or for a previously assigned assessment.
- the following discusses a new assessment, but the concepts may also be used for determining a complexity score for a previously assigned assessment.
- FIG. 3 provides a table 300 including definitions of example complexity factors
- FIG. 4 illustrates a table 400 having example values for the complexity factors.
- Example complexity factors may be a tier factor, an information security and business continuity supplier tiering and risk tool score factor, and a historical performance factor.
- the historical performance factor may consider a supplier's historical responsiveness when addressing IS issues over a time period.
- FIG. 3 indicates that the historical performance factor is for a previous year (e.g., 2010).
- the historical performance factor may consider any desired length of time.
- other complexity factors may be used, some of the complexity factors listed in table 300 may be omitted, and additional complexity factors may also be used.
- a tier complexity factor may define a level of supplier risk and performance management necessary for a supplier relationship.
- the tier complexity factor may be assigned to one of multiple values by supply chain management and/or may be determined by the computing device 101 . Categorizing a supplier into a tier may be based on the service provided to the organization by the supplier, and the extent to which the supplier has access to the organization's confidential information. For example, a supplier having access to important information, such as customer lists, billing information, as well as rights to modify such information, may be placed in a higher tier (e.g., tier 1 ).
- a supplier that merely has access to read important information, but not to modify such information may be placed in a lower tier (e.g., tier 2 ).
- a supplier that does not have access to important information, but does not access to other types of information may be placed in a lowest tier (e.g., tier 3 ). Any number of tiers may be used, depending on a desired level of granularity between the tiers.
- the tier complexity factor may be used to classify a supplier into one of tier 1 , tier 2 , tier 3 , and untiered. Each tier may have a rating and a weightage. Computing device 101 may assign one of these tier values to the new assessment. Based on the tier value assigned, computing device 101 may identify the associated rating and weightage. For example, computing device 101 may assign the new assessment to tier 1 , which has a rating of 4 and a weightage of 4. As described in further detail below, computing device 101 may determine a complexity score for the new assessment that is a function of the rating and weightage associated with each complexity factor.
- computing device 101 may determine a rating and weightage for an information security and business continuity supplier tiering and risk tool score factor for the new assessment.
- Computing device 101 may apply a supplier tiering and risk tool computer application to identify and measure supplier information security risk.
- the supplier tiering and risk tool may determine what level of due diligence is needed for a particular supplier.
- the supplier tiering and risk tool may return a determination of information security for a supplier as well as a business continuity score of the supplier.
- each of information security and business continuity may be assigned a value of high, medium, or low. Other values may also be used.
- computing device 101 may determine a rating and weightage for the information security and business continuity supplier tiering and risk tool score complexity factor.
- computing device 101 may assign a rating of 4 and a weightage of 3 to the information security and business continuity supplier tiering and risk tool score complexity factor. If information security for the supplier is high and the business continuity score is medium or low, computing device 101 may assign a rating of 3 and a weightage of 3 to the information security and business continuity supplier tiering and risk tool score complexity factor, and so forth through the table 400 .
- computing device 101 may determine a rating and weightage for the historical performance factor for the new assessment.
- the historical performance factor may be based on a historical performance of how efficiently a supplier has remedied IS issues, as well as how well a supplier complied with IS procedures specified by the organization.
- the historical performance factor may be composed of a number of subfactors. Subfactors may be findings, submission days, remediation accepted days, number of days for closures, and risk acceptance.
- the findings subfactor may indicate a total number of IS findings (i.e., IS issues identified) for the supplier during a time period of interest for the historical performance factor.
- a finding may indicate how well a supplier has historically complied with the IS procedures of the organization.
- a finding may also indicate that the supplier did not provide a particular IS control or violated an IS control during the historical time period of interest. Examples of findings may include a lack of a firewall for a computer network, and failing to periodically review an activity log of a firewall.
- FIGS. 3-4 list the previous year as the time period of interest.
- the findings subfactor may include a threshold of 10 findings, where the computing device 101 may assign a rating of 1 and a weightage of 2 if a supplier has 10 or fewer findings in the previous year, and may assign a rating of 2 and a weightage of 2 if greater than 10 findings.
- the submission days subfactor may be the number of days taken during the previous year to submit an IS questionnaire.
- the organization upon deciding to use a supplier, may require that the supplier periodically fill out an IS questionnaire available online.
- the IS questionnaire may be used to determine what IS controls the supplier has implemented and/or any changes in IS controls over time.
- the submission days subfactor may include a threshold of 28 days, where computing device 101 may assign a rating of 1 and a weightage of 2 if a supplier submitted the IS questionnaire in 28 or fewer days in the previous year, and may assign a rating of 2 and a weightage of 2 if greater than 28 days.
- the remediation accepted days subfactor may indicate the number of days during the time period of interest the supplier required to accept a proposed remediation.
- an assessor may have proposed certain a change in an IS procedure of the supplier, and the remediation accepted days subfactor may indicate how long the supplier took to implement the change.
- the remediation accepted days subfactor may include a threshold of 60 days, where the computing device 101 may assign a rating of 1 and a weightage of 2 if a supplier required 60 or fewer days to accept the remediation during the previous year, and may assign a rating of 2 and a weightage of 2 if greater than 60 days.
- the number of days for assessment closure subfactor may indicate how long an assessment took to close after opening. This subfactor may, for example, indicate the total number of days from a first day when an assessment is initiated to a last day when the assessment is completed.
- the number of days for the closure subfactor may include a threshold of 208 days, where the computing device 101 may assign a rating of 1 and a weightage of 2 if the acceptance was closed in 208 or fewer days during the previous year, and may assign a rating of 2 and a weightage of 2 if greater than 208 days.
- the risk acceptance subfactor may indicate whether a risk acceptance process was initiated during the time period of interest.
- the computing device 101 may initiate a risk acceptance process if the supplier fails to remediate a finding within a predetermined amount of time.
- the risk acceptance process may be a determination by the organization of whether to accept that the supplier has not remediated a finding, to discontinue using the supplier, and/or to limit to what information a supplier has access.
- the computing device 101 may assign a rating of 2 and a weightage of 2 if a risk acceptance process was initiated due to the supplier failing to remediate one or more findings during the time period of interest, and may assign a rating of 1 and a weightage of 2 if no risk acceptance processes were initiated for the supplier.
- the computing device 101 may determine a rating for the historical performance factor that may be an average of the ratings determined for the subfactors.
- the computing device 101 may determine a complexity score for the new assessment as a function of the ratings assigned to each of the factors and the weightages.
- An example equation for determining the complexity score is shown in the first row of the formulas table 700 in FIG. 7 .
- a tier factor may have a rating of 4 and a weightage of 4
- an information security business continuity supplier tiering and risk tool score factor may have a rating of 3 and a weightage of 3
- a historical performance factor may have a rating of 2 (e.g., if all subfactors have a rating of 2, then they will have an average of 2) and a weightage of 2.
- the computing device 101 may multiply each rating by the corresponding weightage to determine a number of complexity points for each factor, and then sum the complexity points to determine the complexity score.
- the complexity score is 29 (i.e., 4 ⁇ 4+3 ⁇ 3+2 ⁇ 2).
- the complexity score may be based on each subfactor of the historical performance factor, rather than using an average of the ratings assigned to the subfactors. In that case, the complexity score from the previous example is 45 (i.e., 4 ⁇ 4+3 ⁇ 3+2 ⁇ 2+2 ⁇ 2+2 ⁇ 2+2 ⁇ 2+2 ⁇ 2+2 ⁇ 2).
- the computing device 101 may process a workload record of each assessor to determine a workload index for each assessor.
- the workload record may be stored in a database, a memory, or other storage device accessible by the computing device 101 .
- the workload record may include data on assessments that have been previously assigned to an assessor.
- a workload record may indicate, for the assessments that have been assigned to an assessor, in which phase each of the assessments is, a number of findings for all of the assessments, and a total assigned assessment complexity score of the already assigned assessments.
- the workload index may be a numerical determination of how busy a particular assessor based on previously assigned assessments. Based on the workload indexes, the computing device 101 may determine which of the assessors to assign the new assessment.
- FIGS. 5-6 illustrate example variables for determining a workload index of an assessor.
- FIG. 5 provides a table 500 including definitions of example variables
- FIG. 6 illustrates a table 600 having example values for the variables.
- Example variables may include a total assessment complexity score variable, a phase of assigned assessments variable, and a number of findings variable. Other variables may be used, some of the variables in table 500 may be omitted, and additional variables may also be used.
- the total assessment complexity score variable may be a sum of individual complexity scores of the assessments previously assigned to a particular assessor.
- the computing device 101 using the methodology above with reference to FIGS. 3-4 , may compute a complexity score for each of the assessments previously assigned to an assessor, and may sum the complexity scores to determine a value for the total assessment complexity score variable.
- the total assessment complexity score variable may have a weightage of 1.
- a phase of the assigned assessments variable may indicate the phase in which each of the previously assigned assessments is for an assessor.
- each assessment may be in one of three phases.
- An initiation phase may be where a supplier is reviewing and completing the IS questionnaire.
- a remediation planning phase may be where a supplier and assessor discuss any IS findings, and the supplier and the assessor agree on a remediation plan, if needed, to address the IS findings.
- a remediation phase may be where a supplier implements any missing IS controls.
- an assessor reviews the IS controls and, if satisfactory, closes the assessment. Otherwise, the assessor may continue working with the supplier until a satisfactory IS control is implemented, or the supplier fails to implement a requested IS control.
- the total assessment complexity score variable may have a weightage of 1.
- the number of findings variable may indicate a number of findings at each risk level.
- the risk level for a finding may be high, medium, or low.
- a high finding for example, may indicate that information could be compromised and that the finding must be mitigated.
- the assessor may also discuss the findings with the supplier, update comments in each finding, and track each finding until remediation has been completed. The assessor may also provide an opinion on the remediation of the finding.
- high risk findings are associated with a weightage of 3
- medium risk findings are associated with a weightage of 2
- low risk finding are associated with a weightage of 1.
- the computing device 101 may apply the formula listed in the second row of FIG. 7 .
- the computing device 101 may process a workload record of an assessor to determine that the assessor has 5 assessments in an initiation stage, 7 assessments in a remediation planning stage, 8 assessments in a remediation stage, 5 high risk findings, 10 medium risk findings, and 15 low risk findings.
- the assessor may also have a total of complexity score of 105 for the assessments previously assigned to the assessor.
- the computing device 101 may determine a workload index of 192 for the assessor (i.e., 5 ⁇ 3+7 ⁇ 2+8 ⁇ 1+5 ⁇ 3+10 ⁇ 2+15 ⁇ 1+105).
- the computing device 101 may determine a workload index for each of multiple assessors, and may assign the new assessment to the assessor having the lowest value for their workload index. If an accessor has not been previously assigned any assessments, then the computing device 101 would determine a value of zero for the workload index of that assessor.
- FIG. 8 illustrates an example flow diagram of a method for allocating a new assessment to an assessor, in accordance with an example embodiment.
- the method may be implemented by the computing device 101 or other device.
- the order of the blocks depicted in FIG. 8 may be rearranged, one or more blocks may be repeated in sequential and/or non-sequential order, and/or one or more blocks may be omitted. Further, other blocks may be added to the flow diagram.
- the method may begin at block 802 .
- the method may include processing a workload record for each of a plurality of assessors, each of the workload records identifying an assessment previously assigned to a particular one of the assessors.
- the computing device 101 may receive a listing of assessors, and retrieve a workload record for each of the assessors.
- Each of the workload records may identify how many assessments have been previously assigned to each assessor.
- a workload record may also indicate that no assessments have been assigned to an assessor.
- the method may include calculating, by a processor, a complexity score for each of the assessments.
- the computing device 101 may calculate a complexity of for each of the previously assigned assessments as described above with reference to FIGS. 3-4 and 7 .
- the method may include calculating a workload index for each of the assessors based on the complexity score of the assessment previously assigned to that assessor.
- the computing device 101 may calculate a workload index for each assessor in the manner discussed above with reference to FIGS. 5-7 .
- the method may include assigning a new assessment to a particular one of the assessors based on the workload indexes. For example, the computing device 101 may rank the workload indexes to identify the one having a lowest value, and assign the new assessment to the assessor associated with that workload index.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Game Theory and Decision Science (AREA)
- Educational Administration (AREA)
- Development Economics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/153,982 US8700685B2 (en) | 2011-06-06 | 2011-06-06 | Allocation of assessments |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/153,982 US8700685B2 (en) | 2011-06-06 | 2011-06-06 | Allocation of assessments |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20120310945A1 US20120310945A1 (en) | 2012-12-06 |
| US8700685B2 true US8700685B2 (en) | 2014-04-15 |
Family
ID=47262471
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/153,982 Active 2032-03-27 US8700685B2 (en) | 2011-06-06 | 2011-06-06 | Allocation of assessments |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US8700685B2 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6163607A (en) * | 1998-04-09 | 2000-12-19 | Avaya Technology Corp. | Optimizing call-center performance by using predictive data to distribute agents among calls |
| US7203655B2 (en) * | 2000-02-16 | 2007-04-10 | Iex Corporation | Method and system for providing performance statistics to agents |
| US20080162327A1 (en) | 2006-12-29 | 2008-07-03 | Cujak Mark D | Methods and systems for supplier quality management |
| US20100131341A1 (en) | 2000-12-27 | 2010-05-27 | International Business Machines Corporation | Gathering and disseminating quality performance and audit activity data in an extended enterprise environment |
| US20100198630A1 (en) | 2009-01-30 | 2010-08-05 | Bank Of America Corporation | Supplier risk evaluation |
-
2011
- 2011-06-06 US US13/153,982 patent/US8700685B2/en active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6163607A (en) * | 1998-04-09 | 2000-12-19 | Avaya Technology Corp. | Optimizing call-center performance by using predictive data to distribute agents among calls |
| US7203655B2 (en) * | 2000-02-16 | 2007-04-10 | Iex Corporation | Method and system for providing performance statistics to agents |
| US20100131341A1 (en) | 2000-12-27 | 2010-05-27 | International Business Machines Corporation | Gathering and disseminating quality performance and audit activity data in an extended enterprise environment |
| US20080162327A1 (en) | 2006-12-29 | 2008-07-03 | Cujak Mark D | Methods and systems for supplier quality management |
| US20100198630A1 (en) | 2009-01-30 | 2010-08-05 | Bank Of America Corporation | Supplier risk evaluation |
Also Published As
| Publication number | Publication date |
|---|---|
| US20120310945A1 (en) | 2012-12-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11930032B2 (en) | System and method for enumerating and remediating gaps in cybersecurity defenses | |
| US20090276257A1 (en) | System and Method for Determining and Managing Risk Associated with a Business Relationship Between an Organization and a Third Party Supplier | |
| US20130104237A1 (en) | Managing Risk Associated With Various Transactions | |
| GB2628920A (en) | Systems and methods for posture-based modelling | |
| WO2023014384A1 (en) | Methods, systems, apparatuses, and devices for facilitating controlling and managing cloud usage costs for using cloud resources | |
| US20240265465A1 (en) | Systems and methods for synchronizing and protecting data | |
| US10915968B1 (en) | System and method for proactively managing alerts | |
| EP4416626A1 (en) | System, method, and apparatus for measuring, modeling, reducing, and addressing cyber risk | |
| Santhikumar et al. | Utilization of big data analytics for risk management | |
| Matejka et al. | A framework for the definition and analysis of cyber insurance requirements | |
| US20180005234A1 (en) | Generating and Dynamically Modifying User Interface Elements | |
| US8700685B2 (en) | Allocation of assessments | |
| US10453143B2 (en) | Computing architecture for managed-account transactions | |
| WO2021257583A1 (en) | Computer architecture for software to improve computer resource utilization based on secure workspace data retrieved over a network | |
| Clement | Leveraging Artificial Intelligence to Enhance Insurance Operations Post-Cloud Transition | |
| Mokodaser et al. | Aligning Information Technology Governance with Business Goals Using the COBIT 2019 Framework: A Case Study of a Innovation Consultancy Firm | |
| US10699342B2 (en) | Computing architecture for managed-account transactions | |
| US10762568B2 (en) | Computing architecture for managed-account transactions | |
| Vijverberg | Management of Cloud Risk Governance: Analyzing Top Risk Topics and a Maturity Model | |
| Shaikh et al. | Dynamic parameter for selecting a cloud service | |
| Snow | A Qualitative Study of Strategy-driven, Information Security Governance (ISG) | |
| Fagerstrøm et al. | An experimental study of intertemporal choices: the case of customer relationship management | |
| US20230153872A1 (en) | Electronic ledger for decision-influencing factors | |
| Ibrahimović et al. | Maximizing Information System Availability Through Bayesian Belief Network Approaches: Emerging Research and Opportunities: Emerging Research and Opportunities | |
| Bovee-Gazett et al. | NTEN |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BANK OF AMERICA CORPORATION, NORTH CAROLINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MITTA, KARTHIK REDDY;WALIA, SUSHEEL;REEL/FRAME:026415/0501 Effective date: 20110518 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |