WO2016109608A1 - System for cyber insurance policy including cyber risk assessment/management service - Google Patents

System for cyber insurance policy including cyber risk assessment/management service Download PDF

Info

Publication number
WO2016109608A1
WO2016109608A1 PCT/US2015/067968 US2015067968W WO2016109608A1 WO 2016109608 A1 WO2016109608 A1 WO 2016109608A1 US 2015067968 W US2015067968 W US 2015067968W WO 2016109608 A1 WO2016109608 A1 WO 2016109608A1
Authority
WO
WIPO (PCT)
Prior art keywords
entity
risk
cyber
motivation
sophistication
Prior art date
Application number
PCT/US2015/067968
Other languages
French (fr)
Other versions
WO2016109608A9 (en
Inventor
Arvind Parthasarathi
George Y. Ng
Original Assignee
Cyence Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cyence Inc. filed Critical Cyence Inc.
Priority to US15/099,297 priority Critical patent/US10341376B2/en
Priority to US15/141,779 priority patent/US9521160B2/en
Priority to US15/142,997 priority patent/US9699209B2/en
Publication of WO2016109608A1 publication Critical patent/WO2016109608A1/en
Publication of WO2016109608A9 publication Critical patent/WO2016109608A9/en
Priority to US15/371,047 priority patent/US10230764B2/en
Priority to US15/373,298 priority patent/US10050989B2/en
Priority to US15/374,212 priority patent/US10050990B2/en
Priority to US15/457,921 priority patent/US10218736B2/en
Priority to US15/972,027 priority patent/US10498759B2/en
Priority to US15/971,946 priority patent/US10511635B2/en
Priority to US15/971,909 priority patent/US10491624B2/en
Priority to US16/582,977 priority patent/US11146585B2/en
Priority to US16/662,936 priority patent/US11153349B2/en
Priority to US17/465,739 priority patent/US11855768B2/en
Priority to US17/477,294 priority patent/US11863590B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance

Definitions

  • the present technology is generally directed to cyber security analysis, and more specifically, but not by way of limitation, to systems and methods that provide cyber risk assessment and management services.
  • Various embodiments of the present technology also allow for selective change to an insurance policy based on the cyber risk analysis, as well as the generation of recommendations based on the cyber risk analysis.
  • the present technology is directed to a method for assessing and reducing the risk of a cyber security failure of an entity.
  • the exemplary method includes assessing the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements.
  • the exemplary method further includes
  • the method may also include automatically recommending computer network changes to reduce the assessed risk and automatically reassessing the cyber risk of the computer network based on the recommended computer network changes. Based on the reassessed cyber risk, the method may include dynamically re-determining the at least one of the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
  • FIG. 1 is a block diagram illustrating a device according to an example
  • FIG. 2 is an example graphical user interface (GUI) that comprises a graphical representation that plots an entity's motivation and sophistication relative to cyber risk.
  • GUI graphical user interface
  • FIG. 3 is an example graphical user interface (GUI) that comprises a scatter plot illustrating an entity's motivation and sophistication relative to cyber risk.
  • GUI graphical user interface
  • FIG. 4 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their combination scores.
  • GUI graphical user interface
  • FIG. 5 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their sophistication scores.
  • GUI graphical user interface
  • FIG. 6 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their motivation scores.
  • GUI graphical user interface
  • FIG. 7 is an example graphical user interface (GUI) that comprises a scatter plot that represents a plurality of entities plotted according to their combination score.
  • GUI graphical user interface
  • FIG. 8 is an example graphical user interface (GUI) that comprises a scatter plot that represents a plurality of entities plotted according to their combination score, as well as additional graphical representations for an entity and a list of recommendations based on the plotting.
  • GUI graphical user interface
  • FIG. 9 is a flowchart of an example method of the present technology.
  • FIG. 10 is a flowchart of another example method of the present technology.
  • FIG. 11 is a flowchart of yet another example method of the present technology.
  • FIG. 12 illustrates an example computer system that can be used to implement embodiments of the disclosed technology.
  • a method comprises assessing the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements.
  • the cyber security failure may include a cyber attack and/or a privacy incident (including but not limited to an incident involving sensitive information), to name just a few.
  • the computer agent may be further configured to collect and/or analyze information from the computer network of the entity.
  • the exemplary method includes automatically determining, based on the assessed risk, at least one of a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy.
  • the insurance policy may be a policy from an insurance company, a product warranty for first and/or third party costs that an entity purchases from one of a networking, security product, or services provider, to name a few.
  • the method includes automatically recommending computer network changes to reduce the assessed risk; and automatically reassessing the cyber risk of the computer network based on the recommended computer network changes.
  • the exemplary method further includes dynamically re-determining, based on the reassessed cyber risk, the at least one of the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
  • a term and a condition may include a retention amount, a deductible, a premium, a coverage limit, a future valuation, a term length, and so forth.
  • the method includes, based on the assessing, plotting one or more features of the entity and other members of a peer group of the entity.
  • the plotting may be configured to visually illustrate the cyber risk of the entity.
  • the exemplary method may determine a change to, or a setting of, terms and conditions of the insurance policy.
  • the method comprises determining a sophistication score of the entity with respect to cyber risk, which may be considered the quality of the defense with respect to repelling, defeating, or preventing a security failure.
  • the method also comprises determining a motivation score of a hacker or other actor with respect to initiating one of a cyber security failure.
  • a composite score may be created from the motivation score and the sophistication score.
  • the exemplary method and system may be used in a cyber insurance market, and/or by a cyber insurance provider providing insurance policies.
  • the cyber insurance policy may include a cyber risk assessment/management service, which may provide feedback to one or both of the insurance company and the insured entity, enabling the entity to determine how to reduce their cyber risk, and/or how they are positioned within their peer group and/or within a universe of companies with respect to their cyber risk.
  • the insurance policy including but not limited to a cyber insurance policy, may be a policy from an insurance company or it could be a product warranty for first and/or third party costs that an entity purchases from a networking or security product or services provider.
  • the recommendations may enable the insurance company to update and/or change terms and conditions of an insurance policy.
  • the composite score of several or many entities may be aggregated and used by insurance companies, reinsurance companies, brokers and/or ratings agencies to understand and/or evaluate an aggregate risk and assess insurance premiums and/or reinsurance treaties and/or change or evaluate a credit rating. This is described in further detail in U.S. patent application no. 14/585,051 filed December 29, 2014 and entitled "Diversity Analysis with Actionable Feedback Methodologies," which is hereby
  • Cyber insurance insures entities against damage and/or loss due to security failures (e.g a cyber attack, a privacy incident). Assessing cyber risk can be a difficult task due to the volatility of the cyber environment. For example, a risk of a security failure such as a cyber attack lacks actuarial data since there is an active adversary behind cyber attacks, and past cyber attacks do not predict future cyber attacks. Better analysis of cyber risk, including the risk of security failures, and providing greater service to insurance companies and insured entities, is desirable
  • the technology disclosed herein provides a cyber risk assessment, and provides methods and systems for improving a cyber risk assessment, by, for instance, reducing a risk of a cyber attack, predicting the probability of a cyber attack, and/or determining the extent to which a cyber attack might cause damage.
  • Exemplary methods plot the cyber risk within a peer group, which may be defined by industry, revenue, and/or any other appropriate metric.
  • Various exemplary methods plot the cyber risk within the universe of companies, e.g., universe of companies for which such cyber risk has been assessed.
  • Exemplary methods assess risk in a plot using one feature.
  • multiple features may be plotted into a matrix.
  • the assessment of risk is plotted with a two (or more) dimensional analysis, which may be plotted into a two by two matrix or graph, or in any appropriate alternative visualization method, particularly for greater than two dimensions.
  • the two dimensions may be characterized as 1) motivation (which may be synonymous or similar to offense, e.g., the motivation of a bad actor to attack an entity) and 2) sophistication (which may be synonymous or similar to defense, e.g., the sophistication of an entity to prevent and/or repel a cyber attack, or compel more responsible behavior from employees and associates to prevent a privacy event with respect to sensitive information).
  • Alternative axes for the two dimensional analysis are also possible, for example, measurements other than motivation and sophistication.
  • the system may output an estimated (or expected) financial impact, which may encompass both the risk of a cyber attack, and the potential amount of damage caused by a cyber attack.
  • the present technology may provide enhanced value by quantifying a cyber risk, thereby creating a market for it. Additionally, the present technology may provide a cyber risk management service tied to a cyber insurance policy.
  • a cyber insurance policy as used herein includes any insurance policy covering any loss arising out of a security failure, including tangible and intangible property. The policy may cover both first party and third party losses arising out of any perils including a security failure. The policy may cover business interruption, loss of income, Director and Officer liability, information asset coverage, and extra expense coverage, or any other insured loss arising out of a security failure.
  • a cyber insurance policy as used herein includes security and privacy coverage, including regulatory coverage (e.g., FTC, Health Insurance Portability and Accountability Act (HIPPA)) covering fines and penalties, and defense costs and damages.
  • regulatory coverage e.g., FTC, Health Insurance Portability and Accountability Act (HIPPA)
  • the coverage provided by a cyber insurance policy as used herein may provide for privacy breach coaches, forensic experts, a public relations campaign, cyber extortion, information asset recovery, business interruption (including for example, lost income, extra expenses, and/or all costs incurred but for the cyber security failure), or any other covered costs or losses.
  • aspects of a cyber insurance policy may be altered based on use of, and
  • the cyber risk management service may include any terms and conditions of the policy.
  • the terms and conditions include, for example, a retention amount, a deductible, a premium, coverage limits, future valuation, term length, or any other term or condition of the insurance policy.
  • the analysis may be a position on a graph, and may include a scatterplot of the peer group members, and/or a simple ranking amongst the peers.
  • the analysis may be two (or more dimensional). Additionally or alternatively, the analysis may be resolved into a single composite score embodying the analysis.
  • the plot may be changed to include more or fewer members of the peer group based on further variables of the peer group members, for instance, revenue, etc.
  • the plot may include points for a universe of companies along with the points for the particular entity. For a two dimensional analysis example, each axes may be a function of many sub-variables, discussed herein as examples of motivation and sophistication.
  • the sub-variables may be weighted equally, or differently, and the weighting may be static, dynamic, or customizable based on different analysis goals.
  • the exemplary assessment system may provide recommendations to an entity to improve their cyber risk assessment, by, for instance, reducing their cyber risk. This may be accomplished by various methods, including increasing the sophistication of the organization or entity, or decreasing the motivation of the attacker to go after this organization or entity.
  • the recommendations may be specific and may impact one or both of the axes of the two dimensional risk analysis. Implementing the recommendations, which may be accomplished in some embodiments automatically, may reduce the risk of a cyber security failure.
  • Implementing the recommendations may impact an entity's relative position in their peer group, in a universe of companies, as well as any expected financial impact of a security failure (e.g., a cyber attack, a privacy incident) Additionally, factors beyond the control of the company or entity, for instance, actions by the other peer group members, activity in the hacker community or vulnerabilities in software and/or hardware, may also impact both a relative risk analysis (e.g., impacting the company or entity's position in their peer group) and/or an absolute expected financial loss. This change over time may be accessible and/or charted for trending information, which may be useful for planning and/or changing terms and conditions (including the premium) for the insurance policy. An entity may make a judgment of which recommendations to prioritize in implementation based on the different recommendations provided by the system to other members of their peer group. Examples of recommendations are illustrated in FIG. 8.
  • the recommendations generated for an entity can be changed in comparison with other entities in a group.
  • the system 105 can provide a first set of recommendations based solely on the motivation and/or sophistication (e.g., cyber risk) analysis for the entity.
  • system 105 can generate a second set of recommendations based on a comparison of the cyber risk for the entity to the aggregate risk score for many entities.
  • This second set of recommendations includes additional recommendations for the entity which are determined to improve the cyber risk of the entity.
  • the system 105 can determine risk factors that are discrepant between the entity and another entity (or an aggregate group of entities) and highlight these recommendations as being unique for the entity. For example, if the entity is the only one out of a group of their peer entities that does not use a CDN (content delivery network), the system 105 can highlight this difference. These unique discrepancies can illustrate areas where the entity is particularly or uniquely vulnerable.
  • CDN content delivery network
  • the system 105 identifies clusters of sophistication elements or motivation elements that are shared between two or more of the portfolio of entities.
  • the clusters of sophistication elements or motivation elements being associated with an increase in cyber risk.
  • the recommendations generated by the system 105 for an entity of the portfolio of entities will cause a decrease in the cyber risk if implemented.
  • the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the
  • the reassessment occurs after the entity has implemented one or more of the recommendations .
  • the system 105 is configured to provide attribution for a score change, including verifiable data including time and attribution information. This attribution identifies/represents the underlying data set which affected the score change, and shows why, how much, and how the score changes.
  • the entity unbeknownst to them, has a dramatic increase in pageviews on their website.
  • This increase in pageviews causes an increase in the motivation score for the entity. That is, the increase in pageviews indicates that a hacker might be more motivated to hack the entity's webpage because of its high traffic profile.
  • the system 105 can be used to automatically institute changes on behalf of the entity that will decrease the likelihood that the entity will experience or be adversely affected by a security failure such as a cyber attack. These automatic changes occur based on the recommendations generated for the entity.
  • the system 105 can establish new content hosts for the content of the entity.
  • the system 105 can inform the entity that diversity in content hosting can decrease the likelihood that all of the entity's content or user information will be exposed, as compared to if the content is stored in one centralized location.
  • the system 105 can be used to automatically change technical aspects of the entity, such as computing diversity, content distribution and delivery, and other technical attributes.
  • the system 105 comprises an economic estimator module 150 that is configured to estimate a financial impact to the entity for a simulated security failure (e.g., a cyber attack, a privacy incident).
  • a simulated security failure e.g., a cyber attack, a privacy incident
  • the system 105 can execute theoretical or simulated security failures against a cyber profile of an entity.
  • the cyber profile for an entity is determined from the various sophistication and motivation elements determined for the entity.
  • the economic estimator module 150 then calculates the effect of, for example, a distributed denial of service attack (DDoS) on the entity.
  • DDoS distributed denial of service attack
  • the economic impact can include an economic impact to the entity itself, other entities that depend upon the entity, or combinations thereof.
  • a cyber security failure for a financial institution can cause direct economic impact on the institution from website downtime.
  • the cyber security failure can also cause a financial impact to the customers of the financial institution if social security numbers, account numbers, or other sensitive consumer and/or personal information is stolen.
  • implementing the recommendations may be paired with changes to the terms and conditions of an insurance policy.
  • implementation of certain recommendations may be paired with automatic renewal, a consequent lower (or higher or otherwise changed) cyber risk insurance premium, better coverage limits, better term length, future valuation and the like.
  • the change to the terms and/or conditions of the policy may be implemented after the end of the term (e.g., 1, 3, 6 or 12 months, or any other appropriate term) of the current insurance policy, or may trigger a renewal option at the lower premium rate immediately or on an accelerated basis.
  • a cooperative and constructive relationship may be achieved between insurers and insured-entities, thereby creating a positive feedback loop of improved cyber preparedness and
  • recommendations provided by the cyber risk management service may cause a change in any of the terms and conditions of a cyber insurance policy. For example, if the
  • the type of coverage, a pricing or re-pricing, the amount of limits, an automatic renewal, and/or a renewal commitment may change based on an increase or decrease in a sophistication of the entity, and/or an increase or decrease in a motivation of an attacker of the entity. Additionally, as recommendations are
  • the terms and conditions of the policy itself may determine and/or change the weighting used in the cyber risk assessment/management system 105.
  • an insurance policy may affect the cyber risk assessment/management system 105 in other ways.
  • the terms and conditions of an insurance policy may impact an assessment of a cyber risk, and/or an assessment service. For example, if an insurance policy has a high deductible, the assessment service may not assess a motivation to initiate a security event.
  • the assessments service may not assess a motivation to initiate a security event.
  • Various other options for having the terms and conditions of an insurance policy drive the type of assessment conducted are also possible.
  • the cyber risk management service as provided herein may include subjective evaluations, and may include vulnerability assessments, penetration testing, tabletop exercises, people services, risk engineering, and/or training exercises. Changes or renewed evaluations of any of these assessments may cause an increase or decrease in a
  • FIG. 1 is a high level schematic diagram of a computing architecture (hereinafter architecture 100) of the present technology.
  • the architecture 100 comprises a cyber risk assessment/management system 105 (hereinafter also referred to as system 105), which in some embodiments comprises a server or cloud- based computing device configured specifically to perform the diversity analyses described herein. That is, the system 105 is a particular purpose computing device that is specifically designed and programmed (e.g., configured or adapted) to perform any of the methods described herein.
  • the system 105 can be coupled with entity device 130 using a network 120.
  • the system 105 comprises a processor 110 and memory 115 for storing instructions.
  • the memory 115 can include a recommendation module 140.
  • the terms “module” may also refer to any of an application-specific integrated circuit ("ASIC"), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • ASIC application-specific integrated circuit
  • the system 105 may gather variables for an entity by querying the entity for information, scraping available online sources such as websites, corporate filings, news sources, other public record databases, and other resources. Additionally, data may be gathered from the entity's network using devices already present there or by placing a new device on the entity's network to gather more data.
  • the data collecting device may be a server, router, firewall, switch, or repeater, or may be a software agent or routine that monitors traffic and/or performs packet inspection.
  • the data collecting device may be on the company's network and/or its periphery, and may collect and/or analyze the data, while also transmitting it to system 105. In this manner, additional, proprietary data may be gleaned from a particular entity's network.
  • the variables or a subset of the variables can be compared. The comparison can be for all or only a subset of all entities. The subset of variables can be selected by the end user, as well as the entities analyzed.
  • the system 105 provides interfaces or adapters 105A-N that allow various resources to communicatively couple with the system 105.
  • the system 105 can use an application program interface (API) or other communication interface.
  • FIG. 1 illustrates example resources that can couple with the system 105.
  • the system 105 can interrogate, for example, various databases such as corporate filings, news sources, and other public record databases.
  • cloud services such as cloud storage and cloud computing environments.
  • a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors and/or that combines the storage capacity of a large grouping of computer memories or storage devices.
  • systems that provide a cloud resource may be utilized exclusively by their owners; or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources.
  • the cloud may be formed, for example, by a network of servers with each server (or at least a plurality thereof) providing processor and/or storage resources. These servers may manage workloads provided by multiple users (e.g., cloud resource customers or other users). Typically, each user may place workload demands upon the cloud that vary in real-time, sometimes dramatically. The nature and extent of these variations typically depend on the type of business associated with the user.
  • the system 105 may also couple with the Internet as well as data feeds such as RSS feeds or social networks. Email behaviors can also be identified by interrogating email servers or email repositories.
  • the system 105 can use vulnerability assessments generated by the entity or a third party, such as a cyber-security firm.
  • nontechnical elements include, but are not limited to, company size, revenue, company location, company industry sector, as well as other elements which are described herein.
  • the present technology provides benefits above and beyond a typical vulnerability assessment, providing users with a robust and comprehensive view of a company's (or multiple companies') overall cyber security risk.
  • the system 105 can obtain sophistication information about entities from the following non-limiting list of sources or resources: (a) Framework; (b) Hosting/infrastructure; (c) Account management; (d) Authentication; (e) Authorization; (f) Scanning; (g) System vulnerability; (h) Ad/Partner integration; (i) Files/Directories/Links; and (j) Patching.
  • sources or resources include (a) Framework; (b) Hosting/infrastructure; (c) Account management; (d) Authentication; (e) Authorization; (f) Scanning; (g) System vulnerability; (h) Ad/Partner integration; (i) Files/Directories/Links; and (j) Patching.
  • the system 105 can obtain sophistication information about entities from the following non-limiting list of sources or resources: (a) Customer Reviews; (b) Employee reviews; (c) Traffic statistics; (d) Business events/news; (e) Corporate connections; (f) Business type; (g) Customer data; (h) Brand/Revenue; (i) Employee profiles; (j) Social Media/Blogs; (k) Industry /Products; (1) Data Types; and (m) Company/Subsidiary connections.
  • sources or resources (a) Customer Reviews; (b) Employee reviews; (c) Traffic statistics; (d) Business events/news; (e) Corporate connections; (f) Business type; (g) Customer data; (h) Brand/Revenue; (i) Employee profiles; (j) Social Media/Blogs; (k) Industry /Products; (1) Data Types; and (m) Company/Subsidiary connections.
  • facets or features relating the motivation regarding a security failure e.g., motivation of some actor, such as a hacker, to attack an entity, to expose sensitive information, to name a few
  • motivation of some actor e.g., motivation of some actor, such as a hacker, to attack an entity, to expose sensitive information, to name a few
  • sophistication of the entity in preventing or dealing with a cyber security event will be referred to herein as an element.
  • the actor may be a hacker, employee, another entity, to name a few.
  • Examples of motivation elements include: visibility; value; hacker sentiment;
  • Visibility may include information and/or derived measures related to the traffic, usage, and activity related to an entity, including but not limited to the in-links; pageviews; duration; traffic; links; page rank; market value; daily (stock) trade volume;
  • Farmer sentiment includes: emails; credit cards; foreign languages; etc., which can be gathered from hacker forums and/or discussion groups, chat rooms, dark web, or dark net forums, such as the Tor Network, Internet Relay Chat (IRC), and combinations thereof - just to name a few.
  • Employee sentiment includes: career opportunities; work/life balance; compensation; and combinations thereof - just to name a few.
  • Company sentiment includes: senior leadership ratings; overall company ratings; recommendations; etc.
  • Customer sentiment includes: product ratings; service ratings, and combinations thereof - just to name a few.
  • the present technology determines a level of sophistication of the entity.
  • Sophistication may be considered a measure of People, Process, and Technology. People indicates how security-aware the entities' employees, principals and/or members are. In particular, do the people associated with the entity understand the risks, are they competent in security, and combinations thereof. Process indicates whether procedures and/or policies have clear and enforceable terms, and clearly indicate what to do in case of various events, including attacks. Process also indicates whether training is provided to employees, third party contractors and/or service providers, indicates their level of expertise, and combinations thereof.
  • Examples of sophistication elements include: hosting infrastructure; topology;
  • Vulnerability scanning includes: CVEs (common vulnerabilities and exposures); patching; updating; default passwords; etc.
  • People includes: chief information security officer (CISO); security team; skills; job postings; etc. In this manner, sophistication encompasses more than just vulnerability, and additionally includes people and processes that may impact a defensive posture of an entity.
  • Determining these variables may be a data gathering operation, which may be based on public information or a company's own data networks, as discussed herein.
  • a cyber risk assessment for instance a two by two (or higher order) graph, may be output, along with a composite score, a peer rank, an estimated financial impact, and recommendations to decrease the cyber risk. These may all be output for each company assessed. All of these elements may be updated over time and in response to
  • the system 105 is configured to evaluate each data point with respect to history, lineage, provenance (e.g., origin), source, time, entities and other details. The system 105 can then cleanse and standardize the data points. Examples of cleansing and standardizing using data normalization are described in greater detail below.
  • the system 105 can use a canonical representation of the data points. As mentioned above, the system 105 can track entities and their attributes/elements over time. The system 105 is also configured to process rollups (e.g., summarizing the data along a dimension), aggregations, transforms, reductions, normalizations, deltas, as well as other types of data transformation or conversion processes that can also be used to convert the motivation/sophistication/combination elements into scores.
  • rollups e.g., summarizing the data along a dimension
  • aggregations e.g., transforms, reductions, normalizations, deltas
  • other types of data transformation or conversion processes that can also be used to convert the motivation/sophistication/combination elements into scores.
  • the system 105 then generates module-ready data for use with matrices of elements (motivation/sophistication) for one or more entities.
  • the system 105 then executes one or more models to generate scores, results, recommendations, delta values (changes in scores over time), as well as historical tracking of scores.
  • the system 105 comprises a scoring and plotting module 135 that is generally configured to calculate sophistication scores, motivation scores, and combination scores; apply weighting to sophistication and/or motivation elements in various calculations; compare scores to threshold values; benchmark various scores over time; as well as other features described herein.
  • the scoring and plotting module 135 can create visual representations such as the graphs illustrated in FIGs. 2-8.
  • the scoring and plotting module 135 is configured to calculate various scores for an entity.
  • the scoring and plotting module 135 can calculate various scores for a plurality of entities. Again, these various scores can be calculated over time and utilized for benchmarking cyber security performance for an entity, or a group of entities that possess a particular attribute in common.
  • the scoring and plotting module 135 can calculate scores for groups of entities in an industry group, a geographical location, a company size, a technology sector, and so forth.
  • the scoring and plotting module 135 is configured to calculate a motivation score for one or more entities.
  • the scoring and plotting module 135 obtains motivation elements collected from the various resources and converts this information into a mathematical representation.
  • a motivation element of pageviews can be mathematically represented by comparing the pageviews of the entity to a set of thresholds.
  • the pageviews could be a pageview of a particular webpage or set of webpages. To be sure, the higher profile and more visited a website is, the more likely that it will be attractive to a hacker, especially if other motivation factors are present such as the entity being involved in commercial or financial activities, just for example.
  • the scoring and plotting module 135 may normalize various elements to obtain mathematical values that are usable in an algorithm for scoring motivation or sophistication.
  • each of the set of thresholds is associated with a mathematical value. If the entity has pageviews in excess of 10,000 unique users in one day, the entity is given a score of five. If the entity has pageviews in excess of 100,000 unique users in one day, the entity is given a score of ten. If the entity has pageviews in excess of 200,000 unique users in one day, the entity is given a score of fifteen.
  • an employee sentiment can be represented mathematically as a percentage of positive versus negative comments from employees.
  • negative employee behaviors, actions, or statements can be counted over time and compared to thresholds (in a method similar to that above with respect to pageviews).
  • Each of the motivation elements (if necessary) is converted into a mathematical representation.
  • the ultimate motivation score can be calculated by taking a sum of each mathematical representation of motivation elements.
  • motivation score can be a representation of one or a combination of many motivation elements.
  • the system 105 can be configured to weight one or more of the elements in a score calculation. For example, if it is determined that certain elements are more likely to increase the likelihood of a security failure (e.g., a cyber attack, a privacy incident), these elements can be assigned a weight. In an example, the weight is applied by multiplying a mathematical representation of an element by a coefficient or factor. If an element value for pageviews is five, a weighting could include multiplying this number by a coefficient of .5, which reduces the impact of that value on the overall score. Increases in element values can also be achieved.
  • a security failure e.g., a cyber attack, a privacy incident
  • the scoring and plotting module 135 is also configured to process sophistication elements to obtain sophistication scores. The exact details for converting sophistication/motivation elements into
  • the scoring and plotting module 135 can determine various facets of an entity or group of entities by comparing the motivation, sophistication, and/or combined scores of these entities. Answers to pertinent questions can be deduced or inferred from the comparison.
  • the scoring and plotting module 135 is
  • the scoring and plotting module 135 has been used to calculate an aggregate risk score (motivation, sophistication, and/or combined) for numerous entities.
  • the scoring and plotting module 135 selects a plurality of motivation elements and analyzes these elements for each of a portfolio (plurality) of entities using the above examples as a guide for calculating motivation scores.
  • the same motivation elements are used for each entity.
  • the scoring and plotting module 135 can then determine where the entity lies within the group of scores. For example, out of 30 entities, a subject entity places 25 th out of thirty.
  • the scoring and plotting module 135 can also be utilized to generate graphs and GUIs that display various scores in graphical format(s). For example, in FIG. 2, a graph with two axes is illustrated. The graph 200 comprises a vertical axis that is representative of motivation elements, and the horizontal axis is representative of sophistication elements. Indeed, this graph can be used to display information about a single entity or a plurality of entities.
  • the motivation axis is delineated or stratified based on the type of content. Less important types of secure information are located towards the bottom of the axis, whereas more important types of information are located at the top part of the axis.
  • the lower part of the motivation axis references payment cards (e.g., credit cards) and other types of general consumer information. Above that is online crime such as phishing, malware, and other malicious behavior. Above online crime is IP theft and industrial espionage. At the top of the motivation axis are state secrets. To be sure, other categories of information types will lie somewhere along this axis, if not specifically mentioned.
  • the axis can be defined by other types of information points. For example, an entity can structure their motivation axis to include information that they deal with, structured from least important to most important.
  • the sophistication axis which is the horizontal axis
  • hacker profiles are listed from left to right on the axis from a lowest impact actor type to a highest impact actor type.
  • actor types can include casual hackers, professional hackers, organized crime, and state actors. Each of these actor types has a different threat level associated therewith.
  • the sophistication axis represents the strength or threat level that it takes to successfully hack the subject entity/entities.
  • FIG. 3 is an example graphical user interface (GUI) that comprises scatter plot illustrating an entity's motivation and sophistication relative to cyber risk.
  • the scatter plot 300 comprises a vertical motivation axis and a horizontal sophistication axis.
  • Each of the points plotted on the scatter plot 300 represent an entity. Again, these entities can be analyzed together because they are a part of an entity group (e.g., industry group, same geographical location, same company size, etc.).
  • FIG. 4 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their combination scores.
  • the bar graph 400 comprises a vertical axis that represents a number of companies and a horizontal axis that represents combination scores for a set of entities. For example, most entities in the group have combination scores (sophistication and motivation) that fall within a score range of 51-60. Other groups of entities fall within other score ranges.
  • the system 105 can cause an elemental analysis of these similar scoring groups to identify what elements are shared between the entities, what elements are different, and so forth.
  • the graphing of entities based on scores aids the system 105 in identifying groups of entities that require attention. For example, the entities in the score range of 31-40 are severely underperforming.
  • FIG. 5 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their sophistication scores.
  • the bar graph 500 comprises a vertical axis that represents a number of companies and a horizontal axis that represents sophistication scores for a set of entities.
  • FIG. 6 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their motivation scores.
  • the bar graph 600 comprises a vertical axis that represents a number of companies and a horizontal axis that represents motivation scores for a set of entities.
  • sophistication and/or motivation can be quickly and easily determined, at least on a high level. Again, a more granular element analysis can be conducted when groups with underperforming sophistication/motivation scores are identified.
  • FIG. 7 is an example graphical user interface (GUI) that comprises a scatter plot that represents a plurality of entities plotted according to their combination score.
  • the scatter plot 700 includes a plurality of data points that each represents an entity.
  • the plot 700 comprises a vertical axis that represents motivation and a horizontal axis that represents sophistication scores for a set of entities. The higher risk area on the plot is where the motivation to hack is high and the sophistication of the entity is low.
  • the system 105 can create a line 705 of acceptable motivation/sophistication scores. Companies falling below this line 705 have a suitable cyber risk profile, whereas companies above the line have an unsuitable cyber risk profile. These companies can be identified and analyzed in order to suggest recommendations for improving their cyber risk.
  • FIG. 8 is an example graphical user interface (GUI) 800 that comprises a scatter plot 805 that represents a plurality of entities plotted according to their combination score, as well as additional graphical representations for an entity and a list of recommendations based on the plotting.
  • GUI graphical user interface
  • a linear slide 820 displays the position of an entity within a peer group of entities. This same relationship position is illustrated in a gauge graph 810.
  • the recommendation module 140 can be executed to provide the end user (or entity) with some type of actionable feedback.
  • the recommendation module 140 can provide the end user one or more actions to the end user based on the diversity score and the clusters of similar variables. This is described in further detail in U.S. patent application no. 14/585,051 filed December 29, 2014 and entitled "Diversity Analysis with Actionable Feedback Methodologies," which is hereby incorporated by reference herein in its entirety, including all references cited therein. These one or more actions potentially decrease the cyber risk of the entity.
  • the recommendation module 140 can automatically identify variables, which if changed, would affect the cyber risk assessment.
  • entities may agree to automatic implementation of recommendations in exchange for lower policy premiums.
  • a set of recommendations 815 is provided along with the graphical analysis generated for the entity. Again, these recommendations are based on the system 105 having knowledge of the motivation elements, sophistication elements, as well as the scores calculated not only for the entity, but other entities (in some embodiments).
  • Exemplary methods and systems according to the present technology may also provide benchmarking over time.
  • the system 105 can track, for a company or group or entities, cyber risk over a selectable time period, for example days, weeks, months, and/or years.
  • This benchmarking may be against a dynamic or static evaluation of the peer group, for instance, an entity's past and present cyber risk tracked against a static past peer group, static present peer group, and/or dynamic peer group.
  • the present technology provides information related to the updated information (the new motivation score, the new sophistication score, the new composite score, etc.), including differences (the amount of the change made in one or more updates, namely the delta), and trends (patterns over many time steps).
  • FIG. 9 is a flowchart of an example method 900 of the present technology.
  • the method includes the system 105 assessing 905 the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements.
  • the cyber risk includes a security failure (e.g., a cyber attack, a privacy incident) of the entity.
  • the system 105 may query the entity for information, scrape available online sources such as websites, corporate filings, news sources, other public record databases, and other resources. Additionally, data may be gathered from the entity's network using devices already present there or by placing a new data collecting device on the entity's network to gather more data.
  • the data collecting device may be on the company's network and/or its periphery, and may collect and/or analyze the data, while also transmitting it to system 105. In this example, additional, proprietary data may be gleaned from a particular entity's network.
  • the exemplary method also includes the system 105 automatically determining 910, based on the assessed risk, at least one of: a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy.
  • the method includes the system 105 automatically recommending 915 computer network changes to reduce the assessed risk.
  • the exemplary method includes the system 105, based on the recommended computer network changes, automatically reassessing 920 the cyber risk of the computer network.
  • the method may also include the system 105 dynamically re-determining 930, based on the reassessed cyber risk, the at least one of: the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
  • FIG. 10 is a flowchart of an example method 1000.
  • the method includes the system 105 assessing 1005 a sophistication for the entity with respect to preventing a cyber security failure using a plurality of sophistication elements for the entity.
  • the sophistication relates to people, processes, and technology.
  • the sophistication analysis as a whole attempts to quantify how strong a threat actor would be required to execute a successful security failure of the entity.
  • the method includes the system 105 assessing 1010 a motivation of an actor (e.g., a hacker) to initiate a cyber security failure, the assessment using a plurality of motivation elements regarding the entity.
  • a motivation of an actor e.g., a hacker
  • the method includes the system 105 plotting 1015 the sophistication against the motivation for the entity and other members of a peer group of the entity. Again, the plotting is performed, in this example, in a matrix that visually illustrates the cyber risk of the entity.
  • the plotting can also include a two dimensional graph.
  • the method also includes the system 105 providing 1020 recommendations to the entity to improve the cyber risk based on the plotting of the sophistication against the motivation.
  • FIG. 11 is a flowchart of yet another example method 1100 for modifying an insurance policy based on a cyber risk analysis.
  • the method includes the system 105 assessing 1105 a sophistication for the entity with respect to preventing a security failure (e.g., a cyber attack, a privacy incident, to name a few) of the entity using a plurality of sophistication elements for the entity.
  • a security failure e.g., a cyber attack, a privacy incident, to name a few
  • the sophistication relates to people, processes, and technology.
  • the sophistication analysis as a whole attempts to quantify how strong a threat actor would be required to cause a successful cyber failure.
  • the method includes the system 105 assessing 1110 a motivation of an actor (for example, a hacker) to initiate at least one of a security failure of the entity using a plurality of motivation elements regarding the entity.
  • steps 1105 and 1110 include the collection of motivation and
  • the method includes the system 105 plotting 1115 the sophistication against the motivation for the entity and other members of a peer group of the entity.
  • the plotting for this example, is performed in a matrix that visually illustrates the cyber risk of the entity.
  • the plotting for this example, can also include a two dimensional graph.
  • the method also includes the system 105 in 1120, based on the plotting,
  • the system 105 performs an analysis of the motivation and sophistication elements without plotting as in step 1115.
  • the change to, and/or setting of, the insurance policy does not need to be based on plotting.
  • the system 105 can be programmed with insurance policy parameters.
  • the system 105 can generate recommendations for the insurer based on the motivation and sophistication analysis of the entity.
  • the recommendation could be to deny a policy or terminate a policy if the entity has motivation or sophistication elements that are defined by the insurance policy as being unacceptable or uninsurable.
  • FIG. 12 illustrates an exemplary computer system 1200 that may be used to implement some embodiments of the present disclosure.
  • the computer system 1200 of FIG. 12 may be implemented in the contexts of the likes of computing systems, networks, servers, or combinations thereof.
  • the computer system 1200 of FIG. 12 includes one or more processor units 1210 and main memory 1220.
  • Main memory 1220 stores, in part, instructions and data for execution by processor units 1210.
  • Main memory 1220 stores the executable code when in operation, in this example.
  • the computer system 1200 of FIG. 12 further includes a mass data storage 1230, portable storage device 1240, output devices 1250, user input devices 1260, a graphics display system 1270, and peripheral devices 1280.
  • FIG. 12 The components shown in FIG. 12 are depicted as being connected via a single bus 1290.
  • the components may be connected through one or more data transport means.
  • Processor unit 1210 and main memory 1220 are connected via a local microprocessor bus, and the mass data storage 1230, peripheral device(s) 1280, portable storage device 1240, and graphics display system 1270 are connected via one or more input/output (I/O) buses.
  • I/O input/output
  • Mass data storage 1230 which can be implemented with a magnetic disk drive, solid state drive, or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit 1210. Mass data storage 1230 stores the system software for implementing embodiments of the present disclosure for purposes of loading that software into main memory 1220.
  • Portable storage device 1240 operates in conjunction with a portable non-volatile storage medium, such as a flash drive, floppy disk, compact disk, digital video disc, or Universal Serial Bus (USB) storage device, to input and output data and code to and from the computer system 1200 of FIG. 12.
  • a portable non-volatile storage medium such as a flash drive, floppy disk, compact disk, digital video disc, or Universal Serial Bus (USB) storage device
  • USB Universal Serial Bus
  • User input devices 1260 can provide a portion of a user interface.
  • User input devices 1260 may include one or more microphones, an alphanumeric keypad, such as a keyboard, for inputting alphanumeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys.
  • User input devices 1260 can also include a touchscreen.
  • the computer system 1200 as shown in FIG. 12 includes output devices 1250. Suitable output devices 1250 include speakers, printers, network interfaces, and monitors.
  • Graphics display system 1270 include a liquid crystal display (LCD) or other suitable display device. Graphics display system 1270 is configurable to receive textual and graphical information and processes the information for output to the display device.
  • LCD liquid crystal display
  • Peripheral devices 1280 may include any type of computer support device that adds additional functionality to the computer system.
  • the components provided in the computer system 1200 of FIG. 12 are those typically found in computer systems that may be suitable for use with embodiments of the present disclosure and are intended to represent a broad category of such computer components that are well known in the art.
  • the computer system 1200 of FIG. 12 can be a personal computer (PC), hand held computer system, telephone, mobile computer system, workstation, tablet, phablet, mobile phone, server, minicomputer, mainframe computer, wearable, or any other computer system.
  • the computer may also include different bus configurations, networked platforms, multi-processor platforms, and the like.
  • Various operating systems may be used including UNIX, LINUX, WINDOWS, MAC OS, PALM OS, QNX ANDROID, IOS, CHROME, TIZEN, and other suitable operating systems.
  • the processing for various embodiments may be implemented in software that is cloud-based.
  • the computer system 1200 is implemented as a cloud- based computing environment, such as a virtual machine operating within a computing cloud.
  • the computer system 1200 may itself include a cloud-based computing environment, where the functionalities of the computer system 1200 are executed in a distributed fashion.
  • the computer system 1200 when configured as a computing cloud, may include pluralities of computing devices in various forms, as will be described in greater detail below.
  • a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors (such as within web servers) and/or that combines the storage capacity of a large grouping of computer memories or storage devices.
  • Systems that provide cloud-based resources may be utilized exclusively by their owners or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large
  • the cloud may be formed, for example, by a network of web servers that comprise a plurality of computing devices, such as the computer system 1200, with each server (or at least a plurality thereof) providing processor and/or storage resources.
  • These servers may manage workloads provided by multiple users (e.g., cloud resource customers or other users).
  • each user places workload demands upon the cloud that vary in realtime, sometimes dramatically. The nature and extent of these variations typically depends on the type of business associated with the user.

Abstract

Provided are methods for reducing the risk of cyber security failure for an entity. The method includes assessing the risk of a cyber security failure regarding a computer network of an entity. The cyber security failure may include a cyber attack or privacy incident. The method may use a computer agent configured to collect information at least from publicly accessible Internet elements. The exemplary method further includes automatically determining, based on the assessed risk, a change to, or setting of, criteria of a policy. The method can include automatically recommending changes to reduce the assessed risk. Based on the recommended changes, the method may automatically reassess the cyber risk and based thereon, dynamically redetermine the change to, or setting of, the criteria. Some embodiments include plotting to visually illustrate the entity's cyber risk. A corresponding exemplary cyber risk assessment and management services system is provided.

Description

SYSTEM FOR CYBER INSURANCE POLICY INCLUDING CYBER RISK
ASSESSMENT/MANAGEMENT SERVICE
CROSS REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of U.S. Provisional Application No. 62/098,238, filed December 30, 2014, which is hereby incorporated by reference herein in its entirety, including all references cited therein.
FIELD
[0002] The present technology is generally directed to cyber security analysis, and more specifically, but not by way of limitation, to systems and methods that provide cyber risk assessment and management services. Various embodiments of the present technology also allow for selective change to an insurance policy based on the cyber risk analysis, as well as the generation of recommendations based on the cyber risk analysis.
SUMMARY
[0003] According to some exemplary embodiments, the present technology is directed to a method for assessing and reducing the risk of a cyber security failure of an entity. The exemplary method includes assessing the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements. The exemplary method further includes
automatically determining, based on the assessed risk, at least one of a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy. The method may also include automatically recommending computer network changes to reduce the assessed risk and automatically reassessing the cyber risk of the computer network based on the recommended computer network changes. Based on the reassessed cyber risk, the method may include dynamically re-determining the at least one of the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] FIG. 1 is a block diagram illustrating a device according to an example
embodiment.
[0005] FIG. 2 is an example graphical user interface (GUI) that comprises a graphical representation that plots an entity's motivation and sophistication relative to cyber risk.
[0006] FIG. 3 is an example graphical user interface (GUI) that comprises a scatter plot illustrating an entity's motivation and sophistication relative to cyber risk.
[0007] FIG. 4 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their combination scores.
[0008] FIG. 5 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their sophistication scores.
[0009] FIG. 6 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their motivation scores.
[0010] FIG. 7 is an example graphical user interface (GUI) that comprises a scatter plot that represents a plurality of entities plotted according to their combination score.
[0011] FIG. 8 is an example graphical user interface (GUI) that comprises a scatter plot that represents a plurality of entities plotted according to their combination score, as well as additional graphical representations for an entity and a list of recommendations based on the plotting.
[0012] FIG. 9 is a flowchart of an example method of the present technology.
[0013] FIG. 10 is a flowchart of another example method of the present technology.
[0014] FIG. 11 is a flowchart of yet another example method of the present technology.
[0015] FIG. 12 illustrates an example computer system that can be used to implement embodiments of the disclosed technology. DETAILED DESCRIPTION
[0016] Various embodiments of systems and methods are provided for assessing and reducing cyber risks associated with companies or other entities. In various embodiments, a method comprises assessing the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements. The cyber security failure may include a cyber attack and/or a privacy incident (including but not limited to an incident involving sensitive information), to name just a few. The computer agent may be further configured to collect and/or analyze information from the computer network of the entity. The exemplary method includes automatically determining, based on the assessed risk, at least one of a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy. The insurance policy may be a policy from an insurance company, a product warranty for first and/or third party costs that an entity purchases from one of a networking, security product, or services provider, to name a few. In various embodiments, the method includes automatically recommending computer network changes to reduce the assessed risk; and automatically reassessing the cyber risk of the computer network based on the recommended computer network changes. The exemplary method further includes dynamically re-determining, based on the reassessed cyber risk, the at least one of the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy. A term and a condition may include a retention amount, a deductible, a premium, a coverage limit, a future valuation, a term length, and so forth.
[0017] In some embodiments, the method includes, based on the assessing, plotting one or more features of the entity and other members of a peer group of the entity. The plotting may be configured to visually illustrate the cyber risk of the entity. Based on the plotting, the exemplary method may determine a change to, or a setting of, terms and conditions of the insurance policy.
[0018] In some embodiments, the method comprises determining a sophistication score of the entity with respect to cyber risk, which may be considered the quality of the defense with respect to repelling, defeating, or preventing a security failure.
[0019] In some embodiments, the method also comprises determining a motivation score of a hacker or other actor with respect to initiating one of a cyber security failure. A composite score may be created from the motivation score and the sophistication score.
[0020] The exemplary method and system may be used in a cyber insurance market, and/or by a cyber insurance provider providing insurance policies. The cyber insurance policy may include a cyber risk assessment/management service, which may provide feedback to one or both of the insurance company and the insured entity, enabling the entity to determine how to reduce their cyber risk, and/or how they are positioned within their peer group and/or within a universe of companies with respect to their cyber risk. As used herein, the insurance policy, including but not limited to a cyber insurance policy, may be a policy from an insurance company or it could be a product warranty for first and/or third party costs that an entity purchases from a networking or security product or services provider.
[0021] Additionally, following the recommendations may enable the insurance company to update and/or change terms and conditions of an insurance policy. In still further alternatives, the composite score of several or many entities may be aggregated and used by insurance companies, reinsurance companies, brokers and/or ratings agencies to understand and/or evaluate an aggregate risk and assess insurance premiums and/or reinsurance treaties and/or change or evaluate a credit rating. This is described in further detail in U.S. patent application no. 14/585,051 filed December 29, 2014 and entitled "Diversity Analysis with Actionable Feedback Methodologies," which is hereby
incorporated by reference herein in its entirety, including all references cited therein. [0022] Cyber insurance insures entities against damage and/or loss due to security failures (e.g a cyber attack, a privacy incident). Assessing cyber risk can be a difficult task due to the volatility of the cyber environment. For example, a risk of a security failure such as a cyber attack lacks actuarial data since there is an active adversary behind cyber attacks, and past cyber attacks do not predict future cyber attacks. Better analysis of cyber risk, including the risk of security failures, and providing greater service to insurance companies and insured entities, is desirable
[0023] The technology disclosed herein provides a cyber risk assessment, and provides methods and systems for improving a cyber risk assessment, by, for instance, reducing a risk of a cyber attack, predicting the probability of a cyber attack, and/or determining the extent to which a cyber attack might cause damage. Exemplary methods plot the cyber risk within a peer group, which may be defined by industry, revenue, and/or any other appropriate metric. Various exemplary methods plot the cyber risk within the universe of companies, e.g., universe of companies for which such cyber risk has been assessed.
Exemplary methods assess risk in a plot using one feature. In other examples, multiple features may be plotted into a matrix.
[0024] For those exemplary matrix embodiments, the assessment of risk is plotted with a two (or more) dimensional analysis, which may be plotted into a two by two matrix or graph, or in any appropriate alternative visualization method, particularly for greater than two dimensions. For example, the two dimensions may be characterized as 1) motivation (which may be synonymous or similar to offense, e.g., the motivation of a bad actor to attack an entity) and 2) sophistication (which may be synonymous or similar to defense, e.g., the sophistication of an entity to prevent and/or repel a cyber attack, or compel more responsible behavior from employees and associates to prevent a privacy event with respect to sensitive information). Alternative axes for the two dimensional analysis are also possible, for example, measurements other than motivation and sophistication. The system may output an estimated (or expected) financial impact, which may encompass both the risk of a cyber attack, and the potential amount of damage caused by a cyber attack.
[0025] In addition to analyzing the cyber risk, the present technology may provide enhanced value by quantifying a cyber risk, thereby creating a market for it. Additionally, the present technology may provide a cyber risk management service tied to a cyber insurance policy. A cyber insurance policy as used herein includes any insurance policy covering any loss arising out of a security failure, including tangible and intangible property. The policy may cover both first party and third party losses arising out of any perils including a security failure. The policy may cover business interruption, loss of income, Director and Officer liability, information asset coverage, and extra expense coverage, or any other insured loss arising out of a security failure. A cyber insurance policy as used herein includes security and privacy coverage, including regulatory coverage (e.g., FTC, Health Insurance Portability and Accountability Act (HIPPA)) covering fines and penalties, and defense costs and damages. The coverage provided by a cyber insurance policy as used herein may provide for privacy breach coaches, forensic experts, a public relations campaign, cyber extortion, information asset recovery, business interruption (including for example, lost income, extra expenses, and/or all costs incurred but for the cyber security failure), or any other covered costs or losses.
[0026] Aspects of a cyber insurance policy may be altered based on use of, and
implementation of recommendations provided by, the cyber risk management service. These aspects may include any terms and conditions of the policy. The terms and conditions include, for example, a retention amount, a deductible, a premium, coverage limits, future valuation, term length, or any other term or condition of the insurance policy.
[0027] The analysis may be a position on a graph, and may include a scatterplot of the peer group members, and/or a simple ranking amongst the peers. The analysis may be two (or more dimensional). Additionally or alternatively, the analysis may be resolved into a single composite score embodying the analysis. The plot may be changed to include more or fewer members of the peer group based on further variables of the peer group members, for instance, revenue, etc. The plot may include points for a universe of companies along with the points for the particular entity. For a two dimensional analysis example, each axes may be a function of many sub-variables, discussed herein as examples of motivation and sophistication. The sub-variables may be weighted equally, or differently, and the weighting may be static, dynamic, or customizable based on different analysis goals.
Examples of motivation and sophistication elements will be described in greater detail below.
[0028] The exemplary assessment system may provide recommendations to an entity to improve their cyber risk assessment, by, for instance, reducing their cyber risk. This may be accomplished by various methods, including increasing the sophistication of the organization or entity, or decreasing the motivation of the attacker to go after this organization or entity. The recommendations may be specific and may impact one or both of the axes of the two dimensional risk analysis. Implementing the recommendations, which may be accomplished in some embodiments automatically, may reduce the risk of a cyber security failure.
[0029] Implementing the recommendations may impact an entity's relative position in their peer group, in a universe of companies, as well as any expected financial impact of a security failure (e.g., a cyber attack, a privacy incident) Additionally, factors beyond the control of the company or entity, for instance, actions by the other peer group members, activity in the hacker community or vulnerabilities in software and/or hardware, may also impact both a relative risk analysis (e.g., impacting the company or entity's position in their peer group) and/or an absolute expected financial loss. This change over time may be accessible and/or charted for trending information, which may be useful for planning and/or changing terms and conditions (including the premium) for the insurance policy. An entity may make a judgment of which recommendations to prioritize in implementation based on the different recommendations provided by the system to other members of their peer group. Examples of recommendations are illustrated in FIG. 8.
[0030] In some embodiments, the recommendations generated for an entity can be changed in comparison with other entities in a group. Thus, the system 105 can provide a first set of recommendations based solely on the motivation and/or sophistication (e.g., cyber risk) analysis for the entity.
[0031] In another example, the system 105 can generate a second set of recommendations based on a comparison of the cyber risk for the entity to the aggregate risk score for many entities. This second set of recommendations includes additional recommendations for the entity which are determined to improve the cyber risk of the entity.
[0032] In some embodiments, the system 105 can determine risk factors that are discrepant between the entity and another entity (or an aggregate group of entities) and highlight these recommendations as being unique for the entity. For example, if the entity is the only one out of a group of their peer entities that does not use a CDN (content delivery network), the system 105 can highlight this difference. These unique discrepancies can illustrate areas where the entity is particularly or uniquely vulnerable.
[0033] Stated otherwise, the system 105 identifies clusters of sophistication elements or motivation elements that are shared between two or more of the portfolio of entities. The clusters of sophistication elements or motivation elements being associated with an increase in cyber risk. The recommendations generated by the system 105 for an entity of the portfolio of entities will cause a decrease in the cyber risk if implemented.
[0034] In various embodiments, where scores are tracked over time, the system 105 can also be configured to periodically reassess the cyber risk of an entity. In some
embodiments, the reassessment occurs after the entity has implemented one or more of the recommendations .
[0035] It may be advantageous for the entity to understand not only that a particular score was changed, but also what affected the change in score. Thus, the system 105 is configured to provide attribution for a score change, including verifiable data including time and attribution information. This attribution identifies/represents the underlying data set which affected the score change, and shows why, how much, and how the score changes.
[0036] By way of example, the entity, unbeknownst to them, has a dramatic increase in pageviews on their website. This increase in pageviews causes an increase in the motivation score for the entity. That is, the increase in pageviews indicates that a hacker might be more motivated to hack the entity's webpage because of its high traffic profile.
[0037] In some embodiments, the system 105 can be used to automatically institute changes on behalf of the entity that will decrease the likelihood that the entity will experience or be adversely affected by a security failure such as a cyber attack. These automatic changes occur based on the recommendations generated for the entity.
[0038] In one example, the system 105 can establish new content hosts for the content of the entity. The system 105 can inform the entity that diversity in content hosting can decrease the likelihood that all of the entity's content or user information will be exposed, as compared to if the content is stored in one centralized location. To be sure, the system 105 can be used to automatically change technical aspects of the entity, such as computing diversity, content distribution and delivery, and other technical attributes.
[0039] In some embodiments, the system 105 comprises an economic estimator module 150 that is configured to estimate a financial impact to the entity for a simulated security failure (e.g., a cyber attack, a privacy incident). Thus, the system 105 can execute theoretical or simulated security failures against a cyber profile of an entity. In one example, the cyber profile for an entity is determined from the various sophistication and motivation elements determined for the entity. The economic estimator module 150 then calculates the effect of, for example, a distributed denial of service attack (DDoS) on the entity. To be sure, the simulated cyber attack in this example tests the sophistication of the entity and is affected by the motivation regarding the entity. The economic impact can include an economic impact to the entity itself, other entities that depend upon the entity, or combinations thereof. For example, a cyber security failure for a financial institution, such as a DDoS attack, can cause direct economic impact on the institution from website downtime. The cyber security failure can also cause a financial impact to the customers of the financial institution if social security numbers, account numbers, or other sensitive consumer and/or personal information is stolen.
[0040] Additionally, implementing the recommendations, provided by the cyber risk management service for example, may be paired with changes to the terms and conditions of an insurance policy. For example, implementation of certain recommendations may be paired with automatic renewal, a consequent lower (or higher or otherwise changed) cyber risk insurance premium, better coverage limits, better term length, future valuation and the like. For example, the change to the terms and/or conditions of the policy may be implemented after the end of the term (e.g., 1, 3, 6 or 12 months, or any other appropriate term) of the current insurance policy, or may trigger a renewal option at the lower premium rate immediately or on an accelerated basis. In this manner, a cooperative and constructive relationship may be achieved between insurers and insured-entities, thereby creating a positive feedback loop of improved cyber preparedness and
lower/higher/changed premiums. As discussed previously, implementation of
recommendations provided by the cyber risk management service may cause a change in any of the terms and conditions of a cyber insurance policy. For example, if the
sophistication of the entity is low, a higher deductible may be required, and vice versa. Additionally or alternatively, the type of coverage, a pricing or re-pricing, the amount of limits, an automatic renewal, and/or a renewal commitment, may change based on an increase or decrease in a sophistication of the entity, and/or an increase or decrease in a motivation of an attacker of the entity. Additionally, as recommendations are
implemented, or other changes in the entity or the entity's situation, the motivation and sophistication, or other metrics, may change, and consequently a new analysis may be provided including new and/or changed recommendations for the entity.
[0041] Additionally or alternatively, the terms and conditions of the policy itself may determine and/or change the weighting used in the cyber risk assessment/management system 105. In still further embodiments, an insurance policy may affect the cyber risk assessment/management system 105 in other ways. In other words, the terms and conditions of an insurance policy may impact an assessment of a cyber risk, and/or an assessment service. For example, if an insurance policy has a high deductible, the assessment service may not assess a motivation to initiate a security event. Various other options for having the terms and conditions of an insurance policy drive the type of assessment conducted are also possible.
[0042] The cyber risk management service as provided herein may include subjective evaluations, and may include vulnerability assessments, penetration testing, tabletop exercises, people services, risk engineering, and/or training exercises. Changes or renewed evaluations of any of these assessments may cause an increase or decrease in a
sophistication of the entity, an increase or decrease in a motivation of an attacker of the entity, and/or a change in any other metric used to evaluate an entity. Any of these changes based on a new or revised assessment may cause a remediation service and/or a new or additional assessment service, to be implemented. Trends, averages and/or changes to an assessment or evaluation may impact terms and conditions of a cyber insurance policy, as discussed herein.
[0043] Various embodiments of the present technology can be practiced with a local computer system, and/or a cloud-based system. FIG. 1 is a high level schematic diagram of a computing architecture (hereinafter architecture 100) of the present technology. The architecture 100 comprises a cyber risk assessment/management system 105 (hereinafter also referred to as system 105), which in some embodiments comprises a server or cloud- based computing device configured specifically to perform the diversity analyses described herein. That is, the system 105 is a particular purpose computing device that is specifically designed and programmed (e.g., configured or adapted) to perform any of the methods described herein. The system 105 can be coupled with entity device 130 using a network 120.
[0044] In one embodiment, the system 105 comprises a processor 110 and memory 115 for storing instructions. The memory 115 can include a recommendation module 140. As used herein, the terms "module" may also refer to any of an application-specific integrated circuit ("ASIC"), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
[0045] The system 105 may gather variables for an entity by querying the entity for information, scraping available online sources such as websites, corporate filings, news sources, other public record databases, and other resources. Additionally, data may be gathered from the entity's network using devices already present there or by placing a new device on the entity's network to gather more data. The data collecting device may be a server, router, firewall, switch, or repeater, or may be a software agent or routine that monitors traffic and/or performs packet inspection. The data collecting device may be on the company's network and/or its periphery, and may collect and/or analyze the data, while also transmitting it to system 105. In this manner, additional, proprietary data may be gleaned from a particular entity's network. The variables or a subset of the variables can be compared. The comparison can be for all or only a subset of all entities. The subset of variables can be selected by the end user, as well as the entities analyzed.
[0046] In some embodiments, the system 105 provides interfaces or adapters 105A-N that allow various resources to communicatively couple with the system 105. As an example, the system 105 can use an application program interface (API) or other communication interface. FIG. 1 illustrates example resources that can couple with the system 105. The system 105 can interrogate, for example, various databases such as corporate filings, news sources, and other public record databases. In another example, cloud services such as cloud storage and cloud computing environments. In general, a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors and/or that combines the storage capacity of a large grouping of computer memories or storage devices. For example, systems that provide a cloud resource may be utilized exclusively by their owners; or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources. The cloud may be formed, for example, by a network of servers with each server (or at least a plurality thereof) providing processor and/or storage resources. These servers may manage workloads provided by multiple users (e.g., cloud resource customers or other users). Typically, each user may place workload demands upon the cloud that vary in real-time, sometimes dramatically. The nature and extent of these variations typically depend on the type of business associated with the user.
[0047] The system 105 may also couple with the Internet as well as data feeds such as RSS feeds or social networks. Email behaviors can also be identified by interrogating email servers or email repositories.
[0048] In some embodiments, the system 105 can use vulnerability assessments generated by the entity or a third party, such as a cyber-security firm.
[0049] In contrast with a vulnerability assessment, which is more technical in nature, the present technology can also consider non-technical or semi-technical aspects of an entity and how these elements impact the cyber vulnerability of the entity. For example, nontechnical elements include, but are not limited to, company size, revenue, company location, company industry sector, as well as other elements which are described herein. The present technology provides benefits above and beyond a typical vulnerability assessment, providing users with a robust and comprehensive view of a company's (or multiple companies') overall cyber security risk. [0050] In some embodiments, the system 105 can obtain sophistication information about entities from the following non-limiting list of sources or resources: (a) Framework; (b) Hosting/infrastructure; (c) Account management; (d) Authentication; (e) Authorization; (f) Scanning; (g) System vulnerability; (h) Ad/Partner integration; (i) Files/Directories/Links; and (j) Patching.
[0051] In some embodiments, the system 105 can obtain sophistication information about entities from the following non-limiting list of sources or resources: (a) Customer Reviews; (b) Employee reviews; (c) Traffic statistics; (d) Business events/news; (e) Corporate connections; (f) Business type; (g) Customer data; (h) Brand/Revenue; (i) Employee profiles; (j) Social Media/Blogs; (k) Industry /Products; (1) Data Types; and (m) Company/Subsidiary connections.
[0052] For purposes of context, facets or features relating the motivation regarding a security failure (e.g., motivation of some actor, such as a hacker, to attack an entity, to expose sensitive information, to name a few) as well as the sophistication of the entity in preventing or dealing with a cyber security event will be referred to herein as an element. Thus, there can be a plurality of types of sophistication elements and a plurality of types of motivation elements. The actor may be a hacker, employee, another entity, to name a few.
[0053] Examples of motivation elements include: visibility; value; hacker sentiment;
employee sentiment; company sentiment; customer sentiment, and combinations thereof - just to name a few. Each of these motivation elements may be further subcategorized as follows. Visibility may include information and/or derived measures related to the traffic, usage, and activity related to an entity, including but not limited to the in-links; pageviews; duration; traffic; links; page rank; market value; daily (stock) trade volume;
exporting/importing; and combinations thereof - just to name a few. Value includes:
revenue; net income; total assets; employees; and combinations thereof - just to name a few. Hacker sentiment includes: emails; credit cards; foreign languages; etc., which can be gathered from hacker forums and/or discussion groups, chat rooms, dark web, or dark net forums, such as the Tor Network, Internet Relay Chat (IRC), and combinations thereof - just to name a few. Employee sentiment includes: career opportunities; work/life balance; compensation; and combinations thereof - just to name a few. Company sentiment includes: senior leadership ratings; overall company ratings; recommendations; etc.
Customer sentiment includes: product ratings; service ratings, and combinations thereof - just to name a few.
[0054] The present technology determines a level of sophistication of the entity.
Sophistication may be considered a measure of People, Process, and Technology. People indicates how security-aware the entities' employees, principals and/or members are. In particular, do the people associated with the entity understand the risks, are they competent in security, and combinations thereof. Process indicates whether procedures and/or policies have clear and enforceable terms, and clearly indicate what to do in case of various events, including attacks. Process also indicates whether training is provided to employees, third party contractors and/or service providers, indicates their level of expertise, and combinations thereof.
[0055] Examples of sophistication elements include: hosting infrastructure; topology;
vulnerability scanning; people; and combinations thereof - just to name a few. Hosting infrastructure includes; content distribution networks; shared hosting; cloud providers; etc. Topology includes: accessibility points; page layout; content on site; etc. Vulnerability scanning includes: CVEs (common vulnerabilities and exposures); patching; updating; default passwords; etc. People includes: chief information security officer (CISO); security team; skills; job postings; etc. In this manner, sophistication encompasses more than just vulnerability, and additionally includes people and processes that may impact a defensive posture of an entity.
[0056] Determining these variables may be a data gathering operation, which may be based on public information or a company's own data networks, as discussed herein. A cyber risk assessment, for instance a two by two (or higher order) graph, may be output, along with a composite score, a peer rank, an estimated financial impact, and recommendations to decrease the cyber risk. These may all be output for each company assessed. All of these elements may be updated over time and in response to
implementation of recommendations, thus, transforming the original data via the use of a particular computer.
[0057] In some embodiments, the system 105 is configured to evaluate each data point with respect to history, lineage, provenance (e.g., origin), source, time, entities and other details. The system 105 can then cleanse and standardize the data points. Examples of cleansing and standardizing using data normalization are described in greater detail below.
[0058] In some embodiments, the system 105 can use a canonical representation of the data points. As mentioned above, the system 105 can track entities and their attributes/elements over time. The system 105 is also configured to process rollups (e.g., summarizing the data along a dimension), aggregations, transforms, reductions, normalizations, deltas, as well as other types of data transformation or conversion processes that can also be used to convert the motivation/sophistication/combination elements into scores.
[0059] The system 105 then generates module-ready data for use with matrices of elements (motivation/sophistication) for one or more entities. In some embodiments, the system 105 then executes one or more models to generate scores, results, recommendations, delta values (changes in scores over time), as well as historical tracking of scores.
[0060] In some embodiments, the system 105 comprises a scoring and plotting module 135 that is generally configured to calculate sophistication scores, motivation scores, and combination scores; apply weighting to sophistication and/or motivation elements in various calculations; compare scores to threshold values; benchmark various scores over time; as well as other features described herein.
[0061] In a second set of functions, the scoring and plotting module 135 can create visual representations such as the graphs illustrated in FIGs. 2-8. [0062] In one embodiment, the scoring and plotting module 135 is configured to calculate various scores for an entity. In another embodiment the scoring and plotting module 135 can calculate various scores for a plurality of entities. Again, these various scores can be calculated over time and utilized for benchmarking cyber security performance for an entity, or a group of entities that possess a particular attribute in common. For example, the scoring and plotting module 135 can calculate scores for groups of entities in an industry group, a geographical location, a company size, a technology sector, and so forth.
[0063] In an example calculation, the scoring and plotting module 135 is configured to calculate a motivation score for one or more entities. The scoring and plotting module 135 obtains motivation elements collected from the various resources and converts this information into a mathematical representation. In one embodiment, a motivation element of pageviews can be mathematically represented by comparing the pageviews of the entity to a set of thresholds. For context, the pageviews could be a pageview of a particular webpage or set of webpages. To be sure, the higher profile and more visited a website is, the more likely that it will be attractive to a hacker, especially if other motivation factors are present such as the entity being involved in commercial or financial activities, just for example.
[0064] For purposes of obtaining a coherent scoring scheme, the scoring and plotting module 135 may normalize various elements to obtain mathematical values that are usable in an algorithm for scoring motivation or sophistication. By way of example, each of the set of thresholds is associated with a mathematical value. If the entity has pageviews in excess of 10,000 unique users in one day, the entity is given a score of five. If the entity has pageviews in excess of 100,000 unique users in one day, the entity is given a score of ten. If the entity has pageviews in excess of 200,000 unique users in one day, the entity is given a score of fifteen. Again, these are merely examples of possible ways to convert pageviews into a mathematical representation that can be combined with other mathematical representations of other motivation elements in order to create an overall motivation score. [0065] In other examples, an employee sentiment can be represented mathematically as a percentage of positive versus negative comments from employees. In another example, negative employee behaviors, actions, or statements can be counted over time and compared to thresholds (in a method similar to that above with respect to pageviews).
[0066] Each of the motivation elements (if necessary) is converted into a mathematical representation. The ultimate motivation score can be calculated by taking a sum of each mathematical representation of motivation elements. In some embodiments, the
motivation score can be a representation of one or a combination of many motivation elements.
[0067] In some embodiments, the system 105 can be configured to weight one or more of the elements in a score calculation. For example, if it is determined that certain elements are more likely to increase the likelihood of a security failure (e.g., a cyber attack, a privacy incident), these elements can be assigned a weight. In an example, the weight is applied by multiplying a mathematical representation of an element by a coefficient or factor. If an element value for pageviews is five, a weighting could include multiplying this number by a coefficient of .5, which reduces the impact of that value on the overall score. Increases in element values can also be achieved.
[0068] While the above examples reference motivation elements, the scoring and plotting module 135 is also configured to process sophistication elements to obtain sophistication scores. The exact details for converting sophistication/motivation elements into
mathematical representations will vary according to the type of information included in the elements. To be sure, some types of elements such as pageviews and revenue are inherently more mathematical in their quantities, while other elements are more non- mathematical in nature such as employee or customer sentiment. For non-mathematical elements, users can develop suitable schemes or algorithms for converting or quantifying these elements into mathematical form. [0069] According to some embodiments, the scoring and plotting module 135 can determine various facets of an entity or group of entities by comparing the motivation, sophistication, and/or combined scores of these entities. Answers to pertinent questions can be deduced or inferred from the comparison.
[0070] For example, in one embodiment, the scoring and plotting module 135 is
configured to determine a position of an entity within an aggregate risk score of a portfolio of entities. Thus, the scoring and plotting module 135 has been used to calculate an aggregate risk score (motivation, sophistication, and/or combined) for numerous entities. In one embodiment, the scoring and plotting module 135 selects a plurality of motivation elements and analyzes these elements for each of a portfolio (plurality) of entities using the above examples as a guide for calculating motivation scores. In some embodiments, the same motivation elements are used for each entity.
[0071] The scoring and plotting module 135 can then determine where the entity lies within the group of scores. For example, out of 30 entities, a subject entity places 25th out of thirty.
[0072] The scoring and plotting module 135 can also be utilized to generate graphs and GUIs that display various scores in graphical format(s). For example, in FIG. 2, a graph with two axes is illustrated. The graph 200 comprises a vertical axis that is representative of motivation elements, and the horizontal axis is representative of sophistication elements. Indeed, this graph can be used to display information about a single entity or a plurality of entities.
[0073] In one embodiment, the motivation axis is delineated or stratified based on the type of content. Less important types of secure information are located towards the bottom of the axis, whereas more important types of information are located at the top part of the axis. In this embodiment, the lower part of the motivation axis references payment cards (e.g., credit cards) and other types of general consumer information. Above that is online crime such as phishing, malware, and other malicious behavior. Above online crime is IP theft and industrial espionage. At the top of the motivation axis are state secrets. To be sure, other categories of information types will lie somewhere along this axis, if not specifically mentioned. Furthermore, the axis can be defined by other types of information points. For example, an entity can structure their motivation axis to include information that they deal with, structured from least important to most important.
[0074] In the sophistication axis, which is the horizontal axis, hacker profiles are listed from left to right on the axis from a lowest impact actor type to a highest impact actor type. For example, actor types can include casual hackers, professional hackers, organized crime, and state actors. Each of these actor types has a different threat level associated therewith. The sophistication axis represents the strength or threat level that it takes to successfully hack the subject entity/entities.
[0075] FIG. 3 is an example graphical user interface (GUI) that comprises scatter plot illustrating an entity's motivation and sophistication relative to cyber risk. The scatter plot 300 comprises a vertical motivation axis and a horizontal sophistication axis. Each of the points plotted on the scatter plot 300 represent an entity. Again, these entities can be analyzed together because they are a part of an entity group (e.g., industry group, same geographical location, same company size, etc.).
[0076] FIG. 4 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their combination scores. The bar graph 400 comprises a vertical axis that represents a number of companies and a horizontal axis that represents combination scores for a set of entities. For example, most entities in the group have combination scores (sophistication and motivation) that fall within a score range of 51-60. Other groups of entities fall within other score ranges.
[0077] To be sure the system 105 can cause an elemental analysis of these similar scoring groups to identify what elements are shared between the entities, what elements are different, and so forth. Thus, the graphing of entities based on scores aids the system 105 in identifying groups of entities that require attention. For example, the entities in the score range of 31-40 are severely underperforming.
[0078] FIG. 5 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their sophistication scores. The bar graph 500 comprises a vertical axis that represents a number of companies and a horizontal axis that represents sophistication scores for a set of entities.
[0079] FIG. 6 is an example graphical user interface (GUI) that comprises a bar graph illustrating the plotting of a plurality of entities based on their motivation scores. The bar graph 600 comprises a vertical axis that represents a number of companies and a horizontal axis that represents motivation scores for a set of entities.
[0080] By comparing these graphs illustrated in FIGs. 4-6, underperformance in
sophistication and/or motivation can be quickly and easily determined, at least on a high level. Again, a more granular element analysis can be conducted when groups with underperforming sophistication/motivation scores are identified.
[0081] FIG. 7 is an example graphical user interface (GUI) that comprises a scatter plot that represents a plurality of entities plotted according to their combination score. The scatter plot 700 includes a plurality of data points that each represents an entity. The plot 700 comprises a vertical axis that represents motivation and a horizontal axis that represents sophistication scores for a set of entities. The higher risk area on the plot is where the motivation to hack is high and the sophistication of the entity is low.
[0082] The system 105 can create a line 705 of acceptable motivation/sophistication scores. Companies falling below this line 705 have a suitable cyber risk profile, whereas companies above the line have an unsuitable cyber risk profile. These companies can be identified and analyzed in order to suggest recommendations for improving their cyber risk.
[0083] FIG. 8 is an example graphical user interface (GUI) 800 that comprises a scatter plot 805 that represents a plurality of entities plotted according to their combination score, as well as additional graphical representations for an entity and a list of recommendations based on the plotting.
[0084] The plot is similar to that of FIG. 7, with the addition of two graphical
representations. For example, a linear slide 820 displays the position of an entity within a peer group of entities. This same relationship position is illustrated in a gauge graph 810.
[0085] In response to making a cyber risk assessment, the recommendation module 140 can be executed to provide the end user (or entity) with some type of actionable feedback. For example, the recommendation module 140 can provide the end user one or more actions to the end user based on the diversity score and the clusters of similar variables. This is described in further detail in U.S. patent application no. 14/585,051 filed December 29, 2014 and entitled "Diversity Analysis with Actionable Feedback Methodologies," which is hereby incorporated by reference herein in its entirety, including all references cited therein. These one or more actions potentially decrease the cyber risk of the entity. In one example, the recommendation module 140 can automatically identify variables, which if changed, would affect the cyber risk assessment. In further exemplary embodiments, entities may agree to automatic implementation of recommendations in exchange for lower policy premiums.
[0086] As best illustrated in FIG. 8, a set of recommendations 815 is provided along with the graphical analysis generated for the entity. Again, these recommendations are based on the system 105 having knowledge of the motivation elements, sophistication elements, as well as the scores calculated not only for the entity, but other entities (in some embodiments).
[0087] Exemplary methods and systems according to the present technology may also provide benchmarking over time. In this manner, the system 105 can track, for a company or group or entities, cyber risk over a selectable time period, for example days, weeks, months, and/or years. This benchmarking may be against a dynamic or static evaluation of the peer group, for instance, an entity's past and present cyber risk tracked against a static past peer group, static present peer group, and/or dynamic peer group. The present technology provides information related to the updated information (the new motivation score, the new sophistication score, the new composite score, etc.), including differences (the amount of the change made in one or more updates, namely the delta), and trends (patterns over many time steps).
[0088] FIG. 9 is a flowchart of an example method 900 of the present technology. The method includes the system 105 assessing 905 the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements. The cyber risk includes a security failure (e.g., a cyber attack, a privacy incident) of the entity.
[0089] The system 105 may query the entity for information, scrape available online sources such as websites, corporate filings, news sources, other public record databases, and other resources. Additionally, data may be gathered from the entity's network using devices already present there or by placing a new data collecting device on the entity's network to gather more data. The data collecting device may be on the company's network and/or its periphery, and may collect and/or analyze the data, while also transmitting it to system 105. In this example, additional, proprietary data may be gleaned from a particular entity's network.
[0090] The exemplary method also includes the system 105 automatically determining 910, based on the assessed risk, at least one of: a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy. Next, in this example, the method includes the system 105 automatically recommending 915 computer network changes to reduce the assessed risk. Next, the exemplary method includes the system 105, based on the recommended computer network changes, automatically reassessing 920 the cyber risk of the computer network. The method may also include the system 105 dynamically re-determining 930, based on the reassessed cyber risk, the at least one of: the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
[0091] FIG. 10 is a flowchart of an example method 1000. The method includes the system 105 assessing 1005 a sophistication for the entity with respect to preventing a cyber security failure using a plurality of sophistication elements for the entity. Again, the sophistication relates to people, processes, and technology. The sophistication analysis as a whole attempts to quantify how strong a threat actor would be required to execute a successful security failure of the entity.
[0092] Next, the method includes the system 105 assessing 1010 a motivation of an actor (e.g., a hacker) to initiate a cyber security failure, the assessment using a plurality of motivation elements regarding the entity.
[0093] In some embodiments, the method includes the system 105 plotting 1015 the sophistication against the motivation for the entity and other members of a peer group of the entity. Again, the plotting is performed, in this example, in a matrix that visually illustrates the cyber risk of the entity. The plotting can also include a two dimensional graph.
[0094] The method also includes the system 105 providing 1020 recommendations to the entity to improve the cyber risk based on the plotting of the sophistication against the motivation.
[0095] FIG. 11 is a flowchart of yet another example method 1100 for modifying an insurance policy based on a cyber risk analysis. The method includes the system 105 assessing 1105 a sophistication for the entity with respect to preventing a security failure (e.g., a cyber attack, a privacy incident, to name a few) of the entity using a plurality of sophistication elements for the entity. Again, the sophistication relates to people, processes, and technology. The sophistication analysis as a whole attempts to quantify how strong a threat actor would be required to cause a successful cyber failure. [0096] Next, the method includes the system 105 assessing 1110 a motivation of an actor (for example, a hacker) to initiate at least one of a security failure of the entity using a plurality of motivation elements regarding the entity.
[0097] To be sure, steps 1105 and 1110 include the collection of motivation and
sophistication elements, converting these elements into mathematical representations (if needed), and processing these elements into scores using relevant algorithms.
[0098] In some embodiments, the method includes the system 105 plotting 1115 the sophistication against the motivation for the entity and other members of a peer group of the entity. Again, the plotting, for this example, is performed in a matrix that visually illustrates the cyber risk of the entity. The plotting, for this example, can also include a two dimensional graph.
[0099] The method also includes the system 105 in 1120, based on the plotting,
automatically recommending changes (e.g., computer network changes).
[00100] In some embodiments, the system 105 performs an analysis of the motivation and sophistication elements without plotting as in step 1115. Thus, the change to, and/or setting of, the insurance policy does not need to be based on plotting.
[00101] According to some embodiments, the system 105 can be programmed with insurance policy parameters. The system 105 can generate recommendations for the insurer based on the motivation and sophistication analysis of the entity. In some instances, the recommendation could be to deny a policy or terminate a policy if the entity has motivation or sophistication elements that are defined by the insurance policy as being unacceptable or uninsurable.
[00102] FIG. 12 illustrates an exemplary computer system 1200 that may be used to implement some embodiments of the present disclosure. The computer system 1200 of FIG. 12 may be implemented in the contexts of the likes of computing systems, networks, servers, or combinations thereof. The computer system 1200 of FIG. 12 includes one or more processor units 1210 and main memory 1220. Main memory 1220 stores, in part, instructions and data for execution by processor units 1210. Main memory 1220 stores the executable code when in operation, in this example. The computer system 1200 of FIG. 12 further includes a mass data storage 1230, portable storage device 1240, output devices 1250, user input devices 1260, a graphics display system 1270, and peripheral devices 1280.
[00103] The components shown in FIG. 12 are depicted as being connected via a single bus 1290. The components may be connected through one or more data transport means. Processor unit 1210 and main memory 1220 are connected via a local microprocessor bus, and the mass data storage 1230, peripheral device(s) 1280, portable storage device 1240, and graphics display system 1270 are connected via one or more input/output (I/O) buses.
[00104] Mass data storage 1230, which can be implemented with a magnetic disk drive, solid state drive, or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit 1210. Mass data storage 1230 stores the system software for implementing embodiments of the present disclosure for purposes of loading that software into main memory 1220.
[00105] Portable storage device 1240 operates in conjunction with a portable non-volatile storage medium, such as a flash drive, floppy disk, compact disk, digital video disc, or Universal Serial Bus (USB) storage device, to input and output data and code to and from the computer system 1200 of FIG. 12. The system software for implementing embodiments of the present disclosure is stored on such a portable medium and input to the computer system 1200 via the portable storage device 1240.
[00106] User input devices 1260 can provide a portion of a user interface. User input devices 1260 may include one or more microphones, an alphanumeric keypad, such as a keyboard, for inputting alphanumeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys. User input devices 1260 can also include a touchscreen. Additionally, the computer system 1200 as shown in FIG. 12 includes output devices 1250. Suitable output devices 1250 include speakers, printers, network interfaces, and monitors. [00107] Graphics display system 1270 include a liquid crystal display (LCD) or other suitable display device. Graphics display system 1270 is configurable to receive textual and graphical information and processes the information for output to the display device.
[00108] Peripheral devices 1280 may include any type of computer support device that adds additional functionality to the computer system.
[00109] The components provided in the computer system 1200 of FIG. 12 are those typically found in computer systems that may be suitable for use with embodiments of the present disclosure and are intended to represent a broad category of such computer components that are well known in the art. Thus, the computer system 1200 of FIG. 12 can be a personal computer (PC), hand held computer system, telephone, mobile computer system, workstation, tablet, phablet, mobile phone, server, minicomputer, mainframe computer, wearable, or any other computer system. The computer may also include different bus configurations, networked platforms, multi-processor platforms, and the like. Various operating systems may be used including UNIX, LINUX, WINDOWS, MAC OS, PALM OS, QNX ANDROID, IOS, CHROME, TIZEN, and other suitable operating systems.
[00110] The processing for various embodiments may be implemented in software that is cloud-based. In some embodiments, the computer system 1200 is implemented as a cloud- based computing environment, such as a virtual machine operating within a computing cloud. In other embodiments, the computer system 1200 may itself include a cloud-based computing environment, where the functionalities of the computer system 1200 are executed in a distributed fashion. Thus, the computer system 1200, when configured as a computing cloud, may include pluralities of computing devices in various forms, as will be described in greater detail below.
[00111] In general, a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors (such as within web servers) and/or that combines the storage capacity of a large grouping of computer memories or storage devices. Systems that provide cloud-based resources may be utilized exclusively by their owners or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large
computational or storage resources.
[00112] The cloud may be formed, for example, by a network of web servers that comprise a plurality of computing devices, such as the computer system 1200, with each server (or at least a plurality thereof) providing processor and/or storage resources. These servers may manage workloads provided by multiple users (e.g., cloud resource customers or other users). Typically, each user places workload demands upon the cloud that vary in realtime, sometimes dramatically. The nature and extent of these variations typically depends on the type of business associated with the user.
[00113] The present technology is described above with reference to example embodiments. Therefore, other variations upon the example embodiments are intended to be covered by the present disclosure.

Claims

CLAIMS What is claimed is:
1. A method for reducing the risk of cyber security failures in a computer network of an entity, the method comprising:
assessing the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements;
automatically determining, based on the assessed risk, at least one of:
a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy; automatically recommending computer network changes to reduce the assessed risk;
automatically reassessing the cyber risk of the computer network based on the recommended computer network changes; and
dynamically re-determining, based on the reassessed cyber risk, the at least one of: the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
2. The method of claim 1, wherein the cyber security failure comprises a cyber attack.
3. The method of claim 1, wherein the cyber security failure comprises a privacy incident involving sensitive information.
4. The method of claim 1, wherein the computer agent is further configured to perform at least one of collecting information from the computer network of the entity, and analyzing information from the computer network of the entity.
5. The method of claim 1, further comprising:
based on the assessing, plotting one or more features of the entity and other members of a peer group of the entity, the plotting being configured to visually illustrate the cyber risk of the entity; the automatically recommending computer network changes being based on the plotting.
6. The method of claim 5, wherein the plotting is performed in a matrix that visually illustrates the cyber risk of the entity.
7. The method of claim 2, further comprising:
in response to the entity implementing at least a portion of the recommended computer network changes, causing the change to the at least one of a term and a condition of an insurance policy of the entity.
8. The method of claim 1, wherein the assessing comprises assessing, using a plurality of sophistication elements for the entity, a sophistication for the entity with respect to preventing the cyber security failure, the sophistication being one of the features of the entity.
9. The method of claim 1, wherein the assessing comprises assessing, using a plurality of motivation elements regarding the entity, a motivation of an actor to initiate the cyber security failure, the motivation being one of the features of the entity.
10. The method of claim 9, wherein the actor is at least one of a hacker.
11. The method of claim 1, wherein the assessing comprises:
assessing, using a plurality of sophistication elements for the entity, a sophistication for the entity with respect to preventing the cyber security failure, the sophistication being one of the features of the entity; and
assessing, using a plurality of motivation elements regarding the entity, a motivation of an actor to initiate the security failure, the motivation being another one of the features of the entity.
12 The method of claim 11, further comprising calculating a composite score from a motivation score and a sophistication score, the motivation score representing the plurality of motivation elements, the sophistication score representing the plurality of sophistication elements.
13. The method according to claim 12, further comprising:
creating an aggregate risk score of a portfolio of entities based on a plurality of motivation scores including the motivation score and a plurality of sophistication scores including the sophistication score; and
benchmarking over time at least one of the sophistication score, the motivation score, the composite score, and the aggregate risk score.
14. The method according to claim 13, further comprising determining a position of the entity relative to the aggregate risk score of the portfolio of entities, the portfolio of entities belong to at least one of an industry group, a geographic location, a company size, a technology sector, or any combinations thereof.
15. The method of claim 13, wherein the assessed cyber risk of the entity is a function of the aggregate risk score of the portfolio of entities.
16. The method of claim 13, further comprising:
comparing the assessed cyber risk for the entity to the aggregate risk score; and automatically generating and recommending additional changes to the entity to reduce the assessed cyber risk.
17. The method of claim 13, further comprising identifying clusters of sophistication elements or motivation elements shared between two or more of the portfolios of entities, the clusters of sophistication elements or motivation elements being associated with an increase in cyber risk.
18. The method of claim 13, further comprising:
identifying additional sophistication elements or motivation elements for at least one of the two or more of the portfolios of entities that are not shared with the portfolio of entities, the additional sophistication elements or motivation elements being associated with another increase in cyber risk; and
generating recommendations for the at least one of the two or more of the portfolio of entities that will cause a decrease in the cyber risk.
19. The method of claim 11, further comprising estimating a financial impact to the entity for a simulated security failure the simulated security failure testing the
sophistication of the entity and being affected by the motivation regarding the entity.
20. The method of claim 19, wherein the estimated financial impact to the entity for the simulated security failure is dynamically calculated based on an implementation of the recommended changes by the entity.
21. The method of claim 12, further comprising providing attribution for a change in at least one of the composite, motivation, or sophistication scores, the attribution indicating a change in an underlying data set to effect the change in the at least one of the composite, motivation, or sophistication scores.
22. The method of claim 11, wherein the recommended computer network changes affect at least one of the motivation and the sophistication of the entity in such a way that the assessed cyber risk is reduced.
23. The method of claim 1, wherein at least one of the recommended computer network changes is implemented automatically without intervention of the entity.
24. The method of claim 1, further comprising:
in response to the entity implementing at least a portion of the recommended computer network changes, further automatically reassessing the cyber risk; and
generating subsequent recommended changes if the cyber risk has not decreased sufficiently to meet a threshold.
25. The method of claim 1, wherein the insurance policy is at least one of:
a policy from an insurance company; and
a product warranty for first and/or third party costs that the entity purchases from one of a networking, security product, or services provider.
26. The method of claim 1, wherein the at least one of a term and a condition includes at least one of a retention amount, a deductible, a premium, a coverage limit, a future valuation, and a term length.
27. The system for reducing the risk of cyber security failures in a computer network of an entity, the system comprising:
a processor; and
a memory communicatively coupled with the processor, the memory storing instructions which when executed by the processor performs a method comprising:
assessing the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements;
automatically determining, based on the assessed risk, at least one of:
a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy; automatically recommending computer network changes to reduce the assessed risk;
automatically reassessing the cyber risk of the computer network based on the recommended computer network changes; and
dynamically re-determining, based on the reassessed cyber risk, the at least one of: the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
28. A non-transitory computer readable medium having recorded thereon a program, the program when executed causing a computer to perform a method, the method comprising:
assessing the risk of a cyber security failure in a computer network of an entity, using a computer agent configured to collect information from at least publicly accessible Internet elements;
automatically determining, based on the assessed risk, at least one of:
a change to at least one of a term and a condition of an insurance policy, and a setting of the at least one of a term and a condition of the insurance policy; automatically recommending computer network changes to reduce the assessed risk;
automatically reassessing the cyber risk of the computer network based on the recommended computer network changes; and
dynamically re-determining, based on the reassessed cyber risk, the at least one of: the change to at least one of a term and condition of the insurance policy, and the setting of the at least one of a term and a condition of the insurance policy.
PCT/US2015/067968 2014-12-29 2015-12-29 System for cyber insurance policy including cyber risk assessment/management service WO2016109608A1 (en)

Priority Applications (14)

Application Number Priority Date Filing Date Title
US15/099,297 US10341376B2 (en) 2014-12-29 2016-04-14 Diversity analysis with actionable feedback methodologies
US15/141,779 US9521160B2 (en) 2014-12-29 2016-04-28 Inferential analysis using feedback for extracting and combining cyber risk information
US15/142,997 US9699209B2 (en) 2014-12-29 2016-04-29 Cyber vulnerability scan analyses with actionable feedback
US15/371,047 US10230764B2 (en) 2014-12-29 2016-12-06 Inferential analysis using feedback for extracting and combining cyber risk information
US15/373,298 US10050989B2 (en) 2014-12-29 2016-12-08 Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US15/374,212 US10050990B2 (en) 2014-12-29 2016-12-09 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US15/457,921 US10218736B2 (en) 2014-12-29 2017-03-13 Cyber vulnerability scan analyses with actionable feedback
US15/972,027 US10498759B2 (en) 2014-12-29 2018-05-04 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US15/971,909 US10491624B2 (en) 2014-12-29 2018-05-04 Cyber vulnerability scan analyses with actionable feedback
US15/971,946 US10511635B2 (en) 2014-12-29 2018-05-04 Inferential analysis using feedback for extracting and combining cyber risk information
US16/582,977 US11146585B2 (en) 2014-12-29 2019-09-25 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US16/662,936 US11153349B2 (en) 2014-12-29 2019-10-24 Inferential analysis using feedback for extracting and combining cyber risk information
US17/465,739 US11855768B2 (en) 2014-12-29 2021-09-02 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US17/477,294 US11863590B2 (en) 2014-12-29 2021-09-16 Inferential analysis using feedback for extracting and combining cyber risk information

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462098238P 2014-12-30 2014-12-30
US62/098,238 2014-12-30
US201514614897A 2015-02-05 2015-02-05
US14/614,897 2015-02-05

Related Parent Applications (3)

Application Number Title Priority Date Filing Date
US14614897 Continuation 2012-02-05
US201514614897A Continuation 2014-12-29 2015-02-05
US14/931,510 Continuation-In-Part US9373144B1 (en) 2014-12-29 2015-11-03 Diversity analysis with actionable feedback methodologies

Related Child Applications (4)

Application Number Title Priority Date Filing Date
US14/585,051 Continuation-In-Part US9253203B1 (en) 2014-12-29 2014-12-29 Diversity analysis with actionable feedback methodologies
US15/099,297 Continuation-In-Part US10341376B2 (en) 2014-12-29 2016-04-14 Diversity analysis with actionable feedback methodologies
US15/141,779 Continuation-In-Part US9521160B2 (en) 2014-12-29 2016-04-28 Inferential analysis using feedback for extracting and combining cyber risk information
US15/142,997 Continuation-In-Part US9699209B2 (en) 2014-12-29 2016-04-29 Cyber vulnerability scan analyses with actionable feedback

Publications (2)

Publication Number Publication Date
WO2016109608A1 true WO2016109608A1 (en) 2016-07-07
WO2016109608A9 WO2016109608A9 (en) 2016-10-20

Family

ID=56285019

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/067968 WO2016109608A1 (en) 2014-12-29 2015-12-29 System for cyber insurance policy including cyber risk assessment/management service

Country Status (1)

Country Link
WO (1) WO2016109608A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160234247A1 (en) 2014-12-29 2016-08-11 Cyence Inc. Diversity Analysis with Actionable Feedback Methodologies
US9699209B2 (en) 2014-12-29 2017-07-04 Cyence Inc. Cyber vulnerability scan analyses with actionable feedback
US10050989B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US10050990B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10230764B2 (en) 2014-12-29 2019-03-12 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10404748B2 (en) 2015-03-31 2019-09-03 Guidewire Software, Inc. Cyber risk analysis and remediation using network monitored sensors and methods of use
WO2019173241A1 (en) * 2018-03-04 2019-09-12 Fractal Industries, Inc. Platform for live issuance and management of cyber insurance policies
WO2020056314A1 (en) * 2018-09-13 2020-03-19 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for a simulation program of a percolation model for the loss distribution caused by a cyber attack
WO2020086579A1 (en) * 2018-10-26 2020-04-30 Circadence Corporation Method and system for evaluating individual and group cyber threat awareness
US10970787B2 (en) 2015-10-28 2021-04-06 Qomplx, Inc. Platform for live issuance and management of cyber insurance policies
EP3582468B1 (en) 2018-06-12 2022-02-16 IT-Seal GmbH Method for determining a degree of deception for a single phishing attack against a subject
US11502995B2 (en) 2017-09-01 2022-11-15 Kyndryl, Inc. Testing and remediating compliance controls
US11514531B2 (en) 2015-10-28 2022-11-29 Qomplx, Inc. Platform for autonomous risk assessment and quantification for cyber insurance policies
US11863590B2 (en) 2014-12-29 2024-01-02 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11855768B2 (en) 2014-12-29 2023-12-26 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US11277483B2 (en) 2017-03-31 2022-03-15 Microsoft Technology Licensing, Llc Assessing user activity using dynamic windowed forecasting on historical usage
US11893121B1 (en) 2022-10-11 2024-02-06 Second Sight Data Discovery, Inc. Apparatus and method for providing cyber security defense in digital environments

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020091551A1 (en) * 2000-09-19 2002-07-11 Robert Parisi Internet insurance product
US20100114634A1 (en) * 2007-04-30 2010-05-06 James Christiansen Method and system for assessing, managing, and monitoring information technology risk
US20120011077A1 (en) * 2010-07-12 2012-01-12 Bhagat Bhavesh C Cloud Computing Governance, Cyber Security, Risk, and Compliance Business Rules System and Method
US20140142988A1 (en) * 2012-11-21 2014-05-22 Hartford Fire Insurance Company System and method for analyzing privacy breach risk data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020091551A1 (en) * 2000-09-19 2002-07-11 Robert Parisi Internet insurance product
US20100114634A1 (en) * 2007-04-30 2010-05-06 James Christiansen Method and system for assessing, managing, and monitoring information technology risk
US20120011077A1 (en) * 2010-07-12 2012-01-12 Bhagat Bhavesh C Cloud Computing Governance, Cyber Security, Risk, and Compliance Business Rules System and Method
US20140142988A1 (en) * 2012-11-21 2014-05-22 Hartford Fire Insurance Company System and method for analyzing privacy breach risk data

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10498759B2 (en) 2014-12-29 2019-12-03 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10050989B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US10511635B2 (en) 2014-12-29 2019-12-17 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10050990B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10218736B2 (en) 2014-12-29 2019-02-26 Guidewire Software, Inc. Cyber vulnerability scan analyses with actionable feedback
US10230764B2 (en) 2014-12-29 2019-03-12 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10341376B2 (en) 2014-12-29 2019-07-02 Guidewire Software, Inc. Diversity analysis with actionable feedback methodologies
US11153349B2 (en) 2014-12-29 2021-10-19 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US20160234247A1 (en) 2014-12-29 2016-08-11 Cyence Inc. Diversity Analysis with Actionable Feedback Methodologies
US10491624B2 (en) 2014-12-29 2019-11-26 Guidewire Software, Inc. Cyber vulnerability scan analyses with actionable feedback
US11146585B2 (en) 2014-12-29 2021-10-12 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US9699209B2 (en) 2014-12-29 2017-07-04 Cyence Inc. Cyber vulnerability scan analyses with actionable feedback
US11863590B2 (en) 2014-12-29 2024-01-02 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10404748B2 (en) 2015-03-31 2019-09-03 Guidewire Software, Inc. Cyber risk analysis and remediation using network monitored sensors and methods of use
US11265350B2 (en) 2015-03-31 2022-03-01 Guidewire Software, Inc. Cyber risk analysis and remediation using network monitored sensors and methods of use
US10970787B2 (en) 2015-10-28 2021-04-06 Qomplx, Inc. Platform for live issuance and management of cyber insurance policies
US11475528B2 (en) 2015-10-28 2022-10-18 Qomplx, Inc. Platform for live issuance and management of cyber insurance policies
US11514531B2 (en) 2015-10-28 2022-11-29 Qomplx, Inc. Platform for autonomous risk assessment and quantification for cyber insurance policies
US11533296B2 (en) 2017-09-01 2022-12-20 Kyndryl, Inc. Testing and remediating compliance controls
US11502995B2 (en) 2017-09-01 2022-11-15 Kyndryl, Inc. Testing and remediating compliance controls
WO2019173241A1 (en) * 2018-03-04 2019-09-12 Fractal Industries, Inc. Platform for live issuance and management of cyber insurance policies
EP3582468B1 (en) 2018-06-12 2022-02-16 IT-Seal GmbH Method for determining a degree of deception for a single phishing attack against a subject
WO2020056314A1 (en) * 2018-09-13 2020-03-19 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for a simulation program of a percolation model for the loss distribution caused by a cyber attack
US11354752B2 (en) 2018-09-13 2022-06-07 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for a simulation program of a percolation model for the loss distribution caused by a cyber attack
US11257393B2 (en) 2018-10-26 2022-02-22 Circadence Corporation Method and system for evaluating individual and group cyber threat awareness
WO2020086579A1 (en) * 2018-10-26 2020-04-30 Circadence Corporation Method and system for evaluating individual and group cyber threat awareness
US11972695B2 (en) 2018-10-26 2024-04-30 Circadence Corporation Method and system for evaluating individual and group cyber threat awareness

Also Published As

Publication number Publication date
WO2016109608A9 (en) 2016-10-20

Similar Documents

Publication Publication Date Title
US11153349B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information
US10491624B2 (en) Cyber vulnerability scan analyses with actionable feedback
US10050989B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US9521160B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information
US11146585B2 (en) Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10341376B2 (en) Diversity analysis with actionable feedback methodologies
US20220255965A1 (en) Cyber risk analysis and remediation using network monitored sensors and methods of use
WO2016109608A1 (en) System for cyber insurance policy including cyber risk assessment/management service
US20190035027A1 (en) Synthetic Diversity Analysis with Actionable Feedback Methodologies
US11568455B2 (en) System and methods for vulnerability assessment and provisioning of related services and products for efficient risk suppression
US11657352B2 (en) Reducing cybersecurity risk level of a portfolio of companies using a cybersecurity risk multiplier
de Gusmão et al. Cybersecurity risk analysis model using fault tree analysis and fuzzy decision theory
US9373144B1 (en) Diversity analysis with actionable feedback methodologies
Nagurney et al. Multifirm models of cybersecurity investment competition vs. cooperation and network vulnerability
US11855768B2 (en) Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
Iannacone et al. Quantifiable & comparable evaluations of cyber defensive capabilities: A survey & novel, unified approach
US11863590B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information
Ali et al. Framework for evaluating economic impact of IT based disasters on the interdependent sectors of the US economy
de Oca et al. Statistical Analysis and Economic Models for Enhancing Cyber-security in SAINT

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15876217

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15876217

Country of ref document: EP

Kind code of ref document: A1