Bayes' Theorem: [ P(H|E) = \frac{P(E|H) \cdot P(H)}{P(E)} ]
- ( P(H|E) ): Posterior probability
- ( P(E|H) ): Likelihood
- ( P(H) ): Prior probability
- ( P(E) ): Marginal likelihood
-
Prior Probability (Prior):
- Disease prevalence: 1% (( P(D) = 0.01 ))
-
Likelihood:
- Test sensitivity: 99% (( P(T|D) = 0.99 ))
- False positive rate: 5% (( P(T|\neg D) = 0.05 ))
-
New Evidence (Test Result):
- Positive test result
-
Calculate Marginal Likelihood: [ P(T) = P(T|D) \cdot P(D) + P(T|\neg D) \cdot P(\neg D) ] [ P(T) = 0.99 \cdot 0.01 + 0.05 \cdot 0.99 = 0.0594 ]
-
Apply Bayes' Theorem: [ P(D|T) = \frac{0.99 \cdot 0.01}{0.0594} \approx 0.167 ]
- Prior: Probability of a network anomaly being a threat.
- Evidence: New logs or alerts.
- Likelihood: Probability of the data if there is a threat.
- Posterior: Updated threat probability.
- Prior: Probability an email is spam.
- Evidence: Email features.
- Likelihood: Probability of features if the email is spam.
- Posterior: Updated spam probability.
-
Prior: Initial risk level from past incidents.
- Example: Based on historical data, there might be a 5% risk of an incident occurring during an event ((P(R) = 0.05)).
-
Evidence: New surveillance data.
- Example: Recent surveillance footage shows suspicious activity near the event venue.
-
Likelihood: Probability of accuracy if there is a threat.
- Example: If there were a threat, the likelihood of observing such suspicious activity might be 70% ((P(E|R) = 0.7)). If there is no threat, the likelihood of such activity might be only 10% ((P(E|\neg R) = 0.1)).
-
Posterior: Updated risk level guiding security measures.
- Example: Using Bayes' Theorem to update the risk level based on the new surveillance data, leading to an updated assessment that helps in deciding whether to increase security measures or alert the executive.
![Risk-assessment](https://private-user-images.githubusercontent.com/100837335/331768785-94b01774-a7d1-4223-b603-423cfd35357c.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjE4NDI4NjEsIm5iZiI6MTcyMTg0MjU2MSwicGF0aCI6Ii8xMDA4MzczMzUvMzMxNzY4Nzg1LTk0YjAxNzc0LWE3ZDEtNDIyMy1iNjAzLTQyM2NmZDM1MzU3Yy5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNzI0JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDcyNFQxNzM2MDFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1hZjQ4NDNjNmZjN2UxOTY5YzQ2ZTdlNWZiYmFlNGY4NmM2MmE1ZGViMWQyMmU4NTgxODYyODk3Nzc5MjYzYjBhJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.PViWBmIxwmXkl4nXly3Yh88LvijwLkFCWEB3MMTovCs)
Assume:
- Prior Probability ((P(R))): There is a 5% chance of a security incident ((P(R) = 0.05)).
- Likelihood ((P(E|R))): 70% chance of observing the evidence if there is a threat ((P(E|R) = 0.7)).
- Evidence ((P(E))): Overall probability of observing the evidence. Suppose such suspicious activity is seen 15% of the time regardless of threats ((P(E) = 0.15)).
- Prior Probability of No Threat ((P(\neg R))): (P(\neg R) = 0.95)
- Likelihood of Evidence if No Threat ((P(E|\neg R))): 10% chance of suspicious activity if there is no threat ((P(E|\neg R) = 0.1)).
Calculate the Posterior ((P(R|E))) using Bayes' Theorem: [ P(R|E) = \frac{P(E|R) \cdot P(R)}{P(E)} = \frac{0.7 \cdot 0.05}{0.15} = 0.233 \approx 23.3% ]
Thus, after considering the new surveillance data, the probability of a security incident increases to approximately 23.3%.
-
Prior: Initial threat level based on historical data.
- Example: Based on past intelligence reports, there might be a 10% chance of a threat to the executive ((P(T) = 0.1)).
-
Evidence: New intelligence reports.
- Example: Recent reports indicate heightened chatter about a possible attack.
-
Likelihood: Probability of evidence if the threat is real.
- Example: If the threat is real, the likelihood of observing increased chatter might be 80% ((P(E|T) = 0.8)). If there is no threat, the likelihood of such chatter might be 20% ((P(E|\neg T) = 0.2)).
-
Posterior: Updated threat level for planning protective measures.
- Example: Using Bayes' Theorem to update the threat level based on the new intelligence, leading to an updated assessment that helps in deciding whether to implement additional protective measures or change the executive's schedule.
![Protective-int](https://private-user-images.githubusercontent.com/100837335/331768789-1b51947c-acc2-4548-9327-ea03d47c54c6.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjE4NDI4NjEsIm5iZiI6MTcyMTg0MjU2MSwicGF0aCI6Ii8xMDA4MzczMzUvMzMxNzY4Nzg5LTFiNTE5NDdjLWFjYzItNDU0OC05MzI3LWVhMDNkNDdjNTRjNi5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNzI0JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDcyNFQxNzM2MDFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1iMjAxYTVlNTg4YWJjYTI1OTAwNDBkOGYwOWUxNWQ0N2VhMGZkMDQ3NDYxYTE1Yjc3YmI0MmRmMjM0Njc0YjI2JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.XBxP5zGqzQ4v-b7JlkCHg5V-bpOYPWV3FdXaITx5iD0)
Assume:
- Prior Probability ((P(T))): There is a 10% chance of a threat ((P(T) = 0.1)).
- Likelihood ((P(E|T))): 80% chance of observing the evidence if the threat is real ((P(E|T) = 0.8)).
- Evidence ((P(E))): Overall probability of observing the evidence. Suppose increased chatter is observed 25% of the time regardless of threats ((P(E) = 0.25)).
- Prior Probability of No Threat ((P(\neg T))): (P(\neg T) = 0.9)
- Likelihood of Evidence if No Threat ((P(E|\neg T))): 20% chance of increased chatter if there is no threat ((P(E|\neg T) = 0.2)).
Calculate the Posterior ((P(T|E))) using Bayes' Theorem: [ P(T|E) = \frac{P(E|T) \cdot P(T)}{P(E)} = \frac{0.8 \cdot 0.1}{0.25} = 0.32 \approx 32% ]
Thus, after considering the new intelligence reports, the probability of a threat increases to approximately 32%.
-
Prior: Initial probability based on geopolitical analysis.
- Example: Historical stability, existing political tensions, and regional conflicts. For instance, if there has been a long-standing political tension but no prior invasions, the prior might be low.
-
Evidence: New information from intelligence sources.
- Example: Recent military movements, intercepted communications, or changes in political rhetoric. For instance, if new intelligence reports suggest unusual military activity near Puerto Rico, this would be the new evidence.
-
Likelihood: Probability of observing the new data if an invasion/insurgency is imminent.
- Example: If an invasion were imminent, there would likely be specific patterns of military mobilization or political actions. Assess how probable the observed evidence is under the assumption that an invasion is actually being planned.
- Calculation: Compare the observed data (e.g., frequency of military exercises) against typical behavior in the absence of an invasion threat.
-
Posterior: Updated probability guiding strategic decisions.
- Example: Using Bayes' Theorem to update the initial probability with the new evidence, leading to an updated assessment of the risk. This updated probability helps in deciding whether to increase military readiness, engage in diplomatic talks, or issue public warnings.
![Inva-pr](https://private-user-images.githubusercontent.com/100837335/331768787-96d49c00-1626-4456-9763-6693ffb2a96a.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjE4NDI4NjEsIm5iZiI6MTcyMTg0MjU2MSwicGF0aCI6Ii8xMDA4MzczMzUvMzMxNzY4Nzg3LTk2ZDQ5YzAwLTE2MjYtNDQ1Ni05NzYzLTY2OTNmZmIyYTk2YS5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNzI0JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDcyNFQxNzM2MDFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT0yMzgxZTEyMjkyZWJmODBmZTNjZGUwYmJhODA1NmEyZWRhYWQzNTA0OGRjNzk3NWQ0MjhmNzdhZjk4ZjgyYTFlJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.Z8hoTXoXskfT_j7S62PKiN5K-hglRr2qLchv7DYTSJI)
Assume:
- Prior Probability ((P(I))): There is a 2% chance of an invasion based on historical data ((P(I) = 0.02)).
- Likelihood ((P(E|I))): If an invasion were imminent, there is an 80% chance of observing the new evidence ((P(E|I) = 0.8)).
- Evidence ((P(E))): The overall probability of observing the evidence. Suppose intelligence suggests that such military movements are seen 10% of the time regardless of invasions ((P(E) = 0.1)).
- Prior Probability of No Invasion ((P(\neg I))): (P(\neg I) = 0.98)
- Likelihood of Evidence if No Invasion ((P(E|\neg I))): If there is no invasion, the probability of seeing the evidence might be 5% ((P(E|\neg I) = 0.05)).
Calculate the Posterior ((P(I|E))) using Bayes' Theorem: [ P(I|E) = \frac{P(E|I) \cdot P(I)}{P(E)} = \frac{0.8 \cdot 0.02}{0.1} = 0.16 \approx 16% ]
Thus, after considering the new evidence, the probability of an invasion increases from 2% to 16%.
-
Prior: Initial assessment of infrastructure resilience.
- Example: Based on past assessments, suppose there is a 70% probability that Puerto Rico's infrastructure can withstand a cyber attack ((P(R) = 0.7)).
-
Evidence: Recent assessments or simulated attack data.
- Example: New penetration tests reveal vulnerabilities, or recent incidents show a pattern of attacks targeting similar infrastructures.
-
Likelihood: Probability of data if the infrastructure can withstand attacks.
- Example: If the infrastructure is resilient, there's a 90% chance of it performing well under simulated attacks ((P(E|R) = 0.9)). If not resilient, the probability of it performing well might be only 20% ((P(E|\neg R) = 0.2)).
-
Posterior: Updated resilience assessment guiding cybersecurity investments.
- Example: Using new data to update the resilience probability, which informs decisions on whether to invest more in cybersecurity measures.
![cyber](https://private-user-images.githubusercontent.com/100837335/331768786-56a31563-4636-42eb-9aa8-354c6322addd.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjE4NDI4NjEsIm5iZiI6MTcyMTg0MjU2MSwicGF0aCI6Ii8xMDA4MzczMzUvMzMxNzY4Nzg2LTU2YTMxNTYzLTQ2MzYtNDJlYi05YWE4LTM1NGM2MzIyYWRkZC5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNzI0JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDcyNFQxNzM2MDFaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jY2M4OTdlN2JmNjQzNGNiMjZlYzlhZWVjMzMwMmEyYjljZDk4YTk0OGEzYWVkZDhhMzQyYTFkMTVlY2U0YjEyJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.6wxjKQU3VBkM_unAR-25-FG5run5M921GE5lsouS_y0)
Assume:
- Prior Probability ((P(R))): 70% probability of resilience ((P(R) = 0.7)).
- Likelihood ((P(E|R))): 90% chance of good performance under attack if resilient ((P(E|R) = 0.9)).
- Evidence ((P(E))): Overall probability of observed data. Suppose the infrastructure's performance is good 50% of the time ((P(E) = 0.5)).
- Prior Probability of No Resilience ((P(\neg R))): (P(\neg R) = 0.3)
- Likelihood of Evidence if Not Resilient ((P(E|\neg R))): 20% chance of good performance if not resilient ((P(E|\neg R) = 0.2)).
Calculate the Posterior ((P(R|E))) using Bayes' Theorem: [ P(R|E) = \frac{P(E|R) \cdot P(R)}{P(E)} = \frac{0.9 \cdot 0.7}{0.5} = 1.26 \approx 72% ]
Thus, after considering the new evidence, the updated probability of infrastructure resilience increases slightly to 72%.