The following content is generated using a preview release of Swimlane's pyattck.
This snippet of data is scoped to the following actor groups:
- APT33
- APT34
- APT39
- Charming Kitten
/* Not ECMAScript (ECMA-262). | |
* Copyright: https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.197.pdf | |
* https://en.wikipedia.org/wiki/Finite_field_arithmetic#Rijndael.27s_finite_field | |
* ref0: https://en.wikipedia.org/wiki/Advanced_Encryption_Standard | |
* ref1: http://www.angelfire.com/biz7/atleast/mix_columns.pdf | |
* MixColumns—a linear mixing operation which operates on the columns | |
* of the state, combining the four bytes in each column. | |
*/ | |
// Global parts |
\subsection{APT Threats Sponsored by Iranian Government} | |
Behind the most dangerous APT threats maliciously targeting the general public in democratic societies, there are 24 Iranian threat actors worth attention. | |
\textbf{Cutting Kitten} Aliases: Cutting Kitten (CrowdStrike),TG-2889 (SecureWorks). | |
{\footnotesize Tools:CsExt, Jasus, KAgent, Net Crawler, PvcOut, SynFlooder, TinyZBot, WndTest and ZhMimikatz.} | |
\textbf{DNSpionage} focuses on recon. | |
{\footnotesize Tools:DNSpionage and Karkoff.} |
\textbf{Obfuscation.} To hide weaponized exploitation signatures and anomaly. {\footnotesize Tools: Invoke-Obfuscation, demiguise, Veil-evasion, Invoke-DOSfuscation, morphHTA, Unicorn, Ruler.} | |
\textbf{Encryption.} To encrypt victims files. {\footnotesize Tools: zip, gzip, rar, winzip32, 7z.} | |
\textbf{Exploitation for Privilege Escalation.} To exploit some CVE of Microsoft. {\footnotesize Tools: getsystem, bitsadmin, msbuild, privesc.} | |
\textbf{File Deletion.} To wipe victims machine. {\footnotesize Tools: rm, shred, del, rmdir, Remove-Item, vssadmin, wmic, bcdedit, wbadmin.} | |
\textbf{Rundll32 Tricks.} To bypass things like whitelisting. {\footnotesize Tools: rundll32, {\textbar}javascript{\textbar}vbscript{\textbar}http{\textbar}https{\textbar}.dll | |
} |
{\textbf{Common Anomaly Detectors}} can classify into ``four plus one'' main categories: | |
\begin{enumerate} | |
\item Centroid-based: Finding the mean c of all training data points, Then computing the distance of a new sample point x from the centroid c. Its native defense is the threshold to the distance value. | |
\item Kernel PCA-based (Principal Component Analysis): Analyzing correlations among the variables and finding the values that best captures differences in outcomes. These combined feature values are used to create a more compact feature space called the principal components. | |
\item N-gram: contiguous sequence of n items from a given sample. |
{\bfseries Common attacks} fall into two categories: | |
{\bfseries Evasion Attacks:} | |
Evasion Attacks focus on models. ( attacks on supervised learning) In signature-based context, there was polymorphism attack that focuses on shellcodes for signature. | |
For example, as Fogla et al., the polymorphism- blending attack focuses on an evil feature that has a high similarity score to normal feature vector. This kind of adversarial examples attacks ``reliability'' as a whole. | |
{\bfseries Poison Attacks:} |
Highly productive APT synthesizes phishing and social engineering practices for persistent purpose\cite{falliere2011w32, cardenas2011attacks, maiorca2019digital, urbina2016limiting}. They usually share similar strategic goals. History cleaning and faking can increase the impact of the goals. | |
An example of common history cleaning: | |
{\footnotesize | |
\begin{verbatim} | |
ln /dev/null -/.bash_history -sf # history to null | |
kill -9 $$ # kill session | |
history -c # clear session history | |
echo "" /var/log/auth.log # clear auth log |
The following content is generated using a preview release of Swimlane's pyattck.
This snippet of data is scoped to the following actor groups:
In this tutorial, we'll take an in-depth view of what's happening when you execute a simple Onyx program. All of the code can be found in the Onyx Starter repository if you'd like to follow along. The code uses the development environment with HornetQ and ZooKeeper running in memory, so you don't need additional dependencies to run the example for yourself on your machine.
At the core of the program is the workflow - the flow of data that we ingest, apply transformations to, and send to an output for storage. In this program, we're going to ingest some sentences from an input source, split the sentence into individual words, play with capitalization, and add a suffix. Finally, we'll send the transformed data to an output source.
Let's examine the workflow pictorially: