Skip to content

Instantly share code, notes, and snippets.

View AntoniosBarotsis's full-sized avatar
📚
Learning

Tony AntoniosBarotsis

📚
Learning
View GitHub Profile
@AntoniosBarotsis
AntoniosBarotsis / rustfmt.toml
Created December 20, 2022 15:41
rustfmt.toml
tab_spaces = 2
edition ="2021"
{"schemaVersion":1,"label":"MSRV","message":"1.71.1","color":"32ca55","labelColor":"353d46"}
@AntoniosBarotsis
AntoniosBarotsis / config.toml
Last active December 8, 2023 18:10
Rust lints
# add the below section to `.cargo/config.toml`
[target.'cfg(all())']
rustflags = [
# BEGIN - Embark standard lints v6 for Rust 1.55+
# do not change or add/remove here, but one can add exceptions after this section
# for more info see: <https://github.com/EmbarkStudios/rust-ecosystem/issues/59>
"-Dunsafe_code",
"-Wclippy::all",
"-Wclippy::await_holding_lock",
@AntoniosBarotsis
AntoniosBarotsis / findWifiPass.ps1
Created October 10, 2021 18:15
Retrieve the password of a known network
netsh wlan show profile [NETWORK NAME] key=clear | findstr "Key Content"
@AntoniosBarotsis
AntoniosBarotsis / kmeans_elbow.R
Created August 3, 2020 15:48
Using the elbow method to find the optimal amount of clusters in a given data set
set.seed(1234)
wcss <- vector()
for(i in 1:10) {
wcss[i] = sum(kmeans(X, i)$withinss)
}
plot(1:10, wcss, type = 'b',
main = paste('Clusters of <data>'),
xlab = 'Number of Clusters',
@AntoniosBarotsis
AntoniosBarotsis / backward_elimination.R
Last active July 30, 2020 15:59
Backward Elimination algorithm implementation in R for Multiple Linear Regression (MLR)
# Automatic backward elimination
backwardElimination <- function(x, sl) {
# Number of features
numVars = length(x)
for (i in c(1:numVars)){
# Train regressor on all (remaining) features
regressor = lm(formula = Profit ~ ., data = x)
# Query all the p-values and extract the largest one
maxVar = max(coef(summary(regressor))[c(2:numVars), "Pr(>|t|)"])
# If that value is greater than the significance level (SL) then drop the feature