Skip to content

Instantly share code, notes, and snippets.

@Harshit1694
Last active June 27, 2019 12:18
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Harshit1694/e784618fd626ea3161523fbfdae47d19 to your computer and use it in GitHub Desktop.
Save Harshit1694/e784618fd626ea3161523fbfdae47d19 to your computer and use it in GitHub Desktop.
#Writing the function to predict kNN
knn_predict <- function(test, train, k_value){
pred <- c()
#LOOP-1
for(i in c(1:nrow(test))){
dist = c()
char = c()
setosa =0
versicolor = 0
virginica = 0
}
#LOOP-2-looping over train data
for(j in c(1:nrow(train))){}
dist <- c(dist, ED(test[i,], train[j,]))
char <- c(char, as.character(train[j,][[5]]))
df <- data.frame(char, dist$SepalLength)
df <- df[order(df$dist.SepalLength),] #sorting dataframe
df <- df[1:k_value,]
#Loop 3: loops over df and counts classes of neibhors.
for(k in c(1:nrow(df))){
if(as.character(df[k,"char"]) == "setosa"){
setosa = setosa + 1
}else if(as.character(df[k,"char"]) == "versicolor"){
versicolor = versicolor + 1
}else
virginica = virginica + 1
}
n<-table(df$char)
pred=names(n)[which(n==max(n))]
return(pred) #return prediction vector
}
#Predicting the value for K=1
K=1
predictions <- knn_predict(test, train, K)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment