Skip to content

Instantly share code, notes, and snippets.

View dantejauregui's full-sized avatar

Dante J. dantejauregui

View GitHub Profile
version: "2.2"
services:
# setup:
# image: docker.elastic.co/elasticsearch/elasticsearch:${STACK_VERSION}
# volumes:
# - certs:/usr/share/elasticsearch/config/certs
# user: "0"
# command: >
# bash -c '
@dantejauregui
dantejauregui / gist:acf73f2b874cf977bce7828f465ddae7
Created January 15, 2024 07:00
Docker Volumes & Networking
Docker Volumes & Networking
2 Types of Mounts:
1.-Volume Mounting:
-When you use the command:
docker volume create “data_volume”
docker volume ls
FluxCD & Killerkoda K3S
*Killercoda install automatically GIT, DOCKER and K3S Kubernetes
Creating SSH Key “gitlabKey1” and storing it in GITLAB User Settings:
ssh-keygen -t ed25519 -C “killercoda1”
eval "$(ssh-agent -s)"
ssh-add ~/.ssh/id_ed25519
cat ~/.ssh/id_ed25519.pub (EL CODIGO QUE SE GENERA SE COPIA Y SE PEGA EN GITLAB USER SETTINGS “SSH KEYS”)
@dantejauregui
dantejauregui / gist:93a1eba8a9c4a232b15a55c8ad88f855
Last active January 15, 2024 07:03
SQL & NOSQL Databases Config para DevOps
SQL Databases Config para DevOps:
1.-Creando DB y Tablas:
mysql -h 0.0.0.0 -u root -p (primero te conectas a mysql como “root” y colocare el password configurado por el “docker compose” con el q cree este mysql image)
CREATE DATABASE dantito; (se crea la DB llamada: dantito)
USE dantito;
@dantejauregui
dantejauregui / gist:7a05546945a531fc444fbc0a1bddbc08
Last active December 3, 2023 22:13
Changing different Contexts (Kubernetes distributions)
1.To find the current Context, you can simply run the following command:
kubectl config current-context
2.To list all of the available Kubernetes Contexts in our .KubeConfig file:
kubectl config get-contexts
3.To switch to a different Kubernetes Context that exists in our .KubeConfig file:
kubectl config use-context <context-name>
@dantejauregui
dantejauregui / gist:7629f088100c729420941596339d4114
Last active February 5, 2024 12:22
starting w HELM Charts
1.you add the repo:
helm repo add bitnami https://charts.bitnami.com/bitnami
2.After added, you verify is in ur local machine:
helm search repo
3.If you wanna do some changes in the YAML files of NGINX (as an example), you pull the files to make changes:
helm pull bitnami/nginx --untar=true
Before INSTALL the YAML project, we have to make sure the KubeConfig is correct configured, for ex. with EKS would be:
@dantejauregui
dantejauregui / gist:dcd23f5abb0371cb89634eab2b591a1d
Last active January 1, 2019 17:36
Virtualenv use for Python v2.7 (se debe usar para ML: Keras, etc)
1.-After install pyenv with: pip install virtualenv
2.-select the folder of the project where you wanna create the vEnv.
3.-in terminal run this:
virtualenv pyEnv2.7 -p /usr/bin/python2.7
"pyEnv27" is the name of the virtualEnv Folder
and
"-p /usr/bin/python2.7" is the specific route where python2.7 is located natively in MAC
@dantejauregui
dantejauregui / gist:5af30df6ea33bb876fbfbd947731b5f7
Last active January 25, 2018 10:12
Multiple conditionals: " && " " || "
following this url theory: https://stackoverflow.com/questions/4490274/returning-with
1st Case '&&' :
- return a && b
-Explanation:
Will be equivalent to:
if (a = true) return b;
'''This script goes along the blog post
"Building powerful image classification models using very little data"
from blog.keras.io.
It uses data that can be downloaded at:
https://www.kaggle.com/c/dogs-vs-cats/data
In our setup, we:
- created a data/ folder
- created train/ and validation/ subfolders inside data/
- created cats/ and dogs/ subfolders inside train/ and validation/
- put the cat pictures index 0-999 in data/train/cats
@dantejauregui
dantejauregui / gist:559968381ec85fc9040409473465b8da
Last active January 1, 2019 18:16
Keras with Tensorflow Backend
Followed this: https://www.pyimagesearch.com/2016/11/14/installing-keras-with-tensorflow-backend/
(lo hice para Python2.7)
1.Create a virtualEnv and open it
2. Once is inside, in the console, write this command: export TF_BINARY_URL= (look in Tensorflow WEB the URL: https://www.tensorflow.org/install/pip y buscar sección "Package Location" y "macOS (CPU-only)" )
I used for my MAC: "export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/mac/cpu/tensorflow-1.12.0-py2-none-any.whl
3. pip install --upgrade $TF_BINARY_URL