Skip to content

Instantly share code, notes, and snippets.

View fonylew's full-sized avatar
🐝
Go Jackets!

Kamolphan Lewprasert fonylew

🐝
Go Jackets!
View GitHub Profile
@fonylew
fonylew / docker_run_colab.sh
Created March 20, 2024 17:49
Run Google Colab runtime locally
docker run --gpus=all -p 127.0.0.1:9000:8080 us-docker.pkg.dev/colab-images/public/runtime
@fonylew
fonylew / docker_run_jupyter.sh
Created March 20, 2024 16:32
Run Jupyter Notebook on Docker with GPU
docker run -d --gpus all -v $(realpath ~/notebooks):/tf/notebooks -p 8888:8888 tensorflow/tensorflow:latest-gpu-jupyter
# Create a new Bucket
gsutil mb gs://[BUCKET]
# List files in bucket
gsutil ls gs://[BUCKET]
# Check bucket usage (du: disk usage)
gsutil du -sh gs://[BUCKET]
# Copy (upload) file to bucket
# สร้าง bucket
gsutil mb gs://[BUCKET]
# ลิสต์ files ใน bucket
gsutil ls gs://[BUCKET]
# ดูว่า bucket ขนาดเท่าไร (du: disk usage)
gsutil du -sh gs://[BUCKET]
@fonylew
fonylew / kafka_direct_consumer.py
Created June 29, 2017 09:10
Simple Kafka direct consumer with Spark Streaming (Python) -- just consume it!
import sys
from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils
if __name__ == "__main__":
sc = SparkContext(appName="PythonStreamingDirectKafkaWordCount")
sc.setLogLevel("ERROR")
ssc = StreamingContext(sc, 2)
brokers, topic = sys.argv[1:]
kvs = KafkaUtils.createDirectStream(ssc, [topic],{"metadata.broker.list": brokers})
@fonylew
fonylew / install-local-server.sh
Last active August 31, 2023 12:50
A bunch of script to install Nvidia with CUDA10, CUDNN 7.4 and so on.
#!/bin/bash
# First sudo command
sudo whoami
# Update and upgrade
sudo apt update
sudo apt upgrade -y
# Utility
# Installing WineHQ
```
sudo dpkg --add-architecture i386
wget https://dl.winehq.org/wine-builds/Release.key
sudo apt-key add Release.key
sudo apt-add-repository https://dl.winehq.org/wine-builds/ubuntu/
sudo apt-get update
sudo apt-get install --install-recommends winehq-stable
@fonylew
fonylew / ingest_to_bq_airflow.py
Last active June 28, 2023 16:50
Simple code to ingest data from REST API (Bitcoin price from Coindesk) to BigQuery in Airflow.
import json
import requests
from airflow.models import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from airflow.utils.dates import days_ago
URL = "https://api.coindesk.com/v1/bpi/currentprice/SGD.json"
DATASET_ID = "demo"
@fonylew
fonylew / message_generator_pubsub.py
Last active August 11, 2022 17:50 — forked from antoniocachuan/message_generator_pubsub.py
Simple Message generator for Google Cloud Pub/Sub
#!/usr/bin/env python
# Code modified from https://cloud.google.com/dataflow/docs/samples/join-streaming-data-with-sql#expandable-3
import datetime, json, os, random, time
# Set the `project` variable to a Google Cloud project ID.
project = 'GCP-PROJECT-ID'
BRANCH = ['LIM', 'BOG', 'SFO', 'LAX', 'PEK', 'ATL', 'CDG', 'AMS',
'HKG', 'ICN', 'FRA', 'MAD', 'SEA', 'LAS', 'SIN', 'BKK', 'DFW',
@fonylew
fonylew / performance.py
Created March 21, 2021 13:07 — forked from greenstick/performance.py
Python Class for Performance Assessment of Classification Tasks
#! /usr/bin/env python3
"""
Development Version: Python 3.5.1
Author: Benjamin Cordier
Description: Module For Performance
Assessment of Classification Task
License: BSD 3 Clause
--