Bootstrap < 3.4.1 || < 4.3.1
✔️ CSP strict-dynamic bypass
➖ Requires user interaction
➖ Requires $('[data-toggle="tooltip"]').tooltip();
####Rets Rabbit http://www.retsrabbit.com
Rets Rabbit removes the nightmare of importing thousands of real estate listings and photos from RETS or ListHub and gives you an easy to use import and Web API server so you can focus on building your listing search powered website or app.
WITH gen_series AS ( | |
SELECT | |
i, | |
count(*) over () as rows | |
FROM generate_series(1,100000) tbl(i) | |
), gen_year AS ( | |
SELECT | |
i, | |
CASE WHEN i <= rows * 0.25 THEN 2022 | |
WHEN i <= rows * 0.5 THEN 2023 |
This is my recommended Python setup, as of Fall 2022. The Python landscape can be a confusing mess of overlapping tools that sometimes don't work well together. This is an effort to standardize our approach and environments.
# | |
# Reload localhost tabs. Sample usage: | |
# | |
# $ brew install watchexec | |
# $ watchexec --watch <DIRNAME> reload.osa | |
# | |
tell application "Google Chrome" | |
set window_list to every window | |
repeat with the_window in window_list |
# how to run this thingy | |
# create a file on your mac called setup.sh | |
# run it from terminal with: sh setup.sh | |
# heavily inspired by https://twitter.com/damcclean | |
# https://github.com/damcclean/dotfiles/blob/master/install.sh | |
#!/bin/bash | |
set -euo pipefail |
#!/usr/bin/env python | |
# -*- coding: utf-8 -*- | |
""" | |
Denoising Autoencoders (dA) | |
References : | |
- P. Vincent, H. Larochelle, Y. Bengio, P.A. Manzagol: Extracting and | |
Composing Robust Features with Denoising Autoencoders, ICML'08, 1096-1103, | |
2008 |
#coding: utf8 | |
""" | |
1. Download this gist. | |
2. Get the MNIST data. | |
wget http://deeplearning.net/data/mnist/mnist.pkl.gz | |
3. Run this code. | |
python autoencoder.py 100 -e 1 -b 20 -v | |
Wait about a minute ... and get a vialization of weights. | |
""" |
# Author: Jean-Remi King <jeanremi.king@gmail.com> | |
""" | |
Illustrate how a hinge loss and a log loss functions | |
typically used in SVM and Logistic Regression | |
respectively focus on a variable number of samples. | |
For simplification purposes, we won't consider the | |
regularization or penalty (C) factors. | |
""" | |
import numpy as np | |
import matplotlib.animation as animation |
# http://stackoverflow.com/questions/6645895/calculating-the-percentage-of-variance-measure-for-k-means | |
cents = [km.kmeans.cluster_centers_ for km in kms] | |
D_k = [cdist(rid_brand_pca, cent, 'euclidean') for cent in cents] | |
# 最も近い中心との距離 | |
dist = [np.min(D,axis=1) for D in D_k] | |
avgWithinSS = [sum(d)/rid_brand_pca.shape[0] for d in dist] | |
# elbow curve |