# Install pipenv
pip install pipenv
# Create Venv
pipenv shell
Collecting scipy | |
Using cached scipy-1.4.1.tar.gz (24.6 MB) | |
Installing build dependencies ... error | |
ERROR: Command errored out with exit status 1: | |
command: /home/moose/.pyenv/versions/pypy3.6-7.3.0/bin/python /home/moose/.pyenv/versions/pypy3.6-7.3.0/site-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-1x4wi_xk/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- wheel setuptools 'Cython>=0.29.13' 'numpy==1.13.3; python_version=='"'"'3.5'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.13.3; python_version=='"'"'3.6'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.14.5; python_version=='"'"'3.7'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.17.3; python_version>='"'"'3.8'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.16.0; python_version=='"'"'3.5'"'"' and platform_system=='"'"'AIX'"'"'' 'numpy==1.16.0; python_version=='"'"'3.6'"'"' and platform_system=='"'"'AIX'"'"'' 'numpy==1.16.0; python_version=='"'"'3.7'" |
import pandas as pd | |
import requests, re, datetime, urllib.parse | |
from bs4 import BeautifulSoup as bs | |
from multiprocessing.pool import ThreadPool | |
def get_datelist(): | |
base_time = datetime.datetime.today() | |
base_time = base_time.replace(hour=0, second=0, minute=0, microsecond=0) | |
date_list = [base_time + datetime.timedelta(days=x) for x in range(1, 15)] |
# initiate alembic in project | |
alembic init alembic | |
##################################### | |
# need to edit alembic/env.py so that gets connection string from settings/local_config python module | |
# add in below around about line 17 | |
# add parent project to sys path | |
import sys, os | |
sys.path.insert(0, os.path.dirname(os.path.dirname(__file__))) |
TL|DR: Use this to easily deploy a FastAI Python model using NodeJS.
You've processed your data and trained your model and now it's time to move it to the cloud.
If you've used a Python-based framework like fastai to build your model, there are several excellent solutions for deployment like Django or Starlette. But many web devs prefer to work in NodeJS, especially if your model is only part of a broader application.
My friend Navjot pointed out that NodeJS and Python could run together if we could send remote procedure calls from NodeJS to Python.
# Adapted from example in Ch.3 of "Web Scraping With Python, Second Edition" by Ryan Mitchell | |
import re | |
import requests | |
from bs4 import BeautifulSoup | |
pages = set() | |
def get_links(page_url): | |
global pages |
postgres: | |
image: postgres:9.4 | |
volumes: | |
- ./init.sql:/docker-entrypoint-initdb.d/init.sql |
Just migrated it from Codepen.io to markdown. Credit goes to David Conner.
Working with DOM | Working with JS | Working With Functions |
---|---|---|
Accessing Dom Elements | Add/Remove Array Item | Add Default Arguments to Function |
Grab Children/Parent Node(s) | Add/Remove Object Properties | Throttle/Debounce Functions |
Create DOM Elements | Conditionals |