Skip to content

Instantly share code, notes, and snippets.

View francbartoli's full-sized avatar

Francesco Bartoli francbartoli

View GitHub Profile
@francbartoli
francbartoli / pygeoapi-config-ogc-dev-track-step1.yml
Created March 24, 2021 08:55
pygeoapi practical session - OGC developer track
server:
bind:
host: 0.0.0.0
port: 5000
url: http://localhost:5000
mimetype: application/json; charset=UTF-8
encoding: utf-8
language: en-US
# cors: true
pretty_print: true
@francbartoli
francbartoli / cql.py
Created March 22, 2021 14:58
CQL JSON models
# generated by datamodel-codegen:
# filename: cql-schema.json
# timestamp: 2021-03-13T21:05:20+00:00
from __future__ import annotations
from datetime import date, datetime
from enum import Enum
from typing import Any, Dict, List, Optional, Union
server:
bind:
host: 0.0.0.0
port: 5000
url: http://localhost:5000
mimetype: application/json; charset=UTF-8
encoding: utf-8
language: en-US
# cors: true
pretty_print: true
@francbartoli
francbartoli / test_csv.py
Created April 14, 2020 16:40
COVID-19 data loaded by GDAL
from osgeo import gdal
ds = gdal.OpenEx('CSV:/vsicurl/https://raw.githubusercontent.com/pcm-dpc/COVID-19/master/dati-regioni/dpc-covid19-ita-regioni.csv', gdal.OF_VECTOR, open_options=['X_POSSIBLE_NAMES=long', 'Y_POSSIBLE_NAMES=lat'])
lyr = ds.GetLayer(0)
f = lyr.GetNextFeature()
f.ExportToJson()
'{"type": "Feature", "geometry": {"type": "Point", "coordinates": [13.39843823, 42.35122196]}, "properties": {"data": "2020-02-24T18:00:00", "stato": "ITA", "codice_regione": "13", "denominazione_regione": "Abruzzo", "lat": 42.35122196, "long": 13.39843823, "ricoverati_con_sintomi": "0", "terapia_intensiva": "0", "totale_ospedalizzati": "0", "isolamento_domiciliare": "0", "totale_positivi": "0", "variazione_totale_positivi": "0", "nuovi_positivi": "0", "dimessi_guariti": "0", "deceduti": "0", "totale_casi": "0", "tamponi": "5", "note_it": "", "note_en": ""}, "id": 1}'

I had a large dataset in postgis and wanted to avoid the hassle of first exporting it to a several GB geojson file before tiling it with Tippecanoe.

ogr2ogr -f GeoJSON /dev/stdout \                                                                            
PG:"host=localhost dbname=postgres user=postgres password=thepassword" \
-sql "select * from a, roi where a.geom && roi.geom" \
| docker run -i -v ${PWD}:/data tippecanoe:latest tippecanoe \
--output=/data/yourtiles.mbtiles
@francbartoli
francbartoli / Dockerfile
Created November 19, 2019 22:00
A multi-stage Dockerfile to build an image for a minimal python pipenv-managed app
FROM python:3.7 as build
# The python:3.7 image is HUGE but already comes with all the essentials
# for compiling (most) python modules with native dependencies
ENV LC_ALL C.UTF-8
ENV LANG C.UTF-8
RUN pip install pipenv
WORKDIR /build
COPY Pipfile Pipfile.lock /build/
@francbartoli
francbartoli / Dockerfile-geonode
Last active November 20, 2019 12:32
Reproducible and deterministic GeoNode image 2.10.1
FROM python:2.7.17-stretch AS base
MAINTAINER GeoNode development team
# This section is borrowed from the official Django image but adds GDAL and others
RUN apt-get update && apt-get install -y \
gcc \
zip \
gettext \
postgresql-client libpq-dev \
sqlite3 \

GeoNode 2.4 -> 2.10.1 migration

PostgreSQL

Create a role and a database for Django:

create role user with superuser login with password '***';
create database gn_24 with owner user;
\c gn_24
# routes
from fastapi import APIRouter, Depends, HTTPException
from starlette.status import HTTP_201_CREATED
from starlette.responses import JSONResponse
from sqlalchemy.orm import Session
from app.core.config import WPS_PROCESS_LINK
from app import crud
from app.api.utils.db import get_db
from app.models.job import (
@francbartoli
francbartoli / timescale--db--backend--base.py
Created August 9, 2019 09:58 — forked from dedsm/timescale--db--backend--base.py
WeRiot Django Timescale integration
import logging
from django.contrib.gis.db.backends.postgis.base import \
DatabaseWrapper as PostgisDBWrapper
from django.db import ProgrammingError
from .schema import TimescaleSchemaEditor
logger = logging.getLogger(__name__)