View postgis-gis.md

PostGIS && GIS

cdb-manager

A simple browser-based terminal for running SQL against Carto using the SQL API

  • http://github.com/cartodb/cdb-manager
  • git clone git@github.com:CartoDB/cdb-manager.git
  • When you're done cloning, enter the directory and run ./httpserv.py
  • Point your browser at http://locahost:8000
View small-it-bibliography.md
  • Google Presentation Link [goo.gl]
  • PDF Presentation Link [s3.cleverelephant.ca]
  • "Ottawa turns to U.S. tech giants too often: internal memo" [cbc.ca]
  • "Government as a Platform: the next phase of digital transformation" [gds.blog.gov.uk]
  • DevOps Real Talk [www.theregister.co.uk] "incremental change, tight feedback loops, shared knowledge, and mutual respect" "If you're a developer releasing large changesets, you're part of the problem."
  • The Government IT Self-Harm Playbook [medium.com]
  • Better For Less (UK IT) [[drive.google.com](https:/
View performance.md

Carto Patched PostGIS/PostgreSQL Performance

The REL_10_CARTO PostgreSQL branch and svn-2.4-cartodb branch of PostGIS carry patches to improve performance around our core use cases (generating resolution-appropriate data for rendering in Mapnik and creating vector tiles). They also include patches that make the PostgreSQL planner more likely to pick parallel plans, in particular when using PostGIS functions.

  • The first round of improvements went into PostGIS and were focussed on the functions used commonly in feeding Mapnik and MVT.
  • The second round went into both PostGIS and PostgreSQL and were focussed on improving parallel query for our use cases.
2.4 Aug 2.4cdb Oct 2.4cdb Dec
# features ms μs ms μs
View index.html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Geonames Heatmap</title>
<!-- Include Leaflet 1.2.0 Library -->
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.2.0/dist/leaflet.css" />
<script src="https://unpkg.com/leaflet@1.2.0/dist/leaflet.js"></script>
View parallel_paths_include_tlist_cost_v5.patch
diff --git a/src/backend/optimizer/geqo/geqo_eval.c b/src/backend/optimizer/geqo/geqo_eval.c
index b5cab0c..faa5bb7 100644
--- a/src/backend/optimizer/geqo/geqo_eval.c
+++ b/src/backend/optimizer/geqo/geqo_eval.c
@@ -40,7 +40,7 @@ typedef struct
} Clump;
static List *merge_clump(PlannerInfo *root, List *clumps, Clump *new_clump,
- bool force);
+ int num_gene, bool force);
View random-test-data.sql
--
-- Polygons in contained in a 10000x10000
-- square, with just enough size/density to mostly
-- cover the whole area.
--
DROP TABLE IF EXISTS polygon_table_10000;
CREATE TABLE polygon_table_10000 AS
SELECT ST_Buffer(
ST_SetSRID(
ST_MakePoint(random() * 10000, random() * 10000),
View curl-options.csv
CURL_OPT VERSION LIBCURL_VERSION_NUM URL
CURLOPT_ABSTRACT_UNIX_SOCKET 7.53.0 0x073500 http://curl.haxx.se/libcurl/c/CURLOPT_ABSTRACT_UNIX_SOCKET.html
CURLOPT_ACCEPTTIMEOUT_MS 7.24.0 0x071800 http://curl.haxx.se/libcurl/c/CURLOPT_ACCEPTTIMEOUT_MS.html
CURLOPT_ACCEPT_ENCODING 7.21.6 0x071506 http://curl.haxx.se/libcurl/c/CURLOPT_ACCEPT_ENCODING.html
CURLOPT_ADDRESS_SCOPE 7.19.0 0x071300 http://curl.haxx.se/libcurl/c/CURLOPT_ADDRESS_SCOPE.html
CURLOPT_APPEND 7.16.4 0x071004 http://curl.haxx.se/libcurl/c/CURLOPT_APPEND.html
CURLOPT_BUFFERSIZE 7.53.0 0x073500 http://curl.haxx.se/libcurl/c/CURLOPT_BUFFERSIZE.html
CURLOPT_CHUNK_BGN_FUNCTION 7.21.0 0x071500 http://curl.haxx.se/libcurl/c/CURLOPT_CHUNK_BGN_FUNCTION.html
CURLOPT_CHUNK_DATA 7.21.0 0x071500 http://curl.haxx.se/libcurl/c/CURLOPT_CHUNK_DATA.html
CURLOPT_CHUNK_END_FUNCTION 7.21.0 0x071500 http://curl.haxx.se/libcurl/c/CURLOPT_CHUNK_END_FUNCTION.html
View encrypt.sh
#
# Simple commandline encryption aliases for OSX.
# Put this in your .bash_profile
#
# encrypt myfile
# takes your password, encrypts myfile to myfile.enc,
# writes some random data into myfile and then deletes it
#
# decrypt myfile.enc
# takes your passowrd, decrypts to myfile,
View address_match_pgsql.py
#
# This script expects that
#
# - as many tables have been loaded as possible,
# to allow as much longitudinal data as possible to be used in
# identifying "identical" addresses
# - a table of locality congruencies has been created, 'locality_corpus'
# that maps an id key to different localities it shows up in. So, for
# example:
#
View pgsql.txt
blocking...
creating blocking_map database
creating inverted indexes
writing blocking map
Traceback (most recent call last):
File "1_identify_address_entities.py", line 208, in <module>
csv_writer.writerows(b_data)
File "/Library/Python/2.7/site-packages/dedupe/blocking.py", line 42, in __call__
block_keys = predicate(instance)
File "/Library/Python/2.7/site-packages/dedupe/predicates.py", line 224, in __call__