RPI - Oldest engineering University in the West.
Most powerful private University supercomputer.
Graph learning.
node01.local | |
description: Computer | |
width: 64 bits | |
capabilities: smp vsyscall32 | |
*-core | |
description: Motherboard | |
physical id: 0 | |
*-memory | |
description: System memory | |
physical id: 0 |
In this paper Lam et al. propose GraphCast, an ML based method trained directly on reanalysis data. It predicts weather variables for the next 10 day at 0.25°
resolution globally in under 1 minute (on a Google Cloud TPU v4). GraphCast outperforms the most accurate operational deterministic systems on 90% of 1380 verification targets --- and supports better severe event prediction, including tropical cyclone tracking, atmospheric rivers, and extreme temperatures.
ECMWF's IFS runs for less than am hour, every 6 hours, of every day, worldwide making weather forecasts. This is down using NWP, which involves solving the governing equations of weather using supercomputers. The success of NWP lies in rigorous and ongoing research. NWP scales to greater accuracy with greater computational resources. The top deterministic operational system in the world is ECMWF's HRES, a configuration of IFS that produces global 10-day forecasts at 0.1° latitude and longitude resolution, in around an hour
This is a nice paper, where Tsuchiya and Takefuji discuss how they use N x N
hysteresis McCulloch-Pitts neurons as processing elements for the no-three-in-line problem. They discover solutions for up to N = 25
.
For N > 20
, many of the solutions to the no-three-in-line problem have been solved by a computer search. Rotation and reflection symmetry are used to reduce the search space.
Hopfield and Tank proposed the first neural-network approach to optimization problems. They applied sigmoid neural network to the travelling salesman problem. Szu used McCulloch-Pitts neural network for the same problem. To suppress the oscillatory behavior, the hysteresis neuron model has been introduced.
Below are the important points I note from the 2020 paper by Martin Grohe:
ABSTRACT:
Vector representations of graphs and relational structures, whether handcrafted feature vectors or learned representations, enable us to apply standard data analysis and machine learning techniques to the structures. A wide range of methods for generating such embeddings have been studied in the machine learning and knowledge representation literature. However, vector embeddings have received relatively little attention from a theoretical point of view.
I think I wrote a summary of this thesis, but I cannot find it anymore.
const fs = require('fs'); | |
const path = require('path'); | |
const readline = require('readline'); | |
// Read the header of a graph file. | |
async function readGraphHeader(pth) { | |
var fstream = fs.createReadStream(pth); |
const cp = require('child_process'); | |
const fs = require('fs'); | |
const path = require('path'); | |
const _ = require('lodash'); | |
// Main function. | |
function main() { | |
for (var file of fs.readdirSync('./')) { |
# Beware! This file is rewritten by htop when settings are changed in the interface. | |
# The parser is also very primitive, and not human-friendly. | |
fields=0 48 17 18 38 39 40 2 46 47 49 1 | |
sort_key=46 | |
sort_direction=-1 | |
tree_sort_key=0 | |
tree_sort_direction=1 | |
hide_kernel_threads=1 | |
hide_userland_threads=0 | |
shadow_other_users=0 |