Skip to content

Instantly share code, notes, and snippets.

View jagandecapri's full-sized avatar

Jagatheesan Jag jagandecapri

  • Kuala Lumpur, Malaysia
View GitHub Profile
@jagandecapri
jagandecapri / table_check_create_insert.php
Last active March 16, 2017 15:51
Class written in PHP to check for existence of a table in database, create a table or insert data into table (in bulk or one record at a time). Records are sent in as associative arrays containing the column names and values.
<?php
require_once 'database_connector.php';
class TableCheckCreateInsert{
/*
* @var $conn
* @var $databse
* @var $config
@jagandecapri
jagandecapri / Android Toast Creation Helper Function
Created August 6, 2014 15:29
Function to create Toast only if there is no Toast currently displayed . Avoids repeated toast creation.
/**
* Function to create Toast only if there is no Toast currently displayed
* Avoids repeated toast creation
*
* Sample usage/calling:
* showToast("lorem ipsum", Toast.LENGTH_SHORT);
*
* If needed call toast.cancel() in methods such as onPause() etc
* to cancel toast
**/
@jagandecapri
jagandecapri / gist:d6f3e77fd8820d687597399d237f90af
Last active September 16, 2018 14:47 — forked from bryhal/gist:4129042
MYSQL: Generate Calendar Table
DROP TABLE IF EXISTS time_dimension;
CREATE TABLE time_dimension (
id INTEGER PRIMARY KEY, -- year*10000+month*100+day
db_date DATE NOT NULL,
year INTEGER NOT NULL,
month INTEGER NOT NULL, -- 1 to 12
day INTEGER NOT NULL, -- 1 to 31
quarter INTEGER NOT NULL, -- 1 to 4
week INTEGER NOT NULL, -- 1 to 52/53
day_name VARCHAR(9) NOT NULL, -- 'Monday', 'Tuesday'...
@jagandecapri
jagandecapri / pad_packed_demo.py
Created September 3, 2021 12:20 — forked from HarshTrivedi/pad_packed_demo.py
Minimal tutorial on packing (pack_padded_sequence) and unpacking (pad_packed_sequence) sequences in pytorch.
import torch
from torch import LongTensor
from torch.nn import Embedding, LSTM
from torch.autograd import Variable
from torch.nn.utils.rnn import pack_padded_sequence, pad_packed_sequence
## We want to run LSTM on a batch of 3 character sequences ['long_str', 'tiny', 'medium']
#
# Step 1: Construct Vocabulary
# Step 2: Load indexed data (list of instances, where each instance is list of character indices)
package vw;
import java.io.*;
import java.text.ParseException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.stream.Collectors;
torch.manual_seed(42)
x_tensor = torch.from_numpy(x).float()
y_tensor = torch.from_numpy(y).float()
# Builds dataset with ALL data
dataset = TensorDataset(x_tensor, y_tensor)
# Splits randomly into train and validation datasets
train_dataset, val_dataset = random_split(dataset, [80, 20])
@jagandecapri
jagandecapri / force.reqanimframe.js
Created April 30, 2024 21:17 — forked from iros/force.reqanimframe.js
force layout with d3.timer instead of tick loop
var force = d3.layout.force()
.charge(-150)
.linkDistance(30)
.size([width, height]);
d3.json("assets/500nodes.json", function(error, graph) {
if (error) throw error;
// Task 2:
// Connect the force layout to the nodes and links in our dataset