Skip to content

Instantly share code, notes, and snippets.

View ratmcu's full-sized avatar
🏠
Working from home

Rajitha Hathurusinghe ratmcu

🏠
Working from home
View GitHub Profile
@ratmcu
ratmcu / referencecollapse.cpp
Created April 22, 2022 04:00
example that shows off a function that can accept a lvalue reference as well as a rvalue
#include <iostream>
template<typename T>
void fnc(T&& x)
{
// std::cout << x;
x++;
}
int main()
find ../build/bin/ -type f \( -name \*.so \) -exec nm -A {} + | grep symBoL
@ratmcu
ratmcu / README.md
Last active August 29, 2020 14:18
virtual memory manager with paging in c++

compiling

$g++ --std=c++11 main.cpp -lpthread -o sim

running (make sure the simulation input file "input.txt" is places at the location with the current name)

./sim

@ratmcu
ratmcu / winsock2_UPD_server.cpp
Last active January 23, 2020 19:15
native windows winsock UDP cpp server and python client and
#undef UNICODE
#define WIN32_LEAN_AND_MEAN
#include <windows.h>
#include <winsock2.h>
#include <ws2tcpip.h>
#include <stdlib.h>
#include <stdio.h>
#include <string>
@ratmcu
ratmcu / main.c
Last active December 25, 2019 04:31
Splitting an array in OPENMPI and combining results [c/c++]
#include "stdio.h"
#include <stdlib.h>
#include <math.h>
#include <mpi.h>
int main(int argc, char *argv[])
{
int process_Rank, size_Of_Comm;
double distro_Array[] = {1, 2, 3, 4, 5, 6 ,7, 8, 9, 10, 11, 12, 13, 14}; // data to be distributed
int N = sizeof(distro_Array)/sizeof(distro_Array[0]);
@ratmcu
ratmcu / file_seeker.py
Last active September 18, 2019 19:24
generator to list file paths of files with the same extension. using os.walk
import os
extension = '.csv'
tree = os.walk('./dataset')
paths = sorted([os.path.join(f[0], name) for f in tree if len(f[2])!=0 for name in f[2] if os.path.splitext(name)[-1] == '.csv'])
@ratmcu
ratmcu / multiprocessing_with_joining_back_results.py
Last active May 16, 2022 15:52 — forked from baojie/hello_multiprocessing.py
Python multiprocessing hello world. Split a list and process sublists in different jobs
import multiprocessing
import os
# split a list into evenly sized chunks
def chunks(l, n):
return [l[i:i+n] for i in range(0, len(l), n)]
def do_job(job_id, data_slice, queue):
for item in data_slice:
print ("job", job_id, item)