Skip to content

Instantly share code, notes, and snippets.

View mjm522's full-sized avatar
💭
robotics

Michael J Mathew mjm522

💭
robotics
View GitHub Profile
@mjm522
mjm522 / pycharm in virtual environment.md
Last active October 8, 2019 14:47
Pycharm from in virtual environment
@mjm522
mjm522 / A set of instructions to setup ROS Melodic with Conda.md
Last active January 22, 2024 08:22
Setting up nvdia docker with ros melodic with conda to use hardware acceleration

Steps

  1. Install docker using instructions here. This installation instructions were tested on Ubuntu 18.04 with Docker version 19.03.3, build a872fc2f86.

  2. To run docker as a non-root user, follow the post installation instruction here.

  3. Install nvidia toolkit:Reference

$ sudo apt-get update && sudo apt-get install -y nvidia-container-toolkit
@mjm522
mjm522 / Combine pdf Ubuntu
Last active November 18, 2019 11:15
Combining pdfs in Ubuntu
Add this function to the `~/.bashrc` file or keep as a shell script file.
```
function combine_pdfs()
{
output_file=$1
files=($2)
gs -dBATCH -dNOPAUSE -q -sDEVICE=pdfwrite -dAutoRotatePages=/None -sOutputFile=$output_file "${files[@]}"
}
```
@mjm522
mjm522 / Isolate CPU Ubuntu.md
Last active February 16, 2024 05:37
Isolate CPU core to run Python code Ubuntu

Note: To run a process dedicated on a CPU you can use the tasket command. For example

taskset -c 5,11 python -m timeit 'sum(range(10**7))'

However, this will not guarantee that the CPUs 5,11 will be used for that process alone. The same CPU can be interrupted by the scheduler and may not be properly isolated. So inorder to isolate the CPU the following steps are to be taken. Another feature that could be turned off for these CPUs are the interrupt feature 'IRQ'. However this is not covered here.

  1. List all available cores and hyperthreads in your processor.
lscpu --all --extended
@mjm522
mjm522 / instructions.md
Last active November 13, 2020 15:59
GDB C++ Debugging

When your program runs, do the first instruction. If it is causing segmentation fault, put a while true or cin in the start of main loop, execute it. So it will be caught up in the while loop or may be waiting for the input. When it happens, run the first instruction to get the name of your executable.

First open a terminal and then type:

ps ax | grep <name of your executable>

This would list the complete path to the executable with any arguments present.

@mjm522
mjm522 / import_module.py
Created September 24, 2020 13:06
Import a function from a python file specified in the arguments.
import sys
import argparse
import importlib
def import_function_handle(config_file):
file_path = [ROOT DIRECTORY WHERE YOUR PYTHON FILE EXIST]/{}.py'
file_path = file_path.format(config_file)
if not os.path.isfile(file_path):
print("\033[31;40mSuch a config file does not exist, check name ...\033[m")
print("\033[31;40mCurrent specified path: {}\033[m".format(file_path))
@mjm522
mjm522 / inter_process_communication_using_shared_memory.py
Last active October 6, 2020 10:30
Inter process communication using multiprocessing.shared_memory
import numpy as np
from multiprocessing import shared_memory, Process
def worker(shared_variable):
existing_shm = shared_memory.SharedMemory(name=shared_variable.name)
myarray = np.ndarray((6,), dtype=np.int64, buffer=existing_shm.buf)
count = 0
while True:
count += 1
myarray[-1] = count
@mjm522
mjm522 / inter_process_communication_using_nmap.py
Last active October 6, 2020 10:32
Inter process communication using numpy.nmap
import numpy as np
from multiprocessing import Process
def worker(array_file):
count = 0
temp = np.zeros(6)
while True:
count += 1
temp[-1] = count
array_file[:] = temp[:]
@mjm522
mjm522 / inter_process_communication_using_pipes.py
Last active October 7, 2020 07:07
Inter process communication using named pipes.
import time
import os, stat
import numpy as np
from multiprocessing import Process
pipename = 'myfifo'
if not stat.S_ISFIFO(os.stat(pipename).st_mode):
os.mkfifo(pipename)
def worker(pipe_name):
@mjm522
mjm522 / inter_process_communication_using_queue.py
Created October 7, 2020 07:26
Inter process communication using multiprocessing queues.
import numpy as np
from multiprocessing import Queue, Process
def worker(shared_queue):
count = 0
temp = np.zeros(6)
while True:
count += 1
temp[-1] = count
shared_queue.put(temp)