-
Install JetBeans toolbox
-
Turn on Generate shell scripts in the settings of the tooldbox app
-
Set path to
~/.local/share/JetBrains/Toolbox/bin
-
Quit and re-launch toolbox
-
Install Pycharm Community version
-
Install docker using instructions here. This installation instructions were tested on Ubuntu 18.04 with Docker version 19.03.3, build a872fc2f86.
-
To run docker as a non-root user, follow the post installation instruction here.
-
Install nvidia toolkit:Reference
$ sudo apt-get update && sudo apt-get install -y nvidia-container-toolkit
Add this function to the `~/.bashrc` file or keep as a shell script file. | |
``` | |
function combine_pdfs() | |
{ | |
output_file=$1 | |
files=($2) | |
gs -dBATCH -dNOPAUSE -q -sDEVICE=pdfwrite -dAutoRotatePages=/None -sOutputFile=$output_file "${files[@]}" | |
} | |
``` |
Note: To run a process dedicated on a CPU you can use the tasket
command. For example
taskset -c 5,11 python -m timeit 'sum(range(10**7))'
However, this will not guarantee that the CPUs 5,11 will be used for that process alone. The same CPU can be interrupted by the scheduler and may not be properly isolated. So inorder to isolate the CPU the following steps are to be taken. Another feature that could be turned off for these CPUs are the interrupt feature 'IRQ'. However this is not covered here.
- List all available cores and hyperthreads in your processor.
lscpu --all --extended
When your program runs, do the first instruction. If it is causing segmentation fault, put a while true or cin in the start of main loop, execute it. So it will be caught up in the while loop or may be waiting for the input. When it happens, run the first instruction to get the name of your executable.
First open a terminal and then type:
ps ax | grep <name of your executable>
This would list the complete path to the executable with any arguments present.
import sys | |
import argparse | |
import importlib | |
def import_function_handle(config_file): | |
file_path = [ROOT DIRECTORY WHERE YOUR PYTHON FILE EXIST]/{}.py' | |
file_path = file_path.format(config_file) | |
if not os.path.isfile(file_path): | |
print("\033[31;40mSuch a config file does not exist, check name ...\033[m") | |
print("\033[31;40mCurrent specified path: {}\033[m".format(file_path)) |
import numpy as np | |
from multiprocessing import Process | |
def worker(array_file): | |
count = 0 | |
temp = np.zeros(6) | |
while True: | |
count += 1 | |
temp[-1] = count | |
array_file[:] = temp[:] |
import time | |
import os, stat | |
import numpy as np | |
from multiprocessing import Process | |
pipename = 'myfifo' | |
if not stat.S_ISFIFO(os.stat(pipename).st_mode): | |
os.mkfifo(pipename) | |
def worker(pipe_name): |
import numpy as np | |
from multiprocessing import Queue, Process | |
def worker(shared_queue): | |
count = 0 | |
temp = np.zeros(6) | |
while True: | |
count += 1 | |
temp[-1] = count | |
shared_queue.put(temp) |