Skip to content

Instantly share code, notes, and snippets.

cd /
umount /sys/firmware/efi/efivars/
mount -t efivarfs rw /sys/firmware/efi/efivars/
cd /sys/firmware/efi/efivars/
rm gpu-power-prefs-…
sudo su
printf "\x07\x00\x00\x00\x01\x00\x00\x00" > gpu-power-prefs-fa4ce28d-b62f-4c99-9cc3-6815686e30f9
chattr +i "gpu-power-prefs-fa4ce28d-b62f-4c99-9cc3-6815686e30f9"
cd /
umount /sys/firmware/efi/efivars/
@okakaino
okakaino / single_producer_multiple_consumers.py
Last active August 24, 2021 12:34
Single producer multiple consumers example in python
import random
import time
from queue import Empty, Queue
from threading import Thread
max_product = 10
cur_product = 0
@okakaino
okakaino / producer_consumer.py
Created April 15, 2018 02:49
An example of using producer consumer model in python
import random
import time
from queue import Queue
from threading import Thread
def produce(queue):
nums = range(5)
while True:
@okakaino
okakaino / aiohttp_crawler.py
Last active January 22, 2018 04:38
My little idea of how to run multiple web crawler instances simultaneously with big input file.
# -*- coding: utf-8 -*-
"""To run multiple aiohttp instances concurrently.
code snippets I found online all create a list tasks to run,
but my example file abc.csv is too big, and contains 10 million rows,
which cannot fit into limited memory on a AWS ec2.micro instance,
so I use a global generator instead, to feed task dynamically.
requires python version 3.6.4 for f string to work.
"""