Skip to content

Instantly share code, notes, and snippets.

@stephanie-wang
stephanie-wang / test_dask.py
Last active January 28, 2021 17:48
Dask-on-Ray sort
import ray
import dask
import dask.dataframe as dd
import json
import pandas as pd
import numpy as np
from ray.util.dask import ray_dask_get
import os.path
import csv
import fastparquet
@stephanie-wang
stephanie-wang / dask_sort.py
Created December 10, 2020 23:19
Dask sort
import dask
import dask.dataframe as dd
import json
import pandas as pd
import numpy as np
import os.path
import csv
import fastparquet
from dask.distributed import Client
@stephanie-wang
stephanie-wang / test.py
Created September 2, 2020 00:29
Test zombie raylet processes
import ray
import numpy as np
import time
ray.init(address="auto")
@ray.remote
def f(x):
@stephanie-wang
stephanie-wang / detached_driver.py
Last active July 14, 2020 23:50
Detached job
import ray
from ray.test_utils import SignalActor
@ray.remote
class Driver:
def __init__(self):
pass
def start(self, signal):
signal.send.remote()
import ray
import time
@ray.remote
def foo(arg):
return
@ray.remote
@stephanie-wang
stephanie-wang / Notes for Gabriele
Created June 9, 2020 23:41
Dependency resolution bug
@ray.remote
def f():
return 1
@ray.remote
def g(x):
return x + 1
x_id = f.remote() # ({'CPU': 1}, args=[])
g.remote(x_id) # ({'CPU': 1}, args=[x]) --> ({'CPU': 1}, args=[])
import subprocess
import cv2
import os.path
import numpy as np
import time
import json
import threading
import tempfile
import ray
@stephanie-wang
stephanie-wang / keybase.md
Created March 25, 2020 00:31
Keybase confirmation

Keybase proof

I hereby claim:

  • I am stephanie-wang on github.
  • I am swangster (https://keybase.io/swangster) on keybase.
  • I have a public key whose fingerprint is CCCD 57CE 7B21 D165 B132 632B D26B 31FD FDFF A3DB

To claim this, I am signing this object:

@stephanie-wang
stephanie-wang / dynamic_resource_example.py
Created July 19, 2019 19:17
Dynamic resource example in Ray
import time
import ray
from ray.tests.cluster_utils import Cluster
# Create a cluster with some worker nodes with 1 CPU each, to force colocated
# tasks to run one at a time.
cluster = Cluster(initialize_head=True, connect=True, head_node_args={'num_cpus': 0})
num_nodes = 3
for _ in range(num_nodes):
cluster.add_node(num_cpus=1)
@stephanie-wang
stephanie-wang / logs
Created January 15, 2019 05:17
`ray.wait` node failure example
WARNING: 3 workers have been started. This could be a result of using a large number of actors, or it could be a consequence of using nested tasks (see https://github.com/ray-project/ray/issues/3644) for some a discussion of workarounds.
WARNING: 3 workers have been started. This could be a result of using a large number of actors, or it could be a consequence of using nested tasks (see https://github.com/ray-project/ray/issues/3644) for some a discussion of workarounds.
WARNING: 3 workers have been started. This could be a result of using a large number of actors, or it could be a consequence of using nested tasks (see https://github.com/ray-project/ray/issues/3644) for some a discussion of workarounds.
WARNING: 3 workers have been started. This could be a result of using a large number of actors, or it could be a consequence of using nested tasks (see https://github.com/ray-project/ray/issues/3644) for some a discussion of workarounds.
WARNING: 3 workers have been started. This could be a result of using a