Skip to content

Instantly share code, notes, and snippets.

View helloworld's full-sized avatar

Sashank Thupukari helloworld

View GitHub Profile
View output.txt
============================= test session starts ==============================
platform linux -- Python 3.10.8, pytest-7.2.0, pluggy-1.0.0 -- /usr/local/bin/python
cachedir: .pytest_cache
rootdir: /tmp/tmp4p82sops
plugins: ddtrace-1.5.2, anyio-3.6.2
collecting ... collected 4 items
test.py::test_hello_world PASSED [ 25%]
test.py::test_hello_world_with_name PASSED [ 50%]
test.py::test_hello_world_with_name_and_age PASSED [ 75%]
View output.txt
~/pr/gpt-coder/test_runner master !52 ?169 ❯ python main.py  gpt-coder 3.10 16:39:36
️️⚡️ Serving... hit Ctrl-C to stop!
└── Watching /Users/sashank/projects/gpt-coder/test_runner/main.py.
✓ Initialized. View app at https://modal.com/apps/ap-qfqInd1MLwOOrMzQdptsAQ
╭──────────────────────────── Traceback (most recent call last) ────────────────────────────╮
│ /Users/sashank/projects/gpt-coder/test_runner/main.py:91 in <module> │
│ │
│ 90 if __name__ == "__main__": │
│ ❱ 91 │ stub.serve() │
│ 92 │
View modal_pytest.py
import modal
pytest_image = modal.Image.debian_slim().pip_install(["pytest"])
stub = modal.Stub()
code = """
def hello_world(name=None, age=None, city=None):
return 'Hello World!'
View tailwind.config.js
const colors = require('tailwindcss/colors')
module.exports = {
purge: [],
presets: [],
darkMode: false, // or 'media' or 'class'
theme: {
screens: {
sm: '640px',
md: '768px',
View mapper.py
from dagster import pipeline, solid, RepositoryDefinition, InputDefinition, execute_pipeline, DagsterInstance
@solid(input_defs=[InputDefinition("number", int)])
def process(context, number):
context.log.info("Number: {}".format(number))
@solid
def root(context):
numbers = [1, 2, 3, 4, 5]
for number in numbers:
View recursive_pipeline.py
from dagster import pipeline, solid, RepositoryDefinition, InputDefinition, execute_pipeline_iterator, DagsterInstance
@solid(input_defs=[InputDefinition("number", int)])
def my_solid(context, number):
context.log.info("Number: {}".format(number))
# Stop condition to prevent infinite recursion
if number == 5:
return
View generate.py
#!/usr/bin/python
from datetime import datetime, timedelta
from string import Template
going_text = \
"""JFK, Seville, 284, 1 stop, Mon 8/13 11:25 PM, 7h in Lisbon, Tue 8/14 3:35 PM, TAP
JFK, Madrid, 284, 1 stop, Mon 8/13 11:25 PM, 7h in Lisbon, Tue 8/14 9:20 PM, TAP
JFK, Madrid, 298, 2 stop, Wed 8/15 11:25 PM, 1h 45m in in Lisbon; 6h 15m in Porto, Thu 8/16 10:25 PM, TAP
JFK, Madrid, 263, 2 Stop, Sun 8/19 11:25 PM, 1h 45m in in Lisbon; 6h 15m in Porto, Mon 8/20 10:25 PM, TAP
View trips
Length: 9 days :: 08/13 - 08/22
Going: JFK to Seville [$284] on TAP
departure: Mon 8/13 11:25 PM
stops: 1 stop 7h in Lisbon
arrival: Tue 8/14 3:35 PM
Return: Madrid to JFK [$344] on Norwegian
departure: Wed 8/22 3:00 PM
stops: 1 Stop 1h in Copenhagen
arrival: Wed 8/22 9:30 PM
Total: $628
View output.txt
ubuntu@ip-172-31-71-84:~/lapstyle$ th lap_style.lua -style_image images/flowers.png -content_image images/megan.png -output_image output/megan_flowers20_100.png -content_weight 20 -lap_layers 2 -lap_weights 100
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
View questions.js
var surveys = {
"mental_wellness": {
intro: "Hey. I'm going to ask you a couple questions about your mental health",
questions: ["On a scale of 1 - 10, how are you feeling today.", "Has anyone hurt you?", "Answer yes or no to this question?"],
exit: "Thanks so much!"
}
}