This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
updating: listenbrainz_spark/sql/__init__.py (deflated 46%) | |
updating: listenbrainz_spark/sql/create_dataframes_queries.py (deflated 76%) | |
updating: listenbrainz_spark/sql/recommend_queries.py (deflated 77%) | |
updating: listenbrainz_spark/sql/candidate_sets_queries.py (deflated 78%) | |
updating: listenbrainz_spark/sql/__pycache__/ (stored 0%) | |
updating: listenbrainz_spark/sql/__pycache__/create_dataframes_queries.cpython-34.pyc (deflated 71%) | |
updating: listenbrainz_spark/sql/__pycache__/__init__.cpython-34.pyc (deflated 39%) | |
updating: listenbrainz_spark/sql/__pycache__/candidate_sets_queries.cpython-34.pyc (deflated 71%) | |
updating: listenbrainz_spark/sql/__pycache__/recommend_queries.cpython-34.pyc (deflated 70%) | |
updating: listenbrainz_spark/exceptions.py (deflated 22%) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
source config.sh | |
zip -r listenbrainz_spark.zip listenbrainz_spark/ | |
time ./run.sh /usr/local/spark/bin/spark-submit \ | |
--packages org.apache.spark:spark-avro_2.11:2.4.1 \ | |
--master $SPARK_URI \ | |
--conf "spark.scheduler.listenerbus.eventqueue.capacity"=$LISTENERBUS_CAPACITY \ | |
--conf "spark.cores.max"=$MAX_CORES \ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
curr_date = datetime.utcnow() # 2019-08-02 | |
begin_date = curr_date + relativedelta(days=-50) # 2019-06-13 less than curr_date, 6.parquet fetched | |
begin_date = begin_date + relativedelta(months=1) # 2019-07-13, less than curr_date so 7.parquet fetched | |
begin_date = begin_date + relativedelta(months=1) # 2019-08-13, greater than curr_date so 8.parquet not fetched |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
num = 2017 | |
int start(int num) | |
{ | |
int count = 0; | |
while (num > 0) { | |
++count; | |
num = (num - 1) & num; | |
} | |
return count; | |
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
You have a 32 bit integer a. Given two integers n and m(m>=n), you have to find an integer formed by bits of a between n and m(both inclusive). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
from datetime import datetime | |
from dateutil.relativedelta import relativedelta | |
# in days | |
RECOMMENDATION_GENERATION_WINDOW = 60 | |
STEPS_TO_REACH_NEXT_MONTH = 32 | |
def adjust_days(date, days, shift_backwards=True): | |
if shift_backwards: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
# All paths from source to destination | |
# source = 0 | |
# destination = n - 1 | |
Input: [[1,2], [3], [3], []] | |
Output: [[0,1,3],[0,2,3]] | |
class Solution: | |
def allPathsSourceTarget(self, graph: List[List[int]]) -> List[List[int]]: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
listenbrainz-jobs-vansika | |
listenbrainz-jobs-vansika | |
latest: Pulling from metabrainz/listenbrainz-spark | |
Digest: sha256:3ea74d55c434b9e198352946ba3d5cf6161dc9b95bed273a3e2d8b2affffa67a | |
Status: Image is up to date for metabrainz/listenbrainz-spark:latest | |
docker: Error response from daemon: Could not attach to network spark-network: rpc error: code = PermissionDenied desc = network spark-network not manually attachable. | |
ERRO[0000] error waiting for container: context canceled | |
real 0m2.050s | |
user 0m0.287s |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Ivy Default Cache set to: /root/.ivy2/cache | |
The jars for the packages stored in: /root/.ivy2/jars | |
:: loading settings :: url = jar:file:/usr/local/spark-2.4.1-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.apache.spark#spark-avro_2.11 added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent-2b99af2e-217e-45cf-b06f-204a61bccb89;1.0 | |
confs: [default] | |
You probably access the destination server through a proxy server that is not well configured. | |
You probably access the destination server through a proxy server that is not well configured. | |
You probably access the destination server through a proxy server that is not well configured. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Connecting to namenode via http://hadoop-master:9870/fsck?ugi=root&path=%2F | |
FSCK started by root (auth:SIMPLE) from /10.0.0.39 for path / at Wed Aug 14 10:47:30 GMT 2019 | |
Status: HEALTHY | |
Number of data-nodes: 5 | |
Number of racks: 1 | |
Total dirs: 818 | |
Total symlinks: 0 | |
Replicated Blocks: |