Created
February 8, 2024 04:51
-
-
Save mreso/f3ba1b41489e8bb76856b295ecacfadf to your computer and use it in GitHub Desktop.
fix runtime
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#Saving snapshot | |
#Thu Feb 08 04:34:52 UTC 2024 | |
python=/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python | |
model_snapshot={\n "name"\: "20240208043452822-shutdown.cfg",\n "modelCount"\: 1,\n "created"\: 1707366892822,\n "models"\: {\n "mnist"\: {\n "1.0"\: {\n "defaultVersion"\: true,\n "marName"\: "mnist.mar",\n "minWorkers"\: 16,\n "maxWorkers"\: 16,\n "batchSize"\: 1,\n "maxBatchDelay"\: 100,\n "responseTimeout"\: 120\n }\n }\n }\n} | |
tsConfigFile=logs/config/20240208043104864-shutdown.cfg | |
version=0.9.0 | |
workflow_store=model_store | |
load_models=mnist\=mnist.mar | |
model_store=model_store | |
number_of_gpu=0 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#Saving snapshot | |
#Thu Feb 08 04:47:24 UTC 2024 | |
python=/home/ubuntu/miniconda3/envs/serve230/bin/python | |
model_snapshot={\n "name"\: "20240208044724715-shutdown.cfg",\n "modelCount"\: 1,\n "created"\: 1707367644716,\n "models"\: {\n "mnist"\: {\n "1.0"\: {\n "defaultVersion"\: true,\n "marName"\: "mnist.mar",\n "minWorkers"\: 16,\n "maxWorkers"\: 16,\n "batchSize"\: 1,\n "maxBatchDelay"\: 100,\n "responseTimeout"\: 120,\n "runtimeType"\: "python"\n }\n }\n }\n} | |
tsConfigFile=logs/config/20240208043452822-shutdown.cfg | |
version=0.9.0 | |
workflow_store=model_store | |
load_models=mnist\=mnist.mar | |
model_store=model_store | |
number_of_gpu=0 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
(serve230) ubuntu@ip-172-31-55-226:~/serve$ torchserve --version | |
TorchServe Version is 0.9.0 | |
(serve230) ubuntu@ip-172-31-55-226:~/serve$ torchserve --model-store model_store | |
(serve230) ubuntu@ip-172-31-55-226:~/serve$ WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance. | |
2024-02-08T04:47:13,959 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Loading snapshot serializer plugin... | |
2024-02-08T04:47:14,010 [WARN ] main org.pytorch.serve.util.ConfigManager - Your torchserve instance can access any URL to load models. When deploying to production, make sure to limit the set of allowed_urls in config.properties | |
2024-02-08T04:47:14,011 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager... | |
2024-02-08T04:47:14,054 [INFO ] main org.pytorch.serve.metrics.configuration.MetricConfiguration - Successfully loaded metrics configuration from /home/ubuntu/serve/ts/configs/metrics.yaml | |
2024-02-08T04:47:14,168 [INFO ] main org.pytorch.serve.ModelServer - | |
Torchserve version: 0.9.0 | |
TS Home: /home/ubuntu/serve | |
Current directory: /home/ubuntu/serve | |
Temp directory: /tmp | |
Metrics config path: /home/ubuntu/serve/ts/configs/metrics.yaml | |
Number of GPUs: 0 | |
Number of CPUs: 16 | |
Max heap size: 15820 M | |
Python executable: /home/ubuntu/miniconda3/envs/serve230/bin/python | |
Config file: logs/config/20240208043452822-shutdown.cfg | |
Inference address: http://127.0.0.1:8080 | |
Management address: http://127.0.0.1:8081 | |
Metrics address: http://127.0.0.1:8082 | |
Model Store: /home/ubuntu/serve/model_store | |
Initial Models: mnist=mnist.mar | |
Log dir: /home/ubuntu/serve/logs | |
Metrics dir: /home/ubuntu/serve/logs | |
Netty threads: 0 | |
Netty client threads: 0 | |
Default workers per model: 16 | |
Blacklist Regex: N/A | |
Maximum Response Size: 6553500 | |
Maximum Request Size: 6553500 | |
Limit Maximum Image Pixels: true | |
Prefer direct buffer: false | |
Allowed Urls: [file://.*|http(s)?://.*] | |
Custom python dependency for model allowed: false | |
Enable metrics API: true | |
Metrics mode: LOG | |
Disable system metrics: false | |
Workflow Store: /home/ubuntu/serve/model_store | |
CPP log config: N/A | |
Model config: N/A | |
2024-02-08T04:47:14,173 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Started restoring models from snapshot { | |
"name": "20240208043452822-shutdown.cfg", | |
"modelCount": 1, | |
"created": 1707366892822, | |
"models": { | |
"mnist": { | |
"1.0": { | |
"defaultVersion": true, | |
"marName": "mnist.mar", | |
"minWorkers": 16, | |
"maxWorkers": 16, | |
"batchSize": 1, | |
"maxBatchDelay": 100, | |
"responseTimeout": 120 | |
} | |
} | |
} | |
} | |
2024-02-08T04:47:14,179 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Validating snapshot 20240208043452822-shutdown.cfg | |
2024-02-08T04:47:14,180 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Snapshot 20240208043452822-shutdown.cfg validated successfully | |
2024-02-08T04:47:14,259 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Adding new version 1.0 for model mnist | |
2024-02-08T04:47:14,260 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model mnist | |
2024-02-08T04:47:14,260 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model mnist | |
2024-02-08T04:47:14,260 [INFO ] main org.pytorch.serve.wlm.ModelManager - Model mnist loaded. | |
2024-02-08T04:47:14,260 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: mnist, count: 16 | |
2024-02-08T04:47:14,268 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9004, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,270 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9006, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,270 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9000, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,267 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9001, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,272 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9008, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,268 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9003, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,270 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9005, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,273 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9009, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,268 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9002, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,273 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9007, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,281 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9010, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,282 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9013, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,284 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9011, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,282 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9014, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,284 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9015, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,285 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve230/bin/python, /home/ubuntu/serve/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9012, --metrics-config, /home/ubuntu/serve/ts/configs/metrics.yaml] | |
2024-02-08T04:47:14,287 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel. | |
2024-02-08T04:47:14,989 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://127.0.0.1:8080 | |
2024-02-08T04:47:14,992 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel. | |
2024-02-08T04:47:15,010 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://127.0.0.1:8081 | |
2024-02-08T04:47:15,011 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel. | |
2024-02-08T04:47:15,011 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://127.0.0.1:8082 | |
Model server started. | |
2024-02-08T04:47:15,825 [WARN ] pool-3-thread-1 org.pytorch.serve.metrics.MetricCollector - worker pid is not available yet. | |
2024-02-08T04:47:15,880 [INFO ] pool-3-thread-1 TS_METRICS - CPUUtilization.Percent:100.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367635 | |
2024-02-08T04:47:15,881 [INFO ] pool-3-thread-1 TS_METRICS - DiskAvailable.Gigabytes:382.98189544677734|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367635 | |
2024-02-08T04:47:15,882 [INFO ] pool-3-thread-1 TS_METRICS - DiskUsage.Gigabytes:101.63203048706055|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367635 | |
2024-02-08T04:47:15,882 [INFO ] pool-3-thread-1 TS_METRICS - DiskUtilization.Percent:21.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367635 | |
2024-02-08T04:47:15,882 [INFO ] pool-3-thread-1 TS_METRICS - MemoryAvailable.Megabytes:58821.6328125|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367635 | |
2024-02-08T04:47:15,882 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUsed.Megabytes:3736.89453125|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367635 | |
2024-02-08T04:47:15,883 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUtilization.Percent:7.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367635 | |
2024-02-08T04:47:16,026 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9000, pid=30719 | |
2024-02-08T04:47:16,027 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9000 | |
2024-02-08T04:47:16,040 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,041 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - [PID]30719 | |
2024-02-08T04:47:16,041 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,042 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,043 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9011, pid=30734 | |
2024-02-08T04:47:16,044 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9011 | |
2024-02-08T04:47:16,047 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9015, pid=30732 | |
2024-02-08T04:47:16,048 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9015 | |
2024-02-08T04:47:16,052 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000 | |
2024-02-08T04:47:16,059 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,059 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - [PID]30734 | |
2024-02-08T04:47:16,060 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,060 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,060 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9011 | |
2024-02-08T04:47:16,061 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,061 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,061 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - [PID]30732 | |
2024-02-08T04:47:16,061 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,061 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,061 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9015-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,062 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9015 | |
2024-02-08T04:47:16,064 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,074 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9004, pid=30716 | |
2024-02-08T04:47:16,077 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9014, pid=30731 | |
2024-02-08T04:47:16,086 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9004 | |
2024-02-08T04:47:16,087 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9014 | |
2024-02-08T04:47:16,088 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,091 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,091 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - [PID]30716 | |
2024-02-08T04:47:16,088 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9010, pid=30737 | |
2024-02-08T04:47:16,091 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,090 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9009, pid=30733 | |
2024-02-08T04:47:16,091 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9010 | |
2024-02-08T04:47:16,091 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,092 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9009 | |
2024-02-08T04:47:16,091 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - [PID]30731 | |
2024-02-08T04:47:16,092 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9008, pid=30729 | |
2024-02-08T04:47:16,091 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,094 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9004 | |
2024-02-08T04:47:16,094 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,094 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9008 | |
2024-02-08T04:47:16,094 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - [PID]30733 | |
2024-02-08T04:47:16,095 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,095 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,095 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,095 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9009 | |
2024-02-08T04:47:16,098 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,098 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,099 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9014-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,099 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9014 | |
2024-02-08T04:47:16,102 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,103 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - [PID]30737 | |
2024-02-08T04:47:16,103 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,103 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9010-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,103 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9010 | |
2024-02-08T04:47:16,103 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,105 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,105 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9015. | |
2024-02-08T04:47:16,105 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9009. | |
2024-02-08T04:47:16,106 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - [PID]30729 | |
2024-02-08T04:47:16,106 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,106 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,106 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,106 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9008 | |
2024-02-08T04:47:16,105 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9000. | |
2024-02-08T04:47:16,107 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9004. | |
2024-02-08T04:47:16,108 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9011. | |
2024-02-08T04:47:16,112 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9010. | |
2024-02-08T04:47:16,114 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636114 | |
2024-02-08T04:47:16,114 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636114 | |
2024-02-08T04:47:16,115 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9014. | |
2024-02-08T04:47:16,114 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636114 | |
2024-02-08T04:47:16,114 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636114 | |
2024-02-08T04:47:16,114 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636114 | |
2024-02-08T04:47:16,115 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636115 | |
2024-02-08T04:47:16,114 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636114 | |
2024-02-08T04:47:16,116 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9008. | |
2024-02-08T04:47:16,116 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636116 | |
2024-02-08T04:47:16,117 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9006, pid=30717 | |
2024-02-08T04:47:16,117 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9006 | |
2024-02-08T04:47:16,119 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636119 | |
2024-02-08T04:47:16,119 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636119 | |
2024-02-08T04:47:16,120 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636119 | |
2024-02-08T04:47:16,120 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636119 | |
2024-02-08T04:47:16,120 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636119 | |
2024-02-08T04:47:16,120 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636120 | |
2024-02-08T04:47:16,120 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636119 | |
2024-02-08T04:47:16,124 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636124 | |
2024-02-08T04:47:16,125 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9003, pid=30721 | |
2024-02-08T04:47:16,126 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9003 | |
2024-02-08T04:47:16,129 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,129 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - [PID]30717 | |
2024-02-08T04:47:16,129 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,129 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9006-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,129 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,129 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9006 | |
2024-02-08T04:47:16,132 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9006. | |
2024-02-08T04:47:16,132 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636132 | |
2024-02-08T04:47:16,133 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636133 | |
2024-02-08T04:47:16,133 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,133 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - [PID]30721 | |
2024-02-08T04:47:16,133 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,134 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,134 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,134 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9003 | |
2024-02-08T04:47:16,135 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9003. | |
2024-02-08T04:47:16,135 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636135 | |
2024-02-08T04:47:16,135 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636135 | |
2024-02-08T04:47:16,168 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9013, pid=30740 | |
2024-02-08T04:47:16,169 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9013 | |
2024-02-08T04:47:16,180 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,180 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - [PID]30740 | |
2024-02-08T04:47:16,180 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,181 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,181 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,181 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9013-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,181 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9013 | |
2024-02-08T04:47:16,181 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,181 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,181 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,181 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,181 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,182 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,182 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,183 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,207 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9001, pid=30739 | |
2024-02-08T04:47:16,208 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636208 | |
2024-02-08T04:47:16,208 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636208 | |
2024-02-08T04:47:16,209 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9001 | |
2024-02-08T04:47:16,214 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,208 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9013. | |
2024-02-08T04:47:16,222 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,223 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - [PID]30739 | |
2024-02-08T04:47:16,224 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,224 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,224 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001 | |
2024-02-08T04:47:16,224 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,237 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,238 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9001. | |
2024-02-08T04:47:16,238 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636238 | |
2024-02-08T04:47:16,238 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636238 | |
2024-02-08T04:47:16,247 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9005, pid=30736 | |
2024-02-08T04:47:16,248 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9005 | |
2024-02-08T04:47:16,261 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,262 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - [PID]30736 | |
2024-02-08T04:47:16,262 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,262 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,263 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,263 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9005 | |
2024-02-08T04:47:16,264 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,275 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9005. | |
2024-02-08T04:47:16,275 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636275 | |
2024-02-08T04:47:16,276 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636276 | |
2024-02-08T04:47:16,276 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9012, pid=30730 | |
2024-02-08T04:47:16,277 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9012 | |
2024-02-08T04:47:16,286 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9007, pid=30735 | |
2024-02-08T04:47:16,287 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9007 | |
2024-02-08T04:47:16,291 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,292 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - [PID]30730 | |
2024-02-08T04:47:16,292 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,292 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9012-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,292 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,292 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9012 | |
2024-02-08T04:47:16,300 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,300 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - [PID]30735 | |
2024-02-08T04:47:16,301 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,301 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9007-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,301 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,301 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9007 | |
2024-02-08T04:47:16,301 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,311 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636311 | |
2024-02-08T04:47:16,311 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636311 | |
2024-02-08T04:47:16,311 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9012. | |
2024-02-08T04:47:16,342 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9007. | |
2024-02-08T04:47:16,342 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636342 | |
2024-02-08T04:47:16,342 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636342 | |
2024-02-08T04:47:16,343 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,344 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9002, pid=30738 | |
2024-02-08T04:47:16,345 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9002 | |
2024-02-08T04:47:16,359 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/serve/ts/configs/metrics.yaml. | |
2024-02-08T04:47:16,359 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - [PID]30738 | |
2024-02-08T04:47:16,360 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:47:16,360 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9002-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:47:16,360 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9002 | |
2024-02-08T04:47:16,361 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:47:16,363 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:16,383 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9002. | |
2024-02-08T04:47:16,383 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1707367636383 | |
2024-02-08T04:47:16,384 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1707367636384 | |
2024-02-08T04:47:16,426 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:47:17,924 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:17,925 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:17,938 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:17,939 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:17,941 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:17,941 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:17,944 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:17,944 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:17,967 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:17,968 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:17,976 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:17,977 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:17,984 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:17,986 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:17,988 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:17,990 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:17,993 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1873 | |
2024-02-08T04:47:17,993 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1874 | |
2024-02-08T04:47:17,994 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:17,994 [INFO ] W-9008-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3724.0|#WorkerName:W-9008-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367637 | |
2024-02-08T04:47:17,997 [INFO ] W-9008-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:8.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367637 | |
2024-02-08T04:47:17,995 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:17,997 [INFO ] W-9009-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3726.0|#WorkerName:W-9009-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367637 | |
2024-02-08T04:47:17,998 [INFO ] W-9009-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:10.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367637 | |
2024-02-08T04:47:18,002 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,003 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1884 | |
2024-02-08T04:47:18,003 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,003 [INFO ] W-9011-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3732.0|#WorkerName:W-9011-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,003 [INFO ] W-9011-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:5.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,026 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1906 | |
2024-02-08T04:47:18,026 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9010-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,026 [INFO ] W-9010-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3755.0|#WorkerName:W-9010-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,027 [INFO ] W-9010-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:7.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,027 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,037 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,037 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1902 | |
2024-02-08T04:47:18,037 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,038 [INFO ] W-9003-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3771.0|#WorkerName:W-9003-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,038 [INFO ] W-9003-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:1.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,043 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,043 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1834 | |
2024-02-08T04:47:18,043 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9013-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,043 [INFO ] W-9013-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3766.0|#WorkerName:W-9013-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,044 [INFO ] W-9013-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:2.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,046 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,046 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1926 | |
2024-02-08T04:47:18,046 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9015-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,047 [INFO ] W-9015-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3769.0|#WorkerName:W-9015-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,047 [INFO ] W-9015-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:7.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,064 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,064 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,064 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,065 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,065 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,066 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,092 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,093 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1855 | |
2024-02-08T04:47:18,093 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,094 [INFO ] W-9001-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3829.0|#WorkerName:W-9001-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,094 [INFO ] W-9001-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:1.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,101 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,101 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,102 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1978 | |
2024-02-08T04:47:18,102 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,102 [INFO ] W-9004-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3836.0|#WorkerName:W-9004-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,102 [INFO ] W-9004-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:10.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,106 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1973 | |
2024-02-08T04:47:18,106 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9006-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,106 [INFO ] W-9006-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3839.0|#WorkerName:W-9006-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,106 [INFO ] W-9006-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:1.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,148 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,149 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,154 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,154 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,160 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,161 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 2041 | |
2024-02-08T04:47:18,161 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,161 [INFO ] W-9000-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3897.0|#WorkerName:W-9000-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,161 [INFO ] W-9000-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:6.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,164 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,164 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,166 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,166 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 2046 | |
2024-02-08T04:47:18,167 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9014-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,167 [INFO ] W-9014-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3889.0|#WorkerName:W-9014-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,167 [INFO ] W-9014-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:6.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,168 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,169 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,179 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,179 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1903 | |
2024-02-08T04:47:18,180 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,180 [INFO ] W-9005-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3914.0|#WorkerName:W-9005-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,180 [INFO ] W-9005-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:2.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,184 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,186 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1874 | |
2024-02-08T04:47:18,186 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9012-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,186 [INFO ] W-9012-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3914.0|#WorkerName:W-9012-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,187 [INFO ] W-9012-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:2.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,217 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,217 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,228 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,229 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1845 | |
2024-02-08T04:47:18,229 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9002-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,229 [INFO ] W-9002-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3964.0|#WorkerName:W-9002-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,229 [INFO ] W-9002-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:1.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,377 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - ONNX enabled | |
2024-02-08T04:47:18,377 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:47:18,388 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - '/tmp/models/6ba8c08204b84ca28ec6c7f27d1b9379/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:47:18,389 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 2047 | |
2024-02-08T04:47:18,389 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9007-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:47:18,389 [INFO ] W-9007-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:4122.0|#WorkerName:W-9007-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 | |
2024-02-08T04:47:18,389 [INFO ] W-9007-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:0.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707367638 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
torchserve --model-store model_store --models mnist=mnist.mar | |
(serve0.9.0) ubuntu@ip-172-31-55-226:~/serve$ WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance. | |
2024-02-08T04:30:45,182 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Loading snapshot serializer plugin... | |
2024-02-08T04:30:45,251 [WARN ] main org.pytorch.serve.util.ConfigManager - Your torchserve instance can access any URL to load models. When deploying to production, make sure to limit the set of allowed_urls in config.properties | |
2024-02-08T04:30:45,251 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager... | |
2024-02-08T04:30:45,314 [INFO ] main org.pytorch.serve.metrics.configuration.MetricConfiguration - Successfully loaded metrics configuration from /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml | |
2024-02-08T04:30:45,428 [INFO ] main org.pytorch.serve.ModelServer - | |
Torchserve version: 0.9.0 | |
TS Home: /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages | |
Current directory: /home/ubuntu/serve | |
Temp directory: /tmp | |
Metrics config path: /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml | |
Number of GPUs: 0 | |
Number of CPUs: 16 | |
Max heap size: 15820 M | |
Python executable: /home/ubuntu/miniconda3/envs/serve0.9.0/bin/python | |
Config file: N/A | |
Inference address: http://127.0.0.1:8080 | |
Management address: http://127.0.0.1:8081 | |
Metrics address: http://127.0.0.1:8082 | |
Model Store: /home/ubuntu/serve/model_store | |
Initial Models: mnist=mnist.mar | |
Log dir: /home/ubuntu/serve/logs | |
Metrics dir: /home/ubuntu/serve/logs | |
Netty threads: 0 | |
Netty client threads: 0 | |
Default workers per model: 16 | |
Blacklist Regex: N/A | |
Maximum Response Size: 6553500 | |
Maximum Request Size: 6553500 | |
Limit Maximum Image Pixels: true | |
Prefer direct buffer: false | |
Allowed Urls: [file://.*|http(s)?://.*] | |
Custom python dependency for model allowed: false | |
Enable metrics API: true | |
Metrics mode: log | |
Disable system metrics: false | |
Workflow Store: /home/ubuntu/serve/model_store | |
Model config: N/A | |
2024-02-08T04:30:45,435 [INFO ] main org.pytorch.serve.ModelServer - Loading initial models: mnist.mar | |
2024-02-08T04:30:45,520 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Adding new version 1.0 for model mnist | |
2024-02-08T04:30:45,520 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model mnist | |
2024-02-08T04:30:45,520 [INFO ] main org.pytorch.serve.wlm.ModelManager - Model mnist loaded. | |
2024-02-08T04:30:45,520 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: mnist, count: 16 | |
2024-02-08T04:30:45,532 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9000, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,532 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9006, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,532 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9010, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,532 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9007, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,532 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9009, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,532 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9003, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,532 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9004, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,531 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9012, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,531 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9011, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,534 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9005, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,534 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9001, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,534 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9014, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,534 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9013, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,531 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9002, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,535 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel. | |
2024-02-08T04:30:45,533 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9015, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:45,533 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9008, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:30:46,051 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://127.0.0.1:8080 | |
2024-02-08T04:30:46,051 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel. | |
2024-02-08T04:30:46,062 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://127.0.0.1:8081 | |
2024-02-08T04:30:46,062 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel. | |
2024-02-08T04:30:46,063 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://127.0.0.1:8082 | |
Model server started. | |
2024-02-08T04:30:46,852 [WARN ] pool-3-thread-1 org.pytorch.serve.metrics.MetricCollector - worker pid is not available yet. | |
2024-02-08T04:30:46,948 [INFO ] pool-3-thread-1 TS_METRICS - CPUUtilization.Percent:100.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366646 | |
2024-02-08T04:30:46,992 [INFO ] pool-3-thread-1 TS_METRICS - DiskAvailable.Gigabytes:382.11880111694336|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366646 | |
2024-02-08T04:30:46,993 [INFO ] pool-3-thread-1 TS_METRICS - DiskUsage.Gigabytes:102.49512481689453|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366646 | |
2024-02-08T04:30:46,993 [INFO ] pool-3-thread-1 TS_METRICS - DiskUtilization.Percent:21.1|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366646 | |
2024-02-08T04:30:46,993 [INFO ] pool-3-thread-1 TS_METRICS - MemoryAvailable.Megabytes:60458.7421875|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366646 | |
2024-02-08T04:30:46,993 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUsed.Megabytes:2140.98828125|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366646 | |
2024-02-08T04:30:46,994 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUtilization.Percent:4.5|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366646 | |
2024-02-08T04:30:47,490 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9000, pid=8864 | |
2024-02-08T04:30:47,503 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9000 | |
2024-02-08T04:30:47,485 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9001, pid=8881 | |
2024-02-08T04:30:47,487 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9006, pid=8861 | |
2024-02-08T04:30:47,504 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9001 | |
2024-02-08T04:30:47,504 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9006 | |
2024-02-08T04:30:47,513 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9013, pid=8883 | |
2024-02-08T04:30:47,513 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9013 | |
2024-02-08T04:30:47,518 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9015, pid=8878 | |
2024-02-08T04:30:47,519 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9015 | |
2024-02-08T04:30:47,537 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9002, pid=8885 | |
2024-02-08T04:30:47,538 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9002 | |
2024-02-08T04:30:47,539 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9010, pid=8862 | |
2024-02-08T04:30:47,540 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9010 | |
2024-02-08T04:30:47,544 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9009, pid=8863 | |
2024-02-08T04:30:47,544 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9009 | |
2024-02-08T04:30:47,547 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9004, pid=8880 | |
2024-02-08T04:30:47,547 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9004 | |
2024-02-08T04:30:47,553 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,554 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - [PID]8881 | |
2024-02-08T04:30:47,554 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,554 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,554 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,559 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001 | |
2024-02-08T04:30:47,560 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9014, pid=8886 | |
2024-02-08T04:30:47,560 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9014 | |
2024-02-08T04:30:47,561 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,562 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - [PID]8861 | |
2024-02-08T04:30:47,562 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,563 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9006-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,563 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9006 | |
2024-02-08T04:30:47,563 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,568 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,568 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - [PID]8864 | |
2024-02-08T04:30:47,569 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,569 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,569 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,569 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000 | |
2024-02-08T04:30:47,574 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9005, pid=8884 | |
2024-02-08T04:30:47,577 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9006. | |
2024-02-08T04:30:47,577 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9000. | |
2024-02-08T04:30:47,578 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9001. | |
2024-02-08T04:30:47,577 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9005 | |
2024-02-08T04:30:47,584 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,584 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - [PID]8883 | |
2024-02-08T04:30:47,585 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,585 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9013-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,585 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,585 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9013 | |
2024-02-08T04:30:47,586 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647586 | |
2024-02-08T04:30:47,589 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647589 | |
2024-02-08T04:30:47,589 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647589 | |
2024-02-08T04:30:47,591 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,591 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - [PID]8878 | |
2024-02-08T04:30:47,591 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,591 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,591 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9015-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,591 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9015 | |
2024-02-08T04:30:47,593 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647593 | |
2024-02-08T04:30:47,594 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9013. | |
2024-02-08T04:30:47,596 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9015. | |
2024-02-08T04:30:47,596 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647596 | |
2024-02-08T04:30:47,600 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9011, pid=8882 | |
2024-02-08T04:30:47,600 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9011 | |
2024-02-08T04:30:47,600 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9012, pid=8877 | |
2024-02-08T04:30:47,601 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9012 | |
2024-02-08T04:30:47,609 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,610 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - [PID]8885 | |
2024-02-08T04:30:47,611 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,611 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9002-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,611 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,611 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9002 | |
2024-02-08T04:30:47,622 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,622 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9007, pid=8876 | |
2024-02-08T04:30:47,622 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9003, pid=8887 | |
2024-02-08T04:30:47,622 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9002. | |
2024-02-08T04:30:47,622 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9008, pid=8879 | |
2024-02-08T04:30:47,623 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,623 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9007 | |
2024-02-08T04:30:47,623 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9008 | |
2024-02-08T04:30:47,623 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9003 | |
2024-02-08T04:30:47,623 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647623 | |
2024-02-08T04:30:47,623 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - [PID]8862 | |
2024-02-08T04:30:47,623 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,623 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9010-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,624 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9010 | |
2024-02-08T04:30:47,624 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,624 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,623 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - [PID]8863 | |
2024-02-08T04:30:47,624 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,624 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - [PID]8880 | |
2024-02-08T04:30:47,624 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,624 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,624 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,625 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9004 | |
2024-02-08T04:30:47,625 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9009 | |
2024-02-08T04:30:47,624 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,625 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,625 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,633 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,633 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - [PID]8886 | |
2024-02-08T04:30:47,633 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,633 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,633 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,633 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9014-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,640 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,640 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,640 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9014 | |
2024-02-08T04:30:47,640 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,641 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,649 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,649 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647649 | |
2024-02-08T04:30:47,649 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - [PID]8884 | |
2024-02-08T04:30:47,649 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9004. | |
2024-02-08T04:30:47,650 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,650 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,650 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9010. | |
2024-02-08T04:30:47,650 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9005 | |
2024-02-08T04:30:47,650 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,650 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647650 | |
2024-02-08T04:30:47,668 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9014. | |
2024-02-08T04:30:47,671 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647671 | |
2024-02-08T04:30:47,671 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,671 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9005. | |
2024-02-08T04:30:47,671 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647671 | |
2024-02-08T04:30:47,668 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9009. | |
2024-02-08T04:30:47,672 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,675 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,675 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - [PID]8877 | |
2024-02-08T04:30:47,676 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,676 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9012-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,676 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,676 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,676 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9012 | |
2024-02-08T04:30:47,677 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - [PID]8882 | |
2024-02-08T04:30:47,677 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647677 | |
2024-02-08T04:30:47,677 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,677 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,677 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,677 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9011 | |
2024-02-08T04:30:47,679 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,680 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - [PID]8879 | |
2024-02-08T04:30:47,680 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,680 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,680 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9008 | |
2024-02-08T04:30:47,680 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,687 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,688 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - [PID]8876 | |
2024-02-08T04:30:47,695 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:30:47,702 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,702 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9007-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,703 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,706 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647706 | |
2024-02-08T04:30:47,706 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9012. | |
2024-02-08T04:30:47,706 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,706 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - [PID]8887 | |
2024-02-08T04:30:47,695 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,706 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647706 | |
2024-02-08T04:30:47,706 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:30:47,703 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647703 | |
2024-02-08T04:30:47,706 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:30:47,703 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9008. | |
2024-02-08T04:30:47,707 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9003 | |
2024-02-08T04:30:47,703 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9011. | |
2024-02-08T04:30:47,711 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,711 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9007 | |
2024-02-08T04:30:47,712 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:30:47,737 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,752 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,755 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647755 | |
2024-02-08T04:30:47,756 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9003. | |
2024-02-08T04:30:47,756 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9007. | |
2024-02-08T04:30:47,756 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366647755 | |
2024-02-08T04:30:47,755 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,779 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:47,784 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:30:48,657 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,657 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,661 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,662 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,662 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,662 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,670 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,671 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,672 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,673 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,674 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,674 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,687 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,688 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,692 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,698 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,715 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,715 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,729 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,730 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,730 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1089 | |
2024-02-08T04:30:48,730 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9002-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,730 [INFO ] W-9002-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3204.0|#WorkerName:W-9002-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,731 [INFO ] W-9002-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:19.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,731 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,732 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,732 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1091 | |
2024-02-08T04:30:48,732 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,732 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,732 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,732 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1099 | |
2024-02-08T04:30:48,732 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9013-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,732 [INFO ] W-9001-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3207.0|#WorkerName:W-9001-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,733 [INFO ] W-9001-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:56.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,732 [INFO ] W-9013-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3203.0|#WorkerName:W-9013-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,733 [INFO ] W-9013-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:41.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,739 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,739 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,739 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,739 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1099 | |
2024-02-08T04:30:48,741 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,742 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,745 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,741 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,745 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1108 | |
2024-02-08T04:30:48,745 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9015-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,746 [INFO ] W-9015-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3216.0|#WorkerName:W-9015-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,746 [INFO ] W-9015-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:42.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,745 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9006-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,746 [INFO ] W-9006-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3219.0|#WorkerName:W-9006-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,746 [INFO ] W-9006-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:58.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,749 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,749 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1125 | |
2024-02-08T04:30:48,749 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,749 [INFO ] W-9000-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3225.0|#WorkerName:W-9000-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,749 [INFO ] W-9000-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:35.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,751 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,751 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,755 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,756 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,756 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1088 | |
2024-02-08T04:30:48,756 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,756 [INFO ] W-9004-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3229.0|#WorkerName:W-9004-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,756 [INFO ] W-9004-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:19.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,765 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,765 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,765 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1091 | |
2024-02-08T04:30:48,765 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9010-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,765 [INFO ] W-9010-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3237.0|#WorkerName:W-9010-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,765 [INFO ] W-9010-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:24.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,772 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,772 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,772 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1069 | |
2024-02-08T04:30:48,772 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,772 [INFO ] W-9009-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3244.0|#WorkerName:W-9009-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,773 [INFO ] W-9009-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:33.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,779 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,779 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,781 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,782 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,782 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1080 | |
2024-02-08T04:30:48,782 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9014-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,782 [INFO ] W-9014-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3253.0|#WorkerName:W-9014-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,783 [INFO ] W-9014-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:26.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,783 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,783 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,784 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,784 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,787 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,787 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,787 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1085 | |
2024-02-08T04:30:48,787 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,787 [INFO ] W-9005-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3260.0|#WorkerName:W-9005-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,787 [INFO ] W-9005-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:31.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,795 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,796 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,796 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1059 | |
2024-02-08T04:30:48,796 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9012-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,796 [INFO ] W-9012-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3267.0|#WorkerName:W-9012-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,797 [INFO ] W-9012-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:32.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,800 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,801 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,801 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1045 | |
2024-02-08T04:30:48,801 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,801 [INFO ] W-9008-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3273.0|#WorkerName:W-9008-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,802 [INFO ] W-9008-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:51.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,805 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,806 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,806 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 996 | |
2024-02-08T04:30:48,806 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,807 [INFO ] W-9003-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3281.0|#WorkerName:W-9003-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,807 [INFO ] W-9003-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:56.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,820 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,821 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,825 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:30:48,825 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:30:48,832 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,832 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,832 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1080 | |
2024-02-08T04:30:48,832 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,833 [INFO ] W-9011-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3304.0|#WorkerName:W-9011-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,833 [INFO ] W-9011-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:50.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,837 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - '/tmp/models/77f809570e2241eeabe297f1de11b356/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:30:48,837 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:30:48,837 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1058 | |
2024-02-08T04:30:48,837 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9007-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:30:48,837 [INFO ] W-9007-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3310.0|#WorkerName:W-9007-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 | |
2024-02-08T04:30:48,837 [INFO ] W-9007-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:24.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366648 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$torchserve --model-store model_store | |
(serve0.9.0) ubuntu@ip-172-31-55-226:~/serve$ WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance. | |
2024-02-08T04:33:41,779 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Loading snapshot serializer plugin... | |
2024-02-08T04:33:41,813 [WARN ] main org.pytorch.serve.util.ConfigManager - Your torchserve instance can access any URL to load models. When deploying to production, make sure to limit the set of allowed_urls in config.properties | |
2024-02-08T04:33:41,813 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager... | |
2024-02-08T04:33:41,854 [INFO ] main org.pytorch.serve.metrics.configuration.MetricConfiguration - Successfully loaded metrics configuration from /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml | |
2024-02-08T04:33:41,956 [INFO ] main org.pytorch.serve.ModelServer - | |
Torchserve version: 0.9.0 | |
TS Home: /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages | |
Current directory: /home/ubuntu/serve | |
Temp directory: /tmp | |
Metrics config path: /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml | |
Number of GPUs: 0 | |
Number of CPUs: 16 | |
Max heap size: 15820 M | |
Python executable: /home/ubuntu/miniconda3/envs/serve0.9.0/bin/python | |
Config file: logs/config/20240208043104864-shutdown.cfg | |
Inference address: http://127.0.0.1:8080 | |
Management address: http://127.0.0.1:8081 | |
Metrics address: http://127.0.0.1:8082 | |
Model Store: /home/ubuntu/serve/model_store | |
Initial Models: mnist=mnist.mar | |
Log dir: /home/ubuntu/serve/logs | |
Metrics dir: /home/ubuntu/serve/logs | |
Netty threads: 0 | |
Netty client threads: 0 | |
Default workers per model: 16 | |
Blacklist Regex: N/A | |
Maximum Response Size: 6553500 | |
Maximum Request Size: 6553500 | |
Limit Maximum Image Pixels: true | |
Prefer direct buffer: false | |
Allowed Urls: [file://.*|http(s)?://.*] | |
Custom python dependency for model allowed: false | |
Enable metrics API: true | |
Metrics mode: log | |
Disable system metrics: false | |
Workflow Store: /home/ubuntu/serve/model_store | |
Model config: N/A | |
2024-02-08T04:33:41,961 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Started restoring models from snapshot { | |
"name": "20240208043104864-shutdown.cfg", | |
"modelCount": 1, | |
"created": 1707366664864, | |
"models": { | |
"mnist": { | |
"1.0": { | |
"defaultVersion": true, | |
"marName": "mnist.mar", | |
"minWorkers": 16, | |
"maxWorkers": 16, | |
"batchSize": 1, | |
"maxBatchDelay": 100, | |
"responseTimeout": 120 | |
} | |
} | |
} | |
} | |
2024-02-08T04:33:41,968 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Validating snapshot 20240208043104864-shutdown.cfg | |
2024-02-08T04:33:41,968 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Snapshot 20240208043104864-shutdown.cfg validated successfully | |
2024-02-08T04:33:42,043 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Adding new version 1.0 for model mnist | |
2024-02-08T04:33:42,044 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model mnist | |
2024-02-08T04:33:42,044 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model mnist | |
2024-02-08T04:33:42,044 [INFO ] main org.pytorch.serve.wlm.ModelManager - Model mnist loaded. | |
2024-02-08T04:33:42,044 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: mnist, count: 16 | |
2024-02-08T04:33:42,053 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9000, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,056 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9002, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,056 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9006, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,056 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9001, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,056 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9007, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,051 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9003, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,051 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9004, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,057 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9008, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,056 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9005, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,070 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9010, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,070 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9009, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,103 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9011, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,124 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9012, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,126 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9013, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,129 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9014, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,165 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel. | |
2024-02-08T04:33:42,187 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [/home/ubuntu/miniconda3/envs/serve0.9.0/bin/python, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/model_service_worker.py, --sock-type, unix, --sock-name, /tmp/.ts.sock.9015, --metrics-config, /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml] | |
2024-02-08T04:33:42,408 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://127.0.0.1:8080 | |
2024-02-08T04:33:42,409 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel. | |
2024-02-08T04:33:42,423 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://127.0.0.1:8081 | |
2024-02-08T04:33:42,423 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel. | |
2024-02-08T04:33:42,424 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://127.0.0.1:8082 | |
Model server started. | |
2024-02-08T04:33:43,020 [WARN ] pool-3-thread-1 org.pytorch.serve.metrics.MetricCollector - worker pid is not available yet. | |
2024-02-08T04:33:43,068 [INFO ] pool-3-thread-1 TS_METRICS - CPUUtilization.Percent:100.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366823 | |
2024-02-08T04:33:43,069 [INFO ] pool-3-thread-1 TS_METRICS - DiskAvailable.Gigabytes:382.1188659667969|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366823 | |
2024-02-08T04:33:43,069 [INFO ] pool-3-thread-1 TS_METRICS - DiskUsage.Gigabytes:102.49505996704102|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366823 | |
2024-02-08T04:33:43,070 [INFO ] pool-3-thread-1 TS_METRICS - DiskUtilization.Percent:21.1|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366823 | |
2024-02-08T04:33:43,071 [INFO ] pool-3-thread-1 TS_METRICS - MemoryAvailable.Megabytes:60942.53515625|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366823 | |
2024-02-08T04:33:43,071 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUsed.Megabytes:1656.3125|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366823 | |
2024-02-08T04:33:43,072 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUtilization.Percent:3.7|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366823 | |
2024-02-08T04:33:43,952 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9007, pid=12784 | |
2024-02-08T04:33:43,953 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9007 | |
2024-02-08T04:33:43,981 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9002, pid=12782 | |
2024-02-08T04:33:43,982 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9002 | |
2024-02-08T04:33:43,989 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9010, pid=12795 | |
2024-02-08T04:33:43,990 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9010 | |
2024-02-08T04:33:43,999 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9001, pid=12787 | |
2024-02-08T04:33:44,000 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9001 | |
2024-02-08T04:33:44,003 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9006, pid=12785 | |
2024-02-08T04:33:44,003 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9006 | |
2024-02-08T04:33:44,017 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,017 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9003, pid=12798 | |
2024-02-08T04:33:44,018 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - [PID]12784 | |
2024-02-08T04:33:44,018 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9003 | |
2024-02-08T04:33:44,018 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,019 [DEBUG] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9007-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,019 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,024 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9007 | |
2024-02-08T04:33:44,034 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9007. | |
2024-02-08T04:33:44,039 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9000, pid=12801 | |
2024-02-08T04:33:44,040 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9005, pid=12790 | |
2024-02-08T04:33:44,044 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9005 | |
2024-02-08T04:33:44,044 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9000 | |
2024-02-08T04:33:44,045 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9004, pid=12800 | |
2024-02-08T04:33:44,046 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9004 | |
2024-02-08T04:33:44,047 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9009, pid=12799 | |
2024-02-08T04:33:44,047 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9009 | |
2024-02-08T04:33:44,048 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9008, pid=12786 | |
2024-02-08T04:33:44,048 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9008 | |
2024-02-08T04:33:44,049 [INFO ] W-9007-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824049 | |
2024-02-08T04:33:44,055 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,056 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - [PID]12782 | |
2024-02-08T04:33:44,056 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,056 [DEBUG] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9002-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,056 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9002 | |
2024-02-08T04:33:44,057 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,057 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9012, pid=12825 | |
2024-02-08T04:33:44,059 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9012 | |
2024-02-08T04:33:44,059 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9002. | |
2024-02-08T04:33:44,059 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,059 [INFO ] W-9002-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824059 | |
2024-02-08T04:33:44,060 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - [PID]12795 | |
2024-02-08T04:33:44,060 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,060 [DEBUG] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9010-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,060 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9010 | |
2024-02-08T04:33:44,060 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,062 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9013, pid=12831 | |
2024-02-08T04:33:44,063 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9013 | |
2024-02-08T04:33:44,066 [INFO ] W-9010-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824066 | |
2024-02-08T04:33:44,066 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9010. | |
2024-02-08T04:33:44,071 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,072 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - [PID]12787 | |
2024-02-08T04:33:44,072 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,072 [DEBUG] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,072 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9001 | |
2024-02-08T04:33:44,072 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,074 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,074 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - [PID]12785 | |
2024-02-08T04:33:44,075 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,075 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9006-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,075 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,075 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9006 | |
2024-02-08T04:33:44,088 [INFO ] W-9001-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824088 | |
2024-02-08T04:33:44,088 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9006. | |
2024-02-08T04:33:44,088 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9001. | |
2024-02-08T04:33:44,089 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,089 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824089 | |
2024-02-08T04:33:44,089 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - [PID]12798 | |
2024-02-08T04:33:44,089 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,089 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,089 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,089 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9003 | |
2024-02-08T04:33:44,106 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9011, pid=12807 | |
2024-02-08T04:33:44,106 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9011 | |
2024-02-08T04:33:44,107 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,115 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,115 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,116 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9003. | |
2024-02-08T04:33:44,116 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,116 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - [PID]12800 | |
2024-02-08T04:33:44,115 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,116 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,116 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - [PID]12786 | |
2024-02-08T04:33:44,116 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,116 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,116 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824116 | |
2024-02-08T04:33:44,116 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,116 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,116 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,117 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9004 | |
2024-02-08T04:33:44,117 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,118 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - [PID]12801 | |
2024-02-08T04:33:44,118 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,118 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,118 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9000 | |
2024-02-08T04:33:44,119 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,119 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9008 | |
2024-02-08T04:33:44,122 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,129 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,123 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - [PID]12790 | |
2024-02-08T04:33:44,130 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - [PID]12799 | |
2024-02-08T04:33:44,131 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824131 | |
2024-02-08T04:33:44,131 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9008. | |
2024-02-08T04:33:44,131 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9004. | |
2024-02-08T04:33:44,139 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,131 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,139 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,139 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,139 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,139 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9009 | |
2024-02-08T04:33:44,139 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,139 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - [PID]12831 | |
2024-02-08T04:33:44,140 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,131 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824131 | |
2024-02-08T04:33:44,140 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,139 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,140 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - [PID]12825 | |
2024-02-08T04:33:44,140 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,140 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,131 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,139 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,140 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,140 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9013-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,140 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9012-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,140 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9005 | |
2024-02-08T04:33:44,140 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9012 | |
2024-02-08T04:33:44,140 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9013 | |
2024-02-08T04:33:44,149 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9014, pid=12833 | |
2024-02-08T04:33:44,157 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9009. | |
2024-02-08T04:33:44,164 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9014 | |
2024-02-08T04:33:44,165 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824165 | |
2024-02-08T04:33:44,165 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824165 | |
2024-02-08T04:33:44,166 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9005. | |
2024-02-08T04:33:44,157 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,158 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9000. | |
2024-02-08T04:33:44,166 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9012. | |
2024-02-08T04:33:44,165 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824165 | |
2024-02-08T04:33:44,165 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824165 | |
2024-02-08T04:33:44,164 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,165 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9013. | |
2024-02-08T04:33:44,164 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824158 | |
2024-02-08T04:33:44,169 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,170 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - [PID]12807 | |
2024-02-08T04:33:44,170 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,170 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,170 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9011 | |
2024-02-08T04:33:44,170 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,165 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,196 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9011. | |
2024-02-08T04:33:44,196 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,196 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824196 | |
2024-02-08T04:33:44,197 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,205 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,208 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,208 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,208 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - [PID]12833 | |
2024-02-08T04:33:44,209 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,209 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9014-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,209 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,209 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9014 | |
2024-02-08T04:33:44,217 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,217 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,217 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9014. | |
2024-02-08T04:33:44,218 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824218 | |
2024-02-08T04:33:44,242 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:44,256 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - s_name_part0=/tmp/.ts.sock, s_name_part1=9015, pid=12839 | |
2024-02-08T04:33:44,256 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Listening on port: /tmp/.ts.sock.9015 | |
2024-02-08T04:33:44,320 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Successfully loaded /home/ubuntu/miniconda3/envs/serve0.9.0/lib/python3.10/site-packages/ts/configs/metrics.yaml. | |
2024-02-08T04:33:44,320 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - [PID]12839 | |
2024-02-08T04:33:44,320 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Torch worker started. | |
2024-02-08T04:33:44,320 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9015-mnist_1.0 State change null -> WORKER_STARTED | |
2024-02-08T04:33:44,320 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Python runtime: 3.10.13 | |
2024-02-08T04:33:44,321 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /tmp/.ts.sock.9015 | |
2024-02-08T04:33:44,333 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Connection accepted: /tmp/.ts.sock.9015. | |
2024-02-08T04:33:44,333 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD to backend at: 1707366824333 | |
2024-02-08T04:33:44,355 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - model_name: mnist, batchSize: 1 | |
2024-02-08T04:33:45,116 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,117 [INFO ] W-9002-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,124 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,124 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,130 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,131 [INFO ] W-9007-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,132 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,133 [INFO ] W-9001-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,140 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,140 [INFO ] W-9010-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,153 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,158 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,172 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,173 [INFO ] W-9008-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,184 [INFO ] W-9006-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,188 [DEBUG] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,189 [INFO ] W-9006-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1072 | |
2024-02-08T04:33:45,224 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,224 [INFO ] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1060 | |
2024-02-08T04:33:45,224 [DEBUG] W-9008-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,224 [INFO ] W-9008-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3171.0|#WorkerName:W-9008-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,224 [INFO ] W-9008-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:33.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,229 [INFO ] W-9003-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,230 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,230 [INFO ] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1090 | |
2024-02-08T04:33:45,230 [DEBUG] W-9003-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,230 [INFO ] W-9003-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3181.0|#WorkerName:W-9003-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,231 [INFO ] W-9003-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:25.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,231 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,231 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,238 [INFO ] W-9004-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,238 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,238 [INFO ] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1073 | |
2024-02-08T04:33:45,239 [DEBUG] W-9004-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,239 [INFO ] W-9004-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3189.0|#WorkerName:W-9004-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,239 [INFO ] W-9004-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:35.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,241 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,241 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,244 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,244 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,251 [INFO ] W-9005-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,252 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,252 [INFO ] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1034 | |
2024-02-08T04:33:45,252 [DEBUG] W-9005-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,252 [INFO ] W-9005-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3202.0|#WorkerName:W-9005-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,252 [INFO ] W-9005-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:53.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,253 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,253 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,257 [INFO ] W-9012-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,257 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,257 [INFO ] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1061 | |
2024-02-08T04:33:45,257 [DEBUG] W-9012-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9012-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,257 [INFO ] W-9012-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3188.0|#WorkerName:W-9012-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,258 [INFO ] W-9012-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:32.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,258 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,258 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,263 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,263 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,264 [INFO ] W-9013-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,264 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,264 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,264 [INFO ] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1055 | |
2024-02-08T04:33:45,264 [DEBUG] W-9013-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9013-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,264 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,264 [INFO ] W-9013-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3145.0|#WorkerName:W-9013-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,265 [INFO ] W-9013-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:44.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,273 [INFO ] W-9011-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,273 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,273 [INFO ] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1055 | |
2024-02-08T04:33:45,273 [DEBUG] W-9011-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,273 [INFO ] W-9011-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3211.0|#WorkerName:W-9011-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,273 [INFO ] W-9011-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:22.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,278 [INFO ] W-9009-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,278 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,278 [INFO ] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1071 | |
2024-02-08T04:33:45,279 [DEBUG] W-9009-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,279 [INFO ] W-9009-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3223.0|#WorkerName:W-9009-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,279 [INFO ] W-9009-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:50.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,283 [INFO ] W-9014-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,283 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,283 [INFO ] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1033 | |
2024-02-08T04:33:45,283 [DEBUG] W-9014-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9014-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,283 [INFO ] W-9014-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3164.0|#WorkerName:W-9014-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,284 [INFO ] W-9014-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:33.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,288 [INFO ] W-9000-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,288 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,288 [INFO ] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 1091 | |
2024-02-08T04:33:45,288 [DEBUG] W-9000-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,288 [INFO ] W-9000-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3240.0|#WorkerName:W-9000-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,289 [INFO ] W-9000-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:33.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,330 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - proceeding without onnxruntime | |
2024-02-08T04:33:45,330 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - Torch TensorRT not enabled | |
2024-02-08T04:33:45,341 [INFO ] W-9015-mnist_1.0-stdout MODEL_LOG - '/tmp/models/fa5a752f9ba84c3cbfb6e6d94d90df3e/index_to_name.json' is missing. Inference output will not include class name. | |
2024-02-08T04:33:45,341 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - sent a reply, jobdone: true | |
2024-02-08T04:33:45,341 [INFO ] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 971 | |
2024-02-08T04:33:45,341 [DEBUG] W-9015-mnist_1.0 org.pytorch.serve.wlm.WorkerThread - W-9015-mnist_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED | |
2024-02-08T04:33:45,341 [INFO ] W-9015-mnist_1.0 TS_METRICS - WorkerLoadTime.Milliseconds:3214.0|#WorkerName:W-9015-mnist_1.0,Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 | |
2024-02-08T04:33:45,342 [INFO ] W-9015-mnist_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:38.0|#Level:Host|#hostname:ip-172-31-55-226,timestamp:1707366825 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment