Skip to content

Instantly share code, notes, and snippets.

@piercelamb
Created December 20, 2022 17:47
Show Gist options
  • Save piercelamb/e4c90e55952f98c13a8a6137395fdc40 to your computer and use it in GitHub Desktop.
Save piercelamb/e4c90e55952f98c13a8a6137395fdc40 to your computer and use it in GitHub Desktop.
run_inference
if config.run_inference:
role = get_role(config.execution_role)
processor = ScriptProcessor(
command=['python3'],
image_uri=config.docker_image_path,
role=role.arn,
instance_count=1,
instance_type=config.preparation_instance,
volume_size_in_gb=config.storage_size,
max_runtime_in_seconds=config.preparation_runtime,
)
input_source_dir = f"s3://{config.bucket}/{config.s3_parent_dir}/data/prepared_data/{config.encoded_data_dir}/encoded_data/test/"
output_source_dir = f"s3://{config.bucket}/{config.s3_parent_dir}/prepared_data/{config.encoded_data_dir}/inference_output"
processor.run(
code="/opt/ml/code/viso_ml_train/inference/inference.py",
inputs=[ProcessingInput(
source=input_source_dir,
destination=SAGEMAKER_LOCAL_INFERENCE_DATA_DIR)],
outputs=[ProcessingOutput(
source='/opt/ml/processing/processed_data',
destination=output_source_dir)],
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment