Skip to content

Instantly share code, notes, and snippets.

View junwan01's full-sized avatar

Jun Wan junwan01

View GitHub Profile
$ cd $SRC
$ git clone https://github.com/grpc/grpc-java.git
Cloning into 'grpc-java'...
remote: Enumerating objects: 166, done.
remote: Counting objects: 100% (166/166), done.
remote: Compressing objects: 100% (121/121), done.
remote: Total 84096 (delta 66), reused 92 (delta 25), pack-reused 83930
Receiving objects: 100% (84096/84096), 31.18 MiB | 23.14 MiB/s, done.
Resolving deltas: 100% (38843/38843), done.
PROJECT_ROOT=$SRC/tensorflow-serve-client
$PROJECT_ROOT/src/main/proto/tensorflow_serving/
$PROJECT_ROOT/src/main/proto/tensorflow/core/lib/core/
$PROJECT_ROOT/src/main/proto/tensorflow/core/framework/
$PROJECT_ROOT/src/main/proto/tensorflow/core/protobuf/
$PROJECT_ROOT/src/main/proto/tensorflow/core/example/
```
Note I peeled off one directory layer from the source. It is necessary to make the import statements work in those .proto files.
Here I use the rsync commands to copy files with particular extension and keep the directory structure.
$ export SRC=~/Documents/source_code/
$ mkdir -p $SRC
$ cd $SRC
$ git clone git@github.com:tensorflow/serving.git
$ cd serving
$ git checkout tags/1.13.0
# another repo
$ cd $SRC
<properties>
<grpc.version>1.20.0</grpc.version>
</properties>
<dependencies>
<!-- gRPC protobuf client -->
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-protobuf</artifactId>
<version>${grpc.version}</version>
<build>
<extensions>
<extension>
<groupId>kr.motd.maven</groupId>
<artifactId>os-maven-plugin</artifactId>
<version>1.6.2</version>
</extension>
</extensions>
<plugins>
<plugin>
$SRC/serving/tensorflow_serving/
$SRC/tensorflow/tensorflow/core/lib/core/
$SRC/tensorflow/tensorflow/core/framework/
$SRC/tensorflow/tensorflow/core/protobuf/
$SRC/tensorflow/tensorflow/core/example/
```
$ cd $PROJECT_ROOT/src/main/proto/
$ protoc --java_out $PROJECT_ROOT/src/main/java --proto_path ./ ./tensorflow/core/example/*.proto
$ protoc --java_out $PROJECT_ROOT/src/main/java --proto_path ./ ./tensorflow_serving/apis/model_management.proto
$ protoc --java_out $PROJECT_ROOT/src/main/java --proto_path ./ ./tensorflow_serving/apis/predict.proto
$ protoc --java_out $PROJECT_ROOT/src/main/java --proto_path ./ ./tensorflow_serving/apis/model.proto
$ protoc --java_out $PROJECT_ROOT/src/main/java --proto_path ./ ./tensorflow_serving/apis/regression.proto
$ protoc --java_out $PROJECT_ROOT/src/main/java --proto_path ./ ./tensorflow_serving/apis/classification.proto
$ protoc --java_out $PROJECT_ROOT/src/main/java --proto_path ./ ./tensorflow_serving/apis/inference.proto
String host = "localhost";
int port = 8501;
// the model's name.
String modelName = "cool_model";
// model's version
long modelVersion = 123456789;
// assume this model takes input of free text, and make some sentiment prediction.
String modelInput = "some text input to make prediction with";
// create a channel