@@ -25,7 +25,7 @@ of building Tensorflow Serving Docker image.
2525### Run container
2626
2727We build a based image ` $USER/tensorflow-serving-devel ` using
28- [ Dockerfile.devel] ( https://github.com/tensorflow/serving/tree/tensorflow_serving/tools/docker/Dockerfile.devel ) .
28+ [ Dockerfile.devel] ( https://github.com/tensorflow/serving/tree/master/ tensorflow_serving/tools/docker/Dockerfile.devel ) .
2929And then start a container locally using the built image.
3030
3131``` shell
@@ -36,7 +36,7 @@ $ docker run --name=inception_container -it $USER/tensorflow-serving-devel
3636### Clone, configure and build Tensorflow Serving in container
3737
3838In the running container, we clone, configure and build Tensorflow Serving.
39- Then test run [ inception_inference.cc] ( https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_inference.cc ) .
39+ Then test run [ inception_inference.cc] ( https://github.com/tensorflow/serving/tree/master/ tensorflow_serving/example/inception_inference.cc ) .
4040
4141``` shell
4242root@c97d8e820ced:/# git clone --recurse-submodules https://github.com/tensorflow/serving
@@ -54,7 +54,7 @@ E tensorflow_serving/example/inception_inference.cc:362] Usage: inception_infere
5454### Export Inception model in container
5555
5656In the running container, we run
57- [ inception_export.py] ( https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_export.py )
57+ [ inception_export.py] ( https://github.com/tensorflow/serving/tree/master/ tensorflow_serving/example/inception_export.py )
5858to export the inception model using the released
5959[ Inception model training checkpoint] ( http://download.tensorflow.org/models/image/imagenet/inception-v3-2016-03-01.tar.gz ) .
6060Instead of training from scratch, we use the readily available checkpoints
@@ -105,7 +105,7 @@ root@f07eec53fd95:/serving# bazel-bin/tensorflow_serving/example/inception_infer
105105
106106### Query the server
107107
108- Query the server with [ inception_client.py] ( https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_client.py ) .
108+ Query the server with [ inception_client.py] ( https://github.com/tensorflow/serving/tree/master/ tensorflow_serving/example/inception_client.py ) .
109109The client sends an image specified by the command line parameter to the server
110110over gRPC for classification. It then looks up the
111111[ ImageNet] ( http://www.image-net.org/ ) synset and metadata files and returns
@@ -194,7 +194,7 @@ along with an
194194[ External Load Balancer] ( http://kubernetes.io/docs/user-guide/load-balancer/ ) .
195195
196196We create them using the example Kubernetes config
197- [ inception_k8s.json] ( https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_k8s.json ) .
197+ [ inception_k8s.json] ( https://github.com/tensorflow/serving/tree/master/ tensorflow_serving/example/inception_k8s.json ) .
198198
199199``` shell
200200$ kubectl create -f tensorflow_serving/example/inception_k8s.json
@@ -255,9 +255,7 @@ IP address is listed next to LoadBalancer Ingress.
255255We can now query the service at its external address from our local host.
256256
257257``` shell
258- $ bazel-bin/tensorflow_serving/example/inception_client \
259- --server=146.148.88.232:9000
260- --image=/path/to/my_cat_image.jpg
258+ $ bazel-bin/tensorflow_serving/example/inception_client --server=146.148.88.232:9000 --image=/path/to/my_cat_image.jpg
2612598.976576 : tabby, tabby cat
2622608.725506 : Egyptian cat
2632616.883981 : tiger cat
0 commit comments