Skip to content

Commit b0c6793

Browse files
committed
Merge pull request #20 from fangweili/master
update serving_inception.md
2 parents 94559ea + 5b23f65 commit b0c6793

File tree

1 file changed

+6
-8
lines changed

1 file changed

+6
-8
lines changed

tensorflow_serving/g3doc/serving_inception.md

Lines changed: 6 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ of building Tensorflow Serving Docker image.
2525
### Run container
2626

2727
We build a based image `$USER/tensorflow-serving-devel` using
28-
[Dockerfile.devel](https://github.com/tensorflow/serving/tree/tensorflow_serving/tools/docker/Dockerfile.devel).
28+
[Dockerfile.devel](https://github.com/tensorflow/serving/tree/master/tensorflow_serving/tools/docker/Dockerfile.devel).
2929
And then start a container locally using the built image.
3030

3131
```shell
@@ -36,7 +36,7 @@ $ docker run --name=inception_container -it $USER/tensorflow-serving-devel
3636
### Clone, configure and build Tensorflow Serving in container
3737

3838
In the running container, we clone, configure and build Tensorflow Serving.
39-
Then test run [inception_inference.cc](https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_inference.cc).
39+
Then test run [inception_inference.cc](https://github.com/tensorflow/serving/tree/master/tensorflow_serving/example/inception_inference.cc).
4040

4141
```shell
4242
root@c97d8e820ced:/# git clone --recurse-submodules https://github.com/tensorflow/serving
@@ -54,7 +54,7 @@ E tensorflow_serving/example/inception_inference.cc:362] Usage: inception_infere
5454
### Export Inception model in container
5555

5656
In the running container, we run
57-
[inception_export.py](https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_export.py)
57+
[inception_export.py](https://github.com/tensorflow/serving/tree/master/tensorflow_serving/example/inception_export.py)
5858
to export the inception model using the released
5959
[Inception model training checkpoint](http://download.tensorflow.org/models/image/imagenet/inception-v3-2016-03-01.tar.gz).
6060
Instead of training from scratch, we use the readily available checkpoints
@@ -105,7 +105,7 @@ root@f07eec53fd95:/serving# bazel-bin/tensorflow_serving/example/inception_infer
105105

106106
### Query the server
107107

108-
Query the server with [inception_client.py](https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_client.py).
108+
Query the server with [inception_client.py](https://github.com/tensorflow/serving/tree/master/tensorflow_serving/example/inception_client.py).
109109
The client sends an image specified by the command line parameter to the server
110110
over gRPC for classification. It then looks up the
111111
[ImageNet](http://www.image-net.org/) synset and metadata files and returns
@@ -194,7 +194,7 @@ along with an
194194
[External Load Balancer](http://kubernetes.io/docs/user-guide/load-balancer/).
195195

196196
We create them using the example Kubernetes config
197-
[inception_k8s.json](https://github.com/tensorflow/serving/tree/tensorflow_serving/example/inception_k8s.json).
197+
[inception_k8s.json](https://github.com/tensorflow/serving/tree/master/tensorflow_serving/example/inception_k8s.json).
198198

199199
```shell
200200
$ kubectl create -f tensorflow_serving/example/inception_k8s.json
@@ -255,9 +255,7 @@ IP address is listed next to LoadBalancer Ingress.
255255
We can now query the service at its external address from our local host.
256256

257257
```shell
258-
$ bazel-bin/tensorflow_serving/example/inception_client \
259-
--server=146.148.88.232:9000
260-
--image=/path/to/my_cat_image.jpg
258+
$ bazel-bin/tensorflow_serving/example/inception_client --server=146.148.88.232:9000 --image=/path/to/my_cat_image.jpg
261259
8.976576 : tabby, tabby cat
262260
8.725506 : Egyptian cat
263261
6.883981 : tiger cat

0 commit comments

Comments
 (0)