首页 > 解决方案 > 无法在 GoogleCP Vision AutoML 模型中使用导出模型运行 docker

问题描述

我使用 GCP AutoML Vision 训练了一个图像分类模型,我想使用 Docker 将它部署到我自己的 Web 应用程序中。按照GCP 的教程,我将我的 Vision autoML 模型导出到了saved_model.pb并设法将其复制到我的本地驱动器。

sudo docker run --rm --name ${CONTAINER_NAME} -p ${PORT}:8501 -v ${YOUR_MODEL_PATH}:/tmp/mounted_model/0001 -t ${CPU_DOCKER_GCR_PATH}

当我尝试运行 docker 映像时,出现错误。下面的错误信息:

2020-03-18 06:52:52.851811: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
2020-03-18 06:52:52.851825: I tensorflow_serving/model_servers/server_core.cc:559]  (Re-)adding model: default
2020-03-18 06:52:52.859873: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: default version: 1}
2020-03-18 06:52:52.859923: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: default version: 1}
2020-03-18 06:52:52.859938: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: default version: 1}
2020-03-18 06:52:52.860387: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /tmp/mounted_model/0001
2020-03-18 06:52:52.860426: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /tmp/mounted_model/0001
2020-03-18 06:52:52.861256: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-03-18 06:52:52.861345: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:310] SavedModel load for tags { serve }; Status: fail. Took 916 microseconds.
2020-03-18 06:52:52.861357: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: default version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`

我在网上做了一些研究,似乎问题出在模型的导出部分,当我导出模型时,GCP 没有提供任何选项。我真的可以使用帮助,谢谢大家。

标签: dockertensorflowgoogle-cloud-automlautoml

解决方案


似乎模型没有对应于服务标签的图表。

我在Tensorflow github 页面中发现了一个类似的问题。要检查已保存模型中的可用标签集,您可以使用SavedModel CLI,您可以使用 saved_model_cli 检查标签:

$ saved_model_cli 显示 --dir ./modelDir

我发现了如何从 Tensorflow Hub 向模型添加服务标签,似乎使用迁移学习可以帮助您导出或保存带有服务标签的模型。


推荐阅读