日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問(wèn) 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 编程资源 > 综合教程 >内容正文

综合教程

tensorflow学习笔记-SavedModel文件解释及TFServing的模型加载、使用

發(fā)布時(shí)間:2023/12/13 综合教程 27 生活家
生活随笔 收集整理的這篇文章主要介紹了 tensorflow学习笔记-SavedModel文件解释及TFServing的模型加载、使用 小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

tensorflow基本概念:https://www.cnblogs.com/wanyu416/p/8954098.html 這里是一系列文章

Tensorflow SavedModel 模型的保存和加載 https://www.jianshu.com/p/83cfd9571158

Tensorflow如何加載離線模型 https://www.zhihu.com/question/300914772

TensorFlow模型的跨平臺(tái)部署 https://zhuanlan.zhihu.com/p/40481765

TensorFlow程序結(jié)構(gòu) http://c.biancheng.net/view/1883.html

SavedModel的格式:https://www.tensorflow.org/guide/saved_model

SavedModel 是一個(gè)包含序列化簽名和運(yùn)行這些簽名所需的狀態(tài)的目錄,其中包括變量值和詞匯表。
目錄如下:

saved_model.pb 文件用于存儲(chǔ)實(shí)際 TensorFlow 程序或模型,以及一組已命名的簽名(signatures)——每個(gè)簽名標(biāo)識(shí)一個(gè)接受tensor輸入和產(chǎn)生tensor輸出的函數(shù)。
variables 目錄包含一個(gè)標(biāo)準(zhǔn)訓(xùn)練檢查點(diǎn)(checkpoint)
名詞解釋?zhuān)?br /> signatures: 使用SavedModel保存的簽名。只適用于“tf”格式,詳情查看 tf.saved_model.save
checkpoint: 檢查點(diǎn),保存模型并不限于在訓(xùn)練模型后,在訓(xùn)練模型之中也需要保存,因?yàn)門(mén)ensorFlow訓(xùn)練模型時(shí)難免會(huì)出現(xiàn)中斷的情況,我們自然希望能夠?qū)⒂?xùn)練得到的參數(shù)保存下來(lái),否則下次又要重新訓(xùn)練。這種在訓(xùn)練中保存模型,習(xí)慣上稱(chēng)之為保存檢查點(diǎn)。
TensorFlow——Checkpoint為模型添加檢查點(diǎn) https://www.cnblogs.com/baby-lily/p/10930591.html

TFS模型加載:
tensorflow/tensorflow/cc/saved_model/loader.cc

/// Checks whether the provided directory could contain a SavedModel. Note that
/// the method does not load any data by itself. If the method returns `false`,
/// the export directory definitely does not contain a SavedModel. If the method
/// returns `true`, the export directory may contain a SavedModel but provides
/// no guarantee that it can be loaded.
bool MaybeSavedModelDirectory(const string& export_dir);

檢查提供的目錄是否可以包含SavedModel。 請(qǐng)注意,該方法本身不會(huì)加載任何數(shù)據(jù)。 如果該方法返回false,則導(dǎo)出目錄肯定不包含SavedModel。 如果該方法返回true,則導(dǎo)出目錄可能包含SavedModel,但不保證可以加載它。

/// Loads a SavedModel from the specified export directory. The meta graph def
/// to be loaded is identified by the supplied tags, corresponding exactly to
/// the set of tags used at SavedModel build time. Returns a SavedModel bundle
/// with a session and the requested meta graph def, if found.
Status LoadSavedModel(const SessionOptions& session_options,
                      const RunOptions& run_options, const string& export_dir,
                      const std::unordered_set<string>& tags,
                      SavedModelBundle* const bundle);

從指定的導(dǎo)出目錄加載SavedModel。 所要加載的meta graph def由所提供的tag標(biāo)識(shí),該標(biāo)簽恰好與SavedModel構(gòu)建時(shí)使用的標(biāo)簽集相對(duì)應(yīng)。 如果找到,返回帶有會(huì)話和請(qǐng)求的meta graph def的SavedModel bundle。

Eg:例子: https://gist.github.com/OneRaynyDay/c79346890dda095aecc6e9249a9ff3e1
tensorflow::MaybeSavedModelDirectory
tensorflow::LoadSavedModel
bundle.session->Run

點(diǎn)擊查看例子
#include <tensorflow/cc/saved_model/loader.h>
#include <tensorflow/cc/saved_model/tag_constants.h>
#include <tensorflow/core/public/session_options.h>
#include <tensorflow/core/framework/tensor.h>

#include <xtensor/xarray.hpp>
#include <xtensor/xnpy.hpp>

#include <string>
#include <iostream>
#include <vector>
#include <cfloat>

static const int IMG_SIZE = 784;
static const int NUM_SAMPLES = 10000;

tensorflow::Tensor load_npy_img(const std::string& filename) {
    auto data = xt::load_npy<float>(filename);
    tensorflow::Tensor t(tensorflow::DT_FLOAT, tensorflow::TensorShape({NUM_SAMPLES, IMG_SIZE}));

    for (int i = 0; i < NUM_SAMPLES; i++)
        for (int j = 0; j < IMG_SIZE; j++)
            t.tensor<float, 2>()(i,j) = data(i, j);

    return t;
}

std::vector<int> get_tensor_shape(const tensorflow::Tensor& tensor)
{
    std::vector<int> shape;
    auto num_dimensions = tensor.shape().dims();
    for(int i=0; i < num_dimensions; i++) {
        shape.push_back(tensor.shape().dim_size(i));
    }
    return shape;
}

template <typename M>
void print_keys(const M& sig_map) {
    for (auto const& p : sig_map) {
        std::cout << "key : " << p.first << std::endl;
    }
}

template <typename K, typename M>
bool assert_in(const K& k, const M& m) {
    return !(m.find(k) == m.end());
}

std::string _input_name = "digits";
std::string _output_name = "predictions";

int main() {
    // This is passed into LoadSavedModel to be populated.
    tensorflow::SavedModelBundle bundle;

    // From docs: "If 'target' is empty or unspecified, the local TensorFlow runtime
    // implementation will be used.  Otherwise, the TensorFlow engine
    // defined by 'target' will be used to perform all computations."
    tensorflow::SessionOptions session_options;

    // Run option flags here: https://www.tensorflow.org/api_docs/python/tf/compat/v1/RunOptions
    // We don't need any of these yet.
    tensorflow::RunOptions run_options;

    // Fills in this from a session run call
    std::vector<tensorflow::Tensor> out;

    std::string dir = "pyfiles/foo";
    std::string npy_file = "pyfiles/data.npy";
    std::string prediction_npy_file = "pyfiles/predictions.npy";

    std::cout << "Found model: " << tensorflow::MaybeSavedModelDirectory(dir) << std::endl;
    // TF_CHECK_OK takes the status and checks whether it works.
    TF_CHECK_OK(tensorflow::LoadSavedModel(session_options,
                                           run_options,
                                           dir,
                                           // Refer to tag_constants. We just want to serve the model.
                                           {tensorflow::kSavedModelTagServe},
                                           &bundle));

    auto sig_map = bundle.meta_graph_def.signature_def();

    // not sure why it's called this but upon running this for loop to check for keys we see it.
    print_keys(sig_map);
    std::string sig_def = "serving_default";
    auto model_def = sig_map.at(sig_def);
    auto inputs = model_def.inputs().at(_input_name);
    auto input_name = inputs.name();
    auto outputs = model_def.outputs().at(_output_name);
    auto output_name = outputs.name();

    auto input = load_npy_img(npy_file);

    TF_CHECK_OK(bundle.session->Run({{input_name, input}},
                        {output_name},
                        {},
                        &out));
    std::cout << out[0].DebugString() << std::endl;

    auto res = out[0];
    auto shape = get_tensor_shape(res);
    // we only care about the first dimension of shape
    xt::xarray<float> predictions = xt::zeros<float>({shape[0]});
    for(int row = 0; row < shape[0]; row++) {
        float max = FLT_MIN;
        int max_idx = -1;
        for(int col = 0; col < shape[1]; col++) {
            auto val = res.tensor<float, 2>()(row, col);
            if(max < val) {
                max_idx = col;
                max = val;
            }
        }
        predictions(row) = max_idx;
    }
    xt::dump_npy(prediction_npy_file, predictions);
}

總結(jié)

以上是生活随笔為你收集整理的tensorflow学习笔记-SavedModel文件解释及TFServing的模型加载、使用的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。

如果覺(jué)得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。