日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 人文社科 > 生活经验 >内容正文

生活经验

Windows10上使用VS2017编译MXNet源码操作步骤(C++)

發布時間:2023/11/27 生活经验 22 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Windows10上使用VS2017编译MXNet源码操作步骤(C++) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

MXNet是一種開源的深度學習框架,核心代碼是由C++實現。MXNet官網推薦使用VS2015或VS2017編譯,因為源碼中使用了一些C++14的特性,VS2013是不支持的。這里通過VS2017編譯,步驟如下:

1. 編譯OpenCV,版本為3.4.2,可參考?https://blog.csdn.net/fengbingchun/article/details/78163217 ,注意加入opencv_contrib模塊;

2. 編譯OpenBLAS,版本為0.3.3,可參考:?https://blog.csdn.net/fengbingchun/article/details/55509764? ;

3. 編譯dmlc-core,版本為0.3:

4. 下載mshadow,注意不能是1.0或1.1版本,只能是master,因為它們的文件名不一致,里面僅有頭文件;

5. 編譯tvm,版本為0.4,在編譯MXNet時,目前僅需要nnvm/src下的c_api, core, pass三個目錄的文件參與:

6. 編譯dlpack,版本為最新master,commit為bee4d1d,

7. 編譯MXNet,版本為1.3.0:

8. 使用mxnet/cpp-package/scripts/OpWrapperGenerator.py生成mxnet/cpp-package/include/mxnet-cpp目錄下的op.h文件操作步驟:

(1). 將lib/rel/x64目錄下的libmxnet.dll和libopenblas.dll兩個動態庫拷貝到mxnet/cpp-package/scripts/目錄下;

(2). 在mxnet/cpp-package/scripts/目錄下打開命令行提示符,執行:

python OpWrapperGenerator.py libmxnet.dll

(3). 修改生成的op.h文件中的兩處UpSampling函數:將參數scale,修改為int scale;將參數num_filter = 0,修改為int num_filter = 0;?注:直接通過以下命令生成op.h文件時不用作任何修改,后面查找下原因

git clone --recursive https://github.com/apache/incubator-mxnet

注意:

(1).?關于MXNet中的依賴庫介紹和使用可參考:https://blog.csdn.net/fengbingchun/article/details/84981969

(2).?為了正常編譯整個工程,部分源碼作了微小的調整;

(3).?所有項目依賴的版本如下:

1. OpenBLAS:commit: fd8d186version: 0.3.3date: 2018.08.31url: https://github.com/xianyi/OpenBLAS/releases
2. dlpack:commit: bee4d1dversion: masterdate: 2018.08.24url: https://github.com/dmlc/dlpack
3. mshadow:commit: 2e3a895version: masterdate: 2018.11.08url: https://github.com/dmlc/mshadow
4. dmlc-core:commit: 85c1180version: 0.3date: 2018.07.18url: https://github.com/dmlc/dmlc-core/releases
5. tvm:commit: 60769b7version: 0.4date: 2018.09.04url: https://github.com/dmlc/tvm/releases
6. HalideIR:commit: a08e26eversion: masterdate: 2018.11.28url: https://github.com/dmlc/HalideIR
7. mxnet:commit: b3be92fversion: 1.3.0date: 2018.09.12url: https://github.com/apache/incubator-mxnet/releases

(4).?整個項目可以從https://github.com/fengbingchun/MXNet_Test?clone到E:/GitCode目錄下直接編譯即可。

下面測試代碼是用生成的MXNet.dll動態庫訓練MNIST:

#include "funset.hpp"
#include <chrono>
#include <string>
#include <fstream>
#include <vector>
#include "mxnet-cpp/MxNetCpp.h"namespace {bool isFileExists(const std::string &filename)
{std::ifstream fhandle(filename.c_str());return fhandle.good();
}bool check_datafiles(const std::vector<std::string> &data_files)
{for (size_t index = 0; index < data_files.size(); index++) {if (!(isFileExists(data_files[index]))) {LG << "Error: File does not exist: " << data_files[index];return false;}}return true;
}bool setDataIter(mxnet::cpp::MXDataIter *iter, std::string useType, const std::vector<std::string> &data_files, int batch_size)
{if (!check_datafiles(data_files))return false;iter->SetParam("batch_size", batch_size);iter->SetParam("shuffle", 1);iter->SetParam("flat", 1);if (useType == "Train") {iter->SetParam("image", data_files[0]);iter->SetParam("label", data_files[1]);} else if (useType == "Label") {iter->SetParam("image", data_files[2]);iter->SetParam("label", data_files[3]);}iter->CreateDataIter();return true;
}} // namespace// mnist 
/* reference: https://mxnet.incubator.apache.org/tutorials/c%2B%2B/basics.htmlmxnet_source/cpp-package/example/mlp_cpu.cpp
*/
namespace {mxnet::cpp::Symbol mlp(const std::vector<int> &layers)
{auto x = mxnet::cpp::Symbol::Variable("X");auto label = mxnet::cpp::Symbol::Variable("label");std::vector<mxnet::cpp::Symbol> weights(layers.size());std::vector<mxnet::cpp::Symbol> biases(layers.size());std::vector<mxnet::cpp::Symbol> outputs(layers.size());for (size_t i = 0; i < layers.size(); ++i) {weights[i] = mxnet::cpp::Symbol::Variable("w" + std::to_string(i));biases[i] = mxnet::cpp::Symbol::Variable("b" + std::to_string(i));mxnet::cpp::Symbol fc = mxnet::cpp::FullyConnected(i == 0 ? x : outputs[i - 1], weights[i], biases[i], layers[i]);outputs[i] = i == layers.size() - 1 ? fc : mxnet::cpp::Activation(fc, mxnet::cpp::ActivationActType::kRelu);}return mxnet::cpp::SoftmaxOutput(outputs.back(), label);
}} // namespaceint test_mnist_train()
{const int image_size = 28;const std::vector<int> layers{ 128, 64, 10 };const int batch_size = 100;const int max_epoch = 20;const float learning_rate = 0.1;const float weight_decay = 1e-2;std::vector<std::string> data_files = { "E:/GitCode/MXNet_Test/data/mnist/train-images.idx3-ubyte","E:/GitCode/MXNet_Test/data/mnist/train-labels.idx1-ubyte","E:/GitCode/MXNet_Test/data/mnist/t10k-images.idx3-ubyte","E:/GitCode/MXNet_Test/data/mnist/t10k-labels.idx1-ubyte"};auto train_iter = mxnet::cpp::MXDataIter("MNISTIter");setDataIter(&train_iter, "Train", data_files, batch_size);auto val_iter = mxnet::cpp::MXDataIter("MNISTIter");setDataIter(&val_iter, "Label", data_files, batch_size);auto net = mlp(layers);mxnet::cpp::Context ctx = mxnet::cpp::Context::cpu();  // Use CPU for trainingstd::map<std::string, mxnet::cpp::NDArray> args;args["X"] = mxnet::cpp::NDArray(mxnet::cpp::Shape(batch_size, image_size*image_size), ctx);args["label"] = mxnet::cpp::NDArray(mxnet::cpp::Shape(batch_size), ctx);// Let MXNet infer shapes other parameters such as weightsnet.InferArgsMap(ctx, &args, args);// Initialize all parameters with uniform distribution U(-0.01, 0.01)auto initializer = mxnet::cpp::Uniform(0.01);for (auto& arg : args) {// arg.first is parameter name, and arg.second is the valueinitializer(arg.first, &arg.second);}// Create sgd optimizermxnet::cpp::Optimizer* opt = mxnet::cpp::OptimizerRegistry::Find("sgd");opt->SetParam("rescale_grad", 1.0 / batch_size)->SetParam("lr", learning_rate)->SetParam("wd", weight_decay);// Create executor by binding parameters to the modelauto *exec = net.SimpleBind(ctx, args);auto arg_names = net.ListArguments();// Start trainingfor (int iter = 0; iter < max_epoch; ++iter) {int samples = 0;train_iter.Reset();auto tic = std::chrono::system_clock::now();while (train_iter.Next()) {samples += batch_size;auto data_batch = train_iter.GetDataBatch();// Set data and labeldata_batch.data.CopyTo(&args["X"]);data_batch.label.CopyTo(&args["label"]);// Compute gradientsexec->Forward(true);exec->Backward();// Update parametersfor (size_t i = 0; i < arg_names.size(); ++i) {if (arg_names[i] == "X" || arg_names[i] == "label") continue;opt->Update(i, exec->arg_arrays[i], exec->grad_arrays[i]);}}auto toc = std::chrono::system_clock::now();mxnet::cpp::Accuracy acc;val_iter.Reset();while (val_iter.Next()) {auto data_batch = val_iter.GetDataBatch();data_batch.data.CopyTo(&args["X"]);data_batch.label.CopyTo(&args["label"]);// Forward pass is enough as no gradient is needed when evaluatingexec->Forward(false);acc.Update(data_batch.label, exec->outputs[0]);}float duration = std::chrono::duration_cast<std::chrono::milliseconds>(toc - tic).count() / 1000.0;LG << "Epoch: " << iter << " " << samples / duration << " samples/sec Accuracy: " << acc.Get();}std::string json_file{ "E:/GitCode/MXNet_Test/data/mnist.json" };std::string param_file{"E:/GitCode/MXNet_Test/data/mnist.params"};net.Save(json_file);mxnet::cpp::NDArray::Save(param_file, exec->arg_arrays);delete exec;MXNotifyShutdown();return 0;
}

執行結果如下:

在Windows上編譯MXNet_Test工程時注意事項:

(1). clone MXNet_Test到E:/GitCode目錄下;

(2). MXNet_Test使用VS2017既可在windows7 x64上編譯,也可在Windows10 x64上編譯;

(3). 由于每個人的機子上VS2017安裝的Windows SDK版本不同,可能會導致出現"error MSB8036: 找不到 Windows SDK 版本10.0.17134.0"類似的錯誤,解決方法是:選中指定的項目,打開屬性頁,在配置屬性->常規->Windows SDK版本中重新選擇你已安裝的Windows SDK版本即可。

GitHub:?https://github.com/fengbingchun/MXNet_Test?

總結

以上是生活随笔為你收集整理的Windows10上使用VS2017编译MXNet源码操作步骤(C++)的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。