日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 > 编程语言 > c/c++ >内容正文

c/c++

Intel Realsense D435 C/C++调用code examples(附代码)(坑坑坑坑坑!!!)(opencv显示图片反色解决)

發(fā)布時間:2025/3/20 c/c++ 24 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Intel Realsense D435 C/C++调用code examples(附代码)(坑坑坑坑坑!!!)(opencv显示图片反色解决) 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.

https://dev.intelrealsense.com/docs/rs-hello-realsense
👆上面代碼僅供參考,不要用它,要到👇下面下載source code源碼

要先安裝inter realsense SDK 2.0并下載source code

文章目錄

    • rs-hello-realsense(C++)(調(diào)用攝像頭顯示中心深度距離)
    • Distance(C)(測試depth流)(SDK里少了example.h文件,搞不了[我勒個去,它沒集成在安裝包里面,要下source code才有這個文件])
    • Color(C)(測試color流)
    • Capture(C++)(顯示color和depth圖)(裂開,官方給的要引用的庫還得自己編譯!)(還有不要用官網(wǎng)上那個不全的代碼,要用source code里的代碼!)
    • imshow (C++)
      • 為啥opencv顯示是反色的,這是因為opencv它默認(rèn)展示圖片就是反色的,你得通過配置讓它正確顯示,可以用realsense的enable函數(shù)解決(主要看那個RS2_FORMAT_BGR)

rs-hello-realsense(C++)(調(diào)用攝像頭顯示中心深度距離)

Demonstrates the basics of connecting to a RealSense device and using depth data

realsense_test.cpp

#include <stdio.h> #include <iostream> #include <librealsense2/rs.hpp> // Include Intel RealSense Cross Platform APIint main() {// Create a Pipeline - this serves as a top-level API for streaming and processing framesrs2::pipeline p;// Configure and start the pipelinep.start();// Block program until frames arrivers2::frameset frames = p.wait_for_frames();// Try to get a frame of a depth imagers2::depth_frame depth = frames.get_depth_frame();// Get the depth frame's dimensionsfloat width = depth.get_width();float height = depth.get_height();// Query the distance from the camera to the object in the center of the imagefloat dist_to_center = depth.get_distance(width / 2, height / 2);// Print the distancestd::cout << "The camera is facing an object " << dist_to_center << " meters away \r";return 0; }

運行結(jié)果:

The camera is facing an object 3.602 meters away

Distance(C)(測試depth流)(SDK里少了example.h文件,搞不了[我勒個去,它沒集成在安裝包里面,要下source code才有這個文件])

Equivalent to hello-realsense but rewritten for C users

Intel? RealSense? SDK 2.0 (v2.49.0)

直接把這個文件拷到項目工程文件夾下,跟.c文件放在一起

然后編譯運行,一次成功(中間還有個小插曲,就是realsense2.dll文件是2.48的,但是最新的是要求2.49的,所以第一次運行沒成功,把2.49的realsense2.dll拷過去就運行成功了)

realsense_test.c

// License: Apache 2.0. See LICENSE file in root directory. // Copyright(c) 2017 Intel Corporation. All Rights Reserved./* Include the librealsense C header files */ #include <librealsense2/rs.h> #include <librealsense2/h/rs_pipeline.h> #include <librealsense2/h/rs_option.h> #include <librealsense2/h/rs_frame.h> #include "example.h"#include <stdlib.h> #include <stdint.h> #include <stdio.h> // These parameters are reconfigurable // #define STREAM RS2_STREAM_DEPTH // rs2_stream is a types of data provided by RealSense device // #define FORMAT RS2_FORMAT_Z16 // rs2_format identifies how binary data is encoded within a frame // #define WIDTH 640 // Defines the number of columns for each frame or zero for auto resolve// #define HEIGHT 0 // Defines the number of lines for each frame or zero for auto resolve // #define FPS 30 // Defines the rate of frames per second // #define STREAM_INDEX 0 // Defines the stream index, used for multiple streams of the same type // int main() {rs2_error* e = 0;// Create a context object. This object owns the handles to all connected realsense devices.// The returned object should be released with rs2_delete_context(...)rs2_context* ctx = rs2_create_context(RS2_API_VERSION, &e);check_error(e);/* Get a list of all the connected devices. */// The returned object should be released with rs2_delete_device_list(...)rs2_device_list* device_list = rs2_query_devices(ctx, &e);check_error(e);int dev_count = rs2_get_device_count(device_list, &e);check_error(e);printf("There are %d connected RealSense devices.\n", dev_count);if (0 == dev_count)return EXIT_FAILURE;// Get the first connected device// The returned object should be released with rs2_delete_device(...)rs2_device* dev = rs2_create_device(device_list, 0, &e);check_error(e);print_device_info(dev);// Create a pipeline to configure, start and stop camera streaming// The returned object should be released with rs2_delete_pipeline(...)rs2_pipeline* pipeline = rs2_create_pipeline(ctx, &e);check_error(e);// Create a config instance, used to specify hardware configuration// The retunred object should be released with rs2_delete_config(...)rs2_config* config = rs2_create_config(&e);check_error(e);// Request a specific configurationrs2_config_enable_stream(config, STREAM, STREAM_INDEX, WIDTH, HEIGHT, FORMAT, FPS, &e);check_error(e);// Start the pipeline streaming// The retunred object should be released with rs2_delete_pipeline_profile(...)rs2_pipeline_profile* pipeline_profile = rs2_pipeline_start_with_config(pipeline, config, &e);if (e){printf("The connected device doesn't support depth streaming!\n");exit(EXIT_FAILURE);}while (1){// This call waits until a new composite_frame is available// composite_frame holds a set of frames. It is used to prevent frame drops// The returned object should be released with rs2_release_frame(...)rs2_frame* frames = rs2_pipeline_wait_for_frames(pipeline, RS2_DEFAULT_TIMEOUT, &e);check_error(e);// Returns the number of frames embedded within the composite frameint num_of_frames = rs2_embedded_frames_count(frames, &e);check_error(e);int i;for (i = 0; i < num_of_frames; ++i){// The retunred object should be released with rs2_release_frame(...)rs2_frame* frame = rs2_extract_frame(frames, i, &e);check_error(e);// Check if the given frame can be extended to depth frame interface// Accept only depth frames and skip other framesif (0 == rs2_is_frame_extendable_to(frame, RS2_EXTENSION_DEPTH_FRAME, &e))continue;// Get the depth frame's dimensionsint width = rs2_get_frame_width(frame, &e);check_error(e);int height = rs2_get_frame_height(frame, &e);check_error(e);// Query the distance from the camera to the object in the center of the imagefloat dist_to_center = rs2_depth_frame_get_distance(frame, width / 2, height / 2, &e);check_error(e);// Print the distanceprintf("The camera is facing an object %.3f meters away.\n", dist_to_center);rs2_release_frame(frame);}rs2_release_frame(frames);}// Stop the pipeline streamingrs2_pipeline_stop(pipeline, &e);check_error(e);// Release resourcesrs2_delete_pipeline_profile(pipeline_profile);rs2_delete_config(config);rs2_delete_pipeline(pipeline);rs2_delete_device(dev);rs2_delete_device_list(device_list);rs2_delete_context(ctx);return EXIT_SUCCESS; }

運行結(jié)果:

Color(C)(測試color流)

Demonstrate how to stream color data and prints some frame information

realsense_test.c

// License: Apache 2.0. See LICENSE file in root directory. // Copyright(c) 2017 Intel Corporation. All Rights Reserved./* Include the librealsense C header files */ #include <librealsense2/rs.h> #include <librealsense2/h/rs_pipeline.h> #include <librealsense2/h/rs_frame.h>#include <stdlib.h> #include <stdint.h> #include <stdio.h> #include "example.h" // These parameters are reconfigurable // #define STREAM RS2_STREAM_COLOR // rs2_stream is a types of data provided by RealSense device // #define FORMAT RS2_FORMAT_RGB8 // rs2_format identifies how binary data is encoded within a frame // #define WIDTH 640 // Defines the number of columns for each frame // #define HEIGHT 480 // Defines the number of lines for each frame // #define FPS 30 // Defines the rate of frames per second // #define STREAM_INDEX 0 // Defines the stream index, used for multiple streams of the same type // int main() {rs2_error* e = 0;// Create a context object. This object owns the handles to all connected realsense devices.// The returned object should be released with rs2_delete_context(...)rs2_context* ctx = rs2_create_context(RS2_API_VERSION, &e);check_error(e);/* Get a list of all the connected devices. */// The returned object should be released with rs2_delete_device_list(...)rs2_device_list* device_list = rs2_query_devices(ctx, &e);check_error(e);int dev_count = rs2_get_device_count(device_list, &e);check_error(e);printf("There are %d connected RealSense devices.\n", dev_count);if (0 == dev_count)return EXIT_FAILURE;// Get the first connected device// The returned object should be released with rs2_delete_device(...)rs2_device* dev = rs2_create_device(device_list, 0, &e);check_error(e);print_device_info(dev);// Create a pipeline to configure, start and stop camera streaming// The returned object should be released with rs2_delete_pipeline(...)rs2_pipeline* pipeline = rs2_create_pipeline(ctx, &e);check_error(e);// Create a config instance, used to specify hardware configuration// The retunred object should be released with rs2_delete_config(...)rs2_config* config = rs2_create_config(&e);check_error(e);// Request a specific configurationrs2_config_enable_stream(config, STREAM, STREAM_INDEX, WIDTH, HEIGHT, FORMAT, FPS, &e);check_error(e);// Start the pipeline streaming// The retunred object should be released with rs2_delete_pipeline_profile(...)rs2_pipeline_profile* pipeline_profile = rs2_pipeline_start_with_config(pipeline, config, &e);if (e){printf("The connected device doesn't support color streaming!\n");exit(EXIT_FAILURE);}while (1){// This call waits until a new composite_frame is available// composite_frame holds a set of frames. It is used to prevent frame drops// The returned object should be released with rs2_release_frame(...)rs2_frame* frames = rs2_pipeline_wait_for_frames(pipeline, RS2_DEFAULT_TIMEOUT, &e);check_error(e);// Returns the number of frames embedded within the composite frameint num_of_frames = rs2_embedded_frames_count(frames, &e);check_error(e);int i;for (i = 0; i < num_of_frames; ++i){// The retunred object should be released with rs2_release_frame(...)rs2_frame* frame = rs2_extract_frame(frames, i, &e);check_error(e);const uint8_t* rgb_frame_data = (const uint8_t*)(rs2_get_frame_data(frame, &e));check_error(e);unsigned long long frame_number = rs2_get_frame_number(frame, &e);check_error(e);rs2_time_t frame_timestamp = rs2_get_frame_timestamp(frame, &e);check_error(e);// Specifies the clock in relation to which the frame timestamp was measuredrs2_timestamp_domain frame_timestamp_domain = rs2_get_frame_timestamp_domain(frame, &e);check_error(e);const char* frame_timestamp_domain_str = rs2_timestamp_domain_to_string(frame_timestamp_domain);rs2_metadata_type frame_metadata_time_of_arrival = rs2_get_frame_metadata(frame, RS2_FRAME_METADATA_TIME_OF_ARRIVAL, &e);check_error(e);printf("RGB frame arrived.\n");printf("First 10 bytes: ");int i;for (i = 0; i < 10; ++i)printf("%02x ", rgb_frame_data[i]);printf("\nFrame No: %llu\n", frame_number);printf("Timestamp: %f\n", frame_timestamp);printf("Timestamp domain: %s\n", frame_timestamp_domain_str);printf("Time of arrival: %lld\n\n", frame_metadata_time_of_arrival);rs2_release_frame(frame);}rs2_release_frame(frames);}// Stop the pipeline streamingrs2_pipeline_stop(pipeline, &e);check_error(e);// Release resourcesrs2_delete_pipeline_profile(pipeline_profile);rs2_delete_config(config);rs2_delete_pipeline(pipeline);rs2_delete_device(dev);rs2_delete_device_list(device_list);rs2_delete_context(ctx);return EXIT_SUCCESS; }

運行結(jié)果:

Capture(C++)(顯示color和depth圖)(裂開,官方給的要引用的庫還得自己編譯!)(還有不要用官網(wǎng)上那個不全的代碼,要用source code里的代碼!)

Shows how to synchronize and render multiple streams: left, right, depth and RGB streams

example.hpp不能直接拷貝到源碼文件夾下,要指定路徑引用

裂開,各種報錯

不用安裝包的文件了,直接用source code


悄悄告訴你,文件夾名字為include的極有可能是需要你包含的。。。😅,,,比如這個


經(jīng)過同事的點撥,終于知道為啥報錯了,因為還有一些關(guān)于其他比如opencv的頭文件、庫沒包含進(jìn)去。。。

要把這三文件里的內(nèi)容都給加進(jìn)去,為了方便,就不手動添加了,直接把這三加進(jìn)屬性表管理器里(添加順序沒有要求!)

然后編譯還是不行,報錯!

通過報錯的情況來看,應(yīng)該跟glfw這玩意有關(guān)

在SDK中找到第三方庫,發(fā)現(xiàn)這貨居然沒編譯(沒有生成.lib或.dll文件)

幸好它有工程項目文件,打開它直接用VS編譯

編譯后生成了glfw-imgui.lib文件

把它的文件名和它所在的目錄分別添加到屬性頁的輸入-附加依賴項和常規(guī)-附加庫目錄中


然后編譯運行程序,大功告成

// License: Apache 2.0. See LICENSE file in root directory. // Copyright(c) 2017 Intel Corporation. All Rights Reserved.#include <librealsense2/rs.hpp> // Include RealSense Cross Platform API #include "example.hpp" // Include short list of convenience functions for rendering// Capture Example demonstrates how to // capture depth and color video streams and render them to the screen int main(int argc, char* argv[]) try {rs2::log_to_console(RS2_LOG_SEVERITY_ERROR);// Create a simple OpenGL window for rendering:window app(1280, 720, "RealSense Capture Example");// Declare depth colorizer for pretty visualization of depth datars2::colorizer color_map;// Declare rates printer for showing streaming rates of the enabled streams.rs2::rates_printer printer;// Declare RealSense pipeline, encapsulating the actual device and sensorsrs2::pipeline pipe;// Start streaming with default recommended configuration// The default video configuration contains Depth and Color streams// If a device is capable to stream IMU data, both Gyro and Accelerometer are enabled by defaultpipe.start();while (app) // Application still alive?{rs2::frameset data = pipe.wait_for_frames(). // Wait for next set of frames from the cameraapply_filter(printer). // Print each enabled stream frame rateapply_filter(color_map); // Find and colorize the depth data// The show method, when applied on frameset, break it to frames and upload each frame into a gl textures // Each texture is displayed on different viewport according to it's stream unique idapp.show(data);}return EXIT_SUCCESS; } catch (const rs2::error& e) {std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n " << e.what() << std::endl;return EXIT_FAILURE; } catch (const std::exception& e) {std::cerr << e.what() << std::endl;return EXIT_FAILURE; }

運行結(jié)果:

imshow (C++)

// License: Apache 2.0. See LICENSE file in root directory. // Copyright(c) 2017 Intel Corporation. All Rights Reserved.#include <librealsense2/rs.hpp> // Include RealSense Cross Platform API #include <opencv2/opencv.hpp> // Include OpenCV APIint main(int argc, char* argv[]) try {// Declare depth colorizer for pretty visualization of depth data//rs2::colorizer color_map;// Declare RealSense pipeline, encapsulating the actual device and sensorsrs2::pipeline pipe;// Start streaming with default recommended configurationpipe.start();using namespace cv;const auto window_name = "Display Image";namedWindow(window_name, WINDOW_AUTOSIZE);while (waitKey(1) < 0 && getWindowProperty(window_name, WND_PROP_AUTOSIZE) >= 0){rs2::frameset data = pipe.wait_for_frames(); // Wait for next set of frames from the camera//rs2::frame depth = data.get_depth_frame().apply_filter(color_map);rs2::frame color = data.get_color_frame();// Query frame size (width and height)//const int w = depth.as<rs2::video_frame>().get_width();const int w = color.as<rs2::video_frame>().get_width();//const int h = depth.as<rs2::video_frame>().get_height();const int h = color.as<rs2::video_frame>().get_height();// Create OpenCV matrix of size (w,h) from the colorized depth data//Mat image(Size(w, h), CV_8UC3, (void*)depth.get_data(), Mat::AUTO_STEP);Mat image(Size(w, h), CV_8UC3, (void*)color.get_data(), Mat::AUTO_STEP);// Update the window with new dataimshow(window_name, image);}return EXIT_SUCCESS; } catch (const rs2::error& e) {std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n " << e.what() << std::endl;return EXIT_FAILURE; } catch (const std::exception& e) {std::cerr << e.what() << std::endl;return EXIT_FAILURE; }


怎么通道順序是反的?

添加了cvtColor函數(shù)后,好了(轉(zhuǎn)換一下通道RGB2BGR)

// License: Apache 2.0. See LICENSE file in root directory. // Copyright(c) 2017 Intel Corporation. All Rights Reserved.#include <librealsense2/rs.hpp> // Include RealSense Cross Platform API #include <opencv2/opencv.hpp> // Include OpenCV API//Dontla 20210827 (cvtColor函數(shù)) //寫出函數(shù),右鍵速覽定義,看頭文件在哪個目錄下,在附加包含目錄引用它并且在代碼中也要include一下 #include <opencv2/imgproc.hpp>int main(int argc, char* argv[]) try {// Declare depth colorizer for pretty visualization of depth data//rs2::colorizer color_map;// Declare RealSense pipeline, encapsulating the actual device and sensorsrs2::pipeline pipe;// Start streaming with default recommended configurationpipe.start();using namespace cv;const auto window_name = "Display Image";namedWindow(window_name, WINDOW_AUTOSIZE);while (waitKey(1) < 0 && getWindowProperty(window_name, WND_PROP_AUTOSIZE) >= 0){rs2::frameset data = pipe.wait_for_frames(); // Wait for next set of frames from the camera//rs2::frame depth = data.get_depth_frame().apply_filter(color_map);rs2::frame color = data.get_color_frame();// Query frame size (width and height)//const int w = depth.as<rs2::video_frame>().get_width();const int w = color.as<rs2::video_frame>().get_width();//const int h = depth.as<rs2::video_frame>().get_height();const int h = color.as<rs2::video_frame>().get_height();// Create OpenCV matrix of size (w,h) from the colorized depth data//Mat image(Size(w, h), CV_8UC3, (void*)depth.get_data(), Mat::AUTO_STEP);Mat image(Size(w, h), CV_8UC3, (void*)color.get_data(), Mat::AUTO_STEP);//Dontla 20210827Mat output_image;cvtColor(image, output_image, CV_BGR2RGB);//轉(zhuǎn)換// Update the window with new dataimshow(window_name, output_image);}return EXIT_SUCCESS; } catch (const rs2::error& e) {std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n " << e.what() << std::endl;return EXIT_FAILURE; } catch (const std::exception& e) {std::cerr << e.what() << std::endl;return EXIT_FAILURE; }

為啥opencv顯示是反色的,這是因為opencv它默認(rèn)展示圖片就是反色的,你得通過配置讓它正確顯示,可以用realsense的enable函數(shù)解決(主要看那個RS2_FORMAT_BGR)

rs2::config tempcfg; tempcfg.enable_stream(RS2_STREAM_DEPTH, 640, 480, RS2_FORMAT_Z16, 30); tempcfg.enable_stream(RS2_STREAM_COLOR, 640, 480, RS2_FORMAT_BGR8, 30);

總結(jié)

以上是生活随笔為你收集整理的Intel Realsense D435 C/C++调用code examples(附代码)(坑坑坑坑坑!!!)(opencv显示图片反色解决)的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯,歡迎將生活随笔推薦給好友。