Category : openvino

Hello I’m currently working on yolov5 with openvino using c++, I got this error Unhandled exception thrown: write access violation. blob_data was 0x11102E62ECB8182 the type of blob_data is float* and also had this message in the output Exception thrown at 0x00007FFF03654ED9 in ylov5_openvino.exe: Microsoft C++ exception: InferenceEngine::NotImplemented at memory location 0x000000BF095FBEC0 I used try{} catch{} ..

Read more

I tried to use OpenVINO Inference Engine to accelerate my DL inference. It works with one image. But I want to create a batch of two images and then do a inference. This is my code: InferenceEngine::Core core; InferenceEngine::CNNNetwork network = core.ReadNetwork("path/to/model.xml"); InferenceEngine::InputInfo::Ptr input_info = network.getInputsInfo().begin()->second; std::string input_name = network.getInputsInfo().begin()->first; InferenceEngine::DataPtr output_info = network.getOutputsInfo().begin()->second; std::string ..

Read more

I am creating a .appimage in my system for an application which uses libraries from openvino..When deploying that on another system..it gives me symbol lookup error in opencv Mat() function.. The distributed system has openvino installed but it is a older version than what I used..So when running..the AppImage or AppRun from AppDir folder uses ..

Read more

I have no problems when working with dnn module But I have downloaded OPENVINO to use dnn with engine inference, and I can’t load the opencv_dnn452d.dll library When I go to the opencv subdirecotry in openvino, and execute opencv_version_win32d.exe, I get this output, that says that inference engine has 3 backends (ONETBB, TBB and OPENM) ..

Read more

Use C++ opencv dnn to call the license-plate-recognition-barrier provided by openvino to recognize the license plate. The width and height of the Mat returned by net.forward() are both -1. What is the reason for this? cv::dnn::Net net = cv::dnn::readNetFromModelOptimizer(lprModelXml, lprModelBin); net.setPreferableBackend(cv::dnn::DNN_BACKEND_INFERENCE_ENGINE); net.setPreferableTarget(cv::dnn::DNN_TARGET_CPU); cv::Mat img = cv::imread(imagePath); cv::Mat inputBlob = cv::dnn::blobFromImage(img, 0, cv::Size(94, 24), CV_8U); net.setInput(inputBlob, ..

Read more

I’m trying to run Inference on the Intel Compute Stick 2 (MyriadX chip) connected to a Raspberry Pi 4B using OnnxRuntime and OpenVINO. I have everything set up, the openvino provider gets recognized by onnxruntime and I can see the myriad in the list of available devices. However, I always get some kind of memory ..

Read more