libtorch initialize Adam Optimizer

  adam, c++, libtorch, pytorch, tensor

I am using libtorch to implement some optimize code in c++;
For some reason, I need to create a tensor that its value is between(-0.5,0.5) as a jitter for another tensor.
So I implement that in this way:

  1. I have two tensor(shape{8,3)} called vtxp_tensor and vtxc_tensor:

torch::Tensor vtxp_tensor = torch::from_blob(vtxp_data.data(), {8,3} torch::TensorOptions.dtype(torch::kFloat32)).requires_grad_(true);
torch::Tensor vtxc_tensor = torch::from_blob(vtxc_data.data(), {8,3} torch::TensorOptions.dtype(torch::kFloat32)).requires_grad_(true);

  1. create two tensor called vtx_pos_opt and vtx_col_opt:

auto vtx_pos_opt = torch::rand({8,3}, torch::TensorOptions().dtype(torch::kFloat32).required_grad(true)) - 0.5 + vtxp_tensor;

auto vtc_pos_opt = torch::rand({8,3}, torch::TensorOptions().dtype(torch::kFloat32).required_grad(true))

std::vector<torch::Tensor> optim_tensors{vtx_pos_opt, vtx_col_opt

  1. initialze Adam Optimizer:
    auto optimizer = torch::optim::Adam(optim_tensor, torch::optim::AdamOptions(1e-2));

When program come to the Adam Optimizer Initialization line, core dump with a Exception that I can’t figure it out:
Exception: Exception 0xe06d7363 encountered at address 0x7ff82a154f69

But if I just let vtx_pos_opt as this:

auto vtx_pos_opt = torch::rand({8,3}, torch::TensorOptions().dtype(torch::kFloat32).required_grad(true)) ;
Then everything goes fine.

So what is different?

Source: Windows Questions C++

LEAVE A COMMENT