Wednesday, February 7, 2024

_projective plugings are available in mitsuba3.5 instead of _reparam plugins

I wrote my problem about _reparam plugins in Mitsuba3 here two days ago. 

https://geblog3.blogspot.com/2024/02/mitsuba3-removed-reparameterizations.html

After that, I realized that I had read the stable version tutorial but installed the latest version of the library. The difference confused me. 

While _reparam is still available in the stable library whose version is 3.4.1, new plugins were introduced in the latest version 3.5.0 instead of _reparam. Those were projective plugins. In addition, the latest tutorial already described it. 

https://mitsuba.readthedocs.io/en/latest/src/inverse_rendering/projective_sampling_integrators.html

Thanks to this tutorial, I solved my problem. The plugins adjust shapes to the desired ones. 


Monday, February 5, 2024

mitsuba3 removed reparameterizations

I wrote a program for retargeting shapes using a reparameterization plugin in Mitsuba3, but the program did not work. I only knew that a line using the reparameterization plugin had stopped the program, but I did not understand what the problem was because the program was written along with the official tutorial. 

At first, I had not wondered if the official tutorial was old. But a day later, I wondered again it was real. Later, I found a page. The official tutorial was old.  

https://github.com/mitsuba-renderer/mitsuba3/pull/997

According to this, "Reparameterizations were completely removed from the codebase." OK, goodbye. 

Now, I have to reconsider what I use for retargeting shapes. 

----------------

The problem was solved with _projective plugins. https://geblog3.blogspot.com/2024/02/projective-plugings-are-available-in.html

Wednesday, January 3, 2024

libtorchでis_avaiable()がFalseを返すのを直したときのこと

Windowsで機械学習で有名なpytorchのc++ apiであるlibtorchを使い始めたが、たびたび???となることにぶつかる。

直近で当たったのが、ライブラリのバージョンやドライバーなどちゃんと整えたうえで、CUDAの利用が可否か問う関数、

torch::cuda::is_available()

で、0,falseが返ってきて、なぜか使えないと判定されたことである。

念のために、pythonで、

import torch

torch.cuda.is_avaiable()

を投げると、Trueで返ってくる。

つまり、torch的にも計算機の環境としてはCUDAが整っているのに、Libtorchを通した時におかしいという状態である。

考えてみると、これはDLLが別のものを指している可能性があって、Libtorchのフォルダ内のdllを先に参照できていないことが示唆される。

ということで、システム環境変数のPathを見てみると、libtorch/libが通っていないので、先頭に追加した。

結果、ちゃんと1, Trueが返ってきて、かつ、

device_type = torch::kCUDA;

torch::Device device(device_type);

model->to(device);

cout << "Is module CUDA? " << model->parameters()[0].is_cuda() << endl;

によって、GPUへの転送もうまくいっていることを確認できた。