赛派号

电动车换一个电机需要多少钱费用 How to ensure Gazebo is utilizing a graphics card?

Okay, so I think I solved the problem. After hing the graphics card driver reinstalled, it seemed to function properly (gzclient and gzserver are both listed as processes on the GPU after running nvidia-smi). That being said, it looks like only 8-9% of the GPU (about 550 MiB/7982 MiB) is being utilized. I'm wondering if people he achieved better utilization of their GPU, or if this looks about right?

Edit: wanted to add that the whole reason for this is question that the RTF of the simulation started out at 0.9 and would deteriorate to 0.1. I do know that I'm adding models continuously so that definitely contributes to the problem, but I'm deleting them at the same time, so the model shouldn't deteriorate that much over time. Is there something else I'm missing/could be improving on? Appreciate any thoughts.

Asked by luna on 2020-08-17 16:29:25 UTC

Comments

Another note: I was using NoMachine previously so that I could see graphics generated by the Ubuntu server I was running the simulation on. For some reason, when running nvidia-smi while NoMachine was running caused the GPU-utilization to be lower (8-9%). Without NoMachine, the GPU-utilization was about 25-35%, which seems much better, although I didn't notice a speed up in the RTF of the simulation. My guess is that NoMachine probably also utilizes the GPU/CPU of whatever it runs on as well.

Asked by luna on 2020-08-19 14:11:12 UTC

I think that gazebo use the gpu only for rendering purpose, for exemple if you he a camera or laser in your setup, or simply to display the gzclient. But the physics engine itself use only CPU.

Asked by Cl茅ment Rolinat on 2020-08-26 03:17:34 UTC

版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容, 请发送邮件至lsinopec@gmail.com举报,一经查实,本站将立刻删除。

上一篇 没有了

下一篇没有了