Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

one_d_rpm breaks with different masses for HoverAviary #225

Open
KevinHan1209 opened this issue Jun 28, 2024 · 3 comments
Open

one_d_rpm breaks with different masses for HoverAviary #225

KevinHan1209 opened this issue Jun 28, 2024 · 3 comments

Comments

@KevinHan1209
Copy link

Hi,

I am running into an odd issue where if I change the mass by a small amount, say 0.05 kg, using the pybullet.changeDynamics() method in the _housekeeping() method, the training for one_d_rpm gets stuck at a very small reward and never improves. In some cases, during the rendering/evaluation it just oscillates above and below 1m. I have no idea why this is. It seems that one_d_pid does not have this issue and works well. I'm also slightly confused about the 5% parameter when updating RPMs in relation to the hover RPM. Does anyone have any experience with this?

@abdul-mannan-khan
Copy link

abdul-mannan-khan commented Jul 18, 2024

Yes. There is a bug in this HoverAviary.py. It only works for the given example. You make small changes like mass, or action or even goal point, it will not work. Particularly, if you change goal point from (0,0,1) to (1,0,1), it will not work. I guess you have to modify reward function.

@KevinHan1209
Copy link
Author

I don't think it's the policy or the environment (reward function) that's the problem but rather the simulation settings itself. It doesn't make sense that a reward function would suddenly fail with marginal changes in mass. I'm suspecting the PyBullet physics implementation may be wrong. I saw that the thrust2weight ratio was also set as a constant in the urdf file, which doesn't make sense to me either.

@abdul-mannan-khan
Copy link

I agree with you. I am also facing the same problem. Given the fact that the developers are busy, I think it would be much better to make own environment using ChatGPT.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants