-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About 3D-LiDAR Sensor #150
Comments
Hi
|
I am working on implementing the orientation-related state processing as described in your paper 'You Are Not Alone: Towards Cleaning Robot Navigation in Shared Environments through Deep Reinforcement Learning.' I am unsure if there are any errors in the code I wrote and would like to confirm this with you. Thank you. |
hi, def get_sincos_dist(self, odom, goal_x, goal_y):
# Calculate robot heading from odometry data
odom_x = odom.pose.pose.position.x
odom_y = odom.pose.pose.position.y
quaternion = Quaternion(
odom.pose.pose.orientation.w,
odom.pose.pose.orientation.x,
odom.pose.pose.orientation.y,
odom.pose.pose.orientation.z,
)
euler = quaternion.to_euler(degrees=False)
angle = round(euler[2], 4)
pose_vector = [np.cos(angle), np.sin(angle)]
goal_vector = [goal_x - odom_x, goal_y - odom_y]
distance = np.linalg.norm(goal_vector)
cos, sin = self.cossin(pose_vector, goal_vector)
return distance, cos, sin, angle
@staticmethod
def cossin(vec1, vec2):
vec1 = vec1/np.linalg.norm(vec1)
vec2 = vec2/np.linalg.norm(vec2)
cos = np.dot(vec1, vec2)
sin = np.cross(vec1, vec2).item()
return cos, sin |
Thank you for your contribution!
I have two questions regarding 3D LiDAR:
In the environmental code, when processing the point cloud, data with a Z-axis value greater than -0.2 is required. If I raise the position of the 3D LiDAR from 0.23 to 0.35, do I need to change the -0.2 to -0.3?
Regardless of whether the input is 180° or 360°, by using the self.garps angle range limitation, it is possible to output the smallest 20 values in the 180° directly in front. Is that correct?
Because I adjusted the 3D LiDAR's range to 360°, but I only need the output of the minimum values for the front 180°.
I was reviewing the callback code and found that normalization is done using the angle between the robot's origin and the point cloud. So, I want to confirm this.
Thank you.
The text was updated successfully, but these errors were encountered: