Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Ali's RWHE calibration method #939

Open
Kazadhum opened this issue May 3, 2024 · 10 comments
Open

Implement Ali's RWHE calibration method #939

Kazadhum opened this issue May 3, 2024 · 10 comments
Assignees
Labels
enhancement New feature or request

Comments

@Kazadhum
Copy link
Collaborator

Kazadhum commented May 3, 2024

The idea is to implement the RWHE-Calib method for Hand-Eye calibration in ATOM.

Similarly to OpenCV, they have two different methods:

  • Hand-Eye, which solves the $AX=XB$ equation;
  • Robot-World/Hand-Eye, which solves the $AX=ZB$ equation.

Am I correct in assuming we only want the second, @miguelriemoliveira? I remember we spoke of this on Monday because of the OpenCV calibration and if I recall correctly I think we landed on just using RWHE.

By the way, ATOM has a script called "convert_from_rwhe_dataset.py". Are these scripts referring to this method?

@Kazadhum Kazadhum self-assigned this May 3, 2024
@Kazadhum Kazadhum added the enhancement New feature or request label May 3, 2024
@miguelriemoliveira
Copy link
Member

Am I correct in assuming we only want the second, @miguelriemoliveira? I remember we spoke of this on Monday because of the OpenCV calibration and if I recall correctly I think we landed on just using RWHE.

I think the hand eye should solve both variants, eye-in-hand and eye-to-world. In any case you can use Robot-World/Hand-Eye, I think its better.

I have been fighting with the opencv's method and still could not make it work.

By the way, ATOM has a script called "convert_from_rwhe_dataset.py". Are these scripts referring to this method?

Yes, we used Ali's method, but in the matlab scritp.

@Kazadhum
Copy link
Collaborator Author

Kazadhum commented May 3, 2024

Hi @miguelriemoliveira!

I think the hand eye should solve both variants, eye-in-hand and eye-to-world. In any case you can use Robot-World/Hand-Eye, I think its better.

Ok, that sounds good.

I have been fighting with the opencv's method and still could not make it work.

If you want, I could join you in Zoom after lunch to try to help. Have you made any progress or is it how we left it?

Yes, we used Ali's method, but in the matlab scritp.

Got it! These scripts will help a lot, I think.

@miguelriemoliveira
Copy link
Member

If you want, I could join you in Zoom after lunch to try to help. Have you made any progress or is it how we left it?

I changed a lot, trying to cleanup the code. Dit not help.

I will call if I have some time this afternoon.

@Kazadhum
Copy link
Collaborator Author

Kazadhum commented May 3, 2024

Ok, sounds good!

@Kazadhum
Copy link
Collaborator Author

Kazadhum commented May 8, 2024

Picking this back up...

@Kazadhum
Copy link
Collaborator Author

Kazadhum commented May 9, 2024

Possible progress! The Li method for calibration in the code base is translated to Python (hopefully correctly!). Now I need to get the comparison working so I can test if it's working properly.
I can't say I understand the code exactly, but I think the translation to Python is correct.

@miguelriemoliveira
Copy link
Member

That's good, but running is the real deal.

Looking forward to see if it runs correctly.

@Kazadhum
Copy link
Collaborator Author

Kazadhum commented May 13, 2024

Hello all! I think I closed this by accident this morning...

Some notes:

  • I just found out the method I was implementing from this repository was not Ali's, but Li's, so I'll work on Ali's next (I did some debugging by running the MATLAB code with the A and B matrices hardcoded);
  • I got Li's method working for the eye-in-hand case, I'll make another script for the eye-to-hand case;

For the eye-in-hand case, running:

rosrun atom_evaluation li_eye_in_hand.py -c rgb_hand -p pattern_1 -bl base_link -hl flange -json $ATOM_DATASETS/rihbot/train_test_opencv/dataset.json -ctgt

we get:

After filtering, will use 5 collections: ['000', '001', '002', '003', '004']
Selected collection key is 000
Calculating A and B matrices for collection 000...
Calculating A and B matrices for collection 001...
Calculating A and B matrices for collection 002...
Calculating A and B matrices for collection 003...
Calculating A and B matrices for collection 004...
Ground Truth h_T_c=
[[ 0.00000000e+00  1.11022302e-16  1.00000000e+00 -2.00000000e-02]
 [-1.00000000e+00 -2.22044605e-16  1.11022302e-16  0.00000000e+00]
 [ 0.00000000e+00 -1.00000000e+00  0.00000000e+00  6.50000000e-02]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  1.00000000e+00]]
estimated h_T_c=
[[ 1.49042521e-04 -3.88510070e-04  9.99999913e-01 -1.96416458e-02]
 [-9.99999898e-01 -4.25657727e-04  1.48877146e-04 -9.81041564e-04]
 [ 4.25599850e-04 -9.99999834e-01 -3.88573471e-04  6.44773915e-02]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  1.00000000e+00]]
Etrans = 0.621 (mm)
Erot = 0.018 (deg)
+----------------------+-------------+---------+----------+-------------+------------+
|      Transform       | Description | Et0 [m] |  Et [m]  | Rrot0 [rad] | Erot [rad] |
+----------------------+-------------+---------+----------+-------------+------------+
| flange-rgb_hand_link |   rgb_hand  |   0.0   | 0.000382 |     0.0     |  0.000321  |
+----------------------+-------------+---------+----------+-------------+------------+

Now creating a copy of this script for the eye-to-hand case and adapting it accordingly...

@Kazadhum
Copy link
Collaborator Author

Kazadhum commented May 15, 2024

Working on implementing Shah's method, also from that repo. Since it is similar to Li's, the implementation should be relatively simple.

@Kazadhum
Copy link
Collaborator Author

Bug fixing on Li's method after testing on the (simulated) riwmpbot revealed issues

Kazadhum added a commit that referenced this issue Jun 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Development

No branches or pull requests

2 participants