-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about reproducing results with resnet backbone #12
Comments
Hi @hh23333 , can you share your full configuration? |
Hi, Vladimir. Thank you for your prompt reply. My full configuration is shown as follows:
|
Hi @hh23333, sorry for the delayed answer, I was looking into my old Wandb experiments to find which hyperparameters changed, here are some stuff you can try: change 'bn_foreg' for 'foreg', try random flip ('rf') in the transforms, use a bigger output feature size (dim_reduce_output: 1024) or disable the dim_reduce layer (dim_reduce: 'none'). Let me know If you manage to make this work! |
Hi, Vladimir. Thank you for your reply. I will try to implement your suggestions and see if they can improve the situation. If I have any results, I will let you know. |
Hi, @VlSomers , I have conducted the experiment following your suggestions and obtained the following results(I conducted each experiment twice): |
Hi @hh23333 , this is strange you have a much better rank-1 than what is reported in the paper while still having 1% less mAP. The experiments with ResNet-50 comes from an older version of the code, before a big refactoring, so there might be some changes in the implementation: I looked at the config used at that time to see if there's any difference with your config, but it's hard to be sure of it because some parameters have new names. Have you tried "- dim reduce + rf + foreg - bn_foreg" all at once? When you say '+ rf' did you just use 'rf' in the transforms or have you added it to the other transforms? You should use: "["rc","rf","re"]". Can you also try using "dist_metric: cosine" with all previous params? |
Hi @VlSomers , '+rf' means "["rc","rf","re"]", the experiment resullts with full config ("- dim reduce + rf + foreg - bn_foreg + cosine") is : It seems that adding rf will slightly degrade the results, using foreg instead of bn_foreg will greatly reduce the accuracy. |
@hh23333 Sorry to interupt you, but I met an issue: The pretrained model I used is "bpbreid_occluded_duke_hrnet32_10670.pth". Actually I don't know the size of the masks this project provided. I don't generate masks by myself. I wonder if you have met this issue before. |
@TInaWangxue, set is_check_shapes=False works well for me. |
Hello, Vladimir. I followed your paper and changed the backbone to resnet and the input size to 256*128. I repeated the experiment twice and attached the results below:
The rank1 is similar, but the mAP is lower. Could you please tell me what other settings I need to change? Thank you for your help.
The text was updated successfully, but these errors were encountered: