-
Notifications
You must be signed in to change notification settings - Fork 253
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Crawlers / Bots Getting Sessions - crawler-session Not Working #1506
Comments
Maybe it's time to update the user agents we are checking? From here it seems Googlebot might not be enough for every Google related bot. |
Thanks for your answer @miguelbalparda! I just updated our crawler user agents list to:
This should definitely catch all Google bots. However, based on the amount of sessions created, I think the current implementation simply does not work properly. Especially, because I already excluded Any other idea @miguelbalparda? |
Unfortunately, the updated crawler user agents list did not help. In the meantime, I tried to debug the issue on a clean test environment (only Magento 1.9.3.9 with sample data and Turpentine 0.7.4). I could reproduce that multiple sessions are generated if the site is opened with a crawler user agent. Two ideas (just guesses until now):
If you @miguelbalparda or anyone else have any input, I am more than thankful. |
I can confirm that idea 1 is the issue. ESI requests each lead to a real user session - the fake crawler session does not work for ESI requests. Even though the fake frontend cookie IS added to the Magento request:
It is ignored by Magento:
Any idea? |
@sprankhub I know this issue is a bit older, but did you perhaps find a solution in the meantime? Because I am experiencing the exact same issue in one of my projects.. |
No, unfortunately not, @christophmassmann :-( |
Alright, thanks for your feedback anyhow, @sprankhub! |
We encountered a huge amount of sessions at a customer's shop and analysed where they come from:
grep -Rl 'Googlebot' | wc -l
undervar/session
)We have a pretty much standard Turpentine setup without any major customisations. We use Apache as our backend server and nginx for SSL offloading. We use Varnish 4.1 and Turpentine 0.7.3. Here is our VCL:
If I understand correctly, the following part should prevent the session generation for all known crawlers - all crawlers should get a dummy
crawler-session
:However, sessions are still created for crawlers, which leads to various issues. Did anyone encounter this behaviour and knows how to fix it?
The text was updated successfully, but these errors were encountered: