Handling Automated Traffic
The following describes best practices for monitoring and handling increased automated traffic to your site.
E-commerce traffic is moving from traditional search engines, to AI agents and chatbots. These systems can enhance the shopping experience, but they put strain on back-end systems with duplicate queries, increased cost, and distort your analytics.
To manage this, implement caching to handle repeated queries efficiently, use rate limiting, monitor traffic, and employ bot filtering. These strategies support beneficial bots while maintaining stable infrastructure, accurate data, and limiting malicious activity.
Failure to address these issues could lead to:
Exceeded licensed query limits
Distorted search analytics
Performance and cost issues
The following best practice guidelines, and implementation of, will be dependent on your systems and how you've integrated with Fredhopper.
As part of this you should also have implemented traffic monitoring on your integration with Fredhopper.
Managing Traffic
Secure your integration and filter out unnecessary traffic before it reaches Fredhopper (FHR). This helps keep your system efficient and prevents unwanted requests from impacting performance.
Access Control & Validation
Make sure you don't send unvalidated requests through by securing your proxy and middleware:
Validate requests before forwarding them to Fredhopper.
Block unauthenticated or anonymous traffic.
Use API keys, session tokens, or referrer checks.
Keep query logic in the backend to reduce exposure and ensure sanitisation.
Traffic Filtering
Control whether or not to allow automated bots to crawl your site.
Filter bot traffic using a CDN of WAF:
Detect and block scrapers and headless browsers.
Monitor and block traffic spikes from unknown agents.
Use robots.txt and user-agent filtering to control bot access.
Caching and Query Optimisation
Use caching, and make sure your queries are structured in a way they can be re-used and cached more frequently.
Cache popular category queries to reduce duplicate requests.
Apply short TTLs (e.g., 30–60 seconds) for freshness without excess hits.
Avoid overly complex or deeply nested queries. Simpler queries can be re-used and cached more easily.
Keep your query structures consistent across types.
Rate Limits & Throttling
Manage requests to your site and Fredhopper to avoid overload.
Set reasonable query limits per user/session/device.
Define thresholds to avoid overload.
Throttle abnormal behaviour automatically, using heuristics or behaviour-based rules.
Fail gracefully when you do hit limits:
Use Fredhopper’s built-in fail-safe features to handle timeouts, bad queries, or network issues without retry storms.
Provide fallback experiences in the UI if needed.
Please see also our Front-End Integration Best Practice guide.
Last updated

