Query API Fail Safe Features
This section describes the fail-safe setup of the Query API endpoints of Fredhopper Services and provides a guideline for the integration of this API.
Last updated
This section describes the fail-safe setup of the Query API endpoints of Fredhopper Services and provides a guideline for the integration of this API.
Last updated
The Query API accepts requests usually sent from the front-end and responds with the results. The diagram below depicts the overview of the infrastructure setup (see the Infrastructural overview section below), shows the path of a query request (see the Mechanisms section below) and indicates the fail-safe components:
By front-end we mean the complete set of components in the web application on the customer's side involved with the integration of the Query API. One could think of the application servers and the underlying infrastructure, proxies and other network appliances, etc.
The front-end web application connects to the Query API endpoint via the Internet.
We register the DNS names for the API end-points with a 3rd party DNS service provider. To ensure high availability, each DNS record for the Query API endpoint contains a set of public IP addresses for our load balancers and adheres to the Round-Robin DNS principle. In short, this means that every time the DNS name gets resolved into an IP address that is different from the previous time. In order to minimise the chance of issues related to DNS caching, we set the TTL to 30 seconds. Your Technical Consultant can provide you with a full list of the IP addresses in our range so that you can ensure that these are whitelisted on your-side, if this is required.
The hostname alone does not identify any specific service instance. Instead, the service instance is referenced in the path, e.g. https://query.<configuration state>.<service instance>.<service>.<region>.fredhopperservices.com
Each of our load balancers fulfils the following functions:
Authentication - only authenticated requests are accepted.
Load balancing - once authenticated, the load balancer passes the request to the correct service instance. Depending on the required capacity, we can scale out the load balancing tier. This tier consists of multiple identical components for both capacity and redundancy purposes: all load balancers have identical configuration/functionality and are placed in different geographical locations. When a load balancer needs to undergo maintenance, its IP address will be removed from the DNS record and the maintenance will start once traffic stops flowing to it. The load balancer IP address is added back in to the list when the maintenance is over.
By default, the nature of the load balancing tier is multi-tenant.
Each service instance consists of a single indexer and a set of query servers. We can adjust the capacity of the service instance based on the load, which literally means scaling out the query servers tier. This tier consists of multiple identical components for both capacity and redundancy purposes: the query servers are identical and are placed in different geographical locations. During reconfiguration of the service instance, each of its components is removed from the records in the load balancing tier. This happens sequentially, ensuring the continuous availability of the service instance.
Each service instance is dedicated to a customer account.
The front-end web application composes the query and prepares it to be sent to the Query API endpoint.
The Query API endpoint DNS name gets resolved. The DNS authority answers with the external IP of one of the load balancers.
The front-end sends the query to the IP address using the Host header to indicate which service instance to use and supplying the account credentials.
The load balancer receives the query, authenticates the request and uses the 'Host' value in the header section of the HTTP request to decide to which service instance this request should be forwarded to. The query is passed to an active query server in the pool and will select the one that has the least connections open.
The query server processes the response, logs it and sends it back to the load balancer.
The load balancer logs the query and sends it back to the front-end.