A scalable cloud
based architecture has converged upon a common design meta-pattern
leveraging the following components, which are explained below:
service Load Balancer
Server Facade Nodes
Memory State Cache
Client requests hit
the Forward-Cache Proxy which returns any memoized service
requests if it can. If a new request signature passes the forward
cache proxy, it hits the Load Balancer which dynamically
allocates the request to one of the Server Facade Nodes. The
server facade nodes provide a REST API surface over the targeted
backend components. They are stateless, with state being
stored in a distributed Shared Memory State Cache.
Client facing component
a TTL expirable content cache – like a CDN. Rapidly return
previously cached request responses for duplicate requests, without
hitting the backend.
Cloud based solutions offer redundancy and distributed locality
(eg leveraging Akamai) – for faster hits.
Elastic Service Load Balancer
Distribute incoming traffic across multiple instances of a
Dynamic – scheduling based on weighted round robin for resource
balanced distribution to non-busy host nodes, via a policy driven
Supports failover / fault tolerance, request priority queues,
auto-scaling, config policies, service virtualization
Stateless Server Facade Nodes
The facade logic (custom code) is implemented here – a
REST service layer shim over some back-end component.
recommended to leverage PAAS (platform as a service) – easily
deployable micro-service components that slot into an existing host
server, without complexity of building the infrastructure.
Stateless (all state is stored in another component – the
distributed shared memory state cache) → Avoid need for complex
sticky load-balancing, can spin up multiple nodes on demand.
Note: it is possible to setup complex orchestrations and
virtualizations across micro-services.
Shared Memory State Cache
A distributed in-memory key-value cache stores all state for
server facade nodes – allowing them to scale out / failover without
worrying about sticky load balancing.
Server facade nodes try to hit this cache before attempting to hit
the back end on cache misses.
Cache can span multiple server nodes and grow dynamically.
Supports replication failover, and it is possible to setup a
background thread mechanism to push to persistent store