Scaling ElastiCache

5 minutes 5 Questions

Scaling ElastiCache is the process of adjusting the size and performance of your cache cluster to meet the changing demands of your application. You can scale your ElastiCache deployment horizontally by adding or removing cache nodes in the cluster, or vertically by changing the node type. Scaling …

Test mode:
AWS Certified Solutions Architect - Scaling ElastiCache Example Questions

Test your knowledge of Scaling ElastiCache

Question 1

An online store has implemented Amazon ElastiCache for Redis to cache frequently used data. They need to ensure the cache reliably scales horizontally with the increased read traffic. Which solution should be used?

Question 2

An application uses Amazon ElastiCache for Redis. During peak hours, p99 latency for GET/SET increases. Amazon CloudWatch shows CurrConnections consistently near the maxclients limit, while CPU utilization is below 40% and the cache hit ratio is stable above 95%. The cluster currently uses the default parameter group, but you will create and apply a custom parameter group to tune settings. Which configuration change should you implement to reduce latency under peak load?

Question 3

A gaming company runs Amazon ElastiCache for Redis (cluster mode disabled) in a single AWS Region using an m5.large primary with one read replica. Every 5 minutes, a leaderboard update triggers a brief but heavy spike of read requests, causing increased GET latency. The company wants to reduce read latency by scaling read throughput horizontally without changing node types, enabling cluster mode, or using cross-Region features. What should they do?

More Scaling ElastiCache questions
15 questions (total)