Integrating Azure Cache for Redis with Azure Functions and serverless apps

Azure Cache for Redis provides an in-memory data storage service powered by Redis, allowing for low latency, high availability, and maximized throughput while caching data to process application requests.

Azure Functions provides a serverless computing service that allows for the execution of event-driven programming models without having to manage complex infrastructure, delivering seamless and cost-effective scalability.

Combined, these two enable faster development lifecycles, reducing complex infrastructure management. In this article, we explore both Azure Cache for Redis and Azure Functions, including their ability to accelerate microservice deployments, improve scale, and optimize performance.

Overview of Azure serverless platform with Azure Functions

Azure Functions provides capabilities for event-driven application development using languages such as Java, C#, Python, Powershell, and Javascript with seamless scaling and hosting options. As a serverless platform, it also unlocks the potential of using Azure data analytics for AI platforms to develop intelligent apps. Azure Functions can be integrated with Azure DevOps pipelines to streamline the development, deployment, and monitoring of the application lifecycle.

Azure Functions’ low-code/no-code platform means less infrastructure to manage and higher cost-effectiveness. Through seamless integration of triggers and bindings, it accelerates the development of independent modules of microservice applications that can be scaled flawlessly.

Azure Functions use cases

Azure Functions is ideal for multiple scenarios, including:

  • File uploads to Azure Blob through HTTP triggers: Functions uses BlobTrigger to process files in Azure Blob Storage.
  • Real-time data streaming: Telemetry data can be ingested through Azure Event Hubs, which can further process the data using event triggers; Microsoft SignalR can also message outputs to a low-latency multimodal NoSQL database like Azure Cosmos DB.
  • Scheduled task processing: Functions can also assist with the execution of scheduled tasks using cron jobs.
  • Serverless workflow automation: Azure Functions works as a compute component for serverless workflow automation of Logic Apps, which often leverages its Durable Functions extension to deploy long-running orchestrations.

You can learn more about Azure Functions use cases on Microsoft’s dedicated page.

Azure Functions triggers and bindings

Every function has one trigger that allows it to execute. A trigger containing data is referred to as the function’s payload. A binding entails the function’s orchestration and how it’s declaratively integrated with other resources like an input binding and output binding. Data from bindings is provided to the functions using parameters such as Blob storage, Azure SQL, and Queue Storage.

Visit Microsoft’s documentation for a detailed overview on Azure Functions triggers and bindings.

Benefits of Azure Functions

The advantages of implementing Azure Functions are many:

  • Serverless platform: As a serverless event-driven platform, it allows for the cost-effective execution of code with scalability and durability.
  • Event-driven code execution: Functions help execute code by triggering events, e.g., webhook triggers, HTTP triggers, queue triggers, etc.
  • Automated scaling: Functions can scale automatically based on load requirements or application traffic.
  • Seamless integration: As a serverless compute platform, it can integrate with Azure App Service and other Azure PaaS services, e.g., Azure Logic Apps, Azure SQL Database, Cosmos DB, Event Grid, Event Hubs, Azure AI/ML services.
  • Pay-per-service usage: Azure Functions has a flexible pricing model and incurs a fee only for the duration of the function’s execution time.

A detailed description of the benefits of Azure Functions is available.

Azure Functions best practices

For optimized design and performance efficiency when using Azure Functions, the following steps are recommended.

Choose the appropriate hosting plan

You must carefully select the hosting plan for Azure Functions, such as Consumption, Flex Consumption, Premium, and App Service; this determines scalability based on demand, resource availability, and advanced networking features like VNet Injection and Private Endpoints.

Use dedicated storage accounts

Apply a dedicated storage account for each Function application. For Event Hub-triggered functions or when application logic directly interacts with the storage account, you must use a dedicated storage account.

Understand memory requirements

Inspect the memory requirement while planning for the deployment of functions within the App Service plan; too many functions in the same App Service plan may cause memory bottlenecks. For high-performance requirements, consider using a dedicated App Service plan for each function.

Note: For single-threaded functions with the parameter FUNCTIONS_WORKER_PROCESS_COUNT, you will need multiple worker processes to increase concurrency to maximize throughput.

Azure Cache for Redis: Brief review

Azure Cache for Redis offers in-memory caching capabilities, which improve application and database performance via dynamic scalability with reliability.

It provides a scalable, low-latency dedicated Redis server engine for applications heavily dependent on backend data storage by maximizing throughput via data caching, content caching, and session storage. Azure Cache for Redis is offered both as a Redis Open Source platform and a Redis Enterprise dedicated cluster instance managed by Microsoft Azure.

Real-time scenarios using Azure Cache for Redis

Azure Cache for Redis can optimize database performance via in-memory content cache and session store caching. It is ideal in the following use cases:

  • Data caching: Azure Cache for Redis offers data caching capabilities to reduce database performance bottlenecks and maximize throughput using cache-aside design patterns.
  • Content caching for static websites: With such sites, Azure cache lets you apply content caching for content like headers, footers, and images.
  • Storing session data into cache: As an in-memory cache, Azure cache enhances web application performance through caching instead of storing the session in cookies.
  • Distributed queues: Azure Cache for Redis employs a distributed queue pattern for long-running tasks.
  • Atomic transactions: The cache service also supports atomic transactions for database caching and allows concurrency.

Azure Cache for Redis service tiers

There are five different service tiers available for Azure Cache for Redis:

  • Basic: No SLA; a single OSS Redis cache runs in a VM; only suitable for PoC or dev/test workloads
  • Standard: Up to 99.9% availability; OSS Redis cache runs in two VMs with replication
  • Premium: 99.9% availability; a high-performance OSS Redis cache runs on high-performance VM series for low latency and high throughput
  • Enterprise: High-performance Redis modules with 99.999% availability; supports libraries including RedisJson, RediSearch, and RedisTimeSeries
  • Enterprise Flash: Caching for memory-heavy workloads cost-effectively with 99.999% availability; supports Redis storage on non-volatile memory apart from Dynamic RAM on VM.

Scenarios of Azure Cache for Redis for optimized infrastructure capacity

Azure Cache for Redis runs on dedicated vCPUs, except the C0 tier, with a standard memory offering of 53 GB to 4.5 TB for the Standard to Enterprise Flash tiers. Companies should choose a tier based on:

  • Memory requirements: Applications with high traffic and client connections require more memory for Redis cache. Hence, the optimized tiers should be chosen for deployment as follows:
    • Standard (supports 250 MB - 53 GB)
    • Premium (6 GB-1.2 TB)
    • Enterprise (1 GB - 2 TB)
  • Maximum performance: The Premium and Enterprise tiers offer faster hardware and more cores for vCPUs, allowing optimized application performance.
  • CPU cores: A higher number of vCPUs generally enhances the performance of Azure Cache for Redis.
  • Scaling: Scaling out Azure Cache of Redis provides a performance boost apart from scaling the tier. Vertical scaling (up) is advised only for the Enterprise and Enterprise Flash tiers to maximize performance versus horizontal scaling of the cache instance.
  • Maximum network bandwidth: For enterprise workloads requiring high throughput, low latency, and scaling, the Premium and Enterprise tiers offer larger cache sizes, which in turn facilitates more bandwidth, reducing network latency and saturation.

Scaling & performance optimization of Azure Cache for Redis

The different tiers of Azure Cache for Redis impact scaling and performance. Azure Cache for Redis allows for instances to be scaled via vertical scaling (scaling up) and horizontal scaling (scaling out.

Scaling best practices

Monitoring performance through Azure Monitor to scale Azure Cache for Redis is required with respect to cache sizes, CPU load, and memory configurations.

Some of the typical scaling conditions are based on the scenarios given below.

  • High server load: The single threaded Redis server instance facilitates distributing overhead functions across multiple Redis processes, enabling distributed TLS encryption/decryption, faster cache speed, etc.
  • High usage of memory: Horizontal or vertical scaling is helpful for cache scenarios requiring high memory.
  • Concurrent client connections: Vertical scaling or scaling up of the cache tier is useful when the cache size limit reaches the number of concurrent client connections.
  • Bandwidth: During network bandwidth exhaustion, where the Redis server exceeds the available bandwidth, apply either horizontal/scaling out or vertical/scaling up of the cache tier.

Achieving optimized Redis cache performances

Scaling up of the Redis tier, especially from the Enterprise to Enterprise Flash tier, since it is not single threaded, and with multiple vCPUs, it can boost the performance and maximize throughput.

Scaling out is recommended for the Standard or Premium tier of Azure Cache for Redis due to single-threaded Redis server instances; clustering can enable distributed functions across multiple Redis processes to accelerate performance.

Lastly, randomly opening and closing client connections is expensive for Redis servers; it also throttles the server performance. New client connections should be staggered to prevent a steep spike in the number of client connections.

Monitoring server load performance through cache

Redis server load and data size values are important parameters to consider when monitoring server load performance through Azure Cache for Redis. The number of client connections and integrated pipelines are also significant factors. Different parameters influence the throughput and latency for Redis:

  • A smaller cache size is useful, e.g., <100 KB, rather than a smaller number of larger cache sizes which may increase latency.
  • Retry principles enable transient fault handling capabilities for Azure Redis cache for complex commands and improve application reliability for applications.
  • It is always recommended to avoid long-running commands, as they cause latencies or timeouts for caching.
  • Monitor CPU usage of the Redis server.

Performance testing of Azure Cache for Redis

There are several key factors to consider when testing performance. The number of client connections, data cache sizes, and your CI/CD pipeline will all impact the overall latency and throughput metrics for Azure cache.

Two popular performance benchmark tools are redis-benchmark and memtier-benchmark.

As per Redis benchmark recommendations, it’s advisable to use the Enterprise or Premium tier, where cache sizes have better network latency and throughput due to running on advanced VM tiers. The Enterprise tier has the best performance due to having multiple CPUs at your disposal; these accelerate application performance via Redis modules with 99.999% availability.

Conclusion

Azure Cache for Redis facilitates serverless workflows by integrating triggers via Azure Functions. This can promote advanced data caching patterns such as write-behind cache and cache-aside patterns for enterprises.

This caching system integrates with Azure Functions to deliver applications with high performance, low latency, and maximized throughput.

In Part 2 of this article, we will demonstrate a detailed overview of the integration of Azure Cache for Redis with Azure Functions, as well as provide in-depth coverage of observability and monitoring.

Was this article helpful?

Related Articles