Flexible Load Balancer

Oracle Cloud Infrastructure (OCI) Flexible Load Balancer is a highly available, cloud native service to distribute incoming application connections automatically, from the internet and internally, to multiple compute resources for resiliency and performance. Load balancers can distribute traffic across multiple fault domains, availability domains, and OCI regions based on persistence, request, and URL characteristics.

Use cases for OCI Flexible Load Balancer

OCI Flexible Network Load Balancer use cases diagram, description below

This image shows three common use cases detailing how flexible load balancers can be used by customers. These use cases are:

  1. Automatically distributing application load across resources
  2. Modernizing and creating resilient applications
  3. Distributing requests based on traffic characteristics

Automatically distributing application load across resources

In the first of three use cases, a virtual cloud network is shown. It contains a flexible load balancer that is bidirectionally connected to two virtual machines, which are in the same virtual cloud network.

The load balancer is bidirectionally connected to an external user outside of the virtual cloud network.

Requests come in from the external user to the load balancer, which can send the request to either virtual machine. This enables the application to support more users than a single virtual machine can handle.

Modernizing and creating resilient applications

In the second of three use cases, a virtual cloud network is shown. It contains a flexible load balancer. In the same virtual cloud network are two groups. Each group has a virtual machine and a database. These represent two instances of a legacy, non-cloud-native application.

The load balancer is bidirectionally connected to each group.

Requests come in from users to the load balancer, which can send the request to either legacy application. This enables a legacy application to support more users than normal by distributing user requests to multiple instances of the legacy application.

Distributing requests based on traffic characteristics

In the third of three use cases, a virtual cloud network is shown. It contains a flexible load balancer that is bidirectionally connected to three virtual machines.

Requests are sent to the first virtual machine based upon values in the HTTP header of the request.

Requests are sent to the second virtual machine based upon values in the virtual hostname of the request.

Requests are sent to the third virtual machine based upon values in the URL of the request.

This enables requests to be directed to a different resource based upon HTTP-related values in the request itself.

Benefits of Flexible Load Balancer


1. High-performance, automatic application distribution

A load balancer improves resource utilization by directing requests across application services that operate in parallel. As demand increases, the number of application services can be increased, and the load balancer will use them to balance the processing of requests.

2. Modern, highly resilient applications

Legacy applications that are monolithic typically scale by running on larger hardware. Using load balancers, smaller but multiple instances can be run in parallel while still presenting a single entry point. For both legacy and cloud native application resources, the load balancer will stop using backend resources that become non-responsive, directing requests to healthy resources.

3. Hybrid and multicloud applications

Application services can live in multiple locations, including OCI, on-premises, and other clouds. A load balancer provides a convenient, single point of entry, and can direct requests to the appropriate backend, which can be in OCI, on-premises, or on other clouds.

How does OCI Flexible Load Balancer work?

OCI Flexible Load Balancer supports web requests (HTTP, HTTPS) and application-layer traffic using TCP. A public load balancer accepts traffic from the internet while a private load balancer does not.

A load balancer has listeners that accept a single request type (HTTP, HTTPS, TCP). It can support multiple listeners in order to accept multiple streams.

Load balancers are regional services. Each load balancer has two load balancer devices that provide failover capability. In a region with multiple availability domains, the devices will be automatically distributed among two of the availability domains.

Define one or more back-end sets and then include compute resources as back-end servers in these sets. Then you can define health checks so the load balancer can determine whether a compute resource is operational or should be excluded.

Session persistence is available, which helps ensure that requests from a particular client will always go to the same compute resource.

Requests are directed to the compute resources based on one of multiple routing strategies, such as the least load.

Optionally, you can define routing policies based on HTTP header or URL to further direct requests to specific compute resources.

Read the documentation

OCI Network Load Balancer diagram, description below

This image shows a logical layout of resources and connections in a typical flexible load balancer architecture.

An OCI region is shown. Inside the region are three separate availability domains numbered one, two, and three.

A virtual cloud network crosses and includes all three availability domains. The virtual cloud network also includes an internet gateway that is bidirectionally connected to the internet.

Within the first availability domain is the first subnet. It contains the primary flexible load balancer.

Within the second availability domain is the second subnet. It contains the failover load balancer.

Between the two availability domains and subnets is a listener with a public IP address. The listener is bidirectionally connected to the internet gateway. It can receive requests from the internet.

The listener is part of a group that includes both the primary and failover flexible load balancer. This group acts as a logical load balancer that continues operating, even if the primary load balancer component fails.

There are three more subnets, one in each availability domain. They are numbered three, four, and five. Each subnet has compute resources.

The primary load balancer is bidirectionally connected to subnets three, four, and five. The failover load balancer is bidirectionally connected to subnets three, four, and five.

Requests come in from the internet to the listener and are sent to the primary load balancer. The load balancer then routes the request to one of the subnets with resources based on weights or HTTP characteristics.

Product tour

Set up your flexible load balancer

Create a load balancer - Add details view

Choose a public or private load balancer

Pick the type of flexible load balancer, what kind of IP address to use, and what subnet to use.

Select the minimum and maximum bandwidth you want the load balancer to scale to.

Create a load balancer - Choose backends view

Choose the distribution type

Select the type of distribution that the load balancer will enable for incoming traffic.

You can also select the type of health check that will verify the condition of each back end.

Create a load balancer - Configure listener view

Identify the back end

Here you’ll also select the type of health check that will verify the condition of each back end.

Create a load balancer - Manage logging view

Identify the back end

Here you’ll also select the type of health check that will verify the condition of each back end.

Reference architectures and solution playbooks

See all reference architectures

Deploy a highly available web application

This reference architecture shows a highly available web application running in OCI using load balancers.

Understand modern app deployment strategies with OCI DevOps

This reference architecture shows how to implement modern DevOps architecture using load balancers.

Implement a custom error page for a load balancer using cloud native services

This reference architecture uses native OCI monitoring and notification services to respond to load balancer threshold conditions, call Oracle Functions to evaluate the condition, and use redirect rules to forward custom error messages stored in OCI Object Storage.


Get started with Flexible Load Balancer


Oracle Cloud Free Tier

Build, test, and deploy applications on Oracle Cloud—for free. Sign up once, get access to two free offers.


Contact sales

Interested in learning more about Oracle Cloud Infrastructure? Let one of our experts help.