Tips for Creating Effective SLAs

You can’t manage what you can’t measure, which is why service level agreements (SLAs) are so vital inoutsourcing relationships. Creating SLAs that help the bottom line requires focusing on business outcomes, adapting SLAs as business needs change, and making them as specific and quantitative as possible.

As outsourcing buyers use service levels to measure the performance of a provider, service level agreements (SLAs) are one ofthe buyer’s fundamental vendor-performance management activities. An SLA’s agreed-upon quantitative provider requirements establish the baseline performance levels and define the monetary credits or other remedies associated with a provider’s failure to meet the standards. The principal role of an SLA is to align buyer and provider objectives.

First Things First: Constructing a Relevant Service Level Portfolio

Developing an effective SLA starts with creating service levels that relate as closely as possible to the buyer’s key business imperatives. After determining those imperatives, the buyer can effectively construct individual service levels.

Here are a few tips on setting truly relevant service levels:

• Service levels should focus the provider on understanding and meeting the buyer’s desired business outcomes.

• Service level design should be driven by requirements, not data. Just because a parameter is easily measured doesn’t make it suitable as a service level, and relating provider performance to its impact on the buyer’s business is more relevant than focusing on commodity transaction counts.

• An ideal set of service levels should be both collectively exhaustive – meaning a provider cannot fail to meet expectations without failing to meet at least one service level – and mutually exclusive, meaning no two SLAs measure different aspects of the same symptom.

• In general, eight to 10 service levels should be sufficient to align the overall goals of the buyer and the provider.

Keep the Structure Flexible

When initially designing an SLA framework, the buyer should keep its structure flexible enough to be substituted with alternative metrics as their needs change, especially in the case of broad, complex, and/or long-term service relationships. These metrics should include key performance indicators (KPIs), which are additional meaningful metrics not initially considered as critical as service levels. Each metric has a performance target but no associated credit, although credits may be implemented for missing an aggregate number of KPIs.

In addition, alternative metrics should include reports, metrics against which the provider is required to report to inform of potential issues, but which carry no explicit performance targets or credits/remedies. The circumstances under which a buyer can exchange SLAs and KPIs should be defined, such as a cap on the number of exchanges per year and the automatic promotion of KPIs to service levels if a provider fails to meet the prescribed performance a certain number of times in a specific period.

Building Individual Service Levels

Once the service level portfolio is defined, focus shifts to making individual service levels measurable and relevant. The effort expended to define and implement high-quality service levels will be repaid many times through the contract period.

Following are three key pointers on how buyers can properly construct individual service levels:

1. Take time to ensure the service levels are as unambiguous and quantitative as possible. While “user satisfaction” might seem to be the ultimate expression of a desired business outcome, the details of implementation typically make it meaningless in practice. With sufficient effort, the buyer can generally define what “user satisfaction” really means and convert that to measurable parameters, such as responsiveness or availability.

2. Do not be bound by convention in making service levels closely reflect the user experience across the entire scope of the provider’s responsibilities. For example, SaaS vendors (to the extent that they offer SLAs) will typically measure availability, response time, etc., from within their data centers. This methodology excludes the performance of the provider’s Internet connection, routers, and security/VPN termination infrastructure.

3. To minimize ambiguity, include the following components: Short name reference (for manageability and ease of reference), a full definition (description of the measurement, points of demarcation), measurement parameters (data sources and data fields used, frequency of measurement), calculation (calculation frequency, averaging approach used ‐ preferably in form of a formula), required performance from the provider, and specific exceptions/exclusions (beyond broad exclusions dealt with in the master agreement such as Force Majeure), with performance excused only to the extent affected by, and only for the duration of, the exception circumstance.

In BPO agreements where services are often labor-based, service levels relating to the timeliness (productivity) or quality (data entry accuracy) of execution provide meaningful measures of delivery performance.

Source: Global Delivery Report

Download Infographic

Fill out the form below and see why Centris is the right choice for your call center support.

This field is required
This field is required
This field is required
This field is required
This field is required
This field is required
Fields with * are required