Search
  • wardclassen

Service Levels: The Devil is in the Details


Service Level Agreements (SLAs)

1. Overview

Service level agreements (SLAs) set forth mutually agreed performance metrics to measure the licensor’s performance. In a software transaction, they usually measure whether the software meets certain mutually agreed performance standards. They incentivize the licensor to meet the standards by providing the customer financial credits for the licensor’s failure to do so. Service levels should not be attached to every aspect of the licensor’s performance but rather to those aspects of the licensor’s performance that have the greatest impact on the licensee.

From the customer’s perspective, service levels should measure important aspects of the vendor’s performance that will determine if the vendor has successfully met its contractual obligations. The amount of the service level credit should be large enough to influence the vendor’s actions, and such credits should not be declared to be liquidated damages and the customer’s sole and exclusive remedy. The remedy for chronic service level failures should not be service credits but rather termination of the agreement for cause.

SLAs set forth the Service Level Credit (SLC) that the licensee is eligible to receive and the Service Level Bonus (SLB) that the licensor is eligible to receive for performing at or above a set level. These credits are usually more common in outsourcing transactions, Internet service provider agreements (ISPs), and application service provider agreements (ASPs) than in general software license agreements. Common metrics covered by SLAs are application availability (downtime limits), response time, refresh rates, maintenance, help desk response, network availability and business/operational processes, and mean time to report (MTTR). The nature of the applicable SLAs will depend on the type of transaction.

From the customer’s perspective, the principal purpose of SLAs is to ensure that the customer receives the same level of services that it contracted for or received prior to the vendor providing the services in question. Often, however, the customer lacks the necessary information to quantitatively document the prior level of services it received. To the extent the customer can quantify prior performance levels, the data is often not detailed enough to create meaningful benchmarks. As such, customers often agree to SLAs without a full understanding of the performance standards the vendor should meet or of the metrics that should be measured.

The SLA should set forth each party’s obligations, such as notifying the other party of its noncompliance, corrective actions and response obligations. Further, the SLA should clearly state the measurement methodologies, such as daily, monthly and yearly calculations, as well as the type of credit. Most licensors will agree only to a credit against future services versus a cash payment to the customer. The SLA needs to be carefully drawn to address any factors outside its control, as the licensor’s performance could be affected by a number of factors, such as hardware and collateral third-party software.

In defining its obligations, the licensor should exclude from calculating any time-sensitive service level obligations; third-party problems such as hardware, telecommunications, and infrastructure links; routine maintenance; emergency maintenance, and the like. Further, the licensor should clearly set forth any requirement or obligation of the customer on which its obligations are premised, for example, a certain hardware configuration.

From the licensor’s perspective, the payment of service level credits should be in full satisfaction of any liability on the licensor’s behalf for the failure to meet the stated metrics. From the licensee’s perspective, service credits are not damages and are not intended to compensate the licensee for any harm it has incurred. Further, the service level credits should not be the licensee’s exclusive remedy. In addition, the customer should insist that once the service level credits reach a certain level, the customer may terminate the agreement.

2. Limitation of Liability

A common issue is whether service level credits count toward the limit of liability contained in the agreement, or are they in addition to any limit of liability. While the licensor seeks to limit its liability, the customer has a strong argument that service level credits represent liquidated damages for failure to meet short-term SLAs and that the overall limit of liability addresses the failure of the project as a whole. Further, “credits” are not cash. The limit of liability goes to the amount of money a licensee can recover from the licensor in the event of a breach. As such, the customer should insist that any service credits should not count toward any cap.

Many licensees believe that there should not be separate caps on the different types of damages set forth throughout the agreement. This belief is founded on the reasoning that the parties have negotiated a mutually agreeable risk allocation and a cap on direct damages. There should not be separate caps for specific types of failures. By having many “sub” limitations, the parties would be forced to spend a significant amount of time negotiating separate caps for the likely failures of both parties.

The licensor should also insist that the SLAs contain a recapture provision, allowing the licensor to recover credits paid to the customer if the licensor performs at a level higher than contractually required either during the period in question or over the term of the contract.

3. The Law of Small Numbers

Licensors should ensure that the service level agreement provides that service level failures resulting from violation of the “Law of Small Numbers” will not trigger a performance credit. The Law of Small Numbers provides that if the measurement of a service level is based on the calculation of a percentage of successful events across a total number of events, a sufficient number of total events must be recorded to permit the potential achievement of the service level with a performance of less than 100 percent (i.e., perfection). For example, if an objective for a service level is 95 percent, there must be at least 20 measured events for the licensor to potentially perform at less than perfection (in this instance, meeting the service level for 19 events and only missing 1) and still achieve the service level. In short, there must be a meaningful number of events to provide a valid representation of the vendor’s performance as a small sample under the agreement’s dispute resolution provision can unfairly skewer the numbers and unfairly penalize the vendor.

From the licensor’s perspective, the failure of the parties to agree on whether the Law of Small Numbers should apply to a service level failure within 30 days should form the basis of a “dispute.” If the Law of Small Numbers applies to a service level failure, the results and reporting should be carried over to subsequent month(s) until a sufficient number of total events are recorded in one or more subsequent months to not have the Law of Small Numbers apply to such service level. The achievement of the service level should then be determined for the final such subsequent month (and not for prior months) as though the total number of events all occurred during such month.

From the licensor’s perspective, the credits should not be cumulative. The licensor can avoid this problem by providing that any failure or success will be determined in the month in which the total number of events exceeds the threshold. Therefore, if a credit accrues it will only be calculated as if the credit occurred in that month and is not cumulative for all months.

4. Negotiating SLAs

As the old adage suggests: The devil is in the details. So goes the creation of SLAs. The parties must focus not only on the particular metric of an SLA but how the SLA is defined and how it will be calculated. For example, how often will “availability” be measured? Every 10 minutes, hourly, daily, monthly, yearly? Who is responsible for tracking downtime? As strange as it may seem, some licensors seek to have the licensee track availability/downtime and have the licensee notify the licensor of any downtime. As to availability, how it is calculated? Scheduled maintenance is usually excluded but what about unscheduled maintenance or an event of force majeure or third-party action or inaction?

It is also important to understand the role of the “9s.” How many 9s does the licensee require the licensor to meet? For example, 99 percent availability means the system may be down seven hours a month and 87 hours a year. See Stross, 99.999 Reliable? Don’t Hold Your Breath, N.Y. Times, Jan. 9, 2011, Bus. Section, p. 3.

The table below indicates “availability” after the inclusion of an additional digit.

Availability/9s - Annual Downtime

99.999% - 5.26 minutes

99.99% - 52.56 minutes

99.9% - 8.76 hours

99.5% - 1.83 days

99% - 3.65 days

For a detailed discussion of service level agreements, see Deckelman, Negotiating Effective Service Level Agreements (SLAs), http://www.outsourcing-center.com/ 1997-11-negotiating-effective-service-level-agreements-slas-article-38377.html. (last visited October 22, 2015).


41 views

Recent Posts

See All

The Seventh Edition is Here!

I am pleased to announce that the American Bar Association has published the Seventh Edition of my book: The Practical Guide to Software Licensing and Cloud Computing. The Seventh Edition contains 4 n

© 2023 by BizBud. Proudly created with Wix.com