Why Customer Satisfaction and Service Review Are Foundational
ISO 20000 is ultimately about delivering value to customers. Service review and satisfaction measurement practices close the feedback loop between what the SMS produces and what customers actually experience. Without this feedback, the SMS can achieve its internal objectives while failing its customers—SLAs met on paper but customer experience poor, incident metrics green but customer complaints rising. The service review and satisfaction measurement requirement in Clause 8.3.1 ensures the SMS remains customer-focused.
ISO 20000's Requirements
Clause 8.3.1 requires the organization to establish and maintain customer relationships; conduct service reviews; measure customer satisfaction; handle customer feedback and complaints. These are all required, not optional recommendations. An SMS without service reviews and satisfaction measurement is non-compliant. During Stage 2 audit, auditors will ask to see evidence of service reviews (documented meeting minutes) and satisfaction measurement (survey results, trend data).
Service Review Meetings: Formal Structure
The service review is a formal periodic meeting between the service provider and customer. Minimum frequency: quarterly for most customers, monthly for strategic accounts or where performance is challenged. Required agenda items: SLA performance review (actual performance against each SLO, pass or fail, trend over last 3 months), incident summary (volume, major incidents, average resolution time, top incident categories), change activity (changes affecting the customer, success rate), problem status (open problems impacting the customer, root cause, workaround status, expected fix date), service improvement activities initiated in response to previous reviews, customer concerns and requests, action items from previous review (status of closure). The agenda should be circulated in advance; both parties should come prepared.
Service Review Meeting Outputs
Meeting minutes are required documented evidence. Minutes must include: attendees (names and titles from both sides), date, SLA performance data reviewed, incident trends discussed, change activity summary, action items with specific owner and due date, customer acknowledgment of SLA performance. The acknowledgment is critical—the customer is not necessarily agreeing with the SMS provider's performance, but is acknowledging that the metrics have been presented and discussed. Minutes must be distributed within 3 working days and maintained in the document management system.
| KEY CONCEPT | Customer satisfaction measurement is not about producing a favorable score. It is about identifying where the SMS is and is not delivering value from the customer's perspective. Acting on feedback is what demonstrates genuine customer focus. |
Customer Satisfaction Measurement
What to measure: overall satisfaction with the service, satisfaction with individual service dimensions (availability, support quality, communication, value for money, responsiveness to issues). Measurement frequency: minimum annually; quarterly for active accounts; monthly post-incident surveys for real-time feedback. Survey design principles: concise (5–10 questions), consistent (same questions each period for trend tracking), actionable (questions that identify specific improvement areas, not vague). Example: "How satisfied are you with incident resolution speed?" is better than "Are you happy?" because it guides improvement (if resolution speed rating is low, investigation can focus on bottlenecks in the incident process).
Survey Methods
Multiple methods: online survey tools (distributed after service review or quarterly to customer contacts), post-incident satisfaction mini-surveys (single question automatic survey at incident closure—"Was this issue fully resolved?"), formal annual/quarterly survey (comprehensive survey covering all service dimensions), and customer interviews for in-depth qualitative feedback (especially valuable for strategic customers). NPS (Net Promoter Score—"How likely are you to recommend this service to others?") is a longitudinal relationship health metric; tracking NPS over time shows whether the customer relationship is strengthening or deteriorating.
Analyzing and Acting on Satisfaction Data
Aggregating results by service and by customer identifies which services or customer segments are underperforming. Trend analysis shows whether satisfaction is improving, stable, or declining. Low-scoring dimensions (e.g., if communication satisfaction is consistently 2.5 / 5) are targets for improvement. Connect satisfaction results to the improvement register: if customers rate communication poorly, create an improvement to enhance communication frequency or clarity. Share aggregate satisfaction data at management review; this data should drive improvement prioritization. Respond to individual customer survey responses where issues are raised; if a customer gives a low score with a comment, the service manager should contact them to understand and address the concern.
Customer Complaint Management
Clause 8.3.1 explicitly requires complaint handling. Process: complaint receipt (from any channel—email, phone, service review meeting), acknowledgment to the customer within 2 working days, investigation of the complaint (what is the complaint specifically, what caused it?), resolution (what will the SMS do to address it?), and closure (customer acknowledges resolution). Complaint records are documented information; complaint root causes become problem management triggers; complaint trend analysis at management review (if three customers complain about the same issue, it is a systemic problem).
Connecting Satisfaction to the Broader SMS
Low satisfaction scores in specific areas should trigger improvement actions tracked in the improvement register. Repeated complaint themes should generate problem records. SLA breach patterns identified in service reviews should trigger SLA review and service improvement planning. This connections ensure that customer voice drives the SMS, not just internal metrics.
| IMPORTANT | Auditors at Stage 2 will ask to see evidence of customer satisfaction measurement and will review service review meeting minutes. If neither exists, this is a nonconformity under Clause 8.3.1. The absence of formal service reviews is one of the more common findings in IT organizations undergoing ISO 20000 assessment. |
Service Review Content: Practical Examples
Example from an email service review: SLA performance showed 99.8% availability (target 99.9%, breach by 0.1%), P1 incidents (unplanned outages) averaged 2 per month, major changes implemented totaled 8 (7 successful, 1 rolled back), problem database shows one open problem related to intermittent authentication delays with estimated fix in 3 weeks, improvement action from previous review to upgrade the email infrastructure is on schedule for Q2. Customer acknowledges data; raises concern about authentication delays affecting 50 users; requests weekly status updates on the fix. Action items: service manager to provide weekly fix status updates, infrastructure team to prioritize the authentication fix, SMS provider to review whether SLA target should be 99.95% given current infrastructure.
| BITLION INSIGHT | Bitlion GRC customer satisfaction survey management with pre-built survey templates, post-incident automation, SLA performance dashboard integration, and service review agenda/minutes templates with customer signature capability. |