IT Support Performance Measurement in Nigeria
Your company pays ₦350,000 monthly for IT support. Every month, your vendor sends a report showing 95% of tickets closed on time, response times within SLA, everything looking great.
But your finance manager’s laptop keeps freezing despite three “fixes.” Your CRM goes down every Monday. When you raise concerns, your vendor points to those closed ticket numbers and says everything is fine.
Here’s the problem: You don’t know if you’re getting what you paid for. You have clear SLAs (Part 1) and protective contracts (Part 2). But you still don’t have a clear picture of how the vendor is performing.
Most Nigerian businesses track one metric: “Are tickets closed?” This is one of the common mistakes Nigerian small business owners make. They miss quality, user satisfaction, and cost efficiency. By the time they realize they’ve been paying for poor performance, they’ve wasted months and money.
Managing IT support performance in Nigeria requires more than accepting vendor reports at face value. This article shows you what to measure, practical tracking tools, how to build a vendor scorecard, and when to escalate.
Legal Disclaimer: This article provides practical guidance. It’s not a substitute for professional advice tailored to your situation.
Beyond SLA Compliance: What Matters
Your vendor meets SLA targets. 95% of tickets answered within 30 minutes, 90% resolved within 4 hours. But users are still unhappy. Why? Because SLAs measure speed, not quality.
Here’s what SLAs don’t tell you: Was the problem fixed or just marked “resolved”? Will the same issue happen again next week? Was the technician helpful? Did the vendor proactively prevent the issue? Is the cost reasonable?
Quality Metrics: Are Problems Actually Fixed?
First-Call Resolution (FCR) Rate measures the percentage of issues resolved without reopening or escalation. Target 70-80%. A score below 60% means you’re paying for the same work multiple times.
A finance company in Lagos had 85% SLA compliance but 45% FCR. Users called back constantly. The vendor “fixed” the same email sync problem three times before finding the root cause.
Customer Satisfaction Score (CSAT): After each ticket, ask “How satisfied were you?” on a 1-5 scale. Target 4.0+. Below 3.5 indicates deeper problems.
Many Nigerian SMEs don’t conduct user surveys at all. They rely on complaints reaching management, which only captures the worst cases. You’re missing the broader dissatisfaction that builds up until people stop reporting issues and just work around them.
Mean Time to Resolution (MTTR) measures the time to actual fixes, not ticket closures. A ticket “resolved” in four hours that reopens twice? Your real MTTR is three days, not four hours.
Proactive Work: Prevention vs. Firefighting
Proactive versus Reactive Work Ratio: What percentage of vendor time goes to preventing problems versus fixing them? Mature IT support should be 30-40% proactive, including monitoring, patching, and preventive maintenance. This aligns with ITIL service management best practices. If your vendor is 100% reactive, you’re getting break-fix support at managed services prices.
Response Quality: Beyond Speed
Did the technician communicate clearly? Document the solution? Provide user training to prevent recurrence? Many vendors close tickets fast with band-aid fixes. Three months later, nobody remembers how it was solved.
Cost Efficiency: Value Per Naira
Cost Metrics: Cost per ticket (monthly fee ÷ tickets closed), cost per user (monthly fee ÷ users supported), and cost trend over time.
The key point: You need a balanced scorecard. Speed (SLAs) plus quality (FCR, CSAT) plus cost efficiency. Your SLAs from Part 1 set the speed baseline. These metrics tell you if that speed delivers value.
Tracking Tools for Nigerian Companies
Let’s be realistic about what works at different budget levels.
For Small Businesses (10-30 Users, ₦150K-300K Monthly)
Excel or Google Sheets are free and work offline during power outages. Create a monthly scorecard tab. Track 5-8 key metrics. Update weekly. Takes 15 minutes.
Free-tier tools like Zoho Desk Free or Freshdesk Free work if your vendor will use them.
WhatsApp analysis: If communication happens via WhatsApp (common in Nigeria), export chats monthly. Count requests, note patterns.
For Medium Businesses (30-100 Users, ₦300K-1M Monthly)
Affordable cloud tools like Zoho Desk (₦3,000 – ₦ 8,000 per agent monthly) or Freshdesk become viable options. Insist your vendor uses your tool or gives you read access to theirs.
Microsoft 365 integration: Microsoft Forms for surveys, Power BI for dashboards, and SharePoint for ticket tracking.
For Larger Companies (100+ Users, ₦1M+ Monthly)
Enterprise platforms like ServiceNow or Jira Service Management are justified at this scale.
The Power Question
Cloud tools fail when the internet is down. Have offline backup: Excel templates, printed scorecards, SMS logs. This isn’t paranoia; it’s planning.
Decision framework:
- Under ₦300K: Excel and manual tracking
- ₦300K-1M: Affordable cloud tool
- ₦1M+: Enterprise platform
Don’t let vendors say “our system doesn’t export reports.” Your contract should require data access (Part 2). With the Nigeria Data Protection Act in effect, you have legal rights to your own data.
Measuring IT Support Performance in Nigeria: Building Your Scorecard
Rather than tracking dozens of metrics, build a simple scorecard covering speed, quality, cost, and communication.
Core principle: If you can’t review it in 15 minutes, you won’t review it consistently.
Vendor Performance Scorecard Template
| Metric Category | Specific Metric | Target | This Month | Last Month | Trend | Weight |
|---|---|---|---|---|---|---|
| Speed (SLAs) | Response Time (% met) | 95% | 15% | |||
| Resolution Time (% met) | 90% | 15% | ||||
| Quality | First Call Resolution Rate | 75% | 20% | |||
| Customer Satisfaction (1-5) | 4.0+ | 20% | ||||
| Ticket Reopen Rate | <10% | 10% | ||||
| Cost | Cost per Ticket | ₦X | 10% | |||
| Cost per User | ₦X | 5% | ||||
| Communication | Monthly Report On Time | Yes | 5% | |||
| OVERALL SCORE | 85%+ | XX% | XX% | ↑/↓ | 100% |
How to Use This Scorecard
Quality metrics (FCR and CSAT) receive the highest weight (40% combined) because they predict satisfaction and prevent unnecessary rework. Speed (SLAs) gets 30% – important, but not everything. Cost gets 15%. Communication is set at 5% as the expected baseline.
Adjust weights for your priorities. If cost is a critical constraint, increase cost weighting. If you’re in a high-uptime industry (e.g., healthcare, finance), increase the SLA weighting.
Scoring: Hit target = 100%, within 5% = 80%, within 10% = 60%, more than 10% off = 0%.
Example: Target FCR is 75%, actual is 71%. Score: 80% (within 5%). Weighted contribution: 80% × 20% = 16 points toward overall score.
The trend column matters more than point-in-time scores. A vendor at 82% trending up beats one at 88% trending down. Use arrows: ↑ (improving over last 3 months), → (stable), ↓ (declining).
Setting targets: Start with SLA targets from your contract. For others, use FCR 70-80%, CSAT 4.0/5.0, and cost per ticket calculated from your baseline (current monthly fee ÷ monthly tickets), targeting a 10% reduction year-over-year.
Update: Monthly minimum. Weekly for the first 90 days with the new vendor.
Running Effective Monthly Vendor Reviews
Typical scenario: 30-minute call. Vendor presents great metrics. You mention complaints. Vendor promises to “look into it.” Nothing documented. This isn’t vendor management; it’s vendor theater.
Monthly Vendor Review Meeting Agenda
| Agenda Item | Time | Who Leads | Required Data |
|---|---|---|---|
| Scorecard Review | 10 min | Client | Completed scorecard with all metrics |
| Trend Analysis | 10 min | Vendor | 3-month trend charts for key metrics |
| Deep Dive: Specific Issues | 15 min | Both | Top 3 recurring issues or user complaints |
| Root Cause Discussion | 10 min | Vendor | Analysis of why issues occurred |
| Improvement Actions | 10 min | Both | Specific commitments with deadlines |
| Cost Review | 5 min | Client | Invoice reconciliation versus contract |
| Next Month Preview | 5 min | Vendor | Planned maintenance, changes, projects |
| Action Items Review | 5 min | Client | Document all commitments and owners |
| TOTAL | 70 min |
You don’t need the full 70 minutes every month if your vendor performs well. Scale it down when performance is steady, maybe 30-40 minutes. The key is consistency.
Before the meeting: Send your scorecard 48 hours ahead so the vendor can prepare responses to red metrics. Collect user feedback through formal surveys or informal pulse checks. Review last month’s action items – what was completed versus what wasn’t.
During the meeting: Start with data, not opinions. “CSAT dropped from 4.2 to 3.7. What happened?” is better than “Users seem unhappy.” Focus on trends, not incidents. One bad week isn’t a pattern. Three months of declining FCR is a problem.
Seek root causes, not blame. “Why are we seeing a 25% ticket reopen rate?” is a productive question. Maybe they lack training. Maybe your internal processes create confusion. Find out together.
Document everything: Someone takes notes. Action items get recorded with specific actions (not “improve communication” but “implement daily status emails for P1 tickets”), owners (person’s name, not “vendor team”), deadlines (not “soon” but “by March 15”), and success criteria (how will we know it’s done?).
After the meeting: Email summary within 24 hours. Add action items to next month’s agenda.
Red flag: Vendor consistently cancels or reschedules reviews.
Frequency: New vendor gets weekly for month 1, bi-weekly for months 2-3, then monthly. Your contract (Part 2) should specify this requirement.
Reading the Warning Signs
Not every performance dip means your vendor is failing. But some patterns indicate structural problems.
Red Flags versus Normal Issues
| Situation | Normal Growing Pains | Red Flag Warning Sign |
|---|---|---|
| Response Time Misses | Missed target 1 week due to sick leave | Missing target 3+ months, no plan |
| Quality Issues | FCR dips during office move | FCR declining for 3+ months |
| Cost Increases | Price escalation per contract (15% annual, explained) | Unexpected charges monthly, vague explanations |
| Communication | Occasional late report (1-2 days) | Consistently missing meetings, data not provided |
| Staff Turnover | One technician leaves, replaced in 2 weeks | Multiple leave, no knowledge transfer |
| Documentation | Some tickets lack detail during busy period | Systematic poor documentation |
| Proactive Work | Delayed one month due to priority incident | No proactive work despite contract |
| Responsiveness | Slow response to non-urgent request | Slow response to urgent issues |
The 3-Month Rule
Any performance issue persisting 3+ months despite being documented, vendor committing to fix, and you providing support: this is a pattern, not temporary. Escalate.
Escalation path:
- Account manager
- Vendor senior management
- Vendor executive team
- Consider termination
When to Coach Instead
Good coaching situations: Vendor is trying but struggling with specific skill gaps. They’re responsive. They want to improve.
Example: A vendor struggled with documentation quality because your template was confusing. Simple fix: Better template plus training. Performance improved immediately.
Ask: “Is this vendor trying to improve but needs help, or hoping I’ll accept poor performance?”
Cost Analysis: Are You Getting Value?
You know what you’re paying. Do you know what you’re getting per Naira?
Cost per ticket: ₦400,000 monthly ÷ 80 tickets = ₦5,000 per ticket.
What’s reasonable?
- Simple issues: ₦2,000 – ₦4,000
- Medium complexity: ₦5,000 – ₦8,000
- Complex issues: ₦10,000 – ₦20,000+
Track by category. A logistics company in Lagos found that 35 of 90 tickets were “printer offline” for the same three printers. They bought new printers for ₦180K; ticket volume dropped 40%, saving money in five months.
Cost per user: ₦400,000 ÷ 45 users = ₦8,889 per user monthly.
Nigerian benchmarks:
- 10-30 users: ₦6,000 – ₦10,000 per user
- 30-100 users: ₦4,000 – ₦7,000 per user
- 100+ users: ₦3,000 – ₦5,000 per user
Hidden costs: Track onsite visits, after-hours support, extra projects. If “extras” consistently exceed 25% of the base fee, your contract scope is wrong.
ROI Calculation
The cost of poor IT support includes downtime (hours down × employees affected × hourly productivity cost), repeated issues, and user frustration. This is why business continuity planning matters.
Budget variance: ₦350K budgeted but ₦450K actual (after extras) = +28% variance. Acceptable is ±10% monthly, ±5% annually.
If cost analysis consistently shows poor value, this informs vendor switching. For guidance on selecting the right IT vendor, our vendor selection guide provides a framework.
Continuous Improvement: Working WITH Your Vendor
Performance measurement isn’t just policing. It’s building a system where both parties improve together.
Quarterly business reviews every three months: Review trends, discuss strategic improvements, plan priorities, and celebrate wins.
Root cause analysis: When the same problem appears 3+ times, implement a long-term fix. Example: Password reset tickets high → root cause: policy too complex → solution: password manager → result: 40% reduction.
Knowledge transfer: Good vendors train your team to handle simple issues. This reduces ticket volume, empowers staff, and builds partnerships.
Vendor feedback loop: Tell vendors what’s working. “Response times are excellent this quarter.” “John is particularly helpful.” Positive feedback encourages good behavior.
Performance Improvement Plans
When the vendor consistently underperforms:
- Document specific issues
- Set measurable targets
- Define timeline (60-90 days)
- Schedule weekly check-ins
- Support the vendor with the needed resources
- Measure progress objectively
PIPs work when the vendor is capable but misaligned. They don’t work when the vendor simply can’t deliver.
When Performance Data Signals It’s Time to Act
Green Zone (85%+, Stable): Maintain approach, monthly meetings, focus on improvement.
Yellow Zone (70-84%, Declining): Implement improvement plan, bi-weekly check-ins, 60 days to return to green.
Red Zone (<70%, Persistent Decline): Formal escalation plus PIP, weekly tracking, 30-90 days to improve or transition.
A manufacturing company in Port Harcourt watched their vendor slide from 88% to 76% to 64% over six months. They implemented a 90-day PIP. The vendor identified staffing problems, hired experienced technicians, and climbed back to 82%.
Critical Failure (Safety, Security, Compliance Risk): Immediate escalation, review termination clauses, in days, not months.
The Data-Driven Conversation
Instead of “We’re unhappy,” try “Your score is 68% this quarter, down from 82%, primarily driven by FCR dropping from 76% to 58%. Let’s discuss the root cause.”
Data removes emotion, focuses on facts, and enables problem-solving.
Documentation: Monthly scorecards become evidence, meeting notes document commitments, and improvement plans show you gave the opportunity to fix. Good performance measurement isn’t just operational management. It’s legal protection if the relationship deteriorates.
What’s Next in This Series
You now understand how to set SLAs (Part 1), structure protective contracts (Part 2), and measure ongoing performance.
Next: “When to Switch IT Vendors in Nigeria” – Part 4 covers the decision framework for when performance issues warrant switching vendors, calculating switching costs versus staying with underperformers, and managing transitions without disrupting your business.
Conclusion
Here’s what you need to remember: Measuring IT support performance in Nigeria isn’t about becoming a taskmaster. It’s about finally knowing whether you’re getting what you’re paying for.
Those vendor reports showing 95% closure? Meaningless if your finance manager is on her third “fix.” What matters is whether issues get solved, users are satisfied, and you’re spending money efficiently.
Start simple. Pick 5-8 metrics from the scorecard. Review monthly. Track trends. You don’t need expensive tools. Excel works fine for most Nigerian SMEs. You just need consistent measurement.
The real value isn’t catching vendors doing wrong. It’s creating clarity about what “good” looks like. When your vendor scores 76% three months running, you have a data-based conversation. When they improve to 88%, you have evidence for your CFO.
Your SLAs from Part 1 set expectations. Your contract from Part 2 protects you legally. But without measurement? Those documents are just paper. This framework makes them real.
Need Help Measuring Your IT Vendor’s Performance?
If you’re looking at this scorecard thinking, “I don’t have time to build this” or “I’m not sure our vendor will cooperate,” we can help.
PlanetWeb works with Nigerian companies to implement practical performance measurement systems. We build custom scorecards, run vendor performance audits, and help you have data-driven conversations with vendors who aren’t delivering.
Our IT consulting services include vendor performance audits, scorecard development, and ongoing management support.
Contact our team to discuss how we can help you implement performance measurement for your IT support.





