Created & Analyzed by Saddam Ansari @Aspiring Data Analyst Linkedin
Live Dashboard at Novypro Live_link_Novypro
- Objective
- Dataset Overview
- Data Preparation and Cleaning Steps
- Dashboard Overview
- Detailed Insights Explanation
- Recommendation
- How this project insights and recomendation Benefits to Stakeholders
- My Learnings
- How you can help me
- Bottom
The objective of this project is to analyze the performance of a Technical Support Centre by examining ticket volumes, resolution times, and customer satisfaction rates.
The analysis will identify peak ticket creation times, compare response and resolution times against Service Level Agreements (SLAs), and explore customer satisfaction across various categories.
Additionally, the project will evaluate daily, weekly, and monthly ticket trends, assess differences between workday and weekend volumes, analyze ticket topics and support channels, and evaluate agent performance in adhering to SLAs.
This comprehensive analysis aims to provide actionable insights to improve the efficiency and effectiveness of the Technical Support Centre.
The provided dataset consists of 2,330 rows of technical support data, capturing a comprehensive range of information related to support tickets. Each row in the dataset represents a unique support ticket with the following columns:
Column name | Description |
---|---|
Status: | Indicates the current state of the ticket (e.g., Closed, In progress, Resolved). |
Ticket ID: | A unique identifier for each support ticket. |
Priority: | The priority level of the ticket (e.g., Low, Medium, High). |
Source: | The medium through which the ticket was created (e.g., Email, Phone, Chat). |
Topic: | The main subject or issue of the support request (e.g., Feature request, Product setup, Purchasing and invoicing). |
Agent Group: | The support team or line handling the ticket (e.g., 1st line support, 2nd line support). |
Agent Name: | The name of the support agent handling the ticket. |
Created time: | The timestamp when the ticket was created. |
Expected SLA to resolve: | The expected Service Level Agreement (SLA) time to resolve the ticket. |
Expected SLA to first response: | The expected SLA time for the first response. |
First response time: | The actual timestamp when the first response was made. |
SLA for first response: | Indicates whether the first response was within the SLA. |
Resolution time: | The actual time taken to resolve the ticket. |
SLA for resolution: | Indicates whether the resolution was within the SLA. |
Close time: | The timestamp when the ticket was closed. |
Agent interactions: | The number of interactions the agent had with the ticket. |
Survey results: | Customer satisfaction survey results associated with the ticket. |
Product group: | The category of the product involved in the support request (e.g., Custom software development, Ready to use Software). |
Support Level: | The tier level of the support provided (e.g., Tier 1, Tier 2). |
Country: | The country from which the ticket was submitted. |
Latitude: | The latitude coordinate of the ticket's origin. |
Longitude: | The longitude coordinate of the ticket's origin. |
This dataset offers a rich source of information for analyzing various aspects of technical support performance.
During the project, several data preparation and cleaning steps were performed after loading the dataset into Power BI to ensure accurate analysis. These steps included correcting data types, handling blank values, and creating new columns for more detailed insights:
Upon loading the dataset into Power BI, it was observed that many columns had incorrect data types. These were systematically corrected to ensure proper analysis. For example, date and time fields were converted to DateTime data types, numerical fields were corrected to appropriate numerical types, and categorical fields were converted to text.
Some columns contained blank values. These blanks were left as they were, either because they were not critical for the analysis or because imputing them might have introduced bias or inaccuracies.
- Time Difference in Minutes: A new column was created to calculate the time difference in minutes between the ticket creation time and the first response time. This column helps in evaluating the responsiveness of the support team.
- Time Difference in Hours: Another new column was created to calculate the time difference in hours between the ticket creation time and the resolution time. This metric is essential for understanding how long it takes to resolve issues.
- Weekday or Weekend: A column was added to differentiate between weekdays and weekends. Saturdays and Sundays were marked as weekends, while the remaining days were marked as weekdays. This distinction helps in analyzing ticket volume and performance variations across different days of the week.
- Working Hours Indicator: A column named "Working Hours" was created to indicate whether the ticket creation time fell within standard working hours (9 AM to 5 PM). Tickets created between 9 AM and 5 PM were marked as "Working Hours," while those created outside this timeframe were marked as "After Working Hours." This helps in assessing the support team's performance during and outside of regular business hours.
By performing these data preparation and cleaning steps, the dataset was refined to support a more accurate and insightful analysis of the Technical Support Centre's performance.
Based on the project requirements, I have created a single-page Power BI report-type dashboard. While it is somewhat lengthy, with a height of approximately 3950 pixels and a width of 1280 pixels, it comprehensively covers and presents all the insights needed by stakeholders.
This section presents ticket volume trends from various perspectives. Let's dive into the insights:
During the period from January 2, 2023, to December 31, 2023, a total of 2,330 tickets were created. These tickets represent customer service requests.
- Closed: 50.3% of the total tickets, amounting to 1,173 tickets, have been closed.
- Resolved: 31.7% of the total tickets, which equals 739 tickets, have been resolved.
- In Progress: 17.2% of the total tickets, translating to 400 tickets, are still in progress.
- Open: 0.8% of the total tickets, or 18 tickets, remain open.
Using a doughnut chart, we can easily see that:
- 84.94% of the total requests, amounting to 1,979 tickets, were generated on workdays.
- 15.06%, which equals 351 tickets, were created on weekends.
đź“ť Note: For this analysis, weekends are considered to be Saturday and Sunday, while all other days are classified as workdays.
Another doughnut chart reveals that:
- 67.21% of the total tickets, amounting to 1,566 tickets, were generated after work hours.
- 32.79%, or 764 tickets, were created during work hours.
đź“ť Note: In this context, "after work hours" refers to any time after 5 PM and before 9 AM. These insights provide a clear understanding of when support requests are most frequently made, aiding in better resource planning and allocation.
A line chart was utilized to visualize the trend in ticket volumes over the months, revealing the following insights:
- The highest ticket volume occurred in January, with a total of 224 tickets, representing 9.6% of the total.
- The second-highest month was May, with 219 tickets, followed by November with 213 tickets.
- The lowest ticket volume was observed in February, with 159 tickets, accounting for** 6.8% of the total**.
These observations provide valuable insights into monthly variations in support ticket volumes, aiding in resource planning and performance evaluation.
A line chart was utilized to visualize the total ticket volumes across different weeks. Here are the key insights:
- Week 3 had the highest number of tickets, totaling 61.
- Following closely, Week 18 recorded 58 tickets, indicating the second-highest volume.
- Conversely, Week 52 experienced the lowest ticket volume, with only 24 tickets recorded.
These observations shed light on weekly fluctuations in ticket volumes, providing valuable insights for resource management and operational planning.
The purpose of this analysis is to identify peak creation times and peak creation days. Here are the key insights:
đź“ť Note: Before interpreting this visualization, it's important to note that darker shades of blue indicate high ticket creation, while lighter shades represent low ticket creation.
- Peak Creation Time: Analysis reveals that, excluding weekends, ticket creation peaks around 3 PM on almost every day.
- Peak Creation Day: Wednesday emerges as the day with the highest ticket generation, totaling 439 tickets. Following closely, Monday sees 410 tickets, making it the second-highest day. Friday ranks third with a notable ticket count.
These insights provide valuable information for scheduling support staff and optimizing resource allocation to address peak demand periods effectively.
In this section, we will explore ticket distribution based on content and insights related to ticket resolution time. Let's delve into the insights:
The analysis reveals the distribution of tickets across different topics:
- Product Setup: This group has the highest ticket volume, comprising** 630 tickets**, which accounts for** 27% of the total**. It indicates a significant number of support requests related to setting up products.
- Training Request: Conversely, this topic has the lowest ticket volume, suggesting fewer support requests related to training.
đź“ť Note: Furthermore, users can drill down into the ticket distribution based on specific product groups, enabling a more detailed analysis of support needs within each product category.
This breakdown provides valuable insights into the areas where customers require the most assistance, facilitating targeted support strategies and resource allocation to address prevalent issues effectively.
đź“ť Note: Before proceeding, it's important to note that the "Source" indicates the channel through which customers connect for help.
- The majority of tickets, totaling 1,234 and representing 52% of the total, were generated via email.
- Chat accounted for 850 tickets, making up a significant portion of the total.
- Phone calls resulted in the lowest ticket volume, with only 246 tickets generated, representing 10.56% of the total.
These insights provide valuable information about the preferred channels through which customers seek support, aiding in resource allocation and service optimization to enhance customer satisfaction.
đź“ť Note: Before proceeding, it's important to note that the "Support Level" indicates the difficulty level of tickets.
đź“ť Note: Before proceeding, it's important to note that the Support-Level Indicate Ticket dificultie level.
- A significant majority of tickets, totaling 1,770 and representing 75.97% of the total, were generated at Tier 1 support level.
- Tier 2 support level accounted for 560 tickets, comprising 24.03% of the total.
These insights provide an understanding of the distribution of support requests based on difficulty levels, aiding in resource allocation and staffing decisions to ensure efficient handling of tickets across different support tiers.
đź“ť Note: Before proceeding, it's important to note that "Priority" indicates the urgency of a ticket, reflecting how urgently a customer requires service.
- Low Priority: The majority of tickets, totaling 1,192 and representing 51.2% of the total, were categorized as low priority.
- Medium Priority: Medium priority tickets accounted for 722 tickets, comprising 31.0% of the total.
- High Priority: Tickets categorized as high priority totaled** 416**, making up** 17.9% of the total**.
These insights provide an understanding of the urgency levels of support requests, enabling appropriate prioritization and allocation of resources to ensure timely resolution of customer issues.
For a detailed analysis, a table has been prepared to illustrate the average first response time in minutes based on SLA:
- SLA Violated: The average first response time for tickets where SLA was violated is 93.42 minutes, involving a total of 311 tickets.
- Within SLA: Tickets resolved within SLA have an average first response time of 16.30 minutes, encompassing a total of 2,019 tickets.
These findings provide valuable insights into the adherence to SLA targets and the efficiency of the support team in responding to customer queries promptly.
Additionally, when analyzing
there isn't a significant difference. The response times for high, medium, and low priority tickets are nearly the same. However, when examining SLA violation, medium priority tickets exhibit a notably longer first response time of 131.22 minutes.
Furthermore, we are interested in
In terms of First Response Time by Ticket Source within SLA, we observe that:
-
Phone tickets have the shortest response time, with an average of only 0.99 minutes.
-
Chat tickets follow closely, with a response time of** 1.02 minutes within SLA**.
-
Email tickets experience a** slightly longer response time within SLA**, averaging at 30 minutes.
-
On the other hand, in cases of SLA violation,** email tickets have the longest response time, averaging at 253 minutes**.
- Within SLA: The average ticket resolution time within the SLA is 33.51 hours.
- SLA Violated: In cases where the SLA was violated, the average ticket resolution time is 31.43 hours.
These findings provide valuable insights into the efficiency of the support team in resolving tickets within the agreed upon SLA timeframe.
When investigating the impact of Ticket Priority on ticket resolution time (hours), the following observations are made:
-
Within SLA: Upon examining resolution times within SLA by priority level, there isn't a significant difference. Medium priority tickets take slightly longer to resolve compared to high and low priority tickets.
-
SLA Violated: In terms of resolution time for tickets where the SLA was violated, the resolution times vary across different priority levels:
SLA Violated: High priority tickets have an average resolution time of 32.50 hours. SLA Violated: Low priority tickets have an average resolution time of 32.59 hours. SLA Violated: Medium priority tickets have a comparatively shorter average resolution time of 28.86 hours.
These findings indicate that while there isn't a substantial difference in resolution times within SLA based on priority level, there is a notable variation in resolution times for tickets where the SLA was violated.
The analysis of Ticket Resolution Time (hours) based on Ticket Source reveals the following insights:
- Tickets received via chat have an average resolution time of 27.51 hours.
- Email-based tickets have a longer average resolution time of 36.35 hours.
- Phone-based tickets exhibit the longest average resolution time of 39.45 hours.
For tickets where the SLA was violated:
- Chat-based tickets have an average resolution time of 28.62 hours.
- Email-based tickets show an average resolution time of 32.75 hours.
- Phone-based tickets have an average resolution time of 35.67 hours.
- These findings indicate variations in resolution times based on the source of the ticket, with chat-based tickets generally having shorter resolution times compared to email and phone-based tickets, both within SLA and in cases where the SLA was violated.
The purpose of this analysis is to explore whether there is any relationship between specific topics and countries. Let's delve into our findings:
- Top Ticket-Generating Countries: Germany, Italy, Poland, and the United States are the countries where the highest number of tickets were generated.
- Popular Topics in Top Countries: In these four countries, the most common ticket topics are Product Setup, Pricing and Licensing, and Feature Requests.
- Drill-Down Analysis: We can further investigate the distribution of tickets based on product group, priority, and ticket source to gain deeper insights into the support needs of specific countries and topics.
These findings provide valuable insights into the correlation between ticket topics and countries, guiding support teams in prioritizing and addressing customer needs effectively.
In this section, we will delve into agent performance metrics, along with various other insights.
For each agent, I've provided detailed insights such as:
- Agent Name
- Total Tickets Received
- Ticket Distribution by Status
- Average First Response Time (in minutes)
- Average Ticket Resolution Time (in hours)
- First Response Time and Ticket Resolution Time by Priority
- First Response Time and Resolution Time by Ticket Source
- Average Rating
These insights are presented in separate sections for each agent, allowing for a comprehensive view of their performance metrics.
For a more detailed view, please visit the live dashboard or refer to the image provided below. 👇
The percentage of tickets that received their first response within the Service Level Agreement (SLA) stands at 86.7%. This metric indicates the efficiency of the support team in adhering to response time commitments outlined in the SLA
##3 Percentage of Tickets Resolved Within SLA
The analysis reveals that approximately 64.4% of tickets were resolved within the Service Level Agreement (SLA) timeframe. This metric serves as a key indicator of the efficiency and effectiveness of the support team in meeting customer expectations and service standards.
The purpose of this analysis is to determine which topics have the highest customer satisfaction ratings.
- Purchasing and Invoicing: This topic has the highest rating of 3.75 out of 5, indicating high customer satisfaction.
- Bug Report: Customers also express satisfaction with the Bug Report topic, with a rating of 3.55.
- Feature Request: Similarly, Feature Request receives a high rating of 3.54.
- Other: The "Other" topic follows closely with a rating of 3.49.
- Pricing and Licensing: Customers are generally satisfied with Pricing and Licensing, giving it a** rating of 3.49**.
- Training Request: This topic has a respectable rating of** 3.46**.
- Product Setup: Product Setup has the lowest rating among the topics analyzed, with a rating of 3.4 out of 5.
Users can further drill down into the ratings based on product group to gain deeper insights into customer satisfaction levels across topic.
Tier 1 support has a slightly higher rating of 3.54 compared to Tier 2 support, which has a rating of 3.41.
High priority tickets have the highest rating of 3.66, followed by medium priority tickets with a rating of 3.54, and low priority tickets with the lowest rating of 3.44.
The 1st line support agent group has a rating of 3.54, whereas the 2nd line support group has a slightly lower rating of 3.41.
Among different ticket sources, chat has the highest rating of 3.54, followed by email with a rating of 3.51, and phone with the lowest rating of 3.42.
Poland has the highest overall rating score of 3.64, while Australia has the lowest rating of 3.24.
Based on the insights derived from the analysis, here are the top recommendations for improving service level and overall rating:
-
Insight: 27% of tickets (630) are about product setup, but it has the lowest satisfaction rating (3.4).
-
Recommendation: Develop comprehensive documentation, including user guides and FAQs. Create step-by-step video tutorials on YouTube for common setup tasks.
-
Insight: Over 50% of tickets come through email, but it has the slowest first response time compared to chat and phone.
-
Recommendation: Implement a target response time of 5 minutes for emails to improve customer experience. Focus on reducing overall resolution times for email inquiries.
-
Insight: Only 86.7% of tickets receive a first response within SLA, impacting satisfaction.
-
Recommendation: Utilize chatbots and AI-powered tools to handle routine inquiries, improving response rates and meeting SLA targets. This frees up human agents for more complex issues.
-
Insight: Analyze performance metrics to identify lower-performing agents like(Kristos Westoll).
-
Recommendation: Provide targeted training programs to address skill gaps and improve the performance of lower-scoring agents.
-
Insight: Identify top-performing agents based on metrics like resolution time and ratings.
-
Recommendation: Assign more complex tickets to top performers to enhance overall efficiency and customer satisfaction.
-
Insight: Low priority tickets have the Lowest customer satisfaction rating (3.44).
-
Recommendation: Implement a system to prioritize high-priority tickets to ensure they receive prompt attention and resolution.
-
Insight: Phone support has the lowest customer satisfaction rating (3.42) compared to chat (3.54) and email (3.51).
-
Recommendation: Investigate reasons behind lower phone satisfaction. Consider agent training on phone communication skills and empathy.
-
Insight: Poland has the highest rating (3.64), while Australia has the lowest (3.24).
-
Recommendation: Analyze reasons behind lower ratings in specific countries. Consider cultural differences or language barriers, and tailor support strategies accordingly.
-
Insight: Only 64.4% of tickets are resolved within SLA.
-
Recommendation: Implement stricter performance tracking and accountability measures for agents to ensure SLA adherence. Consider incentives or rewards for consistent SLA compliance.
-
Insight: 67.21% of tickets are created after work hours.
-
Recommendation: Expand chat support availability during after-hours and weekends to address the surge in ticket volume. Consider implementing live chat features or virtual assistants.
-
Improved Customer Satisfaction: By implementing the recommendations, such as enhancing product setup support, reducing email response times, and prioritizing high-priority tickets, you can significantly improve customer satisfaction. This will lead to happier customers, increased customer retention, and potentially more positive brand perception.
-
Enhanced Efficiency and Productivity: The recommendations around leveraging chatbots and AI, targeted agent training, and optimizing chat support for after-hours will improve the efficiency of your technical support center. This will allow agents to handle more complex issues and reduce overall resolution times.
-
Data-Driven Decision Making: This project provides stakeholders with valuable data and insights into the performance of the technical support center. This data can be used to make informed decisions about resource allocation, service level agreements (SLAs), and overall support strategies.
-
Identification of Areas for Improvement: The analysis highlights areas where the technical support center can improve, such as product setup support and phone support satisfaction. By addressing these areas, stakeholders can ensure the support center is meeting the needs of customers.
-
Technical Skills: I have gained valuable skills in data analysis, data visualization, and technical support center operations. This includes working with data sets, using Power BI for data analysis and reporting, and understanding key metrics related to technical support performance.
-
Problem-Solving: I have learned how to approach a complex problem, such as analyzing technical support performance, by breaking it down into smaller, more manageable tasks. I have also identified the root causes of issues and developed solutions to address them.
-
Industry Knowledge: I've gained valuable insights into the inner workings of a technical support center. I understand the different channels customers use to contact support, the types of issues they encounter, and the metrics used to measure performance.
This project is the result of over 5-7 days of hard work, and I invite you to đź‘Ť like, share, and connect with me on LinkedIn.
I hope scrolling through this project provides you with insightful understanding.
Thank you for taking the time to view my project.
I've successfully completed over 55 Power BI projects, all showcased in my Novypro portfolio. You're all invited to visit my portfolio and explore these amazing projects!
Additionally, I'm currently seeking internship or entry-level opportunities. If you have any opportunities available or need a freelance Power BI project completed, please connect with me on LinkedIn.
Looking forward to connecting with you all!
Saddam Ansari @Aspiring Data Analyst LinkedIn
Location: India
THE END