Dashboard
Last updated
Last updated
The dashboard contains the following information
Quote - This feature of the dashboard displayes different quotes from different sources and contains a button that will lead to the problems that have been added to the hub.
Events - Used to show if there are future scheduled events such as sessions, and different events.
Team solutions - Number of problems the team solved together.
Time Spent for Group - Number of minutes the group dedicated to problem solving
Group Rating - Cumulative Contest rating for the group
Daily solve - This feature displays the number of students that solved at least a single problem from the hub and the number of people who didn't solve a problem from the hub on that specific date.
Weekly Problem-Solving Comparison: This feature tracks and compares the group's performance over several weeks by analyzing the number of problems solved by students each day. It then compares the daily problem-solving count with that of the previous week on the same day. What makes this feature stand out is its visually appealing, graphical representation of the data, offering a clear and engaging view of the group's progress and trends over time.
Summary
Attendance
The attendance feature provides a detailed overview of participants' status during scheduled sessions, indicating whether they were Present, Absent, Excused, or Late. It calculates each individual's attendance percentage based on their actual attendance records and automatically sorts participants according to their attendance rate. This feature offers a clear, organized way to track and evaluate participants' commitment and consistency.
Consistency
The consistency feature monitors the number of problems students solve, both from the hub and competitive programming platforms like LeetCode, on a specific date. This metric serves as a strong indicator of students' dedication and commitment to continuous practice. By tracking their problem-solving activity over time, this feature provides valuable insights into their progress and consistency, making it an excellent tool for measuring ongoing engagement and effort.
Completions
The Completion feature of the dashboard showcases the progress of students in terms of the percentage and total number of problems they've solved for the specific track theyβve been assigned. Each student follows a designated track as part of the year-long education program, which is divided into five distinct phases: Python Track, Education Phase I, Camp I, Education Phase II, and Camp II. This feature allows for easy tracking of each student's progress within these phases, offering a clear snapshot of how much of their assigned curriculum they have completed. It's an essential tool for monitoring overall advancement through the program.
Exercise
The Exercises feature allows students to track and access questions added to the hub, linking them directly to platforms like LeetCode, HackerRank, and Codeforces. After solving a problem, students can submit their solutions to the hub. Once submitted, they can view not only their own submission but also those of other students. This includes valuable details such as the code used to solve the problem, the time taken to complete it, and the number of attempts required. This feature encourages collaborative learning by providing insights into different approaches and fostering a deeper understanding of problem-solving techniques.
Up solving
The Upsolving Summary provides a comprehensive overview of contest participation, detailing both in-person and virtual attendees. It highlights how many and which problems each participant solved during the contest, along with key performance metrics such as the number of attempts made and the time taken to solve each problem. This feature offers a clear and detailed snapshot of individual and group performance, helping to assess participation and efficiency in problem-solving during competitive events.
Divisions
The divisions are created by assessing students' performance across various contests, grouping them according to their skill levels and results. This system allows for a more tailored learning experience, ensuring that students are placed in divisions that reflect their current abilities, encouraging growth and providing appropriate challenges based on their contest performance.
Group Comparison
-> This feature of the dashboard enables the head of academy to make a comparison between multiple selected groups.
-> To compare different groups, navigate to the homepage and scroll down to the 'Group Comparison' section. On the right, you'll find a dropdown menu to select multiple groups for comparison. After selecting the desired groups, you can choose your preferred comparison mode: either Table or Graph. Additionally, you can set the comparison range, from the past week to all time (since the groups were created). Now you will have the ability to make comparison among the different groups according to the following modes that you select
Summary
-> This Mode provides a detailed summary for multiple selected groups, analyzing key metrics such as consistency percentage, track completion, real attendance, total attendance, submission count, and contest rating.
Attendance
-> This mode compares groups based on various attendance metrics, including Real Attendance, Total Attendance, Number of Presents, Number of Excused, Number of Absentees, and the Check-in/Check-out Ratio.
-> Real Attendance: Only includes those who were present.
-> Total Attendance: The combined total of present and
excused attendees.
-> Absent: The number of people who missed the session
without proper notice.
Contest conversion
-> Contest conversion refers to the percentage and ratio of problems solved by students during A2SV-facilitated contests,
Track completion
-> Track completion measures the progress of students in each group based on their performance in solving assigned problems. It is assessed through various metrics, including:
Average Track Completion per Group: The overall average of problems completed by students in the group.
Average Exercises Solved per Student: The number of problems solved by each student on average.
Completion Percentage: The percentage of the track that has been completed by the group.
Minimum and Maximum Completion Percentage: The lowest and highest completion rates within the group.
Number of Students Who Completed All Problems: The count of students who have fully completed the assigned track.
This provides a comprehensive view of the group's progress and helps identify both strong performers and areas where improvement is needed.
Consistency
-> This mode compares and ranks the selected groups based on several key metrics, including:
Average Consistency: The overall consistency of students across the group.
Minimum and Maximum Consistency: The lowest and highest consistency levels within the group.
Average Daily Solves per Student: The average number of problems solved by each student daily.
Total Expected Solves: The number of problems expected to be solved by the group.
Actual Solves: The total number of problems actually solved by the group.
These metrics provide insights into each group's performance and consistency.
Latest Submissions
This feature showcases the most recent submissions and activities from different students on the hub, with entries automatically sorted by submission time. It provides a real-time view of the latest problem-solving efforts.