Skip to main content
All CollectionsNavigating AssessmentsManaging Assessments
Interpreting the Assessment Overview Page
Interpreting the Assessment Overview Page

This article provides detailed information for an Admin or Manager on the data provided on the Assessment’s Overview page.

FifthDomain avatar
Written by FifthDomain
Updated over a week ago

Terminology

The Overview page uses the following terminology:

  • Assessment Rank: For every assessment, all the candidates are ranked based on the assessment score.

  • Assessment Score : A comprehensive score which is the function of Success (challenges successfully solved), Efficiency(scoring points with least number of tries), and Speed(quickness in completing the assessment).

    This is a number score between 0 and 100.

  • Success: Success is measured as a number score between 0-100.

    Success is defined by the points earned by successfully solving challenges where: 0= No points achieved; 100= All Points achieved

  • Efficiency: Efficiency is measured as a number score between 0-100.

    Efficiency is the individual's ability to score points in least number of tries where 0= All tasks entered but not solved; 100= All tasks entered are solved

  • Speed: Speed is measured as a number score between 0-100.

    Speed is determined by the individual's capability to complete the assessment in the quickest possible time where 0= 0 minutes spent, and 100= Assessment duration

  • Score of a skill: Like assessment score, score in a skill is calculated as a function of

    • Success in solving challenges that are assessing for that skill.

    • Efficiency in solving challenges that are assessing for that skill. This is measured by the ability to solve with least number of tries.

    • Speed in solving challenges that are assessing for that skill. This is measured by quickness in solving these challenges.

  • Professional Specialty-Professional specialties denote broad categories of cyber operations. Each challenge aligns with one of these specialties, providing a structured framework for classification.

  • Skills- Skills indicate the specific, acquired ability necessary to solve a challenge within a given timeframe or effort. Each challenge should necessitate one or two specific skills linked to the relevant professional specialty.

  • Techniques- Technologies include the environments (e.g., Windows, Linux, Docker) and tools (e.g., Splunk, Nessus, Python) incorporated within the challenge, crucial for its resolution.

  • Technologies-Techniques refer to the specific methods or strategies required to resolve a challenge. There is a preset list of techniques to select from, with the option to add more if needed.

Overview Bar

The Overview Bar, shown in the image below, displays the following information:

  • Assessment Status [1]: This section shows the assessment’s current status

  • Start Date [2]: This section shows the assessment’s start date and time. Participants or teams cannot attempt the assessment before this time

  • End Date [3]: This section shows when the assessment is scheduled to end. Participants or teams cannot start the assessment after this time. However, if a Participant or team starts the assessment before the End Date and Time, then they can continue completing the assessment after End Date and Time until the duration period expires.

  • Duration [4]: This section shows how much time participants or teams have to complete the assessment once they start the assessment.

Overall Engagement and Performance section

This section gives you glimpse of the success of the cohort by far in this assessment in terms of the assessment score which is a function of number of solves and how efficient and mindful of time they have been.

Performance Triangle [1]:

Performance triangle gives a glimpse of how the candidates performed overall in the assessment based on their Success, Efficiency and Speed. This is plotted on a triangle where the center of the triangle is 0, and the vertices of triangle are 100. The blue coloured part in the triangle gives the overall spectrum or range of the scores of the cohort where outer triangle is the highest score, and inner triangle is the lowest score . The thick grey line inside represents the cohort average.

For example, in the diagram below, the highest score Success for the entire cohort is 56, and lowest is 0. The cohort’s average success is 25.

Performance across cohort [2]:

This is a histogram that captures the number of candidates associated with the assessment score range. On x-axis, you have the assessment score ranges: 0-20, 20-40, 40-60, 60-80 and 80-100. At once, the managers are able to see how is the caliber spread across the cohort.

Leaderboard

This section gives you the list of most successful candidates. The success is defined by the number of solves and how efficient and mindful they have been in their approach.

Top bar

  • Average Assessment Score [1]: The comprehensive average score of the cohort based on success, efficiency and speed

  • Average Solves [2]: This card shows the average number of challenges(numerator) solved by participants or teams who have completed the assessment, as compared to the total number of tasks in the assessment(denominator).

  • Average Success [3]: This is average success score of the cohort(between 1-100) which is based on successful solves.

  • Average Efficiency [4]: This is average Efficiency score of the cohort(between 1-100) which is based on ability to score points with least number of tries.

  • Average Speed [5] : This is average Speed score of the cohort(between 1-100) which is based on ability to solve the assessment as fast as possible.

Candidate Ranking table

You can see the following columns in the ranking table:

Candidate Rank[1]

In this column, you will find names of the candidates along with their ranks listed in descending order. The ranking is based on the assessment score. Higher the score, higher the rank.

Assessment Score[2]

Comprehensive score based on performance metrics like success, efficiency and speed.

Success[3], Efficiency[4], Speed[5] distribution bars.

Distribution bars at a glance show the average score range of the cohort, and the position of the candidate compared to the average on the scale.

Success is represented by red, Efficiency is represented by yellow and Speed is represented by green.

Reading the distribution bar

In the distribution bars, the light coloured part in the base represents the range covering lowest and highest scores of the cohort.

The darker region represents one standard deviation above and below the cohort average. Simply put, a wider shaded region indicates that the scores of the cohort are more spread out from the average. Conversely, a smaller shaded region suggests that the majority of scores are clustered closely around the average.

The dark line on the bar shows where the candidate's score is for that measure.

Viewing Individual’s Insights

You can go to individual's insights page by simply clicking View Performance button. On this page you can find further details about the approach and performance of that candidate.

Comparing candidates with each other

To compare candidates amongst each other, simply select the checkboxes[1] next to their names You can select upto 5 candidates to compare.

After selecting the candidates, their performance in will be plotted for each skill assessed in the assessment in the skill line chart[2].

For demonstration purposes, let’s select the top two candidates[1] in the leaderboard. You will see a line appearing for each candidate in the skill line chart[2]. Each candidate is represented by a unique colour allotted by the platform. You can see the colour key[3]. The key will tell you which colour is representing each candidate selected for comparison.

On the line chart, the x-axis has name of the skills as abbreviations.

The y-axis represents the score in a skill. This score of a skill for a candidate, which is a number between 0-100 is calculated by accounting in the success, efficiency and speed of a candidate to solve challenges that assess for that particular skill.

Refer to the terminology section at the top of this article to understand more on how this score is calculated. Therefore, each node on the line represents the score in a particular skill. In the above diagram, there are three lines, coloured lines representing the candidates and the dashed line representing the cohort average. You can hover over each node to see the details like name of the person, their score, and average score.

Challenge Completion table

The challenge completion table captures the time spent on each task by a candidate, and if or not they have attempted a challenge and then if or not successfully solved it.

  • The first column has the list of names of the candidate

  • The other columns are challenges and the performance of a candidate in those challenges.

  • Each cell shows the time spent by a candidate on a particular challenge in 0h00m format

  • Each cell will be colour-coded by the success rate of the candidate in attempting that challenge, which is determined by number of attempts by the candidate.

The colour key to read the table:

  • Grey: Not attempted

  • Red: Attempted but not solved

  • Light green: Solved with success rate 33%

  • Green: Solved with success rate 33-66%

  • Dark Green: Solved with success rate over 66%

At a glance you can notice that the challenge has not been even attempted by many candidates if there is a lot of grey in the table. A lot of red in a column row would mean perhaps a challenge was too difficult as many attempted it without any success. Depending on how dark the green is, you can read the performance of the cohort against the difficulty in solving a challenge.

To go and see in detail approach and performance of a candidate of choice, you can simply click on their name, which will navigate to the candidate’s individual insights page.

Did this answer your question?