During the exam creation and configuration, you can create your exam and configure it to fit your needs. Add exercises with different variants, register students, generate student exams and conduct test runs. For more information see 1.2 Create and Configure Exam.
Artemis supports two exam modes. The conduction of normal exams and test exams.
The normal exam mode is suitable for conducting end-of-semester exams. Students can view and work on the exam between the configured working time. Afterwards you can perform a manual or automated evaluation of the students’ submissions. The results will be published on the specified date.
Test exams provide you with the possibility to provide students with a practice opportunity for the end-of-semester exam. The main difference is that you choose a working window within which students can freely start the exam. Students then have the configured working time to complete the exam. After submitting, students immediately get the automated assessment for programming and quizzes. Manual correction of the submissions is not necessary.
When you click on you are presented with the Create Exam view. Here you can set the basic information such as title, examiner etc.
You can choose between the exam and test exam mode.
The timeline of the exam is defined by the dates: visiblefrom, startofworkingtime, endofworkingtime, releasedateofresults, beginofstudentreview, endofstudentreview.
The first three dates are mandatory when you create an exam. The rest can be set when required.
The graceperiod defines the amount of time the students have at their disposal to hand in their exam after the workingtime is over. This is set to 3 minutes by default.
Before the exam’s assessment you can choose the Numberofcorrectionroundsinexam. If you want two tutors to assess a student’s exam subsequently, set the number to two here. This enables the second correction.
You can also define the numberofexercises in the exam. You can leave this out initially, however it must be set before you can generate the student exams. For more information, see 1.3 Exercise Groups.
Artemis will randomize the order of the exercises for each student if you activate randomizeorderofexercisegroups.
Finally, you can fill out the exam starttext and endtext. Artemis will present these texts to the students during the exam conduction, at the Start- and End page respectively.
Instead of creating a new exam, you can import an existing exam by clicking on from any of the courses you are an instructor in.
Artemis displays a list of all available exams. To select one specific exam for the import, click on the button.
You are now presented with the Import Exam view. All information except for the dates are copied from the exam you selected for the import. You can find more information regarding this view at the section create exam.
Additionally, you can select or deselect exercises which are imported alongside the exam. You can find more information regarding the exercise import in the section regarding the exercise group import.
Artemis exam mode allows you to define multiple exercise variants so that each student can receive a unique exam. Artemis achieves this through exercise groups. Exercise groups represent an individual exercise slot for each student exam. Within one exercise group you can define different exercises.
Artemis selects one exercise per exercise group randomly, to generate the individual student exams.
You can distinguish between mandatory exercise groups and non-mandatory exercise groups.
Artemis always includes mandatory exercise groups in the individual exam of a student.
Non-mandatory exercise groups can be left out, if there are more exercise groups than the numberofexercises defined in the exam configuration.
By default, every exercise group is mandatory. You can set the mandatory flag when you add an exercise group initially, or later by clicking on the exercise group.
Artemis exam mode allows you to import one or more exercise groups from an existing exam.
The import process consists of two steps.
Step 1: Select Exam
When you click on , you can select one exam from which exercise group(s) should be imported.
To select one exam, click on .
Step 2: Select Exercises and Exercise Groups
In the next step you can select or deselect exercises which should be imported alongside the exercise groups.
You can also change the title and isMandatory of an exercise group, as well as the title (and short-name for programming exercises) for the individual exercises.
The title and short name of programming exercises must be unique. If you want to import an exercise group into the same course, you must change the title and short name before you can import the exercise group.
After you have started the import by clicking on , Artemis checks if the title and short name of the selected programming exercise(s) are unique. If they are not unique, a warning is displayed and you have to change the corresponding title and short name.
Note
Further changes to the individual exercises can be made after the import by editing the respective exercise.
Programming exercises are imported using their initial configuration. This import functionality cannot be used for changing the submission policy, for activating / deactivating the static code analysis or for creating new build plans. In this case, please import the exercises individually into the exercise groups.
Exercise groups can contain multiple exercises. For every student exam, Artemis will randomly select one exercise per exercise group.
Note
If you want all student to have the same exam, define only one exercise per exercise group.
To add exercises navigate to the Exercise Groups of the exam. On the header of each exercise group you will find the available exercise types. You can choose between creatinganewexercise or importinganexistingone from your courses.
For exercise types text and modeling you can also define example submissions and example assessments to guide your assessor team.
Assessors will review the example submissions and assessments in order to familiarise themselves with the exercise and assessment instructions, before they can assess the real submissions.
1.4.1 Programming Exercises
Programming exercises have multiple special options to adjust their behaviour:
You can check the option to allowmanualassessment.
Note
If you do not set this flag, your assessors will not be able to manually assess the student’s submissions during the assessment process.
You can activate RunTestsonceafterDueDate. This will compile and run the test suite on all the student submissions once after the set date.
After you add a programming exercise you can configure the grading via .
In the Configure Grading screen, you can tweak the weight of the tests, the bonusmultiplier and add bonuspoints.
You can hide tests so that they are not executed during the exam conduction. Students can not receive feedback from hidden tests during the exam conduction.
Note
If you hide all tests, the students will only be able to see if their submission compiles during the conduction. Set the RunTestsonceafterDueDate after the
exam end date to achieve this effect.
To register students to the exam, navigate from the exam management to the Students page. Artemis offers two options to register students. You can:
Add students manually by searching via the search bar.
To import multiple students, click on the button and provide a CSV file in the Import Users dialog. The required fields in the CSV file include the registrationNumber and the login, while email, firstname, lastname, seat, and room are optional. Note that the room and seat fields are only necessary for the exam participation check. You can find an example file here: csv. To begin the import, press the button.
Register every student in the course by pressing the button.
Note
Just registering the students to the exam will not allow them to participate in the exam. First, individual student exams must be generated.
Note
Artemis also supports validating participants signatures for on-site exams.
For more information, please see Exam Participation Checker.
You can also remove students from the exam. When you do so, you have the option to also delete their participations and submissions linked to the user’s student exam.
Student exams represent the exam of a student. It consists of an individual set of exercises based on the configured exercise groups.
Student exams are managed via the Student Exams page.
Here you can have an overview of all student exams. When you press View on a student exam, you can view the detailsofthestudent, the allocated workingtime, their participationstatus, their summary, as well as their scores. Additionally, you will also be able to view which assessor is responsible for each exercise.
Note
You can change the individual working time of students from here. The screenshot Individual Working Time below shows where you can do that.
To generate student exams you must click on . This will trigger Artemis to create a student exam for every registered user.
button will be locked once the exam becomes visible to the students. You cannot perform changes to student exams once the exam conduction has started.
If you have added more students recently, you can choose to .
creates a participation for each exercise for every registered user, based on their assigned exercises. It also creates the individual repositories and build plans for programming exercises. This action can take a while if there are many registered students due to the communication between the version control (VC) and continuous integration (CI) server.
On the Student Exams page, you can also maintain the repositories of student exams. This functionality only affects programming exercises. You can choose to and all student repositories.
Note
Artemis locks and unlocks the student repositories automatically based on the individual exam start and end date. These buttons are typically not necessary unless something went wrong.
Test runs are designed to offer the instructors confidence that the exam conduction will run smoothly. They allow you to experience the exam from the student’s perspective. A test run is distinct from a student exam and is not taken into consideration during the calculation of the exam scores.
You can manage your test runs from the Test Run page.
To create a new test run you can press . This will open a popup where you can select an exercise for each exercise group. You can also set the workingtime. A test run will have as many exercises as there are exercise groups. It does consider the numberofexercises set in the exam configuration.
Note
Exercise groups with no exercises are ignored.
Create test run popup with one exercise variant selected for each exercise group.
When you start the test run, you conduct the exam similar to how a student would. You can create submissions for the different exercises and end the test run.
An instructor can also assess his test run submissions. To do this, you must have completed at least one test run. To navigate to the assessment screen of the test runs click .
Test run conduction marked with the banner on the top left.
Note
Only the creator of the test run is able to assess his submissions.
You can view the results of the assessment of the test run by clicking on . This page simulates the Student Exam Summary where the students can view their submissions and the results once they are published.
Here instructors can also use the complaint feature and respond to it to conclude the full exam timeline.
Note
You should delete test runs before the actual exam conduction takes place.
After you create an exam, the exam checklist appears at the top of the exam’s detail page.
The exam checklist helps you oversee and ensure every step of the exam is executed correctly.
You can track the progress of the steps mentioned in this document and spot missed steps easily.
Each row of the checklist includes the name of the task, description and short summary where it is applicable and the page column which navigates the instructors to the relevant action.
Going through each task from the start until the current task and making sure the description column contains no warnings or errors can help instructors conduct the exam smoothly.
The exam conduction starts when the exam becomes visible to the students and ends when the latest working time is over. When the exam conduction begins, you cannot change the exam configuration or individual student exams. When the conduction starts, the students can access and start their exam. They can submit their solutions to the exercises within the given individual working time. After a student submits the exam, they cannot change their exercise submissions. For more information, see participating in the online exam.
In case you have to update the exercise during an exam for programming, modeling, text, or file-upload exercises, you can go to the exercise details page and click on to edit the exercise. At the bottom of the exercise edit page, you can enter a notification text that is shown to the students in the exam mode. The screenshot below shows an example notification. You can see how the updated problem statement looks for the student in Updated Problem Statement during the Exam.
The assessment begins as soon as the latest student exam working time is over.
During this period, your team can assess the submissions of the students and provide results.
Artemis executes the test suites for programming exercises automatically and grades these.
You can enhance the automatic grading with a manual review.
You can also trigger the automatic grading of the quiz exercises via the Manage Student Exams Screen.
If you want you can also enable the second correction feature for the exam.
Once the exam conduction is over and the latest individual working time has passed, your team can begin the assessment process.
This is done through the Assessment Dashboard.
Note
If the exam conduction is not over, you will not be able to access this page.
The assessment process is anonymized. Artemis omits personal student data from the assessors.
The Assessment Dashboard provides an overview over the current assessment progress per exercise. For each exercise, you can view how many submissions have already been assessed and how many are still left. The status of the student complaints is also displayed here.
Additionally, once the exam conduction ends, you can click on . This action will evaluate all student exam submissions for all quiz exercises and assign an automatic result.
Note
If you do not press this button, the students quiz exercises will not be graded.
After the exam conduction ends, you can click on . This action will automatically evaluate all submissions with 0 points for unsubmitted student exams. Additionally, empty submissions will be automatically graded with 0 points.
Note
If you do not press this button, the unsubmitted student submissions and the empty submissions will appear in the assessment dashboard of the exam, which leads to unnecessary effort during grading.
To assess a submission for an exercise, you can click on .
Your assessors must first complete the example submissions and assessments, if you have attached those to the exercise, see 1.4 Add Exercises.
If there is a submission which has not been assessed yet, you can click . This will fetch a random student submission of this exercise which you can then assess.
Artemis grades programming exercises automatically. However, if the exercise allows a manual assessment, you can review and enhance the automatic results.
You can trigger Artemis to automatically grade quiz exercises via the Manage Student Exams Screen. Therefore, quiz exercises do not appear in the Assessment Dashboard.
Set the number of correction rounds of the exam to 2.
When the second correction is enabled, the assessment progress can be observed in the Assessment Dashboard.
There you can see the state of the individual correction rounds, and the state of the complaints.
You can toggle if tutors can assess specific exercises in the second round. Disabling the second correction again, does not affect already created second assessments.
Correction in the second round can be enabled/disabled anytime.
To assess a submission a second time go to the exercise assessment dashboard. When it is enabled, a button will be visible in the second correction round.
The new second assessment will have all the feedback copied from the first assessment. Those can be overridden, and new feedback can be added as well. This does not override the original result, but saves a separate second result.
Within the second correction round review instructors and tutors can highlight which feedback was created for which correction round. This is displayed as a badge at the bottom of every feedback. This view can be enabled or disabled any time during the second correction round review by pressing the button at the top of the page. The feature is currently available for text, modeling and file-upload exercises.
You can access each assessment of both rounds by going to the exam’s -> of the respective exercise.
You can check for specific suspicious behavior in the suspicious behavior dashboard. You find the suspicious behavior as the first step in the exam correction on the exam checklist page.
To open the suspicious behavior dashboard click on the button.
The dashboard allows to detect exam sessions that fulfill certain criteria and gives an overview of the plagiarism detection.
An exam session is a unique combination of IP address, user agent, instance id, a session token and the browser fingerprint. It is created whenever a student enters their exam.
You can see the available analysis options in the first screenshot below. To start the analysis, click the button. When the analysis detects at least one suspicious case, the instructor can click the button to see the details of the suspicious exam sessions.
The second screenshot shows an example of analysis results.
In the lower half of the dashboard the instructor can view the plagiarism detection overview. It only lists exercises that support plagiarism detection. The number of potential plagiarism results are the results returned when running the detection. The number of plagiarism results are the cases a instructor has classified and confirmed as plagiarism.
To view the current detection results or run a detection click on the button. This navigates to the plagiarism detection page of the exercise.
Once the instructor has confirmed at least one case, the button is visible at the bottom left of the table that navigates to all confirmed plagiarism cases of the exam.
Artemis also allows you to detect plagiarism attempts.
Artemis conducts this by analyzing the similarities between all student submissions and flagging those which exceed a given threshold. You can compare all flagged submissions side by side and confirm plagiarism attempts.
Instructors can download a CSV report of accepted and rejected plagiarism attempts for further processing on external systems.
To apply the plagiarism check, you must navigate to the individual exercise. This can be done by navigating to:
-> -> exercise-title
Detecting Plagiarism attempts on Modeling Exercises
At the bottom of the page you will find the option .
Artemis stores the current state of a submission for text, modeling, and quiz exercises every 30s or whenever the student clicks the save button.
As instructor, you have the possibility to view all those states as well as the submissions for file-upload or programming exercises using the exam timeline.
The exam timeline is available on the details page of a student exam when the student exam has been submitted by clicking on the button . If the exam has not been submitted yet, the exam timeline button is disabled and shows an explanatory tooltip.
The exam timeline shows all submissions of the student in chronological order. You can navigate between the different timestamps on the slider or navigate between different exercises using the navigation bar below the timeline
For programming exercises, you can view a git diff between the currently selected submission and the previous submission or the template of this exercise.
Optionally, you can create a grading key for your exam by clicking at the top of the exam’s detail page.
Defining a grading key allows the exam score to be converted to a grade automatically by Artemis, students are then able to see their own grades after the specified Release Date of Results.
Using a grading key also enhances the generated statistics so that the instructor is able to view grade distributions.
For an easy out-of-the-box configuration, you can click and then click Save.
By default, grades are defined as percentages of the total obtainable score. You can also display their point equivalent if you specify Maximumnumberofpointsforexam.
If you would like to define custom grade steps, you can use the button and modify the grade step intervals.
Note
Keep an eye out for the warnings at the bottom of the page to ensure that the grading key is valid.
Inclusivity field allows you to decide which grade should be assigned if the student’s score is exactly equal to a boundary value between two grades.
There are two grade types you can use: Grade and Bonus. The Grade type allows you to set a final grade for the exam with custom grade step names, while the Bonus type allows you to assign bonus points to each grade step so they can contribute to the grade of another course or exam.
Note
If the GradeType is Grade you should set FirstPassingGrade.
For more fine grained control, you can switch to Detailed editing mode and set grade step bounds manually.
buttons enable you to save the grading key as a CSV file and re-use it in other courses and exams.
You can specify the moment when Artemis publishes the results of the exam, see 1.2.2 Create Exam. This is usually when the exam assessment ends, but you can specify this at any point in time. During the publication of the results, the student can view their results from their summary page. You can also view the exam statistics from the exam Scores page and export the data into external platforms such as TUM Online as a CSV file, see 4.1 Exam Scores.
You can access the exam scores by clicking on . This view aggregates the results of the students and combines them to provide an overview over the students’ performance.
You can view the spread between different achieved scores, the average results per exercise, as well as the individual students’ results.
Additionally, you can choose to modify the dataset by selecting onlyincludesubmittedexams or onlyincludeexerciseswithatleastonenon-emptysubmission.
Note
Unsubmitted exams are not eligible for the assessment process.
Review student performance using various metrics such as average, median and standard deviation.
Unsubmitted exams are not eligible for assessment and thereby appear as having no score. The corresponding students are assigned with a no-participation special grade if a grading key exists. It can happen that an exercise is not part of any student exam. This is the case when Artemis selects a different exercise of the same exercise group for every student exam. Similarly to the unsubmitted exams, they can warp the results and statistics of the exam. By eliminating unsubmitted exams and exercises which were not part of the exam conduction, you can gain a more realistic overview of the performance of the students.
Review the students perceived difficulty of every exercise to improve exams in the future.
The exam scores can also be exported via . This is useful to upload the results into university systems like TUM Online as a CSV file.
The exported CSV file includes the students’ name, username, email, registrationnumber, their assigned exercises, their score for every exercise, overallexampoints, overallexamscore, grades (before bonus if bonus is configured), presentationscore, submitted (yes/no) and passed (yes/no) values.
If bonus is configured, the file also contains bonusgrades and finalgrade.
If there is at least one plagiarism verdict in the exam, the file also contains plagiarismverdicts.
If there is at least one plagiarism verdict in the bonus source, the file also contains plagiarismverdictsinbonuscourse/exam.
The exported CSV file also contains the aggregated statistics of the exam conduction such as the numberofparticipations and the averagescore per exercise.
Optionally, you can publish the example solutions of text, modeling, file upload and programming exercises to students with submissions after a desired date by setting ExampleSolutionPublicationDate of the exam to a non-empty date.
All example solutions of these exercises are published according to this date set in the exam, as opposed to the course exercises which have their own individual example solution publication dates.
Example solution publication date can be empty, in this case solutions are never published. This is the default value.
If set, example solution publication date must be the same or after the visiblefrom and endofworkingtime if they are set.
During the review period, students have the opportunity to review the assessment of their exam. If they find inconsistencies, they can submit complaints about perceived mistakes made in the assessment. Students can provide their reasoning through a text message to clarify their objections. You can set the student review period in the exam configuration, see 1.2.2 Create Exam.
Students can submit complaints about their assessment in the Summary page.
During the student review, a complaint button will appear for every manually assessed exercise.
Students cannot submit complaints for automatically assessed exercises like quiz and programming exercises.
Students will be able to submit a complaint for programming exercises, if the automatic result has been reviewed manually by an assessor. This is only possible if manual assessment is enabled for the programming exercise.
Note
If you have found a mistake in the automatic assessment of quiz and programming exercises, you can edit those and re-trigger the evaluation for all participants.
For more information on how students can participate in the student review and submit complaints, see student summary guide.
Artemis collects the complaints submitted by the students during the student review. You can access and review the complaints similar to the submissions from the Assessment Dashboard. Every assessor can evaluate a complaint about the assessment of their peers and either accept or reject the complaint. Artemis will automatically update the results of accepted complaints. You can view the updated scores immediately in the Scores page. There you can also export the updated data in CSV format, see 4.1 Exam Scores.
The complaints appear below the exercise submissions.
The original assessor of an assessment cannot respond to the complaint. A second assessor must review the complaint and respond to it.
Artemis tracks the progress of the complaint assessment and displays a progress bar in the Assessment Dashboard. This allows you to keep track of the complaint assessment and see how many open complaints are left.