Giter Site home page Giter Site logo

themis's People

Contributors

alexanderg2207 avatar alexexcellence avatar botrufus avatar crustypluto19 avatar huber-florian avatar katjanakosic avatar maximiliansoelch avatar mtze avatar pal03377 avatar tcw0 avatar terlan98 avatar tomrudnick avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

themis's Issues

`Programming and Text Exercise`: Context menu issues

  • The context menu interferes with the highlights when the user keeps the finger on the submission content for too long before dragging. We should either solve this or disable it.

    highlight.context.menu.mov
  • Also for programming exercises, we should remove the 'Feedback' option from the context menu. It's not available for other exercise types and can cause invalid selections and confusion.

This issue was found during the user study.

Diff not displayed in read-only mode for student changes to template code

We have identified a bug in our app where the diff that highlights changes made by students to template code is not displayed in read-only mode. This issue is affecting the ability of instructors to review and provide feedback on student work.

Currently, when a student makes changes to the template code, a diff is generated to highlight the changes made. This is an important feature for instructors, as it helps them to understand the changes made by students and to provide targeted feedback.

However, we have found that the diff is not displayed in read-only mode, which is making it difficult for instructors to see the changes made by students. This is a critical issue, as it hinders the ability of instructors to review student work.

Improve performance when resizing CodeView with larger files

We have identified a performance issue in our platform that affects the user experience when resizing the CodeView with larger files. When resizing the CodeView, which is caused by adapting the sidebars, there is a noticeable lag in the redrawing of the line gutter and source code.

This issue is affecting the user experience for users who need to grade larger files, and it can slow down the grading and review process. We have received feedback that this lag is frustrating and hinders the ability to work efficiently.

To improve the performance of the CodeView, we propose implementing a series of optimizations that will reduce the lag when resizing the view. These optimizations could include implementing more efficient rendering algorithms, optimizing memory usage, and improving the performance of the backend logic that handles the redrawing of the line gutter and source code.

`General`: UX Issues

The following issues were found during the user study:

Desirable:

  1. #245
  2. Device gets very warm (could be because of screen sharing during the study)
  3. Make dark mode togglable
  4. Add settings that can be changed from the Settings app (wipe temporary files, change between dev and normal mode)

Please create individual GitHub issues for those listed above and link them here.

`Assessment View`: Apple Pencil interferes with the inline feedback mode

If the user has an Apple Pencil, the inline feedback mode seems to get toggled automatically. This leads to the following issues:

  • The toolbar button doesn't work as expected. The user needs to toggle it on and off several times to manually enable/disable the feedback mode

    toggle_bug.mov
  • Inline feedbacks can be added on files belonging to the solution repository

    solution_repo_bug.mov

This bug was found during the user study. I could not reproduce it myself since I don't have access to a real device with Apple Pencil.

`Assessment View`: UX Issues

The following issues were found during the user study:

Critical:

  1. The inline feedback button is hard to find for new users – #219
  2. Feedback mode should be enabled by default (for non-programming exercises) – #203

Moderate:

  1. #202
  2. Pencil icon next to the exercise name is confusing as it looks like a button – #218

Desirable:

  1. #208
  2. #209
  3. #210
  4. Score picker gets smaller when the keyboard appears
  5. Save button should be greyed out if there are no new changes
  6. Cannot add feedback without text
  7. Make exercise name horizontally scrollable to read
  8. File tree icons have poor visibility in dark mode
  9. Grey toolbar buttons look as if they are disabled; use a different color to represent the inactive state
  10. Undo and redo buttons should be individual for each file
  11. Add a magnified peek into what's under my finger when highlighting
  12. Save button has no indication of success

Please create individual GitHub issues for those listed above and link them here.

Allow users to accept grading instructions in-app

Currently, when a user starts grading a new exercise, they need to switch to the web client first, to accept the grading instructions. This extra step can be time-consuming and interrupt the grading process, leading to delays in completing assessments.

To improve the user experience and streamline the grading process, we propose adding a feature that allows users to accept grading instructions directly within the app. This could be achieved via an alert that appears within the app when the user is ready to start grading.

The alert would display the grading instructions along with a button to accept them. Once the user accepts, they will be able to continue with the grading process without having to switch to the web client.

This feature will save users time and make the grading process more efficient, which will be especially useful for users who grade projects frequently.

Improve Default Text for Correction Guideline Feedback Templates

Currently, when a user selects a predefined correction guideline feedback template with blank "Feedback" field, the default text that is shown in the feedback box is not very helpful. The default text reads "Add feedback for students here (visible for students)", which is too generic and does not provide any guidance for the user and student.

As a result, we would like to improve the placeholder text for correction guideline feedback templates. Our goal is to provide students with more specific feedback on how to improve their submission following the correction guidelines, even when there is no text provided by the instructor.

To achieve this goal, we propose the following solution:

Modify the placeholder text based on the selected correction guideline feedback template.

`General`: Make UI suitable for small screen sizes and enable split view

Split view would be useful for scenarios where the user want's to see an earlier assessment while assessing a new submission. Enabling it is quite straightforward: remove the tick from the 'Requires full screen` checkbox in project target settings (general tab):
Screenshot 2023-10-27 at 14 09 33
Nevertheless, Themis is made for the iPad, so it does not work well in small screen sizes. Splitting the screen leads to a result similar to running the app on an iPhone. That's why the app needs to be adjusted for small screen sizes first.

`Programming Exercise`: Switching files after browsing a different repository leads to a crash

Requirements:

  • 1 assessment with inline feedbacks

Steps to reproduce:

  1. Change to solution or template repo
  2. Change back to the student repo
  3. Tap on inline feedbacks on the correction sidebar to open files containing inline references
  4. Notice that the filetree does not highlight the currently open file
  5. Tap on the currently open file name on the file tree
  6. Notice that the code editor is stuck with the 'Loading..." text
  7. Tap on any other file on the file tree
  8. 💥
repository_crash.mov

Switching between repos somehow makes the file tree detached from the correction sidebar and it becomes unaware of which files are currently open.

This bug was found during the user study.

`General`: Create a short tutorial video for users

One of the user study participants mentioned that it would be nice to have a short tutorial video to train tutors to use Themis. We were already planning to include such videos in the documentation, so this issue will be closed with a documentation PR.

`Programming Exercise`: Show static code analysis- and submission policy-related automatic feedbacks properly

Currently, we use the testCase.testName, text and detailText properties of automatic feedbacks to show them on the correction sidebar. However, we do not have any logic for handling automatic feedback related to SCA or submission policy issues. Such feedbacks contain prefixes in their text field defined in Feedback.java:
Screenshot 2023-09-26 at 19 16 03
We should check for these prefixes and handle them.

This is how the automatic feedbacks with test cases currently look:
Screenshot 2023-09-26 at 19 12 37

This issue was reported by @Strohgelaender

`Course View`: UX Issues

The following issues were found during the user study:

Critical:

  1. #201

Desirable:

  1. Logout button can be tapped accidentally
  2. No search function
  3. Urgent exercises should be highlighted
  4. Show semesters
  5. Long tap gesture on exercise names should copy link to the exercise on Artemis
  6. Support deep links for Artemis exercise URLs

Please create individual GitHub issues for those listed above and link them here.

`Programming Exercise`: UX Issues

The following issues were found during the user study:

Moderate:

  1. #205
  2. #221

Desirable:

  1. #204
  2. #206
  3. Highlight files changed by the student on the file tree
  4. Add pinch gesture to change font size
  5. View student repo and solution repo side-by-side
  6. Long tap on a file on file tree should copy file name
  7. Add a toolbar button to increase the line height

Please create individual GitHub issues for those listed above and link them here.

`Text Exercise`: UX Issues

The following issues were found during the user study:

Critical:

  1. #212

Moderate:

  1. #211
  2. #221

Desirable:

  1. Make the right sidebar collapsible

Please create individual GitHub issues for those listed above and link them here.

CourseView does not group the exercises correctly

CourseView shows some exercises in the 'Currently in Assessment' section, even if their submission deadline is not due yet. When the user tries to assess such an exercise, the app shows an unknown error.

exercise.grouping.bug.mov
not.due.exercise.error.mov

`Assessment View`: Feedback mode should be enabled by default

Some users expect the feedback mode to be enabled by default and try highlighting elements directly after starting an assessment. They suggested making it enabled by default. However, I think this should only be done for non-programming exercises since the users are likely to open files and scroll to read some code before adding inline feedback.

toolbar-feedback-button

ExerciseView does not update the submissions on pull-to-refresh

When the user pulls to refresh the ExerciseView, the submissions are not updated. This creates problems when, for example, a submission gets assessed over Artemis and the user wants to see the latest state on Themis without going back and forth between CourseView and ExerciseView.

refresh.bug.mov

`Text Exercise`: Single words are hard to select

Currently, the highlight feature is not triggered when the user simply taps on text. To select a word, the user needs to put their finger on a word and slightly drag to trigger it.

This issue was found during the user study.

Improve User Interface by Separating Static Code Analysis Feedback from Test Case Feedback

Currently, our user interface displays both static code analysis feedback and test case feedback in the same location. While this design may have worked well in the past, it has become difficult for users to distinguish between the two types of feedback.

As a result, we would like to improve our user interface by visually separating the static code analysis feedback from the test case feedback. This will make it easier for users to quickly identify which feedback corresponds to which type of analysis.

To achieve this goal, we propose the following change to our user interface:

Add a new section to the user interface specifically for static code analysis feedback. This section will be visually distinct from the test case feedback section, making it clear to users which feedback corresponds to static code analysis.

This change will make it easier for users to interpret and act on the feedback provided by our application, leading to a more efficient and effective grading process.

`Course View`: The course picker is not very obvious to new users

We found out in the user study that people have trouble finding how to switch courses. A simple fix could be adding a Course: label in front of the picker to make it more obvious. A suggestion we received by a study participant is to make it similar to Slack channel selection.

course-picker

`Login View`: UX Issues

The following issues were found during the user study:

  1. Select university button is too close to the home button in landscape mode on small iPads.
    Screenshot

  2. Select button in the university selection sheet does not state whether it selects a university or the custom instance. Change it to 'Select Custom Instance'
    ss

  3. Test servers should be included in the university selection sheet by default in the developer mode (we also need to create this mode first)

  4. Adjust input validation based on the selected university

Please create individual GitHub issues for those listed above and link them here.

`Assessment View`: Show the submission ID

Currently, the user can't see which submission they are assessing. This is especially uncomfortable when there are multiple open submissions on the Exercise View, and the user selects one to assess. Having a label containing the submission ID on the assessment view could be helpful to ensure that the user did not accidentally tap on a wrong submission.

Add Option to Edit Selected Range for Inline Feedback

Currently, when giving inline feedback, it can be difficult to precisely select the intended range for feedback. Users are only able to check if the correct range is selected by sliding down the feedback sheet and peeking at the highlighted code. This process is time-consuming and can be frustrating for users who want to provide precise feedback.

As a result, we would like to add an option to edit the selected range for inline feedback. This will allow users to make more precise selections and provide more specific feedback.

To achieve this goal, we propose the following solutions:

  • Add a split view option that displays both the code and the feedback sheet. This will allow users to see the selected range in context and make adjustments as necessary.

  • Provide users with the ability to edit the selected range directly in the split view. This could be done by allowing users to drag range markers to adjust the selection.

  • Update the feedback sheet to display the edited range in real-time. This will allow users to see the changes they have made and ensure that the selected range is accurate.

Overall, these changes will make it easier for users to provide precise feedback by allowing them to edit the selected range in a split view. This will improve the overall user experience and lead to more effective feedback.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.