-
Notifications
You must be signed in to change notification settings - Fork 28
User Journeys 2022 (Draft)
Matt King edited this page May 27, 2023
·
1 revision
This page outlines user journeys in the ARIA-AT App for the following user personas:
The specific scenarios and use cases are derived from the working mode. These user journeys will be used to prioritize the design and development of new features, user interfaces, and user flows for ARIA-AT App in 2022.
Status: DRAFT
- Use Case Overview: When a Test Developer indicates a Test Plan is ready for community review, the Admin will then make it available through the Test Queue in the app.
- Trigger: The Admin has been notified that a new Test Plan is ready for review and has been merged into the main branch.
- Precondition: The Test Plan has been merged into main.
-
User Journey:
- Navigates to the Test Queue.
- Opens "Add Test Plans to the Test Queue" disclosure.
- Selects a Test plan from the Test Plan dropdown.
- Selects a Test plan version from the Test Plan Version dropdown.
- Selects an Assistive Technology from the dropdown.
- Selects a Browser from the dropdown.
- Clicks the "Add Test Plan to Test Queue" button.
- Use Case Overview: When at least two testers have fully executed a Test Plan in at least one browser and results from the different testers conflict with one another due to tester interpretation.
- Trigger: The Admin has been notified (or noticed themself) that a Test Plan has been fully executed by at least two testers and there are conflicts.
- Precondition: A Test Plan run is fully executed by at least two testers and the "Conflicts" label is displayed for the Test Plan in the Test Queue.
-
User Journey:
- Navigates to the Test Queue.
- Identifies the Test Plan with conflicts.
- Clicks the "Opens run as" button.
- Selects a Tester from the "Open run as dropdown" who has completed the Test Run.
- Lands on the Test Run Page.
- Navigates through the Test Navigator Nav to identify tests with conflicts.
- Clicks a Test with conflicts in the Test Navigator Nav.
- Clicks the "Review Conflicts" button in the Alert displayed above the Test.
- Reviews conflicts details displayed in the modal.
- Facilitates conversation with testers to identify which result should be modified.
- Click the "Edit Results" button underneath the Test.
- Modifies the test results.
- Clicks the "Submit Results" or the "Next Test" button.
- Use Case Overview: When at least two testers have fully executed a Test Plan in at least one browser and results from the different testers conflict with one another due to an error in the test.
- Trigger: The Admin has been notified (or noticed themself) that a Test Plan has been fully executed by at least two testers and there are conflicts.
- Precondition: A Test Plan run is fully executed by at least two testers and the "Conflicts" label is displayed for the Test Plan in the Test Queue.
-
User Journey:
- Navigates to the Test Queue.
- Identifies the Test Plan with conflicts.
- Clicks the "Opens run as" button.
- Selects a Tester from the "Open run as dropdown" who has completed the Test Run.
- Lands on the Test Run Page.
- Navigates through the Test Navigator Nav to identify tests with conflicts.
- Clicks a Test with conflicts in the Test Navigator Nav.
- Clicks the "Review Conflicts" button in the Alert displayed above the Test.
- Reviews conflicts details displayed in the modal.
- Clicks the "Raise an Issue for Conflict" to create a Github Issue.
- Facilitates Community Group conversation about the failure.
- Use Case Overview: When at least two testers have fully executed a Test Plan in at least one browser with each in-scope assistive technology and all review issues are close.
- Trigger: The Admin has been notified (or noticed themself) that a Test Plan has no conflicts anymore.
- Precondition: All review issues are close and at least two testers have generated equivalent test results in at least one browser with each in-scope assistive technology.
-
User Journey:
- Navigates to the Test Queue
- Identifies Test Plan that went through the review.
- Clicks the "Mark as Candidate" button.
- User Journey gaps and/or suggested changes: We need to change "Mark as in Review" to be "Mark as Candidate"
- Use Case Overview: When a test plan has been in the Candidate phase for at least 120 days and there are no open test plan issues.
- Trigger: The Admin has been notified (or noticed themself) that a Test Plan has been in the Candidate phase for at least 120 days.
- Precondition: The test plan has been in the Candidate phase for at least 120 days and all issues have been resolved.
-
User Journey:
- Navigates to the Test Queue.
- Identifies the Candidate Test Plan that is ready to be moved to the Recommended phase
- Clicks the "Mark as Recommended" button.
- User Journey gaps and/or suggested changes: We need to change "Mark as Finalized" to be "Mark as Recommended"
- Use Case Overview: When a test plan has been promoted to Recommended it is ready to be published.
- Trigger: An AT Developer has promoted Candidate Test Plan to Recommended.
- Precondition: A Candidate Test Plan has been promoted to Recommended.
-
User Journey:
- Navigates to the Test Queue.
- Identifies Recommended Test Plan.
- Clicks the "Publish Report" button.
- User Journey gaps and/or suggested changes: 2. Implement a "Publish Report" action to make a Test Plan Report available on the Reports page.
- Use Case Overview: When a new version of an in-scope AT is released, the Test Admin needs to make it available for testers.
- Trigger: A new version of an AT has been released.
- Precondition: None.
-
User Journey:
- Navigates to the Test Queue.
- Opens "Manage Assistive Technology Versions" disclosure.
- Selects an AT from the dropdown.
- Clicks the "Add a New Version" link.
- Paste the version number in the input field.
- Clicks the "Add Version" button.
- Use Case Overview: When an AT Developer wants to know Candidate Test Plans that are waiting for their review
- Precondition: The Test Plan needs to be promoted to Candidate.
-
User Journey:
- Navigates to the Test Reports page.
-
User Journey gaps and/or suggested changes:
- Incorporate a “phase” labels for Test plans on the Reports page so users can differentiate Candidate from Recommended Test Plans
- Incorporate “status” labels for Test Plans on the Reports page so AT Developers know if their help is needed and to identify progress. The proposed labels are:
- Ready for review: When a new Test Plan is moved to the Candidate Phase or when all issues have been resolved
- X open issues: When an AT Developer requests changes or leaves feedback, both of these actions culminate in the creation of a new Github issue. The number of open issues should be automatically updated in the app.
- Approved: When an AT Developer reviews a Candidate Test Plan and considered is ready to be moved to the Recommended phase but the Admin.
- Use Case Overview: When an AT Developer wants to mark their review of a Candidate Test Plan
- Precondition: The AT Developer has reviewed a Candidate Test Plan
-
User Journey:
- Navigates to the Test Reports page.
- Identifies a Candidate Test Plan from the table.
- Clicks the Candidate Test Plan name.
- Scrolls through the Report to identify the AT/Browser combination to review.
- Marks their review with 1 of the 3 different statuses available: Approved, Request Changes, or Leave Feedback
-
User Journey gaps and/or suggested changes:
- Incorporate 3 actions for Candidate Test Plans with the following functionalities
- Approved: This action will mark a Candidate Test plan as ready to be moved to the Recommended phase by the Admin.
-
Request Changes:
- This option should be available in two places, at a high level so the AT Developer can request changes for a Test Plan in general and within the report’s detail page so they can do the same for a single test.
- When the AT Developer requests changes, they will be taken to a pre-populated Github Issue where they can elaborate. A “Request Changes” label will be automatically added to the GitHub Issue.
- Once the issue is saved, the “open issues” indicator in the app should reflect this. When an issue gets closed, the app should reflect this update as well
-
Leave Feedback:
- Like the “Request Changes” action, this option should also be available in two places, at a high level so the AT Developer can leave feedback for a Test Plan in general and within the report’s detail page so they can do the same for a single test.
- When this option is clicked, the AT Developer will be taken to a pre-populated Github Issue where they can elaborate on the feedback they want to provide. A “Feedback” label should be automatically added to the GitHub issue. Once the issue is saved, the “open issues” indicator in the app should reflect this. When an issue gets closed, the app should reflect this update as well.
- There is currently a “Raise an issue” button when looking at a single test in a Test Plan Report. This should be removed to avoid confusion
- Incorporate 3 actions for Candidate Test Plans with the following functionalities
- Use Case Overview: When an AT Developer is reviewing a Test Plan report and wants to view the report’s form.
- Precondition: The AT Developer is reviewing a Test Plan report.
-
User Journey:
- Navigates to the Test Reports page.
- Identifies a Candidate Test Plan from the table.
- Clicks the Candidate Test Plan name.
- Scrolls through the Report to identify the AT/Browser combination to review.
- Clicks the “Open Test Plan Run”
-
User Journey gaps and/or suggested changes:
- Add an option to the Reports so an AT Developer can review a Test run.
- This option could be available in two places.
- Under the AT and Browser details heading, which would take the AT Developer to the first test in the Test Run page
- Under the Test Name heading in the details page, which will take the AT Developer to that particular test in the Test Run page
- They should not be able to make any edits.
- Use Case Overview: When an AT Developer encounters a bug in a screen reader while reviewing a Test Plan report
- Precondition: The AT Developer is reviewing a Test Plan report
-
User Journey:
- Navigates to the Test Reports page.
- Identifies a Candidate Test Plan from the table.
- Clicks the Candidate Test Plan name.
- Scrolls through the Report to identify the AT/Browser combination to review.
- Goes through the tests in the Reports Table to review the passing and failing assertions.
- Clicks on one of the Test Names.
- Clicks the “Open Test” button
- Identifies a screen reader bug
- Clicks the “File Screen Reader Bug” button
-
User Journey gaps and/or suggested changes:
- We need to incorporate the “File Screen Reader Bug”, which should probably go where the current “Raise an issue” button is.
- This button should take the AT Developer to the Screen Reader’s repository