Challenge Summary
Overview
- User interface and user experience design
- 10-15 unique screens
- Web application
- All source files should be in Sketch only.
Round 1
Submit your initial designs for checkpoint review for the Desktop application:- Dashboard Review Process
- Create New Review Process
- Clone Existing Review Process
- Dashboard Review Scorecards
- Create New Scorecard
Important: As part of your checkpoint submission, you must upload your submission to MarvelApp so we can provide direct feedback on your designs.
- Please ask in the Forum for a MarvelApp prototype link. You will receive 1 MarvelApp prototype for Desktop.
- Make sure to include a URL/Link to your design within your "notes.txt", Important!
- Make sure all pages have the correct flow! Use correct file numbering. (00, 01, 02, 03).
Round 2
Submit your final designs plus checkpoint feedback implemented for the Desktop application:- Dashboard Review Process
- Create New Review Process
- Clone Existing Review Process
- Edit Review Process
- Delete Review Process
- Dashboard Review Scorecards
- Create New Scorecard
- Clone an Existing Scorecard
- Edit Scorecard
- Delete Scorecard
- Important: As part of your final submission, you must update your submission to MarvelApp.
- Use the same MarvelApp prototype you received incheckpoint.
- Make sure to include a URL/Link to your design within your "notes.txt" Important!
- Make sure all pages have the correct flow! Use correct file numbering. (00, 01, 02, 03).
Project Background
- The purpose of this challenge is to develop a complete management of the Review Process and the Manual Review Scorecard management
- By using a review process the submissions of the members will be rated
- The application will be used by copilots, reviewers and challenge architects
Background Information
Right now, on Topcoder, we have only Community Member Review (which is now called Manual Member Review). For design track, you can think of screeners, who evaluate your submissions based on a scorecard you often see like the General Studio Screening Scorecard. The other tracks have different manual scorecards. In addition to Manual Member Review, automated systems such as Virus Scan, Sonarqube (Code Quality Testing), Api Test Harnesses, Checkmarx, and Blackduck could be used and can weigh in on a member’s submission. Also, automated tests can be set up on Marathon Matches, for example.
These review steps can be gated behind events thrown by the challenge engine itself at different moments: submission received, submission phase closes, review phase closes, etc. This user interface we are developing in this challenge will facilitate the whole review process, both manual and automatic.
Glossary
- Review Process - The entire review system encapsulating members reviewing manually as well as automated systems, which provide weighted scores for the member’s submission
- Review Step - One of the systems used to provide an evaluation of a submission
- System Event - A challenge event
- Member Review Scorecard - The existing scorecard that a reviewer fills out
Review Steps
Here are some of the initial review steps that can be wired together to build a review process, and some of what they should be able to do:
- Antivirus Scan - When a member uploads a submission, it’s sent to a virus scanner where the results are pass/fail
- SonarQube - An automatic code quality review system. The submission will be sent to a server and the results provided back to the member.
- Checkmarx - Code security review
- Blackduck - Code security review, etc.
- Manual Review Scorecard - a member downloads the submission and fills out the review scorecard
- API Test Suite - Configure an API to receive notice that there’s a submission for it to grab, test, and submit a score. NOTE - you do not need to design out what this flow looks like. Just put this in the list of Review Steps to be used.
System Events
System events are events about a challenge. These system events are what this review system / autopilot will monitor and trigger the review steps.
- Submission Phase Opened
- Submission Uploaded
- Submission Phase Closed
- Review Phase Opened
- Review Complete
- Review Phase Closed
- Appeals Phase Opened
- Appeal Submitted
- Appeal Response Submitted
- Appeal Phase Closed
- Challenge Close
In this challenge we would like to focus on 2 Use Cases:
1) Review Processes Management (Automatic):
- View Existing Review Processes, filterable by challenge type
- Drag together System Events and Review Steps, name it and save it to be used on challenges
- Clone an existing Review Process
- List existing Scorecards, filterable by challenge type, searchable by name
- Create a new scorecard, create question groups, create questions in those groups, and provide weights. A question makes up a % of a group, and a group makes up a % of the total scorecard score.
- Clone an existing scorecard for reuse
On all the screens, we will need to have a navigation which will consist of Review Processes and Manual Review Scorecards items so the user can go to those pages. When the user opens the application he will land on the Review Process dashboard page.
USE CASE 1: Review Processes Management
1. Dashboard Review Process
When the user comes to Review Processes page:
- The user will see a list/table of existing processes, listed by their: name, date they were created, created by username, track type and actions the user can take on them (Edit, Clone, Delete).
- The Review Processes will be of 2 types: some are predefined (they are actually Review Steps as mentioned above: Antivirus Scan, SonarQube, Checkmarx, etc.) and those that are customized (meaning a user can create a new Review Process by mixing several Review Steps). Note: the predefined processes cannot be deleted.
- The user has the option to Create New (Review Process)
- The user has the option to search a review process by name or filter by track type (design, development, data science)
2. Create New Review Process
This will include several steps:
- At first the user will have the option to Start from Scratch versus Clone Existing
- User selects Start from Scratch:
- He will then enter the name for his new review process and a short description
- He is then asked to select the track that the review process is applicable (radio buttons)
- He is then presented with two UI sections. On the left there are these System Events and Review Steps. On the right is the review process. In the review process, he can drag and drop events and steps from the left.
- Before, he can select a review step, I first need to select a system event. Under this system event, I then define the review steps.
- There can be multiple review steps under an event.
- An event can only contain a review step. It cannot contain other events
- Review steps have to necessarily be under an event
- Each of the review steps, will have a customizable weight (as %) and for an event, the sum of all weights should always be 100%. The system will through an error when the review process is saved, if this condition is not met.
- Once he is done, he then gives the review process a name and a description and he can Save as Draft / Publish it.
3. Clone Existing Review Process
When the user will create a new review process, instead of creating one from scratch he will chose the option to clone an existing process.
- He is presented with existing review processes and he needs to select one from, that will be cloned
- After cloning, he get the same UI as seen when starting from scratch, except the review process already contains the events and review systems populated.
- The name and description fields are empty and not cloned.
4. Edit Review Process
This will be the same as “Clone Existing” except the name and description fields are populated.
5. Delete Review Process
The user will see a confirmation window for deleting on of the review process from his dashboard.
USE CASE 2: Manual Review Scorecard Management
6. Dashboard Review Scorecards
- The user will see a list of the existing scorecards, with date created, created by username, track type, actions the user can take on them (Edit, Clone, Delete).
- All the scorecards are manually defined by the user.
- The user can search the scorecards by name and he can also filter the scorecards by track type (design, development, data science).
- The user has the option to create a new scorecard
7. Create New Scorecard
Right now the scorecards are built in Direct (link – do not copy! We look for original ideas!). It allows you to add groups, sections, questions to sections, and provide a weight for each question. Then the reviewer uses Online Review to fill out that scorecard for each submission. You can use an example to create a new scorecard, like the General Studio Screening scorecard example. Here, a section is Stock Art and a group is Copyright/ Licensing Issues.
When creating a new scorecard the user should be able to enter the scorecard name, a short description and select the system event. Then he would be able to create new groups, and inside them to have new sections and for each section to add several questions
Each question will have:
- Name
- Answers: some scorecards would have only Yes/ No for all answers (like our Screening Scorecard examples) and some would have ratings from 1 -> 10 for all answers (for Code Review). Show us how each of these different options would look.
- Weight
The user will have the option to Preview and Save the scorecard. We would like to see how the Preview screen looks as well.
9. Clone an Existing Scorecard
This view will show the a predefined scorecard populated for all fields, with the exception of Name and Description. The user is able to edit all fields.
10. Edit Scorecard
This view will show the a predefined scorecard populated for all the fields and the user can update any of them.
11. Delete Scorecard
The user will see a confirmation window for deleting a scorecard from his dashboard.
Key Items to Focus on
- This web application must look in brand with the new Topcoder community site.
- Create an intuitive user experience for the copilots
- Target desktop web devices with a minimum resolution of 1280px width and height can be increased if needed.
Branding
- Colors: is mandatory you follow the exact Topcoder colors from the GUI KIT
- All source files should be in Sketch only.
- Fonts: use Roboto font as that is the standard font we are using for the new Topcoder site and have the text only in the TC colors (shades of gray). Use only existing predefined sizes.
- Use the new navigation we have that is provided in the GUI KIT
- Buttons: when using a button, choose one from the existing GUI KIT and just change its name
- The GUI KIT contains a couple of example pages as well - please take a look to get an idea for the style we are looking for
- If there is something else you need or have questions, ask the in the forum.
- Stock photos and stock icons are allowed from the approved sources.
Note: we recently ran a similar design challenge, but we had internal discussions around the review scorecards and processes and how things should work, but the designer did a good job at following the current GUI KIT. Here’s the winner’s design. Do not copy or get lost in the business logic from that challenge as we have totally changed it. Use it as a reference for the visual style only.
Target Audience
Topcoder copilots and challenge architects.
Judging Criteria
Your submission will be judged by:
- How appealing does the user interface look?
- How easy is to understand the flow of information and how to create new scorecards and review processes?
- Is the design consistent and does it follow the GUI designs?
- Hierarchy and organization of content
MarvelApp Presentation
- Request a MarvelApp prototype from me using this link: https://tc-marvel-app.herokuapp.com/challenge/30100119
- Do not use the forums to request for MarvelApp, unless you face issue with the above link.
- Provide clickable spots (hotzones) to link your screens and showcase the flow of the solution.
- Provide the MarvelApp shareable link in your notes during submission upload.
Please read the challenge specification carefully and watch the forums for any questions or feedback concerning this challenge. It is important that you monitor any updates provided by the client or Studio Admins in the forums. Please post any questions you might have for the client in the forums.