The Topcoder review process helps ensure that materials delivered to our customers are of the highest quality. Our review board makes sure we are producing high quality output by individually inspecting and rating every submission. Both our reviewers and submitters learn from the review process through lively discussions and feedback mechanisms which leads to higher quality output and improved skills.
The review board is made up of experienced members chosen from the community and trained on the process. Membership to the review board is voluntary, but there are strict eligibility criteria which must be met for anyone to be granted membership.
Some reviews have a mix of review board members, project staff and/or the customer.
All parties, submitter, reviewers and challenge managers have responsibilities within the review process
You are notified via email when review phases start and end. MAKE SURE YOU KNOW THE TIMELINE.
If you are selected as a reviewer on a challenge, you MUST prepare before the review phase starts. Make sure you can setup environments, understand requirements, review forum discussions, etc. Ask questions early.
Be thorough and specific in your reviews. If a submission is scored down on any question, you must be clear in the scorecard comments as to why.
Delays can cause a reduction in payment
To apply for a review spot, visit https://www.topcoder.com/challenges/?bucket=reviewOpportunities
To access any of your reviews you can visit Online Review (OR) at http://software.topcoder.com
If you reviewed a contest that failed and you had access to other members’ submissions, you may not submit for the reposted contests (but you may still review them). Please check the article related to eligibility criteria.
For detailed information about reviewer roles and responsibilities, please see Development Review - Role & Responsibilities
Review opportunities become available after challenges open for participation. In order to review a challenge, you first need to apply for an open spot at https://www.topcoder.com/challenges/?bucket=reviewOpportunities. Once you apply, you will be notified if you have been accepted to review the challenge. In order to increase your chances of being selected, make sure you continue to compete in the track and technologies for the challenges you are interested in reviewing.
NOTE: If you are unable to perform a review according to the review schedule, you must withdraw at least 24 hours prior to the start of the review phase by managing your applications on the Review Opportunities page. If the page is not allowing you to withdraw your application, please email your request to support@topcoder.com.
Logging into https://www.topcoder.com/my-dashboard will bring you straight to your dashboard which is your hub for Topcoder activity. Here you can find all of the projects you are involved in under the “My Challenges” section. You can access the forums and the challenge details from links found in this section. To access your reviews directly, please visit Online Review at http://software.topcoder.com.
After submitting to a challenge, submissions will go through some variation of these phases depending on the configuration of the challenge:
Screening
Review
Appeals
Appeals Response
Aggregation
Final Fixes
Final Review
Certain challenge types include a screening phase to make a quick pass through the submissions and judge whether or not they are worthy of being reviewed. A member of the review board screens each submission to ensure it is non-trivial and possibly capable of meeting the requirements. Submissions that show a significant amount of work will move on to the review phase for a more in-depth analysis whereas obviously incomplete or incorrect submissions will be not.
During the screening phase, submitters may use the Contact link in the project’s Online Review page to file an appeal if you believe the screening is incorrect. Appeals must be filed promptly: we will not accept any appeals received more than 24 hours after the screening phase ends.
After the screening phase, reviewers assess the submissions that passed screening based on a detailed review scorecard to determine if each solution meets the requirements. Each submission will receive a scorecard for each reviewer assigned to the challenge. Challenges typically use two or three reviewers, thus each submitter will have at least two scorecards. Reviewers score each submission independently and do not compare their scores for submissions. Once all of the reviewers have completed the scorecards, the submitter is notified via email when the review phase is over.
During the review phase, reviewers focus on assigning scores that are fair and consistent with the scorecard guidelines. Reviewers must provide justifications for all of theirs scores. When justifying a score, reviewers need to be as specific as possible. If a submission is marked down without making the reason for the issue perfectly clear, the challenge manager and the submitter will not be able to understand why the submission received the score which can cause delays to the challenge.
A reviewer may add more than one comment to a scorecard question. Each comment must be marked as:
Required: This means that the fix must happen for the software to be usable.
Recommendation: This means that the fix should be done but it does not affect the current requirements and could be done later.
Comment: This means that the reviewer wanted to provide some general feedback for the contest manager or submitter, but it does not require a fix.
In some cases, you may see an iterative review phase to for First 2 Finish challenges (F2F). This review phase differs slightly as if the first submission passes review, no other submissions are reviewed and the project moves forward. If it does not pass review, the rest of the submissions will be reviewed in the order received until a passing submission is found.
The last type of review, a peer review, is associated with “Show Your Skills” challenges. These follow a different process than the standard community reviews by review board members. Show Your Skills challenges are reviewed by the fellow submitters for the challenge and not members of the review board. When you submit to these challenges, you are taking on the responsibility to review the work of your peers. Each submission will be evaluated by five other submitters and the outlying review scores are dropped. These challenges are marked as “Peer Review” instead of “Community Review” on the challenge detail page (see image).
As part of the challenge, all submitters are asked to not only complete their submissions, but to provide a peer review of each other’s work. Typically, these challenges are for skill-building purposes and result in badges, rather than submitting work to customers.
The appeals phase will open after the review phase is complete. During this phase, submitters have 24 hours to view the scorecards completed by each reviewer for their own submission. Reviewers will leave comments for each item they deduct points. If the submitter feels something was marked down inappropriately they can make an appeal to the reviewer, but they must provide a very good reason based on objective evidence (and not just opinion) to appeal or the reviewer will reject it. Although you may not file a petition against another competitor’s scorecards (only your own submission’s scorecards may be petitioned), you may report an inconsistent review if the reviewer scored submissions differently for the same issue. If it is determined that the reviewer scored the submissions differently for the same issue, Topcoder will evaluate and correct the scorecard at its discretion.
It is the submitter’s responsibility to ensure that every appeal has all the information the reviewer needs to answer it. The submitter should provide links to standards, forum discussions, etc. If the reviewer hasn’t provided enough information in the scorecard to know what to appeal, you should contact the manager immediately (and appeal with what information is provided).
Following the appeals phase is the appeals response phase. This phase allows the reviewer to review any appeals and decide on a final action. During the appeals response phase, the reviewers will reject any appeal not based on objective fact. Arguments on matters of opinion will be dismissed. Any appeal based on fact must be addressed thoroughly, and the reviewer must provide all the information the submitter needs to understand their decision (links, etc.)
In the appeals response phase, reviewers will sometimes find that an appeal covers more than one submission. For example, a change to a test case can affect the score of every submission reviewed. Since a reviewer can only change scores when somebody submits an appeal, reviewers might not be able to fully correct this problem. In this case, a reviewer’s best option is to use the Contact Manager link in Online Review and explain the problem.
Each appeal will be flagged to as follows:
Accepted: The appeal was valid and the reviewer will adjust their score.
Accepted - No score change: The appeal is valid, but not enough to adjust the score.
Rejected: The appeal is not valid and the reviewer does not believe the submitter is correct in their appeal.
Once the appeals and appeals response phases are complete, the final scores are calculated and a winner is determined. Then, the primary reviewer compiles all of the scores of the winner’s scorecards which are marked as incomplete or incorrect areas of the solution into a single scorecard to facilitate final fixes.
This process is called aggregation. All of the comments are combined into the one final, aggregated, scorecard for the winner. Issues must be marked as “Accepted”, “Rejected”, or “Duplicate”. The winner will be responsible to fix any “Accepted” items during the final fix phase. “Duplicate” is used to note two reviewers found the same issue. One of those comments will be marked “Accepted” and the others can be marked as “duplicate.” A “Rejected” comment is removed from the final scorecard. A comment is marked as rejected if the reviewer’s comment was inaccurate or not needed for this version of the software. “Duplicate” and “Rejected” comments are dropped from the final scorecard.
This aggregated scorecard is used for the next phase: final fixes.
The final fixes phase follows aggregation and is the process to wrap up a challenge before the end result is delivered to a client. The final fix scorecard will contain items that are required and/or recommended before the submission can be considered complete. A submission can only be presented to the client for approval after it has been finalized.
Every required fix must be addressed otherwise the submission will be rejected by the final reviewer. A rejection will revert the challenge back to the final fix phase and will penalize the submitter for causing a delay. Items that are marked as recommended should be attempted to be fixed but are a lower priority than required fixes. If there is sufficient time, they should be included in the final fix.
Communication during the final fixes stage is extremely important. If for any reason the submitter has a question or comment on the review, it should be posted as soon as possible. If there is a blocking issue, contact the manager and make a post to the forum. The challenge manager will look into the issue and ensure that the primary reviewer is involved as necessary. If the submitter should feel that a required fix is impossible, it needs to be addressed on the forums.
A submission should never be resubmitted without the fixes completed or a comment in a readme file. Communication between the submitter, reviewer, and a copilot is essential to make sure this phase goes smoothly. The Review Board is always open to communication; reviewers may have overlooked artifacts or misinterpreted the specification. Without communication, the Review Board doesn’t know that fixes are being finished and can’t mark the fixes as complete.
The format of the final submission is identical to the version provided during the initial submission but with the fixes completed to address the feedback by the review board. It is encouraged to add a text file or spreadsheet (which is less cumbersome) that thoroughly lists all the changes that were made to expedite the final review. If all of the requirements have been met, the contest is complete.
If anything is not done, the challenge makes a return trip to the Final Fixes phase and the submitter will be penalized with a “delayed final fix” warning by Online Review.
Once the fixes are complete and the winner submits the final solution, the reviewer approves it, the payment is set and the challenge is complete!
The challenge manager will deliver the final deliverable to the client. The client and the challenge manager will ensure the solution works in the client’s environment and then the client will accept the work. During this time there are no tasks assigned to Topcoder members but if any issues do occur, you may be asked some clarifying questions.
Challenges at Topcoder occasionally fail for different reasons. When a challenge fails it enters the post-mortem phase. During this phase, feedback is collected from various parties to help determine why the challenge failed. Answers to these survey should provide as much detail as possible to ensure that future challenges are successful.
If there are concerns at any step in the process, use the Contact Managers link in Online Review to contact support or email support@topcoder.com to help resolve any issues.