Challenge Overview
Project Overview
TopCoder and the TopCoder community have worked hard to get the platform to its currently level of maturity, but we're far from done. It's time to take the platform to the next level. TopCoder is going to start taking some steps to open up the platform API to the outside and community developers so they could incorporate it in their websites, applications or build their own applications (web, mobile or desktop).
The ultimate goal is to open up and build an "API" that is targeting all different type of audiences - Software and Studio Competitors, SRM/MM competitors, Copilots, Admins and TopCoder partners - each audience will have different interests and usages of the API, so it will be a huge project and we need to make sure that we are in the right direction from the beginning.
Competition Task Overview
Currently, we have provided installation guide for different environments, like CentOS 6.x, Ubuntu, Mac OS and Windows. please check https://github.com/cloudspokes/tc-api/wiki
The updated code must still deploy and work on heroku - any submission which can't be deployed on heroku successfully will be failed in screening phase - primary reviewer must check this.
The implementation will base on the node.js version of TC platform API - https://github.com/cloudspokes/tc-api. Please follow the existing actionhero pattern for your development.
User Algo Challenges API
This is a new API, the route will be like /:apiVersion/user/:userId/challenges/algo
Following are the general requirements for this API.
1. This api returns the SRMs that the member has participat to as well as their placement in them, including SRM related info.
2. sorting and pagination should be supported for this API.
3. the query will be against topcoder_dw database.
5. Following information need to be returned for each challenge.
{
id,
type, // round type
placement,
prize, // bool, whether the placement had a prize
numContestants, //number of other registrants
numSubmitters, //number of other people who submitted. could be numSubmissions if this isn't tracked
codingDuration, // this should be the time between the start of the contest and their last submission
platforms: ['wordpress'],
technologies : ['node.js', 'javascript']
}
6. The following query is used for competition history page like http://community.topcoder.com/tc?module=AlgoCompetitionHistory&cr=10574855, you can learn from the query used like following
SELECT cal.date
, c.name contest_name
, r.name round_name
, r.short_name
, rr.room_placed
, rr.new_rating
, (select rth.vol
from rating_history rth
where rth.coder_id = rr.coder_id
and rth.round_id = rr.round_id) as vol
, rkh.rank
, rr.paid
, rr.payment_type_desc
, rr.round_id
, rr.room_id
, d.division_desc as division
, rkh.percentile
, rr.division_placed
, r.rating_order
FROM contest c
, round r
, calendar cal
, room_result rr
, division_lu d
, coder_rank_history rkh
WHERE rr.coder_id = @cr@
AND r.round_id = rr.round_id
AND c.contest_id = r.contest_id
AND cal.calendar_id = r.calendar_id
AND rr.division_id = d.division_id
AND r.round_type_id IN (1,2,10,20)
AND rkh.coder_id = rr.coder_id
AND rkh.round_id = rr.round_id
AND rkh.coder_rank_type_id = 2
ORDER BY @sc@ @sd@, rr.round_id desc
Update API Doc
the apiary.apib should be updated to describe the new API.
Standarize Query Naming Convension
we like to use the underscore approach in all SQL queries under the queries directory, please follow same approach.
Notes, for the JSON data returned, we will use camelCase approach.
Testing
The API Framework supports tests. Use supertest with mocha. Don't install mocha globally.
you must use mocha BDD style (which is the default), within that, you can optionally use chai.
Please check Test Creation Guide page in wiki
Code Format
All code must pass jslint. You may use "nomen: true".
Winner Only
Winner will create pull request against the main github repo in final fix phase and help merge with dev branch. The changed files should be unix style, you can use dos2unix command to convert it before commiting.
Virutal Machines (VMs)
To use the Arena VM, please follow http://apps.topcoder.com/wiki/display/docs/Competition+Engine+VM+Setup
VM specific information is found here: http://www.topcoder.com/wiki/display/docs/VM+Image+2.5
Upon registration as a submitter or reviewer you will need to request a VM based on the TopCoder systems image. The VM will be active through aggregation review, after which it will be terminated except for the winner's and the reviewers'. To request your image, please post in contest forum.
Before requesting your VM, you need to ensure that you have an SSH key created and in your member profile. Instructions to do so are here: http://www.topcoder.com/wiki/display/projects/Generate+SSH+Key, and instructions to connect afterwards are here: http://www.topcoder.com/wiki/display/projects/Connect+Using+SSH+Key.
Please realize that VMs are currently issued manually. We make every attempt to issue the VM as soon as it is requested, however, there may be delays of up to 12 hours depending on time of day when you request. We encourage everyone to request a VM as soon as possible to minimize any such delays.
Final Submission Guidelines
Submission Deliverables
Below is an overview of the deliverables:
- Source Code, be sure to include the commit hash, that your submission based on
- Deployment guide to configure and verify the application.
Final Submission
For each member, the final submission should be uploaded to the Online Review Tool.