Challenge Overview
Project Overview
TopCoder and the TopCoder community have worked hard to get the platform to its currently level of maturity, but we're far from done. It's time to take the platform to the next level. TopCoder is going to start taking some steps to open up the platform API to the outside and community developers so they could incorporate it in their websites, applications or build their own applications (web, mobile or desktop).
The ultimate goal is to open up and build an "API" that is targeting all different type of audiences - Software and Studio Competitors, SRM/MM competitors, Copilots, Admins and TopCoder partners - each audience will have different interests and usages of the API, so it will be a huge project and we need to make sure that we are in the right direction from the beginning.
This contest is going to implement the Challenge Terms API and Challenge Term Detail API, which will be used to display terms before user register a challanges (Software and Studio).
Competition Task Overview
Please raise questions as quick as you can, I am familiar with related database and code base, I can provide as much support as I can.
The updated code must still deploy and work on heroku - any submission which can't be deployed on heroku successfully will be failed in screening phase - primary reviewer must check this.
The implementation will base on the node.js version of TC platform API - https://github.com/cloudspokes/tc-api. Please follow the existing actionhero pattern for your development.
For this contest, you are expected to implement the Challenge Terms API and provide tests for the API.
Challenge Terms API
The API path will be like /:apiVersion/terms/:challengeId
It will take extra optional role parameter, the parameter value will be like submitter, reviewer etc.
The response will be like
{
terms: [
{
Title: "Standard Terms for TopCoder Competitions v1.0"
TermsOfUseId: 20703,
AgreeabilityType: NON_AGREEABLE
URL: "http://www.topcoder.com/terms"
Agreed: "no",
},
{
Title: "Standard Terms for CloudSpokes Competitions v1.0"
TermsOfUseId: 20704,
AgreeabilityType: ELEC_AGREEABLE
URL: "http://www.topcoder.com/terms"
Agreed: "yes",
},
...
]
}
Challenge Term Detail API
The API path will be like /:apiVersion/terms/detail/:termId
The response will be like
{
Title: "Standard Terms for TopCoder Competitions v1.0"
TermsOfUseId: 20703,
AgreeabilityType: NON_AGREEABLE
URL: "http://www.topcoder.com/terms"
Text: "This is a Dummy Term"
}
Existing Code Reference
Please find the following code for reference.
If you are forbbiden to access the following link, please send request to subversion@topcoder.com for svn access.
- ViewRegistration and Base (Sofware)
- ViewRegistration and BaseTermsOfUse (Studio)
- Related Tables (under common_oltp database)
- project_role_terms_of_use_xref
- terms_of_use_dependency
- terms_of_use
- terms_of_use_type
- user_terms_of_use_xref
- terms_of_use_agreeability_type_lu
Update API Doc
the apiary.apib should be updated to show description for the Challenge Terms API.
Standarize Query Naming Convension
we like to use the underscore approach in all SQL queries under the queries directory, please follow same approach.
Notes, for the JSON data returned, we will use camelCase approach.
Testing
The API Framework supports tests. Use supertest with mocha. Don't install mocha globally.
you must use mocha BDD style (which is the default), within that, you can optionally use chai.
Note: Tests must follow this standard - Tests Creation Guide.docx
Code Format
All code must pass jslint. You may use "nomen: true".
Winner Only
Winner will create pull request against the main github repo in final fix phase and help merge with master. The changed files should be unix style, you can use dos2unix command to convert it before commiting.
Reviewer Responsiblities
Reviewers need to write/update Supertest tests for the these APIs.
There are three roles:
- Accuracy - Tests the implementation on the accuracy of the results when using the component.
- Failure - Tests the implementation's ability to handle bad data and incorrect usage.
- Security - Tests the oauth, sql inject and other security related requirements.
Reviewer could send preferred role by Contact Manager after system selected the reviwer.
Copilot will assign the role to reviewers if reviewer didn't send the preferred role information.
Reviewer must create pull request from GitHub for the tests.
Virutal Machines (VMs)
VM specific information is found here: http://www.topcoder.com/wiki/display/docs/VM+Image+2.5
Upon registration as a submitter or reviewer you will need to request a VM based on the TopCoder systems image. The VM will be active through aggregation review, after which it will be terminated except for the winner's and the reviewers'. To request your image, please post in contest forum.
Before requesting your VM, you need to ensure that you have an SSH key created and in your member profile. Instructions to do so are here: http://www.topcoder.com/wiki/display/projects/Generate+SSH+Key, and instructions to connect afterwards are here: http://www.topcoder.com/wiki/display/projects/Connect+Using+SSH+Key.
Please realize that VMs are currently issued manually. We make every attempt to issue the VM as soon as it is requested, however, there may be delays of up to 12 hours depending on time of day when you request. We encourage everyone to request a VM as soon as possible to minimize any such delays.
Technology Overview
- JavaScript
- Node.js 0.10.x
- actionhero.js framework
- supertest
- mocha
Documentation Provided
Please check the deployment guide in codebase for reference.
Final Submission Guidelines
Submission Deliverables
A complete list of deliverables can be viewed in the TopCoder Assembly competition Tutorial at: http://apps.topcoder.com/wiki/display/tc/Assembly+Competition+Tutorials
Below is an overview of the deliverables:
- Source Code.
- Deployment guide to configure and verify the application.
Final Submission
For each member, the final submission should be uploaded to the Online Review Tool.