Key Information

Register
Submit
The challenge is finished.

Challenge Overview

A previous challenge has implemented a set of REST APIs for handling video assets, including storing them and managing them (create, retrieve, update, delete).  This challenge will implement new calls to manage data in the admin portion of the website, will update the Swagger file accordingly, and will add unit tests for the REST API.

Existing API

The existing Node application and deployment details are in Gitlab, and the URL to the repository can be found in the forum.
 
API updates

We are going to add and document API for the following functionality.  Note that all of these API callss will require user authentication, similar to what's already existing in the admin portion of the site.  The functionality described below already exists, we just need to formalise it against the endpoints below and make sure that it's documented via Swagger and tested.

/scraper endpoint

1.  Adding a scraper (regular or live)
  a.  POST to /scraper
2.  Disabling a scraper (regular or live)
  b.  POST to /scraper/{id}?disable=true/false
3.  Updating a scraper (regular or live)
  c.  PUT to /scraper/{id}
4.  Deleting a scraper (regular or live)
  d.  DELETE to /scraper/{id}
5.  Getting all scrapers
  e.  GET to /scraper
6.  Getting all scrapers by type (regular or live)
  f.  GET to /scraper/?type=regular/live
7.  Get an individual scraper by its ID (regular or live)
  g.  GET to /scraper/{id}
  
/user endpoint

1.  Adding a user 
  a.  POST to /user
2.  Change a user's password
  b.  PUT to /user
4.  Deleting a user
  d.  DELETE to /user/{id}
5.  Getting all users
  e.  GET to /user 
6.  Get an individual user by the ID 
  g.  GET to /user/{id}

Export and import

When setting up a new server, it takes time to set up all the individual scrapers.  This challenge will make that easier by adding a new "Import / Export" section to the admin panel.  The "Export" will generate a JSON file that comprises all the information about the scrapers, both live and regular.  The "Import" function will accept a file upload of an exported JSON file that will then be parsed and the scrapers will be created based on the import.  Any duplicates will just be silently ignored.
 
README

Make sure the README is updated with verification information about the new features and configuration information so they can be easily added.

Unit tests

Unit tests are required for these new changes, and to cover the existing API.  Your tests must cover positive and negative cases and should be reasonably good at covering edge cases and error cases.  

Heroku deploy

Make sure the Heroku deployment information is up-to-date and that you keep the package.json up to date as well.  Don't expect the deployment to be anything other than "npm install" / "npm start" locally and "git push heroku master" for Heroku deployment.

Submission format

Your submission should be provided as a Git patch file against the commit hash mentioned in the forum.  MAKE SURE TO TEST YOUR PATCH FILE!
 


Final Submission Guidelines

Please see above

ELIGIBLE EVENTS:

2017 TopCoder(R) Open

REVIEW STYLE:

Final Review:

Community Review Board

Approval:

User Sign-Off

SHARE:

ID: 30055564