Challenge Overview

Challenge Summary

Welcome to everyone in the “Dragonet Data Management - Rest API Development”! The goal of this challenge is to develop the rest APIs from the given API design in swagger.

Project Overview

The project will develop a web-based desktop application platform that will help the Information Security Risk & Compliance team of a global company to manage a large amount of data that will be used to generate key risk indicators(KRIs).

In these series of challenges, we build the 

  • background job: 
    • to pull the data from different sources.
    • To compute the KRI values from the pulled data
  • Rest API for the frontend
  • Frontend using React JS

In the last challenge, we have developed the background job to pull the data from different sources and stored them in our database. And in another challenge, we are implementing the job that will compute the KRI values from those pulled data.

Now in this challenge, we will develop the Rest API to expose this data to the frontend.

Technology Stack

.Net core 3.1, SQL server 2019, Hangfire, Log4Net, Entity Framework

Device/OS Requirements

Windows Server

Assets

The assets are shared in the forum.

Individual Requirements

Implement the 

  • ALL REST APIs defined on the swagger file 
  • The API should check role-based access control

Authentication:

  • Configure Okta SSO for authorizing the backend API.
  • The frontend will send the token, you will need to verify that token with Okta

Authorization:

  • Verify that token is received is correct by verifying with Okta
  • Role-based authorization should be performed, check the architecture document’s authorizations section

GET /me

  • Based on an email from the authorization token, you will find the user with a matching email and return the detail
  • If the status is “Inactive” then throw 401

POST /masterData/{sourceId}/import

  • Based on the source Id you will get the detail of source detail from the data_source_detail table
  • Check the fetch method of the existing import service. You will parse the uploaded file using the respective import service(CSV or XLSX) based on the detail of extraction_type 
  • Then validate the fetched record from the uploaded file
  • If an error then return the error message, no need to save in the error_logs
  • If all validation passed then save the records in the database
  • NOTE: This is only applicable for data_source with extraction_type csv and XLSX, and check “data_source_tables.canEditData” is true else throw respective errors.

Other Endpoints

Other endpoints are straightforward, let's discuss if there is any confusion on the forum.
 

The detail of the architecture is shared on the forum. If there is any confusion let's clarify on the forum.

General Requirements

  • Use C# best practices
  • The code should be well documented with XML comments
  • Unit tests are NOT required, 
  • Performance must be taken carefully
  • Only use third-party libraries that are accepted by Topcoder, please confirm before using PAID service or library 

Winner Responsibility

  • The winner will need to send the merge requests on the Gitlab
  • Winner has to fix the issues identified by the reviewer as a final fix within 24hr.


Final Submission Guidelines

Submit the zip containing

  • Updated full source code with detailed readme to setup, deployment including Okta setup.
  • Postman Script to test the API

ELIGIBLE EVENTS:

2021 Topcoder(R) Open

REVIEW STYLE:

Final Review:

Community Review Board

Approval:

User Sign-Off

SHARE:

ID: 30144218