Challenge Overview

Introduction

Welcome to the fourth Rapid Development Match Challenge!!

Rapid Development Matches or RDMs (as we love and would love the community to call them) are fast, timed software development competitions focused on ranking and scoring developers on code accuracy, speed, particular technologies, platforms, and development languages. The goal is clearly defined in the problems to be solved and the requirements to be achieved.

This is the Easy - 250 point problem of RDM-4.
Please Note - The files linked in the specs will only be accessible after you register for the challenge.

Topcoder Bug Hunt Scoring Application Development - Easy

Problem Detail

Topcoder Bug Hunt Challenges use Gitlab for Bug Listings by Challenge Participant. These Bugs are listed as Issues in form of Tickets on the Gitlab Project managed by the Topcoder Challenge Admin, Reviewer or Co-pilot. At the end of the Challenge, they review the listed Tickets, check their Validity, Duplicate Status and marked each raised ticket as "Valid" or "Duplicate" or "Invalid" with the help of Labels. In addition to this, Challenge Participant also add the Label Codes like P1, P2, P3, P4 and P5 based on the Challenge Description/ Severity of the Bug and these codes are assigned with Points and Payment Amount in case of Valid Bug. In some Challenges especially in case of Web Site/ Web Application Testing, Participants have to use different Web Browsers eg: Chrome, Firefox, Opera, Edge etc. So, these challenges also require
to mention the number of Browsers as Labels in which Bug or Issue found. As the number of Tickets increased it will be time consuming to calculate the final score of each participant manually for the Reviewer/Co-pilot.

Gitlab provides the Facility to export the list of Tickets with their details in form of CSV(Comma Separated Values) File. This file is well structured with data in form of Columns and Rows where First Row contains Field Names. So the main task is to create the script or application which read this CSV file, displays its contents group by Participants Name (Author Username).

Easy Part

So in Easy part of this Challenge Task is to create a web page which reads the given CSV(Comma Separated Values) and displays it in tabular form group by Author Username. You can use any programming language that renders a page that matches the given constraints .

Allowed Technologies

Web Application: PHP, JavaScript, C#-ASP.Net, Python, HTML, CSS
Any technology is allowed for the backed, the testing is only made against the front-end.

CSV File Information

Field Names in given CSV file are as follows

Issue Id
URL
Title
State
Description
Author
Author Username
Assignee
Assignee Username
Confident
Locked
Due Date
Created At(UTC)
Updated At(UTC)
Closed At(UTC)
Milestone
Weight
Labels
Time Estimate
Time Spent

Out of above Fields our Fields of interest are:

Issue Id
URL
Title
Description
Author
Author Username
Created At(UTC)
Updated At(UTC)
Labels

The following image shows the input data table:

Desired Output should have the following fields (see the image examples):
Author Username 1
Issue ID
URL
Title
Description
Created At (UTC)
Updated At (UTC)
Labels

Of the fields, these are the ones you should consider in your solution:

The final output should look something like this:

Make sure only the desired columns are displayed in your solution!

Steps

The exact steps you need to reproduce for the testing:

  • Host your app on port 8080 and the page should be at root level (index.html)
  • Include a HTML input element with id "ticketFile" to select file input
  • Include a HTML button with id "btnRead" that reads the input file, and displays the table with the desired columns*
  • Make sure the button has exactly the text "Read File & Display Result"
  • Include a div with id "results"
  • Within this div, For each ticket author, include a div element with their username --> Within this div, For each ticket author, include a div element with its id as their username
  • Within the div for an author, include a label with the format "username-name" (replace username with the author username)
  • Within the div for an author, below the label, include a table with id "username-results"(replace username with the author username) with all of the tickets submitted by that author. Display only the columns of interest. Order of display does not matter in this problem.

Sample Submission

This challenge uses a purely automated testing approach to score the submissions so we are providing a sample submission and an automated tester with a basic test case assembled in a way that simulates the final testing. Docker is used to achieving this. Please read the README.md file to find out how to run the setup.
The sample submission will be in the code folder and it should be extended to implement the requirements. The README.md file contains some guidelines on how to extend the sample submission.

Sample submission with local tester for this challenge: [rdm4-easy.zip]https://assets.ctfassets.net/b5f1djy59z3a/5hJmzLqZroTeYeXZPP8qGD/e151823299275e912aadd242cdbb21b0/rdm4-easy.zip)

Sample Submission to submit on the platform: submission.zip

Final Submission Guidelines

Submission Deliverables

Your submission must be a single ZIP file not larger than 10 MB containing just the code folder with the same structure as the one from the sample submission. The sample tester should not be included in the submission. Also make sure you don't submit any build folders generated locally like node_modules, dist etc.You must follow this submission folder structure so our automated test process can process your scoring:

Create a folder with “code” as the folder name then zip. Inside the “code” folder, there needs to be a file named Dockerfile. This is the docker file used to build the user’s submission. Refer to the provided Docker file in Sample Submission for each level. Zip that “code” folder and submit to the challenge. Execution Details and Submission Logs. Each time you submit, the platform will leverage Docker to run your code. The execution logs will be saved as “Artifacts” that can be downloaded from the Submission Review App: https://submission-review.topcoder.com/.

Checking Passing and Failing Test Cases

Using the Submission Review App (https://submission-review.topcoder.com/), navigate to the specific challenge, then to your submission, and then to the Artifacts for your submission. The zip file you download will contain information about your submission including a result.json file with the test results.
Docker Structure
Make sure you can run your submission with Docker from a clean slate. Your Docker needs to expose port: 8080. It needs to build on a completely clean machine when the platform runs your submission. If you are using something locally to build and run your submission, make sure it’s included as part of your Docker configuration as well. Please look at the sample submission to understand the structure better.

ELIGIBLE EVENTS:

2022 Topcoder(R) Open

Review style

Final Review

Community Review Board

Approval

User Sign-Off

ID: 30235731