Challenge Overview

Challenge Objectives

  • Hunt bugs in the provided YAML Config Editor VSCode extension and webapp.


Project Background

“Configuration as code” is a technique to store environmental variables, feature flags, and other configuration items in an SCM the same way the code is stored.   When the code is deployed,  the CI process generates the specific configuration for the proper environment from the config SCM.  Then the CD process deploys it to the appropriate hosts alongside the new code.   The significant advantage of this approach is that configuration changes are tracked over time which is critical to support.   

Our client desires to build a configuration tool that allows developers and DevOps a standard and hierarchical approach and intuitive  form like User Interface for managing and maintaining complex configuration environments.   This tool would be called by the DevOps pipeline to generate JSON configurations for deployment to environments.   Spring Cloud Config project will serve as inspiration for this project.  

In this project, we are building a yaml editor, which is hosted by a NodeJS server running in a docker container. When user requests the UI with an HTTP GET request from the browser the NodeJS server (HapiJS) delivers the index.html and all static assets for the Angular application.  Once the application is running in the browser, the user wants to use it to edit some config files for user's deployed application. For this the user will need to give a git repo url and login credentials. The app would then send a message back to the NodeJS server to perform a shallow clone (we do not need history here) from the git origin (much the same that Jenkins does when it builds your source code).
Once the repo is cloned the nodejs server can deliver a file list to the browser (only YAML files are delivered, the other files are ignored). The app can either select  a file to edit, create a new one, or remove a file. The server either reads (or creates a new file) and sends the content to the browser. And we only need to work with files in the root directory of git repo, no need to support folder for now.  The app should not work on the master branch of the git repo, it should work on a default 'admin' branch. 

The app then parses the content and allows the user to edit/create properties within the default block, or select properties in the default block to copy to an environment for edit/override, or select an environment from a list of defined environments to add to this config file.
 

Technology Stack

  • TypeScript
  • Git
  • Angular 6
  • VSCode Extension
 

Assets

  • Please request access for the gitlab repo
 

What to Test (Scope) 

We previously launched the following challenges to build the YAML Config Editor:
  • REST API Dev: http://www.topcoder.com/challenge-details/30070171/?type=develop&noncache=true
  • UI Prototype Challenge: http://www.topcoder.com/challenge-details/30070647/?type=develop&noncache=true
  • UI Prototype Update Challenge: http://www.topcoder.com/challenge-details/30070674/?type=develop&noncache=true
  • Bug Fixes
    • http://www.topcoder.com/challenge-details/30071175/?type=develop&noncache=true
    • http://www.topcoder.com/challenge-details/30072065/?type=develop&noncache=true
You will need to go through all the challenge specifications above to learn about the YAML Config Editor. 
And it has also gone through many private challenges to update the editor app. 

You should test the
vscode branch, the codebase now has both webapp and vscode extension, please read its README file to see how to  build the vscode extension to test. 
You should also check the features in the webapp are not broken. 
Here is a video show the editor in vscode: 
https://monosnap.com/file/tIaNOQE3rKST80TEvnldt3G8Rqkmm5 , the editor will have the same functionalities as the web page. 

Before logging issues, please check the closed issues to see if they are duplicated. 
 
How to Create New Bug Report
 
1. You need a gitlab account
2. Issues/Bugs found in this extension must create here: https://gitlab.com/leviastan/yaml-editor/issues. DON'T use any other link to create new issues OR submit a document, they won't be counted and won't be paid.
3. Please label issues with the appropriate bug type.  
 
Issue Reporting Guidelines
 
For each report of a limitation or bug, we need the following information:
  1. Steps to reproduce, including any needed information (Must list all the steps that need to reproduce the bug, DON'T list only the URL without test data)
  2. Current results before the bug is fixed
  3. Expected results, after the bug, is fixed
  4. Attach the high-level labels.
  5. If it is a comparison, you must provide the URL and Screenshot/video of that location.
IMPORTANT NOTE:
Missing or Incorrect details to ANY of the above fields will mark the bug report as INCOMPLETE. 
For example, Incorrect Steps, Incorrect Actual and Expected results etc.
 
Be careful when you are providing only the direct URL and not listing the steps to go to that particular page in 'Steps to reproduce' section. Sometimes the Provided URL with parameters won't load the page to the reviewer and the bug may be get closed as 'CAN'T REPRODUCE'. So better to list all the steps till the end or double check the URL is loading or not.
 
Issue Weights and Scoring
  • Scoring will be based on the number of bugs by weight.  Be sure to correctly attach a weight to your bug.  The delivery team has the right to change a severity at their discretion.
  • Only verified issues will be counted.  Tickets created for enhancements or that are not bugs will not be counted. Duplicate issues will be closed and not counted. Log issues according to the guidelines above issues that do not follow these guidelines may reject due to lack of information.
  • For challenge scoring, the user with the most verified issues will be selected as the winner. If two users submit the same issue, the user that submitted the issue first will receive credit.
  • Please focus on functionality testing based on the requirements, all bug reports based on your own assumptions will be rejected.
 
  • Functional Issues    - 10 Points
  • User Interface Issues     - 5 Points
  • Usability/UX Issue - 2 Point
  • Content Bug             - 1 Point
Submitters that do not take 1st, and 2nd places will be paid $5 for each non-duplicate and verified issue up to a maximum of the $100. The 1st and 2nd place submissions must raise at least bugs worth 100 points to win the 1st/ 2nd prizes.
 
Important Notice
  • Follow the standard topcoder Bug Hunt Rules.
  • If you do not properly document your bug reports, they will likely be rejected due to lack of information or documentation. If you submit the same bug in multiple areas/pages, (for instance, Same validation issue of a form can be found in different pages/sections) you will likely get credit for the original bug report only. The others will all be closed as duplicates.
  • If you duplicate an issue on a platform or browser that hasn’t been tested yet, you should create a new issue and add a link/reference in the issue description to the existing issue number. Our copilot will review these items and consolidate them later. Please don’t make adjustments or change labels of existing issues logged by other competitors.
  • DON'T RE-OPEN the issues in the review phase and anyone who RE-OPENS a ticket will be disqualified from the challenge.
  • If Mobile and Tablet testing are available DON'T create the same issue on different platforms; instead, merge them into one; All the others will be marked as Duplicate.
  • If you see multiple broken links on the same page combine them into one ticket. Others will be marked as DUPLICATE.
  • You must not edit the bug report once created, so make sure you enter all the details at the time you create the issue, otherwise, your issue will be moved to the end of the queue. If you really need to edit an issue you must use the comments section for this (i.e. add a comment to describe any changes you want to make to the issue), and we'll decide whether the changes are major enough to move the issue to the end of the queue. You are allowed to add screen shots in the comments section though, assuming your issue report contains all the details when created.
  • You must specify the test data you have used in the 'Reproduction Steps', All the issues will be marked as 'Incomplete', if the correct test data is not provided.
  • Keep an eye on the issues being submitted by other participants to minimize the time you may be spending on duplicate efforts. Knowing what has already been reported will allow you to better focus your time on finding yet undiscovered issues.
  • There will be no appeals phase. The decision of PM/Copilot for validity and severity of each filled issue will be final.


Final Submission Guidelines

Submit all your bugs directly to Bitbucket. When you are done with your submissions please submit a .txt file using the “Submit” button before the submission phase ends. In this file include: 
  • Copies of all Test Execution Summaries which you participated in.
  • topcoder handle (The one displayed in the top right corner near the Profile picture)
  • Bitbuket handle used to raise the issues.
 
- ALL THE SUBMISSIONS WITHOUT ABOVE INFORMATION WILL BE REJECTED AND WON’T BE PAID.
- IMPORTANT: Submit the above details before the Submission Phase ends. If you can't submit due to technical difficulties within the Submission Phase please email your submission with above details to support@topcoder.com.
- Participants who haven't submitted will not be paid.
- DON'T use any other link to create new issues OR submit as document, they won't count and won't be paid.

ELIGIBLE EVENTS:

Topcoder Open 2019

REVIEW STYLE:

Final Review:

Community Review Board

Approval:

User Sign-Off

SHARE:

ID: 30077661