BONUS: 5‌ CHECKPOINTS AWARDED WORTH ‌$100‌ EACH

Register
Submit a solution
The challenge is finished.

Challenge Summary

Welcome to "MUSE - Test Plan Management Platform Design Concept Challenge". In this challenge, we need your help to refine, re-think, and improve the UI/UX for several screens from our MUSE application which will be used for creation, execution, and analysis of automated test cases by our internal team to check Quality Assurance for all software and app in our company.

Read the challenge specification carefully and watch the forums for any questions or feedback concerning this challenge. Let us know if you have any questions in the challenge forum!

Round 1

Submit your initial design for a Checkpoint Feedback
1) Homepage
2) SUT Screen
3) Test Plan Management Screen
4) Topology Screen
5) Test Case Screen
- As part of your checkpoint submission, you must upload your submission to MarvelApp so we can provide direct feedback on your designs. Please include the MarvelApp URL in your notes.txt.
- Make sure all pages have correct flow. Use the proper file numbers, (1, 2, 3, etc.)

Round 2

Submit your Final Design plus Checkpoint Updates.
1) Homepage
2) SUT Screen
3) Test Plan Management Screen
4) Topology Screen
5) Test Case Screen
6) Test Suite Screens
7) Execution Scheduling Screen
8) Result Analysis Screen
9) EQA Communique Add, Saved Drafts Screen
10) Lab Notification Screen
- As part of your Final submission, you must replace your checkpoint submission with the final submission into MarvelApp so we can provide direct feedback on your designs. Please include the MarvelApp URL in your notes.txt.
- Make sure all pages have correct flow. Use the proper file numbers, (1, 2, 3, etc.)


Background Overview
For this challenge, we want to re-think and improve the UI/UX layout for several screens from our MUSE platform. Specifically, we want to create a new design concept for the following features in our MUSE platform:
1) Create New Test Suite and Associate corresponding Test Cases
2) Execute and view the Results
 
This MUSE tool is for creation, execution, and analysis of automated test cases. Once the Test is executed, test case results are stored for future analysis. This tool provides multi-level options for test case, data creation, and maintenance. 
 
Apart for from test management, the MUSE tool recently developed many new features to the tool, slowly all the customer operations will be fused into this tool (New user creation, lab monitor, access management, etc...) and it will not be only a test management tool. 
 
Challenge Goals 
Create a better UI/UX screens for several screens in our Test Plan platform.
 
User Flow
More details about user flow available in "Test-Tool-Description.pdf"
 
1# Manual Tester Perspective:
Step1: Create SUT if not already created
Step2: Create Test Plan if not already created
Step3: Create Topology if not already created
Step4: Create Test Case manually or from excel upload if not already created
Step5: Create Test Suite and add the respective test cases if not already created
Step6: Execute the Test suite from the Execution Scheduler
Step7: Analyze the results from the result analysis page
Step8: Watch the overall Pass Fail chart from all nodes on Home Screen
 
2# Automation Developer Perspective:
Step1: SUT & Release, SUT (System under Test) is node that is going to be tested AND Release is specific version on which SUT is tested.
Step2: Test Plan – Create test plan under each specific Domain and node
Step3: Topology – Topologies are created and maintained here. It contains Test Resources, while creation it will be tagged specific test plans.
Step4: Test Case – Create and maintain Test Cases for specific Test Plans (Functional Area)  
Step5: Test Suite – Test Suite contains list of test cases
Step6: Execution Scheduling – Execute all test suites from this page. 
Step7: Result Analysis – All Executed Tests can viewed here
Step8: Dash Board – Different reports generated and displayed.
 
Design Considerations
- How quickly could you find information?
- This Tool should be easily customizable for UI Modification.
- The interface will be easy and intuitive to navigate
- Give importance to the overall layout and think on how a user would interact with the content on the page.
- Show all the screens and provide a user flow/click-path and navigation, so we can see how the interactions fit together in the application 
- Easy to understand flow from page to page.
- Each page is not overcrowded with information
 
Target Devices
- Desktop: 1366px width with Height adjusted accordingly
 
Branding Guidelines:
- Modern Crisp look and feel adhering to our standards. 
- Follow our "Branding Guidelines.pdf" for colors
- Font and Style Open to designers
 
Challenge Forum
If you have any doubts or questions regarding challenge requirements, please ask in our challenge forum:
 
Screen Requirements
For this challenge, we are looking for the below pages to be designed/considered in your concepts. The screen functionality details listed below are suggested functionality for consideration. Do not allow the suggestions below to impact the creativity of design. If there is a better way to accomplish the same goal, then feel free to take creative liberties.
 
1) Homepage
Reference Screens "Test-Tool-Description.pdf" & "Screenshots2.pdf" (Page 1)
- Main Navigation, should we keep it or change it? Some menus contain sub-menus in it, how can we make this more clear?
- Different reports generated and displayed.
- Ability to watch the overall Pass Fail chart from all nodes on Home Screen
- Chart display, currently there are 6 kind of chart available there, should we keep the appearance or provide more clear chart design? should each chart have different design?
- What if there is an empty content for the chart? How can we make it less empty?
 
2) SUT Screen
Reference Screens "Test-Tool-Description.pdf"
- SUT (System under Test) is node that is going to be tested
- Release - is specific version on which SUT is tested.
- SUT – displayed under Device management tab
- SUT Summary – Search for Existing SUT'S
- If there is no SUT found by the user, user should create new SUT or create/add specific release for the SUT
- User should be able to delete SUT 
 
3) Test Plan Management Screen
Reference Screens Test-Tool-Description.pdf & Screenshots.pdf (Page 2)
- Test Plan - Create test plan under each specific Domain and node
- Test Plan - displayed under the Test Data Management Tab
- Test Plan Summary – Search for existing Test Plans
- If no Test Plan found, Test Plan Create – Create a new Test Plans
- Test Plan Modify – Modify the existing Test Plans
- Test Plan Delete – Delete the existing Test Plans
- Mandatory Fields:
-- DOMAIN: CoreApps
-- Vendor: ERRICSSON
-- TEST PLAN: <Any Name> eg : Auroral
-- SUT NAME: NSIM or any relevant Project.
-- SUT Release: 3.07.01.00
-- Show Build mapping Checkbox
-- Global tcvariable and tcstream are mandatory
-- Objective and Test Scope Text Area
 
4) Topology Screen
Reference Screens Test-Tool-Description.pdf & Screenshots.pdf (Page 4)
- Listing all topology that has been cretead for all domain in this screens
- Topologies are created and maintained here. It contains Test Resources, while creation it will be tagged specific test plans.
- Topology - displayed under the Device management tab
- Topology Search – Search for existing Topology
- If no Topology found, then Topology Create – Create a new Topology – Input file is physical directory path for the file
- Mandatory fields : Domain , Vendor , Test Plan , Topology Name , Input File, Description.
- Other UI Elements : List of Available Recourses search field (Device Names - We can chose device on which Script has to be executed), tables (select, name, mode, domain, family, model)
 
5) Test Case Screen
Reference Screens Test-Tool-Description.pdf & Page 3 (Screenshots.pdf)
- Test Case – Create and maintain Test Cases for specific Test Plans (Functional Area)
- Test Case – Displayed under the Test Data Management Tab
- Test Case View (Summary & Details) – Search for existing Test Cases. Summary Provides all test case list.
- Test Case Create – Create a new Test Case if no Test case found from search
- Test Case Modify – Modify the existing Test Case
- Test Case Delete – Delete the existing Test Case
- Mandatory Fields : Domain, Vendor, Test Plan, Create a Functional Area, Create a Seq Number.
- Other Mandatory Fields : Test Case, Description, Enabled, Classification, category, Critical, Partner, Automated, Expected Duration, Test Status, Acceptance Status, Priority and Mac Document.
 
6) Test Suite Screens
Reference Screens Test-Tool-Description.pdf & Page 5 (Screenshots.pdf)
- Test Suite – Test Suite contains list of test cases.
- Test Suite (Summary & Details) – Displayed under the Test Data Management Tab. Summary Provides all test suite list. Details provides specific test suite all details
- Test Suite View – Search for existing Test Suites
- Test Suite Create – Create a new Test Suites
- Test Suite Modify – Modify the existing Test Suites
- Test Suite Delete – Delete the existing Test Suites
- Mandatory Field: Domain, Vendor, Test Plan, Functional Area. 
- Tes suite search textbox
- Test Suite:
-- Input File Association
-- Test Case Association
-- Topology Resource Association
 
7) Execution Scheduling Screen
Reference Screens Test-Tool-Description.pdf, Screenshots.pdf (Page 6), Screenshots2.pdf (Page 2)
- Execute all test suits from this page.
- Mandatory Fields : Domain, Vendor, Test plan, Functional Area, Suite Type.
- How do we make the search form more accessible for the user? 
- How the search result should appear?
 
8) Result Analysis Screen
Reference Screens Test-Tool-Description.pdf, Screenshots.pdf (Page 7), Screenshots2.pdf (Page 2)
- All Executed Tests can viewed here.
- How can we provide more simple search form to the user?
- How can we mark mandatory field like Domain, Vendor, Test Plan, etc?
- Show us the best way to display all result analysis
- Can we make the search form hidden when it's not necessary? can we make the search result real estate bigger?
 
9) EQA Communique Add, Saved Drafts Screen
Reference Screens Screenshots2.pdf (Page 3-6)
- Right now user needs to do a lot of scrolling to see the content.
- There's too much content in here, how can we simplify it and keep user stay focus?
 
10) Lab Notification Screen
Reference Screen Screenshots2.pdf (Page 7)
- Show us the best way to arrange Node Health area and Key Incidents section.
 
Important:
- Keep things consistent. This means all graphic styles should work together.
- All of the graphics should have a similar feel and general aesthetic appearance
 
Marvel Prototype
- We need you to upload your screens to Marvel App.
- Please send your marvel app request to fajar.mln@gmail.com
- You MUST include your Marvel app URL (in your marvel app prototype, click on share and then copy the link & share it within your notes/comment this link while you upload).
 
Documentation
- Test-Tool-Description.pdf (Description of Test Tool and Details for user flow)
- Existing Screenshot (Screenshots.pdf & Screenshots2.pdf)
- Branding Guidelines Document
 
Target Audience
- QA Teams
 
Judging Criteria
Your submission will be judged on the following criteria:
- Overall idea and execution of concepts
- How well does your design align with the objectives of the challenge
- Execution and thoughtfulness put into the solving the problem
- Overall design and user experience
- Cleanliness of screen design and user flow
 
Submission & Source Files
Preview Image
Please create your preview image as one (1) 1024x1024px JPG or PNG file in RGB color mode at 72dpi and place a screenshot of your submission within it.
 
Submission File
- Submit JPG/PNG image files based on Challenge submission requirements stated above.
- MarvelApp link for review and to provide feedback
 
Source Files
All source files of all graphics created in either Adobe Photoshop/Illustrator/Sketch and saved as editable layer
 
Final Fixes
As part of the final fixes phase, you may be asked to modify your graphics (sizes or colors) or modify overall colors.

Please read the challenge specification carefully and watch the forums for any questions or feedback concerning this challenge. It is important that you monitor any updates provided by the client or Studio Admins in the forums. Please post any questions you might have for the client in the forums.

How To Submit

  • New to Studio? ‌Learn how to compete here
  • Upload your submission in three parts (Learn more here). Your design should be finalized and should contain only a single design concept (do not include multiple designs in a single submission).
  • If your submission wins, your source files must be correct and “Final Fixes” (if applicable) must be completed before payment can be released.
  • You may submit as many times as you'd like during the submission phase, but only the number of files listed above in the Submission Limit that you rank the highest will be considered. You can change the order of your submissions at any time during the submission phase. If you make revisions to your design, please delete submissions you are replacing.

Winner Selection

Submissions are viewable to the client as they are entered into the challenge. Winners are selected by the client and are chosen solely at the client's discretion.

ELIGIBLE EVENTS:

2018 Topcoder(R) Open

Challenge links

Screening Scorecard

Submission format

Your Design Files:

  1. Look for instructions in this challenge regarding what files to provide.
  2. Place your submission files into a "Submission.zip" file.
  3. Place all of your source files into a "Source.zip" file.
  4. Declare your fonts, stock photos, and icons in a "Declaration.txt" file.
  5. Create a JPG preview file.
  6. Place the 4 files you just created into a single zip file. This will be what you upload.

Trouble formatting your submission or want to learn more? ‌Read the FAQ.

Fonts, Stock Photos, and Icons:

All fonts, stock photos, and icons within your design must be declared when you submit. DO NOT include any 3rd party files in your submission or source files. Read about the policy.

Screening:

All submissions are screened for eligibility before the challenge holder picks winners. Don't let your hard work go to waste. Learn more about how to  pass screening.

Challenge links

Questions? ‌Ask in the Challenge Discussion Forums.

Source files

  • Layered PSD files created in Adobe Photoshop or similar
  • AI files created in Adobe Illustrator or similar
  • Sketch

You must include all source files with your submission.

Submission limit

Unlimited

ID: 30060993