Challenge Overview

Welcome to the "Fisheries Application - Automated QA Idea Generation Challenge".  Due to the limited size of the SPC Oceanic Fisheries Programme Data Management unit’s development team, a gap has emerged in capabilities related to software testing, particularly automated testing.

In larger development teams, there tends to be staff members dedicated to aspects of testing, which involves specific skills and knowledge, but that isn't currently an option for the Fisheries team.

The need for improved testing of the fisheries applications has become more acute with the release of mobile applications, named Tails and OnBoard. These applications cannot be updated as readily with bug fixes, as they need to go through the Android Play deployment process, but also, more importantly, they will be used in environments where the users may not be able to update the applications at all – e.g. at sea. In these cases, it is particularly important that updates don't have major bugs. In addition, the dependence of multiple applications – Tufman 2 web, Tails and OnBoard – on the Tufman 2 API creates potential single points of failure, which also need close attention to potential bugs introduced with updates.


Challenge goal

The goal if this challenge is to have competitors
• Assess the fisheries applications and devise a strategy for testing these applications, focusing especially on automated testing
• Recommend infrastructure and products to use for the development of testing tools for our environment; this would include which testing libraries or frameworks to use for the various parts of the system (e.g. API, mobile apps, website) and how they would be used to build and run the tests (e.g. security aspects, system access, where to run the tests from etc)
• Advice and documentation on how these tools could fit into the overall development workflow of the developers and deployments

Application overview

Tufman 2 API:

* REST API running on .Net Web API
* WebAPI module-based permissions
* Authentication through either direct DB username / password or Google
* Quartz-based scheduler to handle cleanup and importing data via email, among other tasks
* SQL Server is used for the DB, behind an NHibernate ORM layer
* A few unit tests exist for the API and are run either through Visual Studio or Resharper

Tufman 2 web:

* Angular 1 based UI that consumes the Tufman 2 API
* Only fully supported browser is Chrome
* Users are country specific and see specific data according to the country associated with their account

Tails and OnBoard:

* Android mobile applications
* Built using Angular, Ionic, and Cordova
* Read-only access to the Tufman 2 API data is available, without logging in
* Saving data requires a token from the authentication layer
* Local storage is used to save data when offline, and it is sync'd when a network connection is available

https://play.google.com/store/apps/details?id=spc.ofp.onBoard
https://play.google.com/store/apps/details?id=spc.ofp.tails

Authentication

Authentication happens via a custom layer called O2Auth that is similar to OAuth.  The authentication can use user accounts stored in the SQL Server database, or Google accounts can also be used.  For the purposes of this challenge, you can assume that automated tests will be able to login and retrieve an API token.

Requirements

Process recommendations:
The client is looking for recommendations for unit testing and QA tools that can be used to formally validate the full stack of applications:

* During development
* Before production deployments
* Smoke-testing after a deployment has completed

Your submission must include documentation on how the recommended tools, with implemented automated tests, can fit into the 3 areas above.

Framework recommendations:

Recommendations for frameworks and products that can be used for automated testing of each layer
 * JSON REST API (Tufman 2 API)
 * Web application UI (Tufman 2 Web)
 * Android applications (Tails and OnBoard)

Setup details: 

* Setup and installation details, targeting Windows
* Example tests that show how a test can be created, including setup and teardown of each test
* Any other details necessary to document how the tests can be set up and run

Code Coverage

Your submission must include details on how coverage reports can be obtained for each layer of the application using the tools you recommend.  Proper code coverage is key to ensuring a well tested application, so coverage reports will play an important part of the process.

Documentation

The more documentation you have in your submission, and the more examples and explanation of how your recommendations fit into the development process, the more likely it will be that the customer will choose one of your submissions as a winner.  Think about how someone will implement tests using your documented flow, and how they will use the tests they write to properly validate the applications.  Note that any documentation not listed above would be fine to include as well.  Things like architecture diagrams, flow charts, etc... could all be helpful in fully describing your ideas and helping the client understand what you are recommending.

Scorecard

Note - Although this challenge is in the "Code" track, the "Idea Generation" scorecard is going to be used for scoring.  Please make sure you are aware of this and plan your submission accordingly.
 

Final Submission Guidelines

Please see above

ELIGIBLE EVENTS:

2018 Topcoder(R) Open

Review style

Final Review

Community Review Board

Approval

User Sign-Off

ID: 30058590