Challenge Overview
Project Overview
Data Fabric enables customers to respond and innovate more quickly to data changes. From the CIO (Chief Information Officer)’s perspective, Data Fabric helps to manage and maintain their existing projects and their details. Our customers want to build a Data Fabric platform, in microservice architecture and exposes API for operation. The Data Fabric platform will consist of a dashboard of all existing projects that will allow users to view CIO level metric/data.
Challenge Objectives
-
Write a document (in .md format) to define and consolidate the microservice architecture, including
-
Service Discovery: Eureka
-
Session Centric Persistence: We prefer to use MySQL for the data persistence.
-
Inter-service Communications: which techs should we use?
-
How to divide microservices, you should define which service provides which functionality, and provide the reason of your division, including the pros and cons.
-
Circuit Breakers: we use the Netflix Hystrix. It helps to monitor failures of services and tolerant faults.
-
Logging & Monitoring
-
Cache: ElasticSearch
-
-
Build Swagger spec in OpenAPI 3.0 format based on the details provided below.
-
Containerize: each service should be containerized in a Docker container. Note we should use kubernetes instead of Docker-Compose for the container orchestration.
Technology Stack
-
REST
-
Swagger
-
Java
-
Spring Cloud
-
MySQL
-
Microservice
-
Docker
-
Kubernetes
-
ElasticSearch
-
Netflix OSS, including
-
Eureka
-
Hystrix
-
...
-
Dashboard of projects
Dashboards of projects are divided into 3 levels.
1. Level 1 - Project Dashboard
The Project Dashboard shows the overall status of each project. For each project, it provides the following info
-
Schedule status
-
Budget status
-
Project name
-
PR Number
-
Size
-
Number of apps impacted
-
Project Manager
-
DTD VP
-
Business Sponsor VP
-
Project descriptions/capabilities
Please see the following two pictures for more details.
For the API in this level, only `GET` operations should be provided.
2. Level 2 - Project Details
Project details should provide detailed project information, which includes
-
Current phase(s)
-
Status summary
-
Risks and mitigations
-
Impacted apps
-
Budget status
The project details should also provide the high-level status for each active phase
-
On/off schedule
-
Days off plan
Please see the following two pictures for more details
For the APIs in this level, only `GET` operations should be provided.
Please decompose it into three APIs - ProjectPhaseSummary, ProjectReleases and ProjectRisks where
- ProjectPhaseSummary includes the following items
- Current phase(s)
- Status summary
- ProjectReleases includes
- Impacted apps
- Budget status
- On/off schedule
- Days off plan
- ProjectRisks includes
- Risks
- Mitigations
Note: this is our basic API divisions, if you have any doubts, please ask in the forum.
3. Level 3 - Project Phase Wise Details
This level should provide details for each phase: Funding, Requirements, Design, Development, Testing, Deployment.
3.1 Funding Phase
3.2 Requirements Phase
3.3 Design Phase
3.4 Development Phase
3.5 Testing Phase
3.6 Deployment Phase
For the APIs in this level, only `GET` operations should be provided.
Database entity information
See the following two pictures
You can also refer to the schema that the client provided for more details.
Individual Requirements
-
Design a robust and performance oriented API(s) for the given set(s) of functionalities.
-
Build Swagger spec in OpenAPI 3.0 format based on the details provided above.
-
Make sure to include appropriate models as part of your Swagger. Also, make sure that your Swagger handles error cases for each endpoint as applicable.
-
For responses, you should cover all the cases, not just 200 status code
-
Create sample body requests for each API paths
-
Write a document (in .md format) to support your API spec & mapping to functionalities.
Write another document (in .md format) to define and consolidate the microservice architecture, including all required information as mentioned above.
Scorecard & Review Criteria
We’ll follow a subjective scorecard (1-10) for this challenge. The submissions will be evaluated by the copilot (& client) on the basis of:
- Completeness of Swagger file
- Adherence to best practices for designing REST API
- Quality of documentation and mapping file
- The quality and rationality of the microservice architecture.
As we are performing a subjective review there won't be Appeals and Appeals Response phases.
Final Submission Guidelines
Please zip all following files in your submission.
-
Swagger spec
-
The document (in .md format) to support your API spec & mapping to functionalities.
-
The document (in .md format) to define and consolidate the microservice architecture, including all required information as mentioned above.