Key Information

Register
Submit
The challenge is finished.

Challenge Overview

The goal of this challenge is to build a small NodeJS application that accepts an input file of devices/streams and will emulate a sensor (device) by sending data.

The application will also serve a small UI that allows a file to be uploaded, and a screen that allows managing streams/devices.

Challenge Requirements

You will address the following in this challenge :

NodeJS Framework and Modules

  • - nodejs

  • - expressjs (latest version)

  • -  async (latest version)

  • - JSON DB (latest version)

Input File

  • - The node application will expose an endpoint to accept input file in csv format or JSON format with multiple device/stream pairs, data values, etc.

  • - Sample CSV/JSON file is provided in challenge forums. (direct link https://drive.google.com/open?id=0B73tfGLY7t1HbUlOSDdLUHMwekU)

  • - Columns Clarifications :

    • - API Key - M2X API Key

    • - DeviceID -  Device ID / Key in m2x

    • - Stream Name - name of the stream in m2X

    • - h00 - h24 - these are 24 hourly value columns

    • - Randomizer value (i.e. 0.5 - this is the +/- threshold for a random number based on the hours value - see data generation section)

    • - Boolean: trend (yes/no) — see data generation section

REST API Backend

Write a nodejs/expressjs application with following information :

  • - The application should accept input file as input argument when running the application i.e. node app.js <input file>

    • - If file is csv format it should be converted to json and stored in the json db.

  • - Configuration (file only, no UI)

    • - Port number - port the app is listening on

    • - Send Interval - [default= 1 minute] number representing time interval (in minutes) to generate and send data.

  • - Use JSON DB https://github.com/typicode/lowdb as database in this application.

  • - API REST Endpoints

    • - POST /StopAll - stop sending data for all streams (idle until start is called)

      • - Basically this means stop the data generator scheduler job.

    • - POST /StartAll - start / restart sending data for all streams

      • - Logic :

        • - Load the new input file in JSON db (convert it to json if it is csv)

        • - Mark all streams as enabled

        • - Start the data generator job

    • - POST /StartStream/{stream name}

      • - This starts/restarts data generation job for the given stream

      • - Logic :

        • - Mark the stream as enabled in the json db.

    • - POST /StopStream/{stream_name}-

      • - This stops the streams data generation job for the given stream

      • - Logic

        • - Mark the stream as disabled in the json db.

    • - POST /ImportFile -  receives a new input file

      • - Add validation for the csv/JSON file, it should have all required fields as the provided sample file.

      • - File logic :

        • - Calls StopAll

        • - Stores file locally (overwrites existing).

        • - Calls StartAll

    • - PUT /Stream/{stream-id}

      • - Used to update the hourly value of the given stream.

      • - The passed parameters are the stream id, the hourly key and hourly value.

      • Logic :

        • - Stop the given stream

        • - Update the stream object in json db

        • - Start the stream again

    • - Move logic to utility class so it can be used by different endpoints, i.e. /ImportFile will use same logic as /StopAll and /StartAll.

  • - Socket.io

    • - Write a real time socket io that emits the latest generated value for each stream to connected clients.

  • - Sensitive data should be configurable.

  • - Add proper error handling.

  • - Add proper logging.

User Interface

  • - Use Default Bootstrap theme to deliver basic and good look and feel.

  • - Use AngularJS for frontend communication with backend.

  • - You will build a single admin page

    • - No authentication or authorization required.

    • - Simple Upload File

      • - User will browse and upload CSV/JSON file.

      • - There should be a progress upload indicator

      • - Status message depends on the backend progress  and workflow update.

    • - A table listing all streams, allowing user to “interrupt” the data generation by updating the hourly value or turning stream on/off

      • - Show button at top to stop All / Start All depending on streams state.

      • - Each row (stream) should be displayed

        • - Stream Name  : as in the file

        • - Last Value - last generated value sent to m2x for this stream, this value will be updated in real time using the socket io controller.

        • - Stop / Start button - stops/start only this stream

        • - Update button - allows updating the stream data in backend when user changes any hourly value.

          • - It is preferred that this button is enabled only when user makes a change to existing hourly field.

        • - Values for h00-h24 in the file

          • - The fields are editable, only digits are allowed.

Data Generator Job Scheduler

We will have single job scheduler run periodically using configured “Send Interval” variable for the interval time.

The generator will generate data and send it to the stream using this api endpoint https://m2x.att.com/developer/documentation/v2/device#Post-Data-Stream-Values

Each time the job scheduler run it will load all “enabled” streams from the JSON db.

The Logic to generate data is this :

  1. Random Value

    1. The input file contains a value for each hour in a 24 hour period. The app will generate and send a random number to m2x every X minutes (default 1 min).
      A random number should be generated based on the correct hourly value (h00 - h24) for the current time +/- the randomizer value.  Let's say for the 12pm hour (h12), the temperature in the store is supposed to be 72 degrees and we have the randomizer value set at 0.5. This means that every minute during h12, the simulator will create a temperature value within the range of 71.5 and 72.5 degrees (i.e.e +/- 0.5 degrees from 72).

    2. Store latest generated value in the JSON DB for this stream.

    3. Use the socket.io to emit the generated value to connected clients..

  2. Trending value transition

    1. If the [TREND] column in the file for this stream is ‘Y’ (TRUE)  the transition to the next hourly value (i.e. going from h01 to h02) is accomplished over 20 intervals (Default send interval is 1 minute). The increment would be calculated as ABS([NewValue] - [OldValue]) / 20. If h00 = 72 and h01 = 82, the value is incremented by 0.5 each time it is sent for 20 intervals of sending data (Creating an upward trend when graphed). {72.5, 73, 73.5, 74…..}

    2. After the 20 intervals completed the Data Generation logic should reset to use Random Value data generation logic.

    3. Store latest generated value in the JSON DB for this stream.

    4. Use the socket.io to emit the generated value to connected clients.

M2X Demo Account

  1. Sign-up for a free account with AT&T M2x

  2. Create a device (hardware / other)

  3. Create a stream (note endpoints for logging value to this stream)

  4. Review the API

  5. Code Samples from AT&T https://m2x.att.com/developer/sample-code

Hosting

Provide a deployment steps to host and run the application in Heroku.

Documents

Sample input csv/json file is provided in challenge forums.



Final Submission Guidelines

Deliverable

  • - All source code that implement the requirement.

  • - README in markup language

  • - Verification document contains steps to verify your solution.

  • - Sample input file.

ELIGIBLE EVENTS:

2016 TopCoder(R) Open

REVIEW STYLE:

Final Review:

Community Review Board

Approval:

User Sign-Off

SHARE:

ID: 30051898