Topcoder - Create Logstash Config For Project Events

Register
Submit a solution
The challenge is finished.

Challenge Overview

Background 

Previously in Create Challenge Log Table And Triggers, we have created the tcs_catalog:project_event_log table and related triggers to dynamically insert records to this table. Whenever changes happen to any table related with challenges, an entry to the tcs_catalog:project_event_log table is generated.

The tcs_catalog:project_event_log table has the following structure:

  • PROJECT_ID: Long. Matches a project.project_id record in the tcs_catalog:project table;
  • OPERATION: String representing the operation that is being performed to the challenge: INSERT, UPDATE, or DELETE;
  • DATE: date / time in which the operation happened;
  • SOURCE: String with the name of the source table in which the change happened, this could be the resource table in case of a new
  • person registering for a challenge, the project_info table in case the challenge name changed, phase in case the phase changed from
  • submission to review etc.
  • SOURCE_ID: Long: PK of the item that triggered the event in the table identified in SOURCE (no foreign key constraint)

The tcs_catalog:project_event_log table is fairly standalone, no constraint with other tables.

Here is the script to create the table, 

create table 'informix'.project_event_log (
    project_id INT,
    operation VARCHAR(10) not null,
    date DATETIME YEAR TO FRACTION default CURRENT YEAR TO FRACTION not null,
    source VARCHAR(64) not null, 
    source_id INT not null
)
extent size 64 next size 64
lock mode row;
revoke all on project_event_log from 'public';

grant select,insert,update,delete on project_event_log to 'public' as 'informix';

Scope

For this challenge, we like like to create a logstash process polling that table at 30 second intervals and generating kafka events every time new records are found in it.

The logstash config shall be built to perform the following steps

  1. Fetches all rows in the tcs_catalog:project_event_log table
  2. For each row in the tcs_catalog:project_event_log table, generate a json representation of the event that just happened. See the attached specification for how the event should look like
  3. Generates one Kafka event for each event generated, in order of DATE ascending (earlier first)
  4. Deletes all successfully processed records from the tcs_catalog:project_event_log table. Processed successfully means the related event was successfully delivered to kafka

Setup Environment

Generally, we can test everything with docker images

Informix on Docker

We have a Docker image with informix installed, you can use that for testing purpose. Please use the following steps after you installed the Docker Tools.

  • docker run -it -P 2020:2020 mdesiderio/arena:informix bash
  • Inside the container, switch to informix user: sudo su - informix
  • in the informix user's home folder, run the start-informix.sh script
  • you can now use dbaccess to use informix.
  • You can also connect to informix using localhost:2020 

The password for the informix user is 1nf0rm1x

Kafka on Docker

Please use https://hub.docker.com/r/spotify/kafka/

Logstash on Docker

Please use https://hub.docker.com/_/logstash/



Final Submission Guidelines

  • logstash config
  • Detail Deployment Guide and Verification Steps
  • Test script files, like sql file to insert records to project_event_log table.

ELIGIBLE EVENTS:

2016 TopCoder(R) Open

Review style

Final Review

Community Review Board

Approval

User Sign-Off

ID: 30053273