Register
Submit a solution
The challenge is finished.

Challenge Overview

Challenge Overview

"Help us find where the Eagle has landed, Again!!"

 

Have you ever looked at images on your favorite mapping webpage and noticed changes in the world depicted? Maybe you looked at your childhood home and noticed an old car in the driveway or noticed stores on a street that have since been built. More dramatically, have you noticed changes in the landscape like the differences in these glaciers over time?

 

NASA is currently collecting data from the Moon and also has images from the  1960s. They are looking to develop a software application - Lunar Mission Coregistration Tool (LMCT) that will process publicly available imagery files from past lunar missions and will enable manual comparison to imagery from the ongoing Lunar Reconnaissance Orbiter (LRO) mission. Check out this introductory video to find out more!

 

The imagery processed by this tool will be used in existing citizen science applications to identify long lost spacecraft components- such as the Eagle, the lunar module that returned Buzz and Neil back to the command module on Apollo 11 and has since impacted the lunar surface, but we don’t know where. It will also be used to identify and examine recent natural impact features on the lunar surface by comparing the old images against the new images. This is known as image (co)-registration in the field of computer vision. The successful result of this project will allow us to better understand what’s been going on on the Moon for the past sixty years (maybe now is when we’ll discover if it’s really made of cheese).

Task Detail

As you may know, we have run a series of challenges (ideation, code challenge, and code refinement) before. Please refer to the previous challenge specs for some background information. 

 

We are happy with the winning solution so far. This solution is based on the USGS Integrated Software Imaging Software (ISIS) 3 library and also supports USGS ISIS Cubes image format and JPEG2000 image format.


In this challenge, we would like to ask you to provide more test cases for us to better evaluate the current codebase and see what are the necessary improvements. Here are some ideation reports that could be useful.

Final Submission Guidelines

Submission

You submission should contain:

  • A set of test cases. For each test case, please describe its major focus and the expected output. 

  • A detailed evaluation report of the current codebase. How does it perform for your test cases? If it’s not performing as expected, any insights about how to improve? If it’s throwing some errors, any suggestions to the codebase?

  • A detailed deployment instruction. What are the required libs? How to install them? How to run your evaluations? 

Judging Criteria

You will be judged on the quality of your test case designs and evaluation reports. It is important to have insightful suggestions for the future code refinement challenges. Note that the evaluation in this challenge may involve subjectivity from the client and Topcoder. However, the judging criteria will largely be the basis for the judgement.

 
  1. Comprehensiveness (40%)

    1. Have your test cases covered many different scenarios for the possible input images? 

    2. Have your test cases covered many different scenarios for engineering designs?

  2. Evaluation Reports (40%)

    1. Have you conducted the right evaluations and documented the results?

    2. Have you proposed constructive suggestions when the current codebase encountered some issues?

  3. Clarity (20%)

    1. Please make sure your evaluation reports are easy to follow. Figures, charts, and tables are welcome.

 

Submission Guideline

We will only evaluate your last submission. Please try to include your great solution and details as much as possible in a single submission.

 

ELIGIBLE EVENTS:

2021 Topcoder(R) Open

Review style

Final Review

Community Review Board

Approval

User Sign-Off

ID: 30175660