Register
Submit a solution
The challenge is finished.

Challenge Overview

Welcome to “Remote Expert Mobile iOS ARKit PoC #2 Challenge”. In this challenge, we need your help to explore one library to implement Video call with ARKit support.
 

CONTEXT

PROJECT CONTEXT

This project aims to build an iOS (iPhone) application that will help field service personnel to identify documents, videos, and/or remote experts to assist them with solving issues in the field while they're working.

The app should leverage ARKit to allow AR objects to be shown during live video calls that will improve the ability of remote experts to provide contextual cues to a user during a video call.
 

CHALLENGE CONTEXT

In this challenge we are going to build a PoC application that makes video calls and have ARKit functionalities on top of it. This is the first application to build the entire application.


CHALLENGE DETAILS

TECHNOLOGY

 

INDIVIDUAL REQUIREMENTS

  • You need to build an iOS applications sample leveraging the following technology: OpenTok. Use swift language latest version
  • The application should be able to connect two iOS phone devices into a video call
  • It is required to build an annotation capability. It has to be enabled during video call
  • Use Case to be implemented:
    • A user (i.e., field technician) performing a task on an object is unable to perform the task and requires the help of another user (i.e., expert) that is located at another location (anywhere in the world). 
    • Thus, field technician initiates a video call with the remote expert. Unlike a traditional video call, this call is not for the two users to see each other, but it is so that the field technician can live broadcast (via video) the object he is working on by using the native video/camera capability on his/her mobile device.
    • The remote expert will see the live video and will then be able to add AR annotation(s) during the live video that both the remote expert and user will see as an overlay to live video.
    • Note: The critical part is that everything happens real-time during live video (i.e., there are no still photos that are shared, annotated and sent back).
  • Once the annotation action button is tapped, the annotation menu option appear on the screen. Note: The user will have the option of selecting:
    • Pen/Marker or Highlighter
    • Eraser
    • Object (for insertion) (Optional)
    • Color Palette (optional)
  • IMPORTANT: All annotation added to the video should leverage augmented reality (i.e., ARKit) to remain anchored to a specific space/location (i.e., it should be spatially aware).
  • For example, if the user moves the camera the annotation may be out of visibility but when it moves the camera back the annotation is still in the space it was originally added.  Example screenshots are given below
  • We don't have a hardcoded design, you can use your creativity based on the screens below. Use any public available icons or library (MIT or Apache License)
 

       



Final Submission Guidelines

Zip file with:
  • iOS application
  • Details instructions for validating the application
  • Video recording link (optional)

ELIGIBLE EVENTS:

2020 Topcoder(R) Open

Review style

Final Review

Community Review Board

Approval

User Sign-Off

ID: 30122533