AG Agentic Developer Assessment


Challenge Summary:

Overview

In this assessment, you will be provided a simplified version of a payment gateway dashboard source code. We reduced the complexity of the code but at the same time introduced intentional issues across the codebase. You will need to use AI coding agents (Claude Code, Codex, Cursor, etc.) to study the existing codebase, understand the project flow, identify issues, and fix accordingly. This assessment covers the full Software Development Lifecycle (SDLC) and is designed to evaluate your ability to effectively direct and validate AI agent output: (Planning & Analysis, Development, Testing, Code Review, Debugging, Deployment, Documentation).

In short you will have:

- 3 x entry challenges (Setup)

- 8 x main challenges (Development, Testing, Review, Deployment)

- Bonus challenges

- Compulsory deliverables (Prompt logs, AI Retrospective, Git commits)

You will be given 3 days to complete the assessment.


Here's some brief relationship info between modules that will help you understand the code better:

Group -> User -> Job -> Job Log

Group -> Payment Method -> Payment Account


Challenge Checklist:

Entry SETUP

  1. Download this laravel code & Setup (Small catch here, try to bypass platform error)
  2. Find the way to login dashboard
  3. Recover/update the password to get into the dashboard

Challenges

  1. Study the code & figure out how to do CRUD on payment account via API. Use ag-dashboard's group (MCP) in the API call for testing and document it (Best to attach postman screenshot / url). DEVELOPMENT
  2. This payment account API above have CORS issue when embed as AJAX on frontend, apply a fix to it. DEBUGGING
  3. Create a hourly report with filters to highlight success / fail transaction (job), refer to the video below. You can test the report based on 25-05-2021 ~ 01-06-2021 data. DEVELOPMENT
  4. Chart Filter Requirement
    • Date
    • Group
    • Bank
    • Job Type
  5. Make chart's point clickable, link it to corresponding job (show job view in _blank page). DEVELOPMENT
  6. Use an AI agent to scan the codebase for bugs, logic errors, and security issues. Document your findings clearly: what the AI agent found vs. what you found by manually reviewing the AI's output. CODE REVIEW
    There are multiple intentional bugs embedded in the codebase. The more you find, the better.
  7. Write a test suite for core modules (API endpoints, model relationships) using AI agents. Must include at least one edge case the AI initially missed that you caught and added manually. TESTING
    Show us you can evaluate AI-generated tests critically, not just accept them blindly.
  8. Based on the chart you made, make some analysis. Identify fail job without reason, try to investigate via limited info in job log and list down some possible cause & fix (documentation). DEBUGGING Tips
  9. Containerize the application - Create a Dockerfile + docker-compose.yml using an AI agent, then host it on the cloud (Any platform will do) - Attach the login credential that you recovered from the step above for us to review DEVOPS DEPLOYMENT

Bonus Challenge

  1. Crawl https://fpxstatus.billplz.com every (x) interval.
  2. Store the crawl result (db/json/etc).
  3. Create a page in ag-dashboard and display the crawled result accordingly.

*Compulsory Deliverables*

1. AI Prompt/Output Log DOCUMENTATION

Share your AI conversation history for each challenge. We want to see how you direct AI agents, not just the final result.

Accepted formats:

  • Conversation share link (e.g. Claude chat link, ChatGPT share link)
  • Exported PDF / HTML of the conversation
  • Conversation dump / transcript file (Markdown, text, etc.)

2. AI Retrospective Document

Write a short document covering:

  • Where AI helped most
  • Where AI failed or produced incorrect output
  • Where you had to intervene manually
  • What you'd do differently next time

3. Git & Submission
  • Git commit for each solution
  • Register an account here https://gitlab.agsmartit.com and upload your modified source code to AG GitLab
    Note: After registering, please ping us to approve your GitLab account before you can push your code.