Dev Log #2 | Autonomous AI Architecture & Strategic Module System

1. Paradigm Shift : Logic over Physics

Today I made a major architectural decision. Given the target platform is Android (Mobile), overly intensive use of the Physics Engine (Rigidbody/Force) would consume a lot of battery and CPU power.

  • Problem : AddForce on Rigidbody was often difficult for AI to predict and felt “heavy” on mobile devices due to the constant inertia and friction calculations.
  • Solution : Switch to Manual Kinematics using Transform.Translate and Lerp.
  • Result : The drone’s movements became highly deterministic, sharp, and 100% under the control of the AI ​​script, while still looking natural with the Visual Tilting system.

2. Digital Senses: Multi-Raycast Sensors

Drones no longer have “tunnel vision.” I implemented a Multi-Raycast (Fan Sensor) system to provide a wider spatial perception.

  • Configuration : 3-Way Sensor (Left 30°, Center 0°, Right 30°).
  • Logic : The AI ​​can now distinguish the location of obstacles. If the left side is detected, the AI ​​will automatically command a right rotation.
  • Optimization : Using a special “Obstacle” LayerMask to reduce raycast calculations to almost zero CPU overhead (0% overhead).

3. Module System: Energy & Weight Balance

This is the core of the strategy aspect of Drones Lab. Every decision in the Hangar has a real-world impact on the drone’s performance in the arena. I designed a three-way trade-off system :

ParameterEffect
WeightThe heavier the module, the greater the penalty on maximum speed (Weight Penalty).
Energy (Power)The battery produces power, while the sensors and motor consume it. If Energy = 0, the drone is completely dead.
Speed (Engine)The engine module provides a base thrust that will be recalculated based on the drone’s total load.

4. Refactoring: Clean Architecture

To keep the code scalable, I did a major cleanup of the Drone Hardware and Drone Brain classes.

  • DroneBrain : Acts as “Cognition.” It only processes data (Sensors, Weight) and makes decisions (Thrust & Steering). It doesn’t touch any Unity functionality at all (Pure C#).
  • DroneHardware : Acts as an “Actuator”. Receives commands from the Brain and executes them into visual movement and coordinate translation. It uses Unity’s Monobehavior.
  • GameManager : Manages the configuration of currently active modules (MonoBehaviour).

Conclusion & Next Steps

By the end of this morning’s session, the drone can now “take action” about avoiding walls independently while managing its remaining battery. It’s no longer just a mobile robot, but a simulation system.

Next Goals :

  • Implementation of the Combat system (ranged/melee weapon modules). So we can get some fun!
  • Creation of a more complex State Machine (Wander, Chase, Attack).
  • Development of the Hangar UI for real-time module customization.

Developer’s Note :
“It’s not about how well we maneuver the drone, but how intelligently we design the logic and choose modules within the limited slots. That is the core result I want to achieve by creating this simulation.”

document.addEventListener("DOMContentLoaded", function () { // OPEN MODAL (tombol di card) var buttons = document.querySelectorAll(".asset-card .open-modal"); buttons.forEach(function (btn) { btn.addEventListener("click", function () { var card = btn.closest(".asset-card"); var modal = card.nextElementSibling; if (modal && modal.classList.contains("asset-modal")) { modal.style.display = "flex"; } }); }); // CLOSE MODAL (tombol X) var closes = document.querySelectorAll(".asset-modal .modal-close"); closes.forEach(function (btn) { btn.addEventListener("click", function () { var modal = btn.closest(".asset-modal"); if (modal) { modal.style.display = "none"; } }); }); // CLOSE MODAL ketika klik area gelap var modals = document.querySelectorAll(".asset-modal"); modals.forEach(function (modal) { modal.addEventListener("click", function (e) { if (e.target === modal) { modal.style.display = "none"; } }); }); });
Scroll to Top