Welcome to ATEM, the adaptive task execution and machine learning package designed for FTC robotics and beyond. Follow this quick-start guide to get up and running with ATEM in your project.
pip install atem
ATEM provides a Python API to train a TensorFlow Lite model based on your tasks.
- Create a tasks.json file to define the tasks your robot will perform. Here’s an example:
{
"tasks": [
{ "name": "Observation Zone", "time": 5, "points": 3 },
{ "name": "Net Zone", "time": 4, "points": 2 },
{ "name": "Low Basket", "time": 5, "points": 4 },
{ "name": "High Basket", "time": 7, "points": 9 }
]
}
- Train a Model Use the following Python script to train and save a TensorFlow Lite model:
from atem.model_train import ModelTrainer
# Paths to tasks file and output model
tasks_file = "tasks.json"
output_model_path = "adaptive_model.tflite"
# Initialize ModelTrainer
trainer = ModelTrainer(tasks_file=tasks_file, output_model_path=output_model_path)
# Train the model and save it
trainer.train_and_save_model(epochs=20, batch_size=16)
print("Task-to-index mappings:")
print(trainer.get_task_mappings())
ATEM models can be integrated into Java-based FTC projects using TensorFlow Lite for inference.
- Step 1: Add TensorFlow Lite Dependencies
dependencies {
implementation 'org.tensorflow:tensorflow-lite:2.11.0'
}
- Step 2: Include the Trained Model in Your Project Place the adaptive_model.tflite file in the assets directory of your FTC project:
TeamCode/src/main/assets/adaptive_model.tflite
- Step 3: Implement Model Inference Use the following Java code to interpret the trained model:
import org.tensorflow.lite.Interpreter;
import java.nio.ByteBuffer;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Map;
public class ATEMModelInterpreter {
private final Interpreter interpreter;
public ATEMModelInterpreter(String modelPath) throws Exception {
ByteBuffer modelBuffer = ByteBuffer.wrap(Files.readAllBytes(Paths.get(modelPath)));
this.interpreter = new Interpreter(modelBuffer);
}
public String predictNextTask(String currentTask, Map<String, Double> sensorData,
Map<String, Integer> taskToIndex, Map<Integer, String> indexToTask, int maxLength) {
// Implement inference logic here using ATEM-trained model
return "Low Basket"; // Replace with actual implementation
}
}
Use the trained model to predict and execute tasks dynamically.
- Step 1: Initialize the Model Interpreter
ATEMModelInterpreter modelInterpreter = new ATEMModelInterpreter("adaptive_model.tflite");
- Step 2: Use Predictions to Execute Tasks
String nextTask = modelInterpreter.predictNextTask(currentTask, sensorData, taskToIndex, indexToTask, 5);
System.out.println("Predicted Next Task: " + nextTask);
-
Model Training:
- Train a TensorFlow model to optimize task sequences for maximum points.
- Tasks and their respective points are dynamically loaded from a JSON file.
- Outputs a TensorFlow Lite model for lightweight deployment.
-
Model Interpretation:
- Given a list of tasks, predicts the optimal sequence and total points.
- Outputs human-readable task orders and scores.
The tasks.json
file defines the tasks available for the autonomous phase:
{
"tasks": [
{ "name": "High Basket", "points": 10, "time": 5 },
{ "name": "Low Basket", "points": 5, "time": 3 },
"..."
]
}
The model uses task data to train on sequences of tasks for maximizing points within a time limit:
- Loads tasks from the tasks.json file.
- Generates random task sequences within the given time constraint.
- Encodes tasks and trains a model to predict scores based on sequences.
- Outputs a TensorFlow Lite model for deployment.
The interpreter script takes a sequence of tasks, predicts the total points, and outputs the best sequence in human-readable format.
Model Architecture
Input:
-
Task indices (embedded into dense vectors).
-
Task times (numeric values).
-
Hidden Layers:
-
Dense layers for feature extraction and sequence analysis.
-
Output
-
Predicted total points for a given task sequence.
Data Encoding
- Task names are encoded as numerical indices.
- Task times are padded to a fixed length for uniform input.
The Adaptive Task Prediction Model is designed to enable real-time decision-making for autonomous robots. It processes sensor data after each task completion, predicts the next optimal task, and adjusts its strategy based on the robot’s current state and environmental feedback.
This dynamic approach ensures the robot maximizes performance, conserves resources, and adapts to unexpected changes in real-world scenarios.
After completing each task, the robot gathers sensor data to provide a snapshot of its current state:
- Time Elapsed: Time taken to complete the task.
- Distance to Target: The robot's proximity to the next goal.
- Gyro Angle: Orientation relative to the reference.
- Battery Level: Remaining energy for task prioritization.
- Additional sensor inputs like vision or LIDAR can be incorporated.
Sensor data and the current task ID are encoded into a format compatible with the machine learning model:
- Continuous values are normalized for consistent input ranges.
- Categorical values are converted to embeddings or indices.
The model processes the encoded input to:
- Predict the Next Task:
- Outputs the most likely task to maximize performance.
- Provide Task Scores:
- Confidence levels for all possible tasks.
Example:
Input:
- Current Task: "Observation Zone"
- Sensor Data: {time_elapsed: 20, distance_to_target: 0.5, gyro_angle: 45, battery_level: 70}
Output:
- Predicted Next Task: "High Basket"
- Task Scores: [0.1, 0.8, 0.1]
The Adaptive Task Prediction Model utilizes a TensorFlow Lite (TFLite) model for efficient inference. This lightweight, optimized model is specifically designed for resource-constrained environments like robotics systems, ensuring fast and accurate predictions in real time.
- Current Task: Encoded as a numerical ID using the task_to_index mapping.
- sensor Data: Real-time inputs such as:
- time_elapsed: Normalized elapsed time.
- distance_to_target: Scaled distance to the next target.
- gyro_angle: Angle, normalized to a fixed range.
- battery_level: Percentage value normalized between 0 and 1.
*The inputs are padded to match the model’s expected dimensions if needed.
- The interpreter runs the input through the pre-trained model.
- The output includes:
- Predicted Task Scores: Confidence scores for each possible task.
- Selected Task: The task with the highest score.
- After completing a task, the robot feeds its current state (task + sensor data) into the model.
- The AI processes the input and:
- Predicts the next task to perform.
- Scores all potential tasks to indicate confidence levels.
- The robot executes the predicted task with the highest score.
- Adaptive Task Model: Predict the next task based on sensor data and task history.
- Task Training: Train custom machine learning models using a
tasks.json
file. - Real-time Adaptation: Simulate real-world scenarios for task execution.
- Pathfinding Integration: Extendable for integration with A* pathfinding for robotics.
- Lightweight TensorFlow Lite Integration: For efficient model inference.
pip install atem