Skip to content

Commit

Permalink
Merge pull request #80 from AccelerationConsortium/update-course3-tut…
Browse files Browse the repository at this point in the history
…orials

Update course3 tutorials
  • Loading branch information
sgbaird authored Oct 9, 2024
2 parents 948b0a2 + 8f0db7d commit 2d3275d
Show file tree
Hide file tree
Showing 4 changed files with 242 additions and 17 deletions.
1 change: 1 addition & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,6 +154,7 @@
"1.4-hardware-software-communication.ipynb", # MicroPython code
"1.4.1-onboard-led-temp.ipynb", # assumes a MCU is actively receiving
"1.5-data-logging.ipynb", # MicroPython code
"1.5.1-aws-lambda-read.ipynb", # just not tested
"2.*", # TODO: Bayes opt notebooks
"3.*", # TODO: Robotics notebooks
"4.*", # TODO: Software dev notebooks
Expand Down
42 changes: 36 additions & 6 deletions docs/courses/robotics/3.4-mobile-robotics.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ In this module, you will develop software to:

### Bill of Materials

- [MyCobot Pi - World's Smallest and Lightest Six-Axis Collaborative Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot)
- [MyCobot 280 Pi: - World's Smallest and Lightest Six-Axis Collaborative Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot)
A versatile and compact six-axis robot, ideal for mobile robotics applications.

- [Camera Flange 2.0](https://shop.elephantrobotics.com/en-ca/collections/camera-modules/products/camera-flange-2-0)
Expand All @@ -30,6 +30,8 @@ In this module, you will develop software to:
- [LIDAR sensor for obstacle detection]
Used for real-time obstacle detection and mapping in mobile robotics.

- Printed AprilTags (can be generated and printed from [AprilTag Generation](https://github.com/AprilRobotics/apriltag-generation))

- USB-A to micro USB-B cable:
Used to connect and power devices such as the Raspberry Pi or peripherals.

Expand All @@ -44,8 +46,6 @@ In this module, you will develop software to:
- [AprilTags Python Library](https://pypi.org/project/apriltag/)
A computer vision library for identifying and tracking AprilTags, used for spatial referencing and navigation.

- Printed AprilTags (can be generated and printed from [AprilTag Generation](https://github.com/AprilRobotics/apriltag-generation))

- [ROS2 Humble](https://docs.ros.org/en/humble/Installation.html)
ROS2 (Robot Operating System) is an open-source framework for building robot applications. **Humble Hawksbill** is the currently recommended version due to its stability and long-term support.

Expand All @@ -62,7 +62,6 @@ In this module, you will develop software to:
Guide for controlling the adaptive gripper using Python commands.



#### Notes
These materials provide a comprehensive setup for controlling a mobile cobot, including vision systems, robotic arms, grippers, and obstacle detection sensors. The setup integrates ROS and workflow orchestration using Prefect, enabling asynchronous task execution and complex robot control in various environments, such as autonomous labs or educational settings.

Expand Down Expand Up @@ -788,9 +787,40 @@ Process Description:
Through Prefect workflow orchestration, these tasks can be easily managed and scheduled to achieve complex laboratory automation task flows.
### 📄 Assignment
## 🚀 Quiz
::::{tab-set}
:sync-group: category
:::{tab-item} W 2024
:sync: w2024
:::
:::{tab-item} Sp/Su 2024
:sync: sp2024
https://q.utoronto.ca/courses/348619/assignments/1385146?display=full_width
:::
::::
## 📄 Assignment
::::{tab-set}
:sync-group: category
:::{tab-item} W 2024
:sync: w2024
:::
:::{tab-item} Sp/Su 2024
:sync: sp2024
:::
For this assignment, you'll develop a more complex control system for a mobile robot using **ROS** and **Isaac Sim**. You’ll implement the following:
::::
1. **Control the TurtleBot3**: Create a ROS node that moves the robot in a complex path (e.g., figure-eight pattern) while avoiding obstacles using LIDAR.
2. **Simulate and Test in Isaac Sim**: Deploy the robot in Isaac Sim’s photorealistic environment and simulate tasks like object detection and path planning.
Expand Down
38 changes: 34 additions & 4 deletions docs/courses/robotics/3.5-computer-vision.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,6 @@ OpenCV is cross-platform and supports multiple programming languages, including

#### Bill of Materials

#### Bill of Materials

- [OpenCV](https://pypi.org/project/opencv-python/) (for Python installation: `pip install opencv-python`)
- [AprilTag Python Library](https://pypi.org/project/apriltag/)
- [Motorized Microscope]
Expand Down Expand Up @@ -245,11 +243,13 @@ In this section, we'll combine OpenCV image processing techniques with a motoriz

#### Bill of Materials

- OpenFlexure Microscope
- OpenFlexure Microscope (https://openflexure.org/)
The OpenFlexure Microscope is a customisable, open-source optical microscope, using either very cheap webcam optics or lab quality, RMS threaded microscope objectives. It uses an inverted geometry, and has a high quality mechanical stage which can be motorised using low cost geared stepper motors.

#### Demo

✅ Watch [Building the OpenFlexure Microscope](https://www.youtube.com/watch?v=aQEyoch3iuo&ab_channel=TinkerTechTrove)
✅ Watch
[Building the OpenFlexure Microscope](https://www.youtube.com/watch?v=aQEyoch3iuo&ab_channel=TinkerTechTrove)

This example code combines OpenCV image processing techniques with simulated microscope functionality to implement automated monitoring of microscopic particles. The main features include:
1. Image preprocessing: Using Gaussian blur and adaptive thresholding to enhance image quality.
Expand Down Expand Up @@ -421,5 +421,35 @@ For more information on camera calibration, refer to the OpenCV documentation on

## 🚀 Quiz

::::{tab-set}
:sync-group: category

:::{tab-item} W 2024
:sync: w2024

:::

:::{tab-item} Sp/Su 2024
:sync: sp2024

https://q.utoronto.ca/courses/348619/assignments/1385147?display=full_width
:::

::::

## 📄 Assignment

::::{tab-set}
:sync-group: category

:::{tab-item} W 2024
:sync: w2024

:::

:::{tab-item} Sp/Su 2024
:sync: sp2024

:::

::::
178 changes: 171 additions & 7 deletions docs/courses/robotics/3.6-solid-sample-transfer.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@
This module revolves around "transfer of solid samples", and completes complex automation tasks through ROS, AprilTags, multi-axis robots and workflow orchestration platform. The goal is to learn how to coordinate different modules (including robot control, visual recognition, and workflow management) to complete complex solid sample transfer processes, such as performing a grabbing task from one workstation to another for sample processing (such as screwing a bottle cap or dropping a liquid). This is a multi-step automation process involving the coordinated operation of multiple hardware devices and software platforms.

### System Overview
- MyCobot Pi: For precise sample handling and manipulation
- AGV (Automated Guided Vehicle): For transporting samples between stations
- MyCobot 280 Pi: For precise sample handling and manipulation
- MyAGV (Automated Guided Vehicle): For transporting samples between stations
- Liquid Handler: For adding liquids to samples
- Powder Dispenser: For adding powders to samples
- AprilTags: For identifying and locating different stations and samples
Expand All @@ -36,6 +36,8 @@ This module revolves around "transfer of solid samples", and completes complex a
- [Multi-Axis Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot)
A robotic arm designed for executing precise sample movements across multiple axes.

- Printed AprilTags (can be generated and printed from [AprilTag Generation](https://github.com/AprilRobotics/apriltag-generation))

-- [liquid handler]

-- [powder dispenser]
Expand All @@ -48,8 +50,6 @@ This module revolves around "transfer of solid samples", and completes complex a
- [AprilTags Python Library](https://pypi.org/project/apriltag/)
Used for spatial referencing and tracking of sample containers using AprilTags for accurate positioning.

- Printed AprilTags (can be generated and printed from [AprilTag Generation](https://github.com/AprilRobotics/apriltag-generation))

- [ROS2 Humble](https://docs.ros.org/en/humble/Installation.html)
ROS2 (Robot Operating System) is an open-source framework for building robot applications. **Humble Hawksbill** is the currently recommended version due to its stability and long-term support.

Expand Down Expand Up @@ -316,6 +316,82 @@ This tutorial provides a basic framework for automated solid sample transfer. Ho
- For larger-scale operations, consider implementing a multi-robot system with centralized control.
- Utilize a database to track sample information and processing history.

This example code snippet demonstrates basic CRUD (Create, Read, Update, Delete) operations using pymongo, the Python driver for MongoDB. Here's what each section does:

- Connection: We establish a connection to the MongoDB server. Replace 'mongodb://localhost:27017/' with your actual MongoDB connection string.
Database and Collection Selection: We select the database and collection we want to work with. Replace 'your_database_name' and 'your_collection_name' with your actual database and collection names.
- Insert Operation: We insert a new document into the collection and print the ID of the inserted document.
- Query Operation: We search for a document in the collection based on a specific query and print the result.
Update Operation: We update a document in the collection, changing the 'data' field, and print the number of modified documents.
1. Delete Operation: We delete a document from the collection based on a query and print the number of deleted documents.
2. Connection Closure: Finally, we close the connection to the MongoDB server.

This snippet provides a basic overview of working with MongoDB using pymongo. In a real-world scenario, you would typically include error handling and possibly use context managers for the database connection. Also, ensure that you have the necessary permissions to perform these operations on your MongoDB instance.

```python
import pymongo
from pymongo import MongoClient

# Connect to MongoDB
client = MongoClient(
"mongodb://localhost:27017/"
) # Replace with your MongoDB connection string

# Select database and collection
db = client["your_database_name"]
collection = db["your_collection_name"]

# Insert a document
document = {"course_id": "DEMO101", "data": "Some sample data"}
insert_result = collection.insert_one(document)
print(f"Inserted document ID: {insert_result.inserted_id}")

# Query a document
query = {"course_id": "DEMO101"}
result = collection.find_one(query)
print(f"Found document: {result}")

# Update a document
update_query = {"course_id": "DEMO101"}
new_values = {"$set": {"data": "Updated sample data"}}
update_result = collection.update_one(update_query, new_values)
print(f"Modified {update_result.modified_count} document(s)")

# Delete a document
delete_query = {"course_id": "DEMO101"}
delete_result = collection.delete_one(delete_query)
print(f"Deleted {delete_result.deleted_count} document(s)")

# Close the connection
client.close()
```

### Tips
When dealing with multiple robots, deciding which parts should be implemented in ROS2 and which in Prefect is indeed an important consideration. Here are some factors to consider and recommendations:
ROS2 (Robot Operating System 2):
1. Real-time control and communication: ROS2 is better suited for handling real-time control and low-latency communication needs of robots.
Robot-specific functionalities: Tasks such as navigation, motion planning, and sensor data processing should be implemented in ROS2.
Inter-robot collaboration: If multiple robots need to communicate directly or coordinate actions with each other, this is typically best handled within ROS2.
4. Hardware abstraction: ROS2 provides a good hardware abstraction layer, allowing the same code to run on different robot platforms.
Prefect:
Workflow orchestration: Prefect is more suitable for handling high-level task orchestration and workflow management.
Long-term task scheduling: For tasks that need to run for extended periods or execute periodically, Prefect is a better choice.
Data processing pipelines: If complex data processing or analysis workflows are involved, these can be implemented in Prefect.
4. Integration with external systems: Prefect may be more suitable for handling integration with databases, cloud services, or other external systems.
5. Error handling and retry mechanisms: Prefect provides robust error handling and task retry capabilities, suitable for handling long-running tasks that may fail.
Recommendations for combined use:
Use ROS2 to handle low-level control and real-time operations for each robot.
2. Use Prefect to manage high-level tasks and workflows across multiple robots.
3. Create a ROS2 node to act as a bridge between Prefect and the ROS2 system, handling command dispatching and status reporting.
Implement long-running data collection, analysis, and reporting tasks in Prefect.
Use Prefect to manage complex task sequences that require collaboration between multiple robots.
Example scenario:
Consider an automated warehouse system with multiple robots:
In ROS2: Implement navigation, item grasping, obstacle avoidance, etc., for each robot.
In Prefect: Manage the overall workflow, such as task allocation, progress tracking, data collection, and report generation.
By using this approach, you can take full advantage of ROS2's real-time performance and robot control capabilities, while leveraging Prefect's workflow management and task orchestration features to handle higher-level system coordination.


### 2. Error Handling:
- Implement robust error handling and recovery mechanisms.
- Use sensors to detect collisions or unexpected obstacles.
Expand Down Expand Up @@ -383,6 +459,76 @@ By implementing these error handling mechanisms, we can create a more robust and

Similar error handling strategies can be applied to other stations in the automated lab, such as the powder dispenser or the robotic arm. Each component should have its own error checks, logging, and recovery mechanisms tailored to its specific operations and potential failure modes.

Another example of a human-in-the-loop workflow orchestration using Prefect, demonstrating how to handle an error by triggering an operator to check it out:
```python
from prefect import task, flow
from prefect.tasks import task_input_hash
from datetime import timedelta
import time
import random


@task(cache_key_fn=task_input_hash, cache_expiration=timedelta(hours=1))
def robot_task(task_id):
# Simulate a robot task that might fail
if random.random() < 0.3: # 30% chance of failure
raise Exception(f"Robot task {task_id} failed")
time.sleep(2) # Simulate task duration
return f"Task {task_id} completed successfully"


@task
def notify_operator(task_id):
print(f"ALERT: Task {task_id} has failed. An operator needs to check the robot.")
# In a real scenario, this could send an email, SMS, or trigger an alert system
operator_response = input("Has the issue been resolved? (yes/no): ")
return operator_response.lower() == "yes"


@flow
def robot_workflow():
tasks = [robot_task.submit(i) for i in range(5)]

for i, task in enumerate(tasks):
try:
result = task.result()
print(result)
except Exception as e:
print(f"Error in task {i}: {str(e)}")
resolved = False
while not resolved:
resolved = notify_operator(i)
if not resolved:
print("Please resolve the issue before continuing.")
time.sleep(5) # Wait before checking again

# Retry the task after the operator has resolved the issue
retry_result = robot_task(i)
print(f"Retry result: {retry_result}")


if __name__ == "__main__":
robot_workflow()
```
This example demonstrates:

- A robot_task that simulates a robot operation with a 30% chance of failure.
- A notify_operator task that alerts an operator when a task fails and waits for confirmation that the issue has been resolved.
- A robot_workflow that:

- Submits multiple robot tasks
- Checks the results of each task
If a task fails, it notifies an operator and waits for confirmation before retrying the task

In a real-world scenario, you would replace the print statements and input function with actual notification systems (e.g., email, SMS, or integration with an alerting platform) and a more sophisticated method for the operator to confirm that the issue has been resolved (e.g., through a web interface or mobile app).

This workflow demonstrates how Prefect can be used to:
- Orchestrate multiple robot tasks
- Handle errors gracefully
- Incorporate human intervention when needed
- Retry tasks after issues have been resolved

This approach ensures that the workflow can continue even when unexpected issues occur, by bringing in human expertise to resolve problems that the automated system can't handle on its own.


### 3. Precision and Accuracy:
Expand Down Expand Up @@ -472,14 +618,32 @@ By considering these aspects, you can create a more robust, flexible, and scalab
::::{tab-set}
:sync-group: category

:::{tab-item} W 2024
:sync: w2024

:::

:::{tab-item} Sp/Su 2024
:sync: sp2024

[Quiz URL]
https://q.utoronto.ca/courses/348619/assignments/1385149?display=full_width
:::

::::

---

## 📄 Assignment

::::{tab-set}
:sync-group: category

:::{tab-item} W 2024
:sync: w2024

:::

:::{tab-item} Sp/Su 2024
:sync: sp2024

:::

::::

0 comments on commit 2d3275d

Please sign in to comment.