From d29755c093412340eff8479cdf9f77a85b3029a9 Mon Sep 17 00:00:00 2001 From: SissiFeng Date: Mon, 7 Oct 2024 15:42:10 -0400 Subject: [PATCH 1/3] Update Module 4 documentation --- .../software-dev/4.1-deep-dive-into-github.md | 216 ++++++++++++++ docs/courses/software-dev/4.2-vscode-setup.md | 187 +++++++++++++ .../software-dev/4.3-vscode-debugging.md | 213 ++++++++++++++ docs/courses/software-dev/4.4-unit-testing.md | 229 +++++++++++++++ .../software-dev/4.5-automated-docs.md | 224 +++++++++++++++ .../4.6-continuous-integration.md | 166 +++++++++++ .../software-dev/4.7-project-templates.md | 230 +++++++++++++++ docs/courses/software-dev/4.8-cloud-server.md | 223 +++++++++++++++ .../software-dev/4.9-cloud-simulations.md | 264 ++++++++++++++++++ 9 files changed, 1952 insertions(+) create mode 100644 docs/courses/software-dev/4.1-deep-dive-into-github.md create mode 100644 docs/courses/software-dev/4.2-vscode-setup.md create mode 100644 docs/courses/software-dev/4.3-vscode-debugging.md create mode 100644 docs/courses/software-dev/4.4-unit-testing.md create mode 100644 docs/courses/software-dev/4.5-automated-docs.md create mode 100644 docs/courses/software-dev/4.6-continuous-integration.md create mode 100644 docs/courses/software-dev/4.7-project-templates.md create mode 100644 docs/courses/software-dev/4.8-cloud-server.md create mode 100644 docs/courses/software-dev/4.9-cloud-simulations.md diff --git a/docs/courses/software-dev/4.1-deep-dive-into-github.md b/docs/courses/software-dev/4.1-deep-dive-into-github.md new file mode 100644 index 0000000..0b5484b --- /dev/null +++ b/docs/courses/software-dev/4.1-deep-dive-into-github.md @@ -0,0 +1,216 @@ +(4.1-github-deep-dive)= +# 🧩 4.1 GitHub Deep Dive Into Github + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn advanced GitHub features and collaborative workflows. By the end of this module, you'll be able to: + +1. Work with GitHub Issues +2. Create and manage Pull Requests +3. Collaborate effectively with other developers +4. Use branches for feature development +5. Resolve merge conflicts +6. Utilize GitHub for project management + +### GitHub Issues + +GitHub Issues are a great way to track tasks, enhancements, and bugs for your projects. + +#### Steps to create an issue: +1. Navigate to the main page of the repository. +2. Click on the "Issues" tab. +3. Click the "New issue" button. +4. Enter a title and description for your issue. +5. Assign labels, milestones, and assignees if needed. +6. Click "Submit new issue". + +#### Example: +```markdown +Title: Add user authentication feature + +Description: +We need to implement user authentication for our web application. This should include: +- User registration +- Login/Logout functionality +- Password reset option + +Please use JWT for token-based authentication. +``` + +**Documentation**: [About Issues](https://docs.github.com/en/issues/tracking-your-work-with-issues/about-issues) + +### Pull Requests + +Pull Requests (PRs) let you tell others about changes you've pushed to a branch in a repository on GitHub. + +#### Steps to create a Pull Request: +1. Create a new branch and make your changes. +2. Push the branch to GitHub. +3. Navigate to the main page of the repository. +4. Click "Pull requests" and then "New pull request". +5. Select your branch to compare with the base branch. +6. Enter a title and description for your PR. +7. Click "Create pull request". + +#### Example PR Description: +```markdown +## Description +This PR implements user authentication using JWT. + +## Changes +- Added user registration endpoint +- Implemented login/logout functionality +- Created password reset feature + +## Testing +- All new endpoints have been tested manually +- Added unit tests for authentication functions + +Please review and let me know if any changes are needed. +``` + +**Documentation**: [About Pull Requests](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/about-pull-requests) + +### Collaborative Coding + +Collaborative coding on GitHub involves working with others on the same project, often simultaneously. + +#### Best Practices: +1. Communicate clearly in issues and pull requests. +2. Use descriptive commit messages. +3. Review others' code thoroughly and provide constructive feedback. +4. Keep pull requests focused and small when possible. + +#### Example of a Good Commit Message: +``` +Add password strength checker to registration form + +- Implement zxcvbn library for password strength calculation +- Display strength meter below password field +- Show suggestions for stronger passwords +``` + +**Video Tutorial**: [Collaborative Coding with GitHub](https://www.youtube.com/watch?v=MnUd31TvBoU) + +### Branches + +Branches allow you to develop features, fix bugs, or safely experiment with new ideas in a contained area of your repository. + +#### Steps to create a new branch: +1. Open your terminal. +2. Navigate to your repository. +3. Create and switch to a new branch: + +```bash +git checkout -b feature/user-authentication +``` + +4. Make your changes and commit them: + +```bash +git add . +git commit -m "Implement user registration" +``` + +5. Push the branch to GitHub: + +```bash +git push -u origin feature/user-authentication +``` + +**Documentation**: [About Branches](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/about-branches) + +### Merge Conflicts + +Merge conflicts occur when competing changes are made to the same line of a file, or when one person edits a file and another person deletes the same file. + +#### Steps to resolve a merge conflict: +1. Open the conflicting file in your text editor. +2. Look for the conflict markers: `<<<<<<<`, `=======`, and `>>>>>>>`. +3. Decide which changes to keep. +4. Remove the conflict markers. +5. Add your changes and commit: + +```bash +git add . +git commit -m "Resolve merge conflict in user authentication" +``` + +#### Example of a Merge Conflict: +```python +<<<<<<< HEAD +def authenticate_user(username, password): + # Implementation using bcrypt +======= +def authenticate_user(username, password): + # Implementation using argon2 +>>>>>>> feature/improve-authentication +``` + +**Video Tutorial**: [Resolving Merge Conflicts in GitHub](https://www.youtube.com/watch?v=xNVM5UxlFSA) + +### Project Management + +GitHub provides tools for project management, including project boards, milestones, and labels. + +#### Steps to create a new project: +1. On GitHub, navigate to the main page of your repository. +2. Click on the "Projects" tab. +3. Click "New project". +4. Choose a project template or start from scratch. +5. Name your project and add a description. +6. Click "Create project". + +#### Example Project Setup: +1. Create a new project named "Web Application Development". +2. Add columns: "To Do", "In Progress", "Review", and "Done". +3. Create issues for different features and add them to the "To Do" column. +4. Assign team members to issues and move them to "In Progress" when work begins. + +**Documentation**: [About Project Boards](https://docs.github.com/en/issues/organizing-your-work-with-project-boards/managing-project-boards/about-project-boards) + +### Additional Resources + +- [GitHub Skills](https://skills.github.com/) +- [GitHub Guides](https://guides.github.com/) +- [Pro Git Book](https://git-scm.com/book/en/v2) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393622?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393619?display=full_width +::: + +:::: \ No newline at end of file diff --git a/docs/courses/software-dev/4.2-vscode-setup.md b/docs/courses/software-dev/4.2-vscode-setup.md new file mode 100644 index 0000000..4161e29 --- /dev/null +++ b/docs/courses/software-dev/4.2-vscode-setup.md @@ -0,0 +1,187 @@ + +(4.2-setup-vscode)= +# 🧩 4.2 Setting Up VS Code + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn how to set up VS Code and optimize it for Python development using tools like Miniconda, various extensions, and advanced features. + +1. Set up VS Code +2. Install Miniconda for environment management +3. Install key VS Code extensions for Python development +4. Configure SSH for remote development +5. Explore advanced tools such as Black formatter, Pylance, and GitHub Copilot Chat + +### Setting Up VS Code + +First, you will download and install Visual Studio Code (VS Code), a lightweight and powerful editor. Then, you'll learn how to configure it for efficient Python development. + +#### Steps: + +1. Download and install VS Code from the official site: [VS Code Download](https://code.visualstudio.com/). +2. Open VS Code and get familiar with the interface, including the command palette, sidebar, and settings menu. + +**Video Tutorial**: [Getting Started with VS Code](https://www.youtube.com/watch?v=VqCgcpAypFQ) + +### Installing Miniconda + +Miniconda is a minimal installer for Conda, which is an open-source package management system and environment management system. + +1. Visit the [Miniconda download page](https://docs.conda.io/en/latest/miniconda.html). +2. Download the appropriate installer for your operating system. +3. Run the installer and follow the prompts. +4. Verify the installation by opening a new terminal window and running: + ``` + conda --version + ``` + +**Video Tutorial**: [How to Install Miniconda](https://www.youtube.com/watch?v=oHHbsMfyNR4&pp=ygUYSG93IHRvIEluc3RhbGwgTWluaWNvbmRh) + +#### Steps: + +1. Download Miniconda from the [official site](https://docs.conda.io/en/latest/miniconda.html). +2. Install Miniconda by following the instructions for your operating system. +3. After installation, create a new Python environment: + ```bash + conda create -n myenv python=3.9 + conda activate myenv + ``` + +4. Ensure that VS Code is set up to use this environment for Python development. + +For detailed instructions on using Conda environments with VS Code, refer to the [official VS Code documentation on Python environments](https://code.visualstudio.com/docs/python/environments). + +**Video Tutorial**: [How to Install Miniconda](https://www.youtube.com/watch?v=oHHbsMfyNR4&pp=ygUYSG93IHRvIEluc3RhbGwgTWluaWNvbmRh) + +### Installing Essential VS Code Extensions + +Extensions can enhance the functionality of VS Code, especially for Python development. Below are some essential extensions to install: + +- [Python Extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python): Adds Python language support. +- [Pylance](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance): A high-performance language server with type checking and autocompletion. +- [Black Formatter](https://marketplace.visualstudio.com/items?itemName=ms-python.black-formatter): Automatically formats your Python code. +- [autoDocstring](https://marketplace.visualstudio.com/items?itemName=njpwerner.autodocstring): Generates docstrings for your Python functions and methods. +- [GitHub Copilot](https://marketplace.visualstudio.com/items?itemName=GitHub.copilot): An AI-powered coding assistant. +- [Remote - SSH](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-ssh): For remote development using SSH. + +To install these extensions: +1. Open the **Extensions** view in VS Code (`Ctrl + Shift + X`). +2. Search for each extension by name and click **Install**. + +**Video Tutorial**: [Best VS Code Extensions for Python](https://www.youtube.com/watch?v=fj2tuTIcUys&pp=ygUiQmVzdCBWUyBDb2RlIEV4dGVuc2lvbnMgZm9yIFB5dGhvbg%3D%3D) + +### Using SSH for Remote Development + +VS Code supports remote development, allowing you to write and run code on a remote machine as if it were local. + +Usecase: Raspberry Pi 5 ssh with VS Code extension + +This setup allows you to develop directly on your Raspberry Pi 5 from your main computer, combining the power of VS Code with the versatility of the Raspberry Pi. + +Steps: +1. Ensure your Raspberry Pi 5 is set up and connected to your network. +2. Install the "Remote - SSH" extension in VS Code. +3. Open the Command Palette (Ctrl+Shift+P) and select "Remote-SSH: Connect to Host..." +4. Enter the SSH connection string: `pi@raspberrypi.local` (or your Pi's IP address) +5. Select the platform of the remote host (Linux) +6. Enter your password when prompted + +Once connected, you can create a new Python file on the Pi and start coding. Here's an example: + +```python +# File: /home/pi/hello_pi.py + +import RPi.GPIO as GPIO +import time + +# Set up GPIO mode +GPIO.setmode(GPIO.BCM) + +# Set up pin 18 as an output +GPIO.setup(18, GPIO.OUT) + +# Blink LED connected to pin 18 +try: + while True: + GPIO.output(18, GPIO.HIGH) # Turn on + time.sleep(1) + GPIO.output(18, GPIO.LOW) # Turn off + time.sleep(1) +except KeyboardInterrupt: + GPIO.cleanup() # Clean up on exit +``` + +This script will blink an LED connected to GPIO pin 18 on your Raspberry Pi 5. You can edit, run, and debug this code directly from VS Code on your main computer. + + +**Video Tutorial**: [VS Code Remote Development with SSH](https://www.youtube.com/watch?v=miyD4c1dnTU&pp=ygUjVlMgQ29kZSBSZW1vdGUgRGV2ZWxvcG1lbnQgd2l0aCBTU0g%3D) + +### Configuring Black Formatter + +Black is an opinionated Python code formatter that ensures code is formatted consistently. Here’s how to configure it in VS Code: + +1. Open the VS Code **Settings** (`Ctrl + ,`). +2. Search for `Python Formatting Provider` and set it to `Black`. +3. You can also configure VS Code to format your code on save by enabling the "Format on Save" option in settings. + +**Video Tutorial**: [How to Use Black in VS Code](https://www.youtube.com/watch?v=esZLCuWs_2Y) + +### GitHub Copilot Chat + +GitHub Copilot can assist you by suggesting code snippets, entire functions, and even refactoring your code. With **GitHub Copilot Chat**, you can interact with Copilot via a chat interface to get suggestions and resolve issues. + +#### Steps: + +1. Install **GitHub Copilot** and **GitHub Copilot Chat** extensions. +2. After installation, sign in to your GitHub account in VS Code. +3. Open the Copilot Chat interface and ask for help with coding, debugging, or writing functions. + +**Video Tutorial**: [GitHub Copilot Chat](https://www.youtube.com/watch?v=a2DDYMEPwbE&pp=ygUTR2l0SHViIENvcGlsb3QgQ2hhdA%3D%3D) + +### Additional Resources + +- [VS Code Documentation](https://code.visualstudio.com/docs) +- [Conda Environments Guide](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html) +- [Setting up VS Code for Python](https://code.visualstudio.com/docs/python/python-tutorial) +- [Black Formatter Documentation](https://black.readthedocs.io/en/stable/) +- [GitHub Copilot Documentation](https://docs.github.com/en/copilot) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393623?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394226?display=full_width +::: + +:::: \ No newline at end of file diff --git a/docs/courses/software-dev/4.3-vscode-debugging.md b/docs/courses/software-dev/4.3-vscode-debugging.md new file mode 100644 index 0000000..3a8ec15 --- /dev/null +++ b/docs/courses/software-dev/4.3-vscode-debugging.md @@ -0,0 +1,213 @@ + +(4.3-debugging-vscode)= +# 🧩 4.3 Debugging in VS Code + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn how to debug Python code using VS Code's built-in debugging tools. By the end of this module, you'll be able to: + +1. Use print statements to debug code +2. Set breakpoints to pause code execution at specific lines +3. Inspect variables to check their values during execution +4. Step through code line by line +5. Use the debug console for live interaction with your running code +6. Configure custom debug settings in VS Code + +### Debugging with Print Statements + +The simplest form of debugging is using print statements to display the values of variables at different stages of execution. This method is great for tracking the flow of the program and spotting where things might be going wrong. + +#### Steps: +1. Add `print()` statements throughout your code to output the values of key variables. +2. Run your Python file and check the terminal output for these printed messages. + +#### Example: +```python +def calculate_area(radius): + area = 3.14 * radius**2 + print(f"Calculated area: {area}") # Debugging with a print statement + return area + + +radius = 5 +print(f"Radius: {radius}") # Print the value of radius +area = calculate_area(radius) +print(f"Final area: {area}") # Print the final calculated area +``` + +**Video Tutorial**: [Debugging Python with Print Statements](https://www.youtube.com/watch?v=GbyXP3_7SBg) + +### Setting Breakpoints + +Breakpoints allow you to pause the execution of your code at specific lines so that you can inspect the state of the program and variables at that moment. Unlike print statements, breakpoints provide more flexibility and control during debugging. + +#### Steps: +1. Open your Python script in VS Code. +2. Click in the margin next to the line number where you want to set a breakpoint. A red dot will appear to indicate the breakpoint. +3. Run the debugger by pressing `F5`, and the code will pause when it hits the breakpoint. + +#### Example: +```python +def calculate_area(radius): + area = 3.14 * radius**2 + return area + + +radius = 5 +area = calculate_area(radius) # Set a breakpoint on this line +print(f"Area: {area}") +``` + +**Video Tutorial**: [VS Code Breakpoints Tutorial](https://www.youtube.com/watch?v=cZhMgXgKQdI) + +### Inspecting Variables + +Once a breakpoint is hit, you can inspect the values of variables at that specific point in time. This helps in understanding the current state of the program and diagnosing any issues. + +#### Steps: +1. After the code pauses at a breakpoint, hover over any variable in the editor to see its current value. +2. Alternatively, use the **Variables** section in the Debug Sidebar to see the values of all local and global variables. + +#### Example: +```python +def calculate_area(radius): + area = 3.14 * radius**2 + return area + + +radius = 10 # Inspect the value of radius here +area = calculate_area(radius) +``` + +**Video Tutorial**: [Inspecting Variables in VS Code](https://www.youtube.com/watch?v=qw--VYLpxG4) + +### Stepping Through Code + +Stepping allows you to move through your code line by line, giving you control over the pace of execution. You can "step into" functions, "step over" them, or "step out" of them. + +#### Steps: +1. Once the code hits a breakpoint, use the toolbar buttons to: + - **Step Over** (`F10`): Skip over a function call without entering it. + - **Step Into** (`F11`): Enter into a function to debug it. + - **Step Out** (`Shift + F11`): Exit the current function and return to the caller. + +#### Example: +```python +def multiply(a, b): + return a * b + + +def main(): + x = 5 + y = 10 + result = multiply(x, y) # Step into this function + print(f"Result: {result}") + + +main() +``` + +**Video Tutorial**: [Stepping Through Code in VS Code](https://www.youtube.com/watch?v=E8dGNupbI4U) + +### Using the Debug Console + +The debug console allows you to interact with your code while it’s paused. You can execute commands, print variable values, or run functions in real-time. + +#### Steps: +1. While the code is paused at a breakpoint, open the **Debug Console** from the bottom panel. +2. In the console, type in variable names to check their values or run Python expressions to see the output. + +#### Example: +```python +def divide(a, b): + return a / b + + +x = 50 +y = 0 # Potential division by zero error +result = divide( + x, y +) # Set a breakpoint here and use the debug console to check variable values +``` + +**Video Tutorial**: [Using the Debug Console in VS Code](https://www.youtube.com/watch?v=-QRyPL5qupU) + +### Debug Configurations + +You can create custom debug configurations in VS Code to define how your program should be debugged. These configurations are defined in a `launch.json` file. + +#### Steps: +1. Open the Command Palette (`Ctrl + Shift + P`) and search for **"Debug: Open launch.json"**. +2. Select the Python environment. +3. Customize the configuration based on your project needs. + +#### Example Configuration (`launch.json`): + +```json +{ + "version": "0.2.0", + "configurations": [ + { + "name": "Python: Current File", + "type": "python", + "request": "launch", + "program": "${file}", + "console": "integratedTerminal", + "args": ["--verbose"], + "env": {"DEBUG": "true"}, + } + ], +} +``` + + + + +**Video Tutorial**: [Setting Up Debug Configurations in VS Code](https://www.youtube.com/watch?v=gLNIRwX3oM4) + +### Additional Resources + +- [VS Code Debugging Documentation](https://code.visualstudio.com/docs/editor/debugging) +- [Python Debugging in VS Code](https://code.visualstudio.com/docs/python/debugging) +- [Advanced Debugging in VS Code](https://www.youtube.com/watch?v=2oFKNL7vYV8) + + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393624?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394227?display=full_width +::: + +:::: \ No newline at end of file diff --git a/docs/courses/software-dev/4.4-unit-testing.md b/docs/courses/software-dev/4.4-unit-testing.md new file mode 100644 index 0000000..798d4c5 --- /dev/null +++ b/docs/courses/software-dev/4.4-unit-testing.md @@ -0,0 +1,229 @@ + +(4.4-unit-testing)= +# 🧩 4.4 Unit Testing + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +Unit testing is a crucial skill for software developers, helping ensure code quality and reliability. This module introduces you to unit testing in Python using the pytest framework, and explores the concept of test-driven development (TDD). + +In this module, you will learn the fundamentals of unit testing in Python using the **pytest** framework. You will also explore test-driven development (TDD) and how it can be applied to real-world projects. + +1. Explain the purpose of unit tests +2. Write unit tests for the light-mixing demo +3. Run and interpret unit tests to fix code +4. Explain test-driven development (TDD) + +### Purpose of Unit Tests + +Unit tests are small, isolated tests that validate the functionality of a specific section (unit) of your code, such as a function or a class. They help ensure that individual components of your program behave as expected, making debugging easier and reducing the chances of introducing bugs when making changes. + +#### Key Benefits: +- Validate the correctness of your code. +- Catch issues early in the development cycle. +- Provide a safety net for refactoring. +- Improve code quality and maintainability. + +**Example:** +```python +def add(a, b): + return a + b +``` + +A unit test for this function would check whether it correctly adds two numbers: +```python +def test_add(): + assert add(2, 3) == 5 +``` + +**Video Tutorial**: [What is Unit Testing?](https://www.youtube.com/watch?v=1Lfv5tUGsn8) + +### Writing Unit Tests with pytest + +**pytest** is a popular Python testing framework that simplifies the process of writing and running tests. It automatically discovers test files and functions, and provides a clean, easy-to-use syntax. + +#### Steps: +1. Install **pytest** using pip: + ```bash + pip install pytest + ``` +2. Create a test file (e.g., `test_light_mixing.py`) and define your unit tests inside it. +3. Use the `assert` statement to check whether the function outputs match the expected results. + +#### Example Unit Test for the Light-Mixing Demo: +```python +# light_mixing.py +def mix_colors(color1, color2): + if color1 == "red" and color2 == "blue": + return "purple" + elif color1 == "blue" and color2 == "yellow": + return "green" + elif color1 == "red" and color2 == "yellow": + return "orange" + else: + return "unknown" + +# test_light_mixing.py +def test_mix_colors(): + assert mix_colors("red", "blue") == "purple" + assert mix_colors("blue", "yellow") == "green" + assert mix_colors("red", "yellow") == "orange" +``` + +#### Running Tests: +1. Navigate to the folder containing your test file. +2. Run pytest from the command line: + ```bash + pytest + ``` + +pytest will automatically discover all files starting with `test_` and execute the tests. + +#### Naming Conventions: +- Test files should be named `test_*.py` or `*_test.py` +- Test functions should start with `test_` +- Test classes should start with `Test` + +These naming conventions help pytest automatically discover your tests. + +**Video Tutorial**: [Introduction to pytest](https://www.youtube.com/watch?v=Kt6QqGoAlvI&ab_channel=teclado) + +### Interpreting Test Results + +When you run pytest, it will display the results of your tests in the terminal. Here's how to interpret the output: + +- **Green (PASSED)**: The test passed successfully. +- **Red (FAILED)**: The test failed. pytest will show you the expected result and the actual result so you can identify the issue. +- **Yellow (SKIPPED)**: The test was skipped (typically because of a specific condition, such as platform dependency). + +#### Example Output: +```bash +============================= test session starts ============================== +collected 3 items + +test_light_mixing.py ... [100%] + +============================== 3 passed in 0.03s =============================== +``` + +If a test fails, pytest will provide a detailed report: +```bash +def test_mix_colors(): +> assert mix_colors("blue", "yellow") == "purple" +E AssertionError: assert 'green' == 'purple' + +test_light_mixing.py:5: AssertionError +``` + +In this case, you can see that the test expected "purple" but received "green," indicating an issue with the `mix_colors()` function. + +#### Code Coverage: +pytest can be used with the `pytest-cov` plugin to measure code coverage, which indicates how much of your code is being tested: + +```bash +pip install pytest-cov +pytest --cov=myproject tests/ +``` + +This will show you the percentage of your code that is covered by tests, helping you identify areas that need more testing. + +**Video Tutorial**: [Understanding pytest Output](https://www.youtube.com/watch?v=dN-pVt7i4Us&ab_channel=anthonywritescode) + +### Debugging with pytest + +When a test fails, you can use pytest's built-in debugging capabilities to help identify the issue. The `--pdb` flag can be used to drop into the Python debugger when a test fails, allowing you to inspect variables and step through the code. + +#### Steps: +1. Run pytest with the `--pdb` flag: + ```bash + pytest --pdb + ``` +2. When a test fails, pytest will drop into an interactive debugging session where you can inspect variables and explore the state of your program. + +#### Example: +```bash +(Pdb) print(color1) +'blue' +(Pdb) print(color2) +'yellow' +``` + +**Video Tutorial**: [Debugging with pytest](https://www.youtube.com/watch?v=by9ZU7h1cgk&ab_channel=SuperEngineer) + +### Test-Driven Development (TDD) + +**Test-driven development (TDD)** is a software development approach where you write tests before writing the actual code. This ensures that the code you write is directly influenced by the tests, leading to a more robust and bug-free implementation. + +#### TDD Workflow: +1. **Write a Test**: Begin by writing a test for the new functionality you want to implement. +2. **Run the Test**: Since the functionality doesn't exist yet, the test will fail. +3. **Write Code**: Write just enough code to make the test pass. +4. **Run the Tests Again**: Ensure all tests pass. If any tests fail, fix the code until they pass. +5. **Refactor**: Clean up the code while ensuring that the tests continue to pass. + +#### Example of TDD for Light Mixing: +1. **Write the Test**: + ```python + def test_mix_colors(): + assert mix_colors("red", "blue") == "purple" + ``` +2. **Run the Test**: It will fail because the `mix_colors()` function doesn’t exist yet. +3. **Write the Function**: + ```python + def mix_colors(color1, color2): + if color1 == "red" and color2 == "blue": + return "purple" + return "unknown" + ``` +4. **Run the Test Again**: The test should now pass. +5. **Refactor the Code**: Improve the `mix_colors()` function while ensuring the test still passes. + +**Video Tutorial**: [Test-Driven Development with pytest](https://www.youtube.com/watch?v=uEFrE6cgVNY) + +### Additional Resources + +- [pytest Documentation](https://docs.pytest.org/en/6.2.x/) +- [Unit Testing in Python](https://realpython.com/python-testing/) +- [Test-Driven Development Explained](https://martinfowler.com/bliki/TestDrivenDevelopment.html) +- [Python Testing with pytest: Simple, Rapid, Effective, and Scalable](https://pragprog.com/titles/bopytest/python-testing-with-pytest/) (book) +- [Effective Python Testing With Pytest](https://realpython.com/pytest-python-testing/) (in-depth tutorial) +- [Property-Based Testing With Hypothesis and Pytest](https://semaphoreci.com/community/tutorials/property-based-testing-with-hypothesis-and-pytest) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393625?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394230?display=full_width +::: + +:::: diff --git a/docs/courses/software-dev/4.5-automated-docs.md b/docs/courses/software-dev/4.5-automated-docs.md new file mode 100644 index 0000000..b01ac0e --- /dev/null +++ b/docs/courses/software-dev/4.5-automated-docs.md @@ -0,0 +1,224 @@ + +(4.5-automated-documentation)= +# 🧩 4.5 Automated Documentation + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn how to automate documentation creation for your Python projects using **Sphinx** and **Readthedocs**. You will also explore writing docstrings, generating documentation in **Markdown**, and understanding the concept of **documentation as code**. + +Learning Objectives: +1. Write documentation in Markdown +2. Explain what "documentation as code" means +3. Write a docstring for a Python function +4. Set up a Readthedocs account and publish a Readthedocs page + +### Writing Documentation in Markdown + +Markdown is a lightweight markup language that uses plain text formatting to create formatted documents. It is widely used for writing documentation because of its simplicity and ease of use. + +#### Why Markdown? +- **Simplicity**: Markdown files are easy to read and write without the need for complex syntax. +- **Compatibility**: Markdown is supported by many platforms (e.g., GitHub, Readthedocs). +- **Efficiency**: Markdown allows you to focus on content without worrying about formatting. + +#### Basic Markdown Syntax: +- **Headings**: Use `#` for headings. +- **Lists**: Use `-` for unordered lists or `1.` for ordered lists. +- **Code blocks**: Enclose code in triple backticks. +- **Links**: Create links with `[Link text](URL)`. + +#### Example Markdown Document: +```markdown +# Project Title + +## Description +This is a brief description of the project. + +## Installation +To install the project, run the following command: +```bash +pip install -r requirements.txt +``` + + +## Usage +Run the following command to start the application: +```bash +python main.py +``` + +[More about Markdown](https://www.markdownguide.org) + + +Video Tutorial: [How to Write Markdown](https://www.youtube.com/watch?v=HUBNt18RFbo) + +### What Documentation as Code Means + +Documentation as code is a practice where documentation is written, version-controlled, and maintained in the same way as code. This approach encourages continuous updates, collaboration, and automation of the documentation process. + +#### Key Points: +- Version Control: Documentation can be versioned along with code in Git. +- Automation: Tools like **Sphinx** and **Readthedocs** can automatically generate documentation from the code. +- Consistency: Since the documentation is written close to the code, it is easier to keep both in sync. + +By treating documentation like code, it becomes part of the development workflow, allowing it to evolve as the project grows. + +Video Tutorial: [Documentation as Code Explained](https://www.youtube.com/watch?v=XU5xt1kBuyI&pp=ygUfRG9jdW1lbnRhdGlvbiBhcyBDb2RlIEV4cGxhaW5lZA%3D%3D) + +### Writing Python Docstrings + +Docstrings are comments within Python functions, classes, or modules that describe their behavior. Tools like **Sphinx** can extract these docstrings and generate API documentation from them. + +#### Example of a Function Docstring: +```python +def multiply(a: int, b: int) -> int: + """ + Multiplies two numbers and returns the result. + + Parameters: + a (int): The first number. + b (int): The second number. + + Returns: + int: The result of multiplying a and b. + """ + return a * b +``` + +#### Steps for Writing Good Docstrings: +1. Start with a summary: Begin with a one-line summary of the function. +2. Parameters section: Describe each argument, including its type and purpose. +3. Returns section: Indicate what the function returns, including the type and expected value. + +### More Docstring Examples + +#### Example for a class: +```python +class Calculator: + """ + A simple calculator class that supports addition and subtraction. + """ + + def add(self, a: int, b: int) -> int: + """ + Adds two numbers. + + Parameters: + a (int): The first number. + b (int): The second number. + + Returns: + int: The sum of the two numbers. + """ + return a + b +``` + +Video Tutorial: [Writing Python Docstrings](https://www.youtube.com/watch?v=QZhANCk5OXc&pp=ygUZV3JpdGluZyBQeXRob24gRG9jc3RyaW5ncw%3D%3D) + +### Setting Up Sphinx for Automated Documentation + +Sphinx is a documentation generator that converts reStructuredText or Markdown files into various output formats (HTML, PDF, etc.). It can extract Python docstrings to create clean, organized API documentation. + +#### Steps to Set Up Sphinx: + +1. Install Sphinx: + ```bash + pip install sphinx + ``` +2. Initialize Sphinx Project: + Run `sphinx-quickstart` in your project folder and follow the prompts to set up the basic structure: + ```bash + sphinx-quickstart + ``` + This will generate the necessary configuration files. + +3. Configure Sphinx: + - Modify the `conf.py` file to include the source directory where your Python files are located. + - Example modification: + ```python + import os + import sys + sys.path.insert(0, os.path.abspath('../src')) + ``` + +4. Add Extensions: + To use Markdown files or automate docstring extraction, add extensions in `conf.py`: + ```python + extensions = ['sphinx.ext.autodoc', 'myst_parser'] + ``` + +5. Build the Documentation: + After making changes, run: + ```bash + make html + ``` + This command will generate HTML documentation in the `_build/html/` directory. + +Video Tutorial: [Sphinx Documentation Setup](https://www.youtube.com/watch?v=BWIrhgCAae0&pp=ygUaU3BoaW54IERvY3VtZW50YXRpb24gU2V0dXA%3D) + +### Publishing Documentation with Readthedocs + +Readthedocs is a popular platform for hosting and automatically building documentation from your repository. It integrates seamlessly with Sphinx and GitHub. + +#### Steps to Set Up and Publish on Readthedocs: +1. Create an Account: Go to [Readthedocs](https://readthedocs.org/) and sign up for an account. +2. Connect Your Repository: After signing in, connect your GitHub repository with Readthedocs. +3. Add a `docs/` folder: + Ensure that your project contains a `docs/` folder with the Sphinx configuration files (`conf.py`, `index.rst` or `index.md`). + +4. Set Up Readthedocs: + - In your Readthedocs dashboard, import the project by selecting your GitHub repository. + - Follow the prompts to configure your build settings (e.g., Python version, Sphinx builder). + +5. Build the Documentation: + - Click "Build" in Readthedocs to generate the documentation. + - Once the build is complete, your documentation will be available at a public URL. + +Video Tutorial: [Host your documentation on ReadTheDocs](https://www.youtube.com/watch?v=PO4Zd-6a6fA&pp=ygULUmVhZHRoZWRvY3M%3D) + +### Additional Resources + +- [Sphinx Documentation](https://www.sphinx-doc.org/en/master/) +- [Readthedocs Documentation](https://docs.readthedocs.io/en/stable/) +- [Markdown Guide](https://www.markdownguide.org/) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393627?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394233?display=full_width +::: + +:::: + diff --git a/docs/courses/software-dev/4.6-continuous-integration.md b/docs/courses/software-dev/4.6-continuous-integration.md new file mode 100644 index 0000000..1fbca97 --- /dev/null +++ b/docs/courses/software-dev/4.6-continuous-integration.md @@ -0,0 +1,166 @@ + +(4.6-continuous-integration)= +# 🧩 4.6 Continuous Integration + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn about **Continuous Integration (CI)** and how to use **GitHub Actions** to automate tasks like running unit tests and building documentation. By the end of this module, you'll be able to: + +1. Explain the purpose of continuous integration +2. Set up a GitHub Actions workflow +3. Run unit tests and documentation builds on GitHub Actions + +### What is Continuous Integration? + +Continuous Integration (CI) is a software development practice where code changes are automatically tested and integrated into a shared repository frequently. This practice helps catch issues early, improve code quality, and ensure the integrity of the project. + +#### Purpose of Continuous Integration: +- **Early Detection of Errors**: Running automated tests ensures that new changes don't introduce bugs. +- **Consistent Code Quality**: CI tools run linters and other static analysis tools to enforce code quality standards. +- **Faster Development**: Automating testing and integration allows developers to focus on writing code rather than manually running tests. +- **Team Collaboration**: By frequently integrating code, CI reduces the chances of conflicts when multiple team members are working on the same codebase. + +Video Tutorial: [Continuous Integration Explained](https://www.youtube.com/watch?v=1er2cjUq1UI) + +### Setting Up a GitHub Actions Workflow + +**GitHub Actions** is a powerful CI/CD tool that allows you to automate workflows directly in your GitHub repository. You can use it to run tests, lint code, build documentation, and more every time new code is pushed to the repository. + +#### Steps to Set Up a GitHub Actions Workflow: + +1. **Create a Workflow File**: + In your GitHub repository, create a `.github/workflows/` directory and add a YAML file for your workflow. For example, create a `ci.yml` file. + +2. **Define Workflow**: + In the `ci.yml` file, define the steps to run the CI process. Below is an example of a workflow that runs unit tests and builds documentation. + +```yaml +name: CI Workflow + +on: + push: + branches: + - main + pull_request: + branches: + - main + +jobs: + build: + runs-on: ubuntu-latest + + steps: + - name: Checkout code + uses: actions/checkout@v2 + + - name: Set up Python + uses: actions/setup-python@v2 + with: + python-version: '3.8' + + - name: Install dependencies + run: | + python -m pip install --upgrade pip + pip install -r requirements.txt + + - name: Run unit tests + run: | + pytest + + - name: Build documentation + run: | + cd docs + make html +``` + +This workflow will: +- Trigger on `push` or `pull request` events targeting the `main` branch. +- Set up Python 3.8, install dependencies, run unit tests, and build the documentation. + +3. **Push to GitHub**: + Once the workflow file is set up, push the changes to your GitHub repository. GitHub will automatically trigger the workflow. + +4. **Monitor Workflow Execution**: + Go to the "Actions" tab in your GitHub repository to see the workflow running and check its results. + +Video Tutorial: [Getting Started with GitHub Actions](https://www.youtube.com/watch?v=R8_veQiYBjI) + +### Running Unit Tests and Documentation Builds on GitHub Actions + +The CI process often includes running automated tests and building documentation to ensure that everything works as expected after each code change. + +#### Running Unit Tests on GitHub Actions: +1. **Install Testing Framework**: + Make sure that `pytest` or any other testing framework you are using is included in your `requirements.txt` file. + +2. **Define Test Step**: + In the workflow, include a step to run the tests: + ```yaml + - name: Run unit tests + run: | + pytest + ``` + +#### Building Documentation on GitHub Actions: +1. **Install Sphinx**: + Ensure that `Sphinx` and its dependencies are included in your `requirements.txt` file. + +2. **Add Documentation Build Step**: + In the workflow, add a step to build the documentation: + ```yaml + - name: Build documentation + run: | + cd docs + make html + ``` + +This step will navigate to the `docs` folder and run the `make html` command to generate the HTML documentation. + +### Additional Resources + +- [GitHub Actions Documentation](https://docs.github.com/en/actions) +- [Continuous Integration Overview](https://www.atlassian.com/continuous-delivery/continuous-integration) +- [Pytest Documentation](https://docs.pytest.org/en/stable/) +- [Sphinx Documentation](https://www.sphinx-doc.org/en/master/) +- [CI/CD Best Practices](https://www.redhat.com/en/topics/devops/what-is-ci-cd) +- [GitHub Actions for Python Applications](https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393628?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394236?display=full_width +::: + +:::: \ No newline at end of file diff --git a/docs/courses/software-dev/4.7-project-templates.md b/docs/courses/software-dev/4.7-project-templates.md new file mode 100644 index 0000000..a076eee --- /dev/null +++ b/docs/courses/software-dev/4.7-project-templates.md @@ -0,0 +1,230 @@ + +(4.7-project-templates)= +# 🧩 4.7 Project Templates + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn how to use **PyScaffold** and **Cookiecutter** to create project templates, manage dependencies, and publish Python packages to **PyPI**. By the end of this module, you'll be able to: + +1. Create a project template using PyScaffold +2. Add project content and initialize a Python project +3. Publish a Python package to PyPI +4. Outline the key benefits of dependency management during software development + +### Project Templates with PyScaffold + +**PyScaffold** is a tool that helps create the scaffolding for Python projects by generating a clean project structure, setting up version control, documentation, and package configuration. This simplifies the process of initializing new Python projects. + +#### Steps to Create a Project Template with PyScaffold: + +1. Install PyScaffold: + + ```bash + pip install --upgrade pyscaffold + ``` + +2. Create a New Project: + Run the following command to initialize a new project: + + ```bash + putup my_project + ``` + + This will create a project folder called `my_project` template with everything you need for some serious coding with the following structure: + + ``` + my_project/ + β”œβ”€β”€ src/ + β”œβ”€β”€ tests/ + β”œβ”€β”€ docs/ + β”œβ”€β”€ setup.py + └── README.rst + ``` + +3. Add Features: + PyScaffold allows you to add features like testing frameworks (e.g., `pytest`) or documentation tools (e.g., `Sphinx`) directly during initialization: + + ```bash + putup --with-pytest --with-sphinx my_project + ``` + + This command adds both `pytest` and `Sphinx` configurations to the project. + +Quick Start: [PyScaffold Demo Project](https://github.com/pyscaffold/pyscaffold-demo) + +### Project Templates with Cookiecutter + +Cookiecutter is another powerful tool for creating project templates, but it is more customizable than PyScaffold. Cookiecutter allows you to define templates that can be reused across different projects. + +#### Steps to Create a Project Template with Cookiecutter: + +1. Install Cookiecutter: + + ```bash + pip install cookiecutter + ``` + +2. Use an Existing Template: + You can initialize a new project from an existing template by running: + + ```bash + cookiecutter https://github.com/audreyr/cookiecutter-pypackage + ``` + + Follow the prompts to customize the project (e.g., project name, author, license, etc.). + +3. Create Your Own Template: + You can also create your own project template. For example, create a folder structure like: + + ``` + my_template/ + β”œβ”€β”€ {{cookiecutter.project_name}}/ + β”‚ └── __init__.py + └── cookiecutter.json + ``` + + The `cookiecutter.json` file defines the variables for your template, such as: + + ```json + { + "project_name": "MyProject", + "author_name": "Your Name" + } + ``` + + You can use this template across different projects by running: + ```bash + cookiecutter path/to/my_template + ``` + +Video Tutorial: [Using Cookiecutter for Python Projects](https://www.youtube.com/watch?v=KpGAEsysxpY) + +### Adding Project Content and Initialization + +After setting up a project template using PyScaffold or Cookiecutter, the next step is to add your own code, documentation, and tests. The structure generated by these tools encourages best practices for organizing code and keeping everything modular and maintainable. + +#### Example of Adding Content: + +1. Source Code**: Add your Python code under the `src/` directory. + + ```bash + src/my_project/ + __init__.py + module.py + ``` + +2. Tests**: Place your unit tests under the `tests/` directory. + + ```bash + tests/ + test_module.py + ``` + +3. Documentation**: Sphinx setup will generate documentation files in the `docs/` directory. + +4. Version Control: Use Git for version control. The template already includes a `.gitignore` file to exclude unnecessary files from your repository. + +### Publishing a Python Package to PyPI + +PyPI (Python Package Index) is a repository where you can publish your Python packages so they can be easily installed by others using `pip`. Here's how to package and publish your project to PyPI: + +#### Steps to Publish a Package to PyPI: + +1. Install Build Tools: + First, install the required tools for packaging: + + ```bash + pip install setuptools wheel twine + ``` + +2. Build Your Package: + Inside your project directory, run the following command to create a distribution package: + + ```bash + python setup.py sdist bdist_wheel + ``` + + This will generate a `dist/` directory containing `.tar.gz` and `.whl` files. + +3. Upload to PyPI: + Create an account on PyPI if you don’t already have one. Then, use `twine` to upload the package: + + ```bash + twine upload dist/* + ``` + + Your package is now published on PyPI and can be installed using: + + ```bash + pip install your-package-name + ``` + +Video Tutorial: [How to Publish a Package on PyPI](https://www.youtube.com/watch?v=GIF3LaRqgXo) + +### Dependency Management in Python + +Dependency management ensures that your project has all the necessary external libraries installed, and that these dependencies are consistent across different environments. Proper dependency management is essential for the smooth development and deployment of software projects. + +#### Key Benefits of Dependency Management: + +1. Consistency: Ensures that everyone working on the project is using the same versions of dependencies. +2. Isolation: Dependencies are isolated in virtual environments (e.g., `venv`, `conda`), preventing conflicts with other projects. +3. Reproducibility: With a `requirements.txt` or `pyproject.toml`, anyone can recreate the exact environment needed to run the project. +4. Security: Dependency management tools can automatically check for outdated or vulnerable libraries, helping keep your project secure. + +#### Tools for Managing Dependencies: +- `requirements.txt`: Lists all the dependencies of the project, which can be installed using: + ```bash + pip install -r requirements.txt + ``` + +- `pyproject.toml`: A modern way to manage dependencies and build system requirements, supported by tools like **Poetry**. + +Video Tutorial: [Dependency Management in Python](https://www.youtube.com/watch?v=fKl2JW_qrso) + +### Additional Resources + +- [PyScaffold Documentation](https://pyscaffold.org/en/stable/) +- [Cookiecutter Documentation](https://cookiecutter.readthedocs.io/en/1.7.2/) +- [Publishing Python Packages to PyPI](https://packaging.python.org/tutorials/packaging-projects/) +- [Dependency Management with Poetry](https://python-poetry.org/) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393629?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394237?display=full_width +::: + +:::: diff --git a/docs/courses/software-dev/4.8-cloud-server.md b/docs/courses/software-dev/4.8-cloud-server.md new file mode 100644 index 0000000..6b13107 --- /dev/null +++ b/docs/courses/software-dev/4.8-cloud-server.md @@ -0,0 +1,223 @@ + +(4.8-launching-free-cloud-server)= +# 🧩 4.8 Launching a Free Cloud Server + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn how to launch a free cloud server, leverage **serverless computing**, and deploy applications using **PythonAnywhere** and **Hugging Face**. By the end of this module, you'll be able to: + +1. Launch a free cloud server +2. Use a container to deploy applications +3. Create a container for your app +4. Deploy a materials discovery campaign on a cloud server + +### Launching a Free Cloud Server + +A cloud server is a virtual server that you can access over the internet. Many cloud platforms offer free tiers for users to launch servers, allowing you to run and deploy your applications in the cloud without incurring costs. + +#### Free Cloud Server Providers: +1. **PythonAnywhere**: A Python-centric cloud platform that offers a free plan with enough resources for small-scale applications. +2. **Google Cloud Platform (GCP)**: Provides free credits and the option to launch a small virtual machine for free. +3. **AWS Free Tier**: Offers free EC2 instances for the first 12 months. +4. **Hugging Face Spaces**: Provides a free option to deploy machine learning models and web applications using the Gradio or Streamlit framework. + +#### Steps to Launch a Free Cloud Server on PythonAnywhere: + +1. **Sign up for PythonAnywhere**: + Go to [PythonAnywhere](https://www.pythonanywhere.com) and create a free account. + +2. **Start a New Console**: + Once logged in, go to the **Consoles** tab and start a new **Bash** console. + +3. **Set Up Your Application**: + You can set up your Python environment and install necessary packages: + ```bash + pip install -r requirements.txt + ``` + +4. **Deploy Your Application**: + You can deploy web apps or Python scripts directly by using the **Web** tab to set up a web application or schedule your scripts to run. + +**Video Tutorial**: [PythonAnywhere Overview](https://www.youtube.com/watch?v=yhqYFyo7dAM&pp=ygUOUHl0aG9uQW55d2hlcmU%3D) + +### Serverless Computing + +**Serverless computing** allows you to run your code without having to manage the underlying infrastructure. You only pay for the compute resources when your code is executed, making it ideal for small-scale or on-demand applications. + +#### Benefits of Serverless Computing: +1. **Cost Efficiency**: You are only billed for actual compute time, rather than for server uptime. +2. **Scalability**: Serverless platforms can automatically scale your application based on demand. +3. **Ease of Use**: You don't need to manage or configure the server, allowing you to focus on application development. + +#### Popular Serverless Platforms: + +1. **AWS Lambda**: Executes your code in response to events (such as HTTP requests or file uploads) without requiring you to provision or manage servers. AWS Lambda automatically scales your application by running code in response to each trigger, making it highly reliable and cost-effective for event-driven workloads. + +2. **Google Cloud Functions**: A lightweight, event-driven serverless compute platform similar to AWS Lambda. It's great for tasks like executing code in response to HTTP requests, Cloud Storage events, or Pub/Sub messages. It’s well integrated with other Google Cloud services and offers seamless scalability. + +3. **Azure Functions**: Microsoft's serverless platform that also supports event-driven execution, with deep integration into the Azure ecosystem. It supports a wide range of triggers (e.g., HTTP requests, queue messages) and can be used to execute code in response to changes in Azure Storage or Cosmos DB, making it ideal for cloud-native applications running on Azure. + +4. **Hugging Face Spaces**: A serverless platform specifically designed for deploying machine learning models and web applications. Hugging Face Spaces supports frameworks like Gradio and Streamlit, which makes it easy to deploy and share ML-powered applications with others. + +### Using Containers + +**Containers** provide a consistent environment for your applications, making it easier to deploy them across different systems, including cloud servers. A container includes everything your application needs to run, such as libraries, dependencies, and configuration files. + +To better understand what a container does, think of it as a **pizza box**. Just as a pizza box holds and protects a pizza, allowing it to be transported anywhere while keeping the pizza intact, a container holds your application and its environment, ensuring that the app can run anywhere without changesβ€”whether on your local machine, a colleague’s system, or a cloud server. + +#### Steps to Use a Container: +1. **Install Docker**: + Docker is the most widely used container platform. Install Docker on your local machine: + + ```bash + sudo apt-get update + sudo apt-get install docker-ce docker-ce-cli containerd.io + ``` + +2. **Create a Dockerfile**: + A Dockerfile is a text file that contains instructions for building a Docker image. Here’s an example Dockerfile for a Python Flask app: + + ```dockerfile + # Use an official Python runtime as a parent image + FROM python:3.8-slim + + # Set the working directory + WORKDIR /app + + # Copy the current directory contents into the container at /app + COPY . /app + + # Install any needed packages specified in requirements.txt + RUN pip install --no-cache-dir -r requirements.txt + + # Make port 5000 available to the world outside this container + EXPOSE 5000 + + # Run app.py when the container launches + CMD ["python", "app.py"] + ``` + +3. **Build the Docker Image**: + Run the following command to build your Docker image: + + ```bash + docker build -t my_flask_app . + ``` + +4. **Run the Container**: + After building the image, you can run the container: + + ```bash + docker run -p 5000:5000 my_flask_app + ``` + +**Video Tutorial**: [Getting Started with Docker](https://www.youtube.com/watch?v=fqMOX6JJhGo) + +### Creating a Container + +Containers are useful for creating reproducible environments that work on any cloud platform or local machine. You can define all dependencies and settings in a container, ensuring that your application will run smoothly, regardless of the environment. + +#### Steps to Create a Container for Your App: + +1. **Create a Dockerfile**: + Create a `Dockerfile` in your project directory that includes the necessary instructions for your application (see the Dockerfile example above). + +2. **Build the Image**: + Use the `docker build` command to build your image from the Dockerfile. + +3. **Test Locally**: + Before deploying your container to the cloud, run it locally to ensure that everything works as expected. + +**Video Tutorial**: [Build Your Own Container](https://www.youtube.com/watch?v=SnSH8Ht3MIc&ab_channel=TechnoTim) + +### Deploying a Materials Discovery Campaign on a Cloud Server + +In this example, we will deploy a simplified **materials discovery campaign** where the goal is to analyze a dataset of material properties (e.g., conductivity, hardness, and thermal resistance) and predict optimal materials for specific applications. We will deploy this application on a cloud server, enabling users to submit data and receive predictions from a trained machine learning model hosted in the cloud. + +#### Steps to Deploy a Materials Discovery Campaign: + +1. **Dataset Preparation**: + The campaign starts by loading a dataset of materials with their respective properties. The dataset could be a CSV file containing columns like `Material Name`, `Conductivity`, `Hardness`, and `Thermal Resistance`. + +2. **Model Training**: + Train a machine learning model (e.g., a Random Forest or Neural Network) that can predict the best material for a given set of conditions (e.g., optimal material for high conductivity and low thermal resistance). Save the trained model as a `.pkl` file. + +3. **Create a Flask Application**: + Develop a Python Flask application that will serve as the front-end for the campaign. Users can submit material properties through a web form, and the model will return a prediction of the best material. + + ```python + from flask import Flask, request, jsonify + import pickle + + app = Flask(__name__) + + # Load the trained model + with open("model.pkl", "rb") as f: + model = pickle.load(f) + + + @app.route("/predict", methods=["POST"]) + def predict(): + data = request.json + prediction = model.predict([data["properties"]]) + return jsonify({"predicted_material": prediction[0]}) + + + if __name__ == "__main__": + app.run(debug=True) + ``` + +4. **Containerize the Application**: + Use Docker to containerize the Flask application (refer to the Dockerfile example above). + +5. **Deploy the Application**: + Deploy the containerized application on PythonAnywhere or Hugging Face Spaces. Ensure the web interface is accessible and that users can submit data and receive predictions in real-time. + +**Video Tutorial**: [Flask Deploy to Huggingface Cloud](https://youtu.be/pWnE9FHnGcQ) + +### Additional Resources + +- [PythonAnywhere Documentation](https://help.pythonanywhere.com/pages/) +- [Hugging Face Spaces Documentation](https://huggingface.co/docs) +- [Docker Documentation](https://docs.docker.com/) +- [Serverless Framework Documentation](https://www.serverless.com/framework/docs/) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393631?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394238?display=full_width +::: + +:::: diff --git a/docs/courses/software-dev/4.9-cloud-simulations.md b/docs/courses/software-dev/4.9-cloud-simulations.md new file mode 100644 index 0000000..8caffa8 --- /dev/null +++ b/docs/courses/software-dev/4.9-cloud-simulations.md @@ -0,0 +1,264 @@ + +(4.9-on-demand-cloud-simulations)= +# 🧩 4.9 On-demand Cloud Simulations + +```{contents} +:depth: 3 +``` + +## πŸ”° Tutorial + +In this module, you will learn how to set up an AWS account and use cloud computing services like **AWS Lambda**, **Docker containers**, **Apptainer**, and **Prefect** to run **on-demand cloud simulations**. You will also explore how to integrate these simulations into a materials discovery campaign. + +1. Set up an AWS account +2. Use AWS Lambda for serverless cloud computing +3. Run Docker containers and Apptainer for cloud simulations +4. Automate workflows with Prefect +5. Integrate cloud simulations into a materials discovery campaign + +### Setting Up an AWS Account + +**AWS (Amazon Web Services)** is a popular cloud computing platform that provides a wide range of cloud services, including compute, storage, and databases. To run on-demand simulations, you’ll first need to set up an AWS account. + +#### Steps to Set Up an AWS Account: + +1. **Sign up for AWS**: + - Visit [AWS Signup](https://aws.amazon.com/free) and create an account. You’ll need to provide billing information, but AWS offers a free tier that includes limited usage for services like **AWS Lambda** and **EC2**. + +2. **Create an IAM User**: + - After signing up, create an **IAM User** for secure access to your AWS resources. + - Go to **IAM Management Console** β†’ **Users** β†’ **Add User**, and assign the necessary permissions (e.g., for Lambda, S3, and EC2). + +3. **Set Up AWS CLI**: + - To manage AWS services from your terminal, install and configure the **AWS CLI**: + + ```bash + pip install awscli + aws configure + ``` + + Enter your **Access Key ID** and **Secret Access Key** from the IAM user you created. + +4. **Security Best Practices**: + - Enable Multi-Factor Authentication (MFA) for your root account and IAM users. + - Regularly rotate access keys and passwords. + - Use AWS CloudTrail to monitor and log API activity across your AWS infrastructure. + +**Video Tutorial**: [How to Set Up an AWS Free Tier](https://www.youtube.com/watch?v=SFaSB6vgp8k&pp=ygUVYXdzIGZyZWUgdGllciBzaWduIHVw) + +### AWS Lambda: Serverless Cloud Computing + +**AWS Lambda** is a serverless compute service that lets you run code in response to events without provisioning or managing servers. It’s ideal for running simulations on-demand based on user requests or data changes. + +#### Running Cloud Simulations with AWS Lambda: + +1. **Create a Lambda Function**: + - Go to **Lambda Console** β†’ **Create Function**, choose **Author from scratch**, and select **Python** as the runtime. + +2. **Write Simulation Code**: + - Write the code for your simulation in Python. Here's a simple example that simulates material property calculations: + + ```python + import json + + + def lambda_handler(event, context): + # Simulate material property calculations + material_data = event["material_properties"] + conductivity = material_data["conductivity"] * 0.9 # Example calculation + hardness = material_data["hardness"] * 1.1 + + return { + "statusCode": 200, + "body": json.dumps( + {"optimized_conductivity": conductivity, "optimized_hardness": hardness} + ), + } + ``` + +3. **Deploy the Function**: + - Once your code is ready, deploy the Lambda function and trigger it using an **API Gateway**, **S3 events**, or **manual invocation**. + +4. **Lambda Limitations**: + - Execution time is limited to 15 minutes per invocation. + - Memory allocation ranges from 128MB to 10GB. + - Temporary disk space (/tmp) is limited to 512MB. + +**Video Tutorial**: [AWS Lambda Tutorial for Beginners](https://www.youtube.com/watch?v=eOBq__h4OJ4) + +### Docker Containers for Cloud Simulations + +Docker containers help package your simulations with all necessary dependencies, making it easy to run them on any cloud server. This allows for reproducible and scalable simulations. + +#### Steps to Run Cloud Simulations in Docker: + +1. **Create a Dockerfile**: + Define a Dockerfile for your simulation environment: + + ```dockerfile + FROM python:3.8-slim + + WORKDIR /simulation + + COPY . /simulation + + RUN pip install -r requirements.txt + + CMD ["python", "run_simulation.py"] + ``` + +2. **Build the Docker Image**: + + ```bash + docker build -t material_simulation . + ``` + +3. **Run the Simulation Locally**: + + ```bash + docker run -d material_simulation + ``` + +4. **Deploy the Docker Container on AWS**: + You can deploy your Docker container on **AWS Fargate** or **EC2** for on-demand simulations. Fargate is a serverless compute engine for containers, while EC2 provides virtual machines. + +5. **Using Docker Compose**: + For more complex simulations involving multiple containers, consider using Docker Compose to define and run multi-container Docker applications. + +**Video Tutorial**: [Docker on AWS EC2](https://www.youtube.com/watch?v=qNIniDftAcU&pp=ygURRG9ja2VyIG9uIEFXUyBFQzI%3D) + +### Apptainer for Scientific Simulations + +**Apptainer** (formerly **Singularity**) is a container platform tailored for high-performance computing (HPC) and scientific simulations. It’s commonly used in research environments where Docker cannot be used due to security constraints. + +#### Steps to Run Simulations with Apptainer: + +1. **Install Apptainer**: + Install Apptainer on your system by following the instructions from the [official documentation](https://apptainer.org/docs/admin/main/installation.html). + +2. **Create an Apptainer Container**: + Similar to Docker, Apptainer uses definition files (`.def`) to define container environments. Here’s an example: + + ```def + Bootstrap: docker + From: python:3.8-slim + + %post + pip install numpy scipy + + %runscript + exec python run_simulation.py + ``` + +3. **Build and Run the Container**: + + ```bash + apptainer build material_simulation.sif material_simulation.def + apptainer run material_simulation.sif + ``` + +**Video Tutorial**: [Apptainer/Singularity Tutorial](https://www.youtube.com/watch?v=g0cCErlveiI&list=PLKZ9c4ONm-VkxWW98Gcn9H6WwykMiqtnF) + +### Workflow Automation with Prefect + +**Prefect** is a modern workflow orchestration tool that helps automate complex workflows, like running multiple cloud simulations or orchestrating data pipelines. You can use Prefect to trigger cloud simulations, handle retries, and manage dependencies. + +#### Steps to Automate Cloud Simulations with Prefect: + +1. **Install Prefect**: + + ```bash + pip install -U prefect + ``` + +2. **Define a Prefect Flow**: + A **Prefect flow** defines the sequence of tasks to run. Here’s an example of running a cloud simulation as part of a workflow: + + ```python + from prefect import task, Flow + + + @task + def run_simulation(): + print("Running cloud simulation...") + # Simulate material discovery task here + return "Simulation Completed" + + + with Flow("material_discovery") as flow: + run_simulation() + + flow.run() + ``` + +3. **Run the Flow**: + Run the flow locally or deploy it on **Prefect Cloud** for distributed execution. + +**Video Tutorial**: [Getting Started with Prefect](https://www.youtube.com/watch?v=4yIW34WcmBQ&pp=ygUcR2V0dGluZyBTdGFydGVkIHdpdGggUHJlZmVjdA%3D%3D) + +### Integrating Cloud Simulations into a Materials Discovery Campaign + +In a materials discovery campaign, you might need to perform large-scale simulations to identify the best materials for specific properties. By integrating cloud simulations, you can scale your computational efforts and automate the discovery process. + +#### Example: Integrating a Simulation into a Campaign + +1. **Use Case**: + Imagine you are searching for materials with optimal thermal conductivity and hardness. You have a dataset of potential materials, and you want to run simulations to find the top candidates. + +2. **Simulation Setup**: + - Load the material properties from a database (e.g., AWS S3). + - Run the simulation using AWS Lambda or Docker containers on EC2/Fargate. + - Use the simulation to calculate optimized properties (e.g., using machine learning models trained on historical material data). + +3. **Automate the Process**: + - Use **Prefect** to automate the workflow, including triggering simulations based on incoming data, aggregating the results, and storing them in a database. + - Each simulation run generates predictions for a subset of materials, and the best candidates are selected for further analysis. + +4. **Error Handling and Monitoring**: + - Implement robust error handling in your simulations to manage unexpected issues. + - Use AWS CloudWatch to monitor the performance and health of your cloud resources. + - Set up alerts to notify you of any issues or anomalies in your simulations. + +### Additional Resources + +- [AWS Lambda Documentation](https://docs.aws.amazon.com/lambda/) +- [Docker Documentation](https://docs.docker.com/) +- [Apptainer Documentation](https://apptainer.org/) +- [Prefect Documentation](https://docs.prefect.io/) +- [EC2 User Guide](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/concepts.html) + +## πŸš€ Quiz + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1393632?display=full_width +::: + +:::: + +## πŸ“„ Assignment + +::::{tab-set} +:sync-group: category + +:::{tab-item} W 2024 +:sync: w2024 + +::: + +:::{tab-item} Sp/Su 2024 +:sync: sp2024 + +https://q.utoronto.ca/courses/370070/assignments/1394239?display=full_width +::: + +:::: \ No newline at end of file From ccd52949e3325f70f94acc871d94c578e78a19df Mon Sep 17 00:00:00 2001 From: "Sterling G. Baird" Date: Wed, 9 Oct 2024 11:19:29 -0400 Subject: [PATCH 2/3] precommit --- docs/courses/hello-world/1.5-data-logging.md | 32 +++-- .../hello-world/1.5.1-aws-lambda-read.json | 2 +- docs/courses/hello-world/index.md | 2 - docs/courses/robotics/3.4-mobile-robotics.md | 92 ++++++------ docs/courses/robotics/3.5-computer-vision.md | 134 ++++++++++++------ .../robotics/3.6-solid-sample-transfer.md | 104 ++++++++------ docs/courses/robotics/index.md | 2 - .../software-dev/4.1-deep-dive-into-github.md | 5 +- docs/courses/software-dev/4.2-vscode-setup.md | 6 +- .../software-dev/4.3-vscode-debugging.md | 2 +- docs/courses/software-dev/4.4-unit-testing.md | 1 + .../software-dev/4.5-automated-docs.md | 12 +- .../4.6-continuous-integration.md | 8 +- .../software-dev/4.9-cloud-simulations.md | 2 +- 14 files changed, 243 insertions(+), 161 deletions(-) diff --git a/docs/courses/hello-world/1.5-data-logging.md b/docs/courses/hello-world/1.5-data-logging.md index 2ce6dcc..26d1b99 100644 --- a/docs/courses/hello-world/1.5-data-logging.md +++ b/docs/courses/hello-world/1.5-data-logging.md @@ -119,9 +119,9 @@ for _ in range(num_retries): txt = str(response.text) status_code = response.status_code - if status_code != 200: - print("Retrying in 5 seconds...") - time.sleep(5) + if status_code != 200: + print("Retrying in 5 seconds...") + time.sleep(5) print(f"Response: ({status_code}), msg = {txt}") @@ -151,23 +151,31 @@ import json import pymongo import os + def lambda_handler(event, context): try: - client = pymongo.MongoClient(os.environ['MONGODB_URI']) + client = pymongo.MongoClient(os.environ["MONGODB_URI"]) db = client["your_database_name"] collection = db["your_collection_name"] - - body = json.loads(event['body']) + + body = json.loads(event["body"]) result = collection.insert_one(body) - + return { - 'statusCode': 200, - 'body': json.dumps({'message': 'Document inserted successfully', 'id': str(result.inserted_id)}) + "statusCode": 200, + "body": json.dumps( + { + "message": "Document inserted successfully", + "id": str(result.inserted_id), + } + ), } except Exception as e: return { - 'statusCode': 500, - 'body': json.dumps({'message': 'Error inserting document', 'error': str(e)}) + "statusCode": 500, + "body": json.dumps( + {"message": "Error inserting document", "error": str(e)} + ), } ``` @@ -248,4 +256,4 @@ https://q.utoronto.ca/courses/350933/assignments/1274182?display=full_width https://q.utoronto.ca/courses/351407/assignments/1286935?display=full_width ::: -:::: \ No newline at end of file +:::: diff --git a/docs/courses/hello-world/1.5.1-aws-lambda-read.json b/docs/courses/hello-world/1.5.1-aws-lambda-read.json index f63d567..41af79c 100644 --- a/docs/courses/hello-world/1.5.1-aws-lambda-read.json +++ b/docs/courses/hello-world/1.5.1-aws-lambda-read.json @@ -141,4 +141,4 @@ }, "nbformat": 4, "nbformat_minor": 4 -} \ No newline at end of file +} diff --git a/docs/courses/hello-world/index.md b/docs/courses/hello-world/index.md index 704b9ce..38db6e3 100644 --- a/docs/courses/hello-world/index.md +++ b/docs/courses/hello-world/index.md @@ -81,5 +81,3 @@ Begin ``` - - diff --git a/docs/courses/robotics/3.4-mobile-robotics.md b/docs/courses/robotics/3.4-mobile-robotics.md index d877e25..c24a8d1 100644 --- a/docs/courses/robotics/3.4-mobile-robotics.md +++ b/docs/courses/robotics/3.4-mobile-robotics.md @@ -15,24 +15,24 @@ In this module, you will develop software to: ### Bill of Materials -- [MyCobot 280 Pi: - World's Smallest and Lightest Six-Axis Collaborative Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot) +- [MyCobot 280 Pi: - World's Smallest and Lightest Six-Axis Collaborative Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot) A versatile and compact six-axis robot, ideal for mobile robotics applications. -- [Camera Flange 2.0](https://shop.elephantrobotics.com/en-ca/collections/camera-modules/products/camera-flange-2-0) +- [Camera Flange 2.0](https://shop.elephantrobotics.com/en-ca/collections/camera-modules/products/camera-flange-2-0) Used for vision-based tasks in mobile robotics, such as object recognition and navigation. -- [Adaptive Gripper](https://shop.elephantrobotics.com/en-ca/collections/grippers/products/adaptive-gripper) +- [Adaptive Gripper](https://shop.elephantrobotics.com/en-ca/collections/grippers/products/adaptive-gripper) A flexible gripper designed for precise manipulation and picking tasks in collaborative robotic systems. -- [G-Shape Base 2.0](https://shop.elephantrobotics.com/en-ca/collections/fixed-bases/products/g-shape-base-2-0) +- [G-Shape Base 2.0](https://shop.elephantrobotics.com/en-ca/collections/fixed-bases/products/g-shape-base-2-0) Provides a sturdy mounting platform for the MyCobot, ensuring stability during robotic operations. -- [LIDAR sensor for obstacle detection] +- [LIDAR sensor for obstacle detection] Used for real-time obstacle detection and mapping in mobile robotics. - Printed AprilTags (can be generated and printed from [AprilTag Generation](https://github.com/AprilRobotics/apriltag-generation)) -- USB-A to micro USB-B cable: +- USB-A to micro USB-B cable: Used to connect and power devices such as the Raspberry Pi or peripherals. - SD Card with Raspbian OS: @@ -40,25 +40,25 @@ In this module, you will develop software to: ### Required Software -- [Prefect](https://www.prefect.io/) +- [Prefect](https://www.prefect.io/) A workflow orchestration tool used to manage and coordinate asynchronous tasks in the system. -- [AprilTags Python Library](https://pypi.org/project/apriltag/) +- [AprilTags Python Library](https://pypi.org/project/apriltag/) A computer vision library for identifying and tracking AprilTags, used for spatial referencing and navigation. -- [ROS2 Humble](https://docs.ros.org/en/humble/Installation.html) +- [ROS2 Humble](https://docs.ros.org/en/humble/Installation.html) ROS2 (Robot Operating System) is an open-source framework for building robot applications. **Humble Hawksbill** is the currently recommended version due to its stability and long-term support. -- [Ubuntu 22.04 LTS](https://releases.ubuntu.com/22.04/) +- [Ubuntu 22.04 LTS](https://releases.ubuntu.com/22.04/) Ubuntu 22.04 LTS is the recommended Linux distribution for running ROS2, as it provides a stable and compatible environment. **Ubuntu 24.04** (the latest version) may encounter compatibility issues with ROS2 and other tools. ### Documentation -- [MyCobot Pi Documentation](https://docs.elephantrobotics.com/docs/gitbook-en/2-serialproduct/2.1-280/2.1.2-PI.html) +- [MyCobot Pi Documentation](https://docs.elephantrobotics.com/docs/gitbook-en/2-serialproduct/2.1-280/2.1.2-PI.html) Detailed guide on setting up and operating the MyCobot Pi. -- [Gripper Control via Python](https://docs.elephantrobotics.com/docs/gitbook-en/7-ApplicationBasePython/7.5_gripper.html) +- [Gripper Control via Python](https://docs.elephantrobotics.com/docs/gitbook-en/7-ApplicationBasePython/7.5_gripper.html) Guide for controlling the adaptive gripper using Python commands. @@ -71,7 +71,7 @@ First, you will learn how to use ROS to control a mobile cobot. ROS is an open-s #### ROS2 Overview -ROS2 (Robot Operating System 2) is an extension of the original ROS framework, designed to be modular and scalable for building complex robotic systems. It is composed of numerous **software packages**, which offer a wide range of functionalities like hardware abstraction, device control, message-passing, and software development tools. +ROS2 (Robot Operating System 2) is an extension of the original ROS framework, designed to be modular and scalable for building complex robotic systems. It is composed of numerous **software packages**, which offer a wide range of functionalities like hardware abstraction, device control, message-passing, and software development tools. ROS2 heavily relies on the **Ubuntu Linux** operating system, and it is recommended to use **Ubuntu 22.04 LTS** for stability and compatibility. One of ROS2’s key advantages is its distributed architecture, which allows different components (nodes) to communicate with each other over a network. This distributed nature is essential for enabling flexibility in robot systems development. @@ -207,7 +207,7 @@ microscope = MicroscopeDemo( def detect_apriltag(frame): gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY) tags = at_detector.detect(gray) - + for tag in tags: if tag.tag_id == 1: # Assuming we're looking for tag with ID 1 return tag.center @@ -230,13 +230,13 @@ def microscope_scan(): microscope.focus() image = microscope.take_image() image.show() - + print("Scanning area...") scan_images = microscope.scan([2000, 2000], [-2000, -2000]) for i, img in enumerate(scan_images): img.show() print(f"Showing scan image {i+1}/{len(scan_images)}") - + microscope.move(0, 0) def main(): @@ -254,7 +254,7 @@ def main(): move_arm_to_target(x, y) grab_object() print("Object grasped") - + # Perform microscope scan microscope_scan() break @@ -334,56 +334,56 @@ import actionlib class MyCobotNavigator: def __init__(self): rospy.init_node('mycobot_navigator', anonymous=True) - + # Initialize MyCobot self.mycobot = MyCobot("/dev/ttyUSB0", 115200) - + # ROS Publishers self.cmd_vel_pub = rospy.Publisher('/cmd_vel', Twist, queue_size=10) self.goal_pub = rospy.Publisher('/move_base_simple/goal', PoseStamped, queue_size=10) - + # ROS Subscribers rospy.Subscriber('/camera/rgb/image_raw', Image, self.image_callback) rospy.Subscriber('/camera/depth/points', PointCloud2, self.pointcloud_callback) rospy.Subscriber('/map', OccupancyGrid, self.map_callback) rospy.Subscriber('/detected_objects', DetectedObjectsArray, self.object_detection_callback) - + # Initialize CV Bridge self.bridge = CvBridge() - + # Initialize MoveBase Action Client self.move_base_client = actionlib.SimpleActionClient('move_base', MoveBaseAction) self.move_base_client.wait_for_server() - + # Initialize TF listener self.tf_listener = tf.TransformListener() - + # State variables self.current_image = None self.current_pointcloud = None self.current_map = None self.detected_objects = [] self.navigation_goal = None - + def image_callback(self, msg): try: self.current_image = self.bridge.imgmsg_to_cv2(msg, "bgr8") self.process_image() except Exception as e: rospy.logerr(f"Error processing image: {e}") - + def pointcloud_callback(self, msg): self.current_pointcloud = msg self.process_pointcloud() - + def map_callback(self, msg): self.current_map = msg self.update_navigation_plan() - + def object_detection_callback(self, msg): self.detected_objects = msg.objects self.process_detected_objects() - + def process_image(self): if self.current_image is not None: # Perform visual navigation tasks @@ -395,41 +395,41 @@ class MyCobotNavigator: for line in lines: x1, y1, x2, y2 = line[0] cv2.line(self.current_image, (x1, y1), (x2, y2), (0, 255, 0), 2) - + cv2.imshow("Navigation View", self.current_image) cv2.waitKey(1) - + def process_pointcloud(self): if self.current_pointcloud is not None: # Process pointcloud for obstacle detection # This is a placeholder for actual pointcloud processing rospy.loginfo("Processing pointcloud for obstacle detection") - + def process_detected_objects(self): for obj in self.detected_objects: rospy.loginfo(f"Detected object: {obj.label} at position: {obj.pose.position}") # Implement logic to interact with detected objects if obj.label == "target_object": self.plan_path_to_object(obj.pose) - + def plan_path_to_object(self, object_pose): goal = MoveBaseGoal() goal.target_pose.header.frame_id = "map" goal.target_pose.header.stamp = rospy.Time.now() goal.target_pose.pose = object_pose - + self.move_base_client.send_goal(goal) rospy.loginfo("Sent goal to move_base") - + def update_navigation_plan(self): if self.navigation_goal is not None: self.plan_path_to_goal(self.navigation_goal) - + def plan_path_to_goal(self, goal_pose): start_pose = PoseStamped() start_pose.header.frame_id = "base_link" start_pose.header.stamp = rospy.Time.now() - + try: transformed_pose = self.tf_listener.transformPose("map", start_pose) # Here you would typically call a path planning service @@ -437,12 +437,12 @@ class MyCobotNavigator: rospy.loginfo(f"Planning path from {transformed_pose.pose.position} to {goal_pose.pose.position}") except (tf.LookupException, tf.ConnectivityException, tf.ExtrapolationException) as e: rospy.logerr(f"TF Error: {e}") - + def move_arm_to_target(self, x, y, z): # Move MyCobot arm to target position self.mycobot.send_coords([x, y, z, 0, 0, 0], 50, 1) # Adjust speed and mode as needed rospy.loginfo(f"Moving arm to coordinates: ({x}, {y}, {z})") - + def run(self): rate = rospy.Rate(10) # 10 Hz while not rospy.is_shutdown(): @@ -485,12 +485,14 @@ from prefect import task, Flow import time import random + @task def control_robot(): for _ in range(10): print("Moving the robot forward...") time.sleep(1) + @task def monitor_sensors(): for _ in range(10): @@ -498,12 +500,14 @@ def monitor_sensors(): print(f"LIDAR sensor reading: {sensor_value}") time.sleep(1) + @task def analyze_data(): for _ in range(10): print("Analyzing real-time data...") time.sleep(1) + with Flow("Asynchronous Robotics Control") as flow: control_robot() monitor_sensors() @@ -532,7 +536,7 @@ To use this script you need to: ### Define Asynchrony in the Context of Hardware Control -Asynchrony in robotics refers to executing tasks independently and concurrently. In the context of hardware control, this is critical for managing complex processes in an autonomous laboratory, where sensors must continually gather data, robots must execute movement commands, and the system needs to react in real-time. +Asynchrony in robotics refers to executing tasks independently and concurrently. In the context of hardware control, this is critical for managing complex processes in an autonomous laboratory, where sensors must continually gather data, robots must execute movement commands, and the system needs to react in real-time. For instance, while a robot is moving, it must be able to continuously monitor its surroundings (e.g., using LIDAR or cameras) and adjust its trajectory without pausing or waiting for other tasks to complete. @@ -593,14 +597,14 @@ bridge = CvBridge() def image_callback(msg): # Convert ROS image messages to OpenCV images img = bridge.imgmsg_to_cv2(msg, "bgr8") - + # Use OpenCV to detect pill bottles (e.g. by color or shape) gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) ret, thresh = cv2.threshold(gray, 127, 255, cv2.THRESH_BINARY) # Use OpenCV contour detection for medicine bottles contours, _ = cv2.findContours(thresh, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE) - + for contour in contours: x, y, w, h = cv2.boundingRect(contour) if w > 30 and h > 30: # Assume the size of the bottle is within this range @@ -650,11 +654,11 @@ def move_robot(): rospy.init_node('robot_control_node') pub = rospy.Publisher('/cmd_vel', Twist, queue_size=10) rate = rospy.Rate(10) - + twist = Twist() twist.linear.x = 1.0 # Move forward twist.angular.z = 0.5 # Rotation - + while not rospy.is_shutdown(): pub.publish(twist) rate.sleep() diff --git a/docs/courses/robotics/3.5-computer-vision.md b/docs/courses/robotics/3.5-computer-vision.md index 98a56a8..b0c9c22 100644 --- a/docs/courses/robotics/3.5-computer-vision.md +++ b/docs/courses/robotics/3.5-computer-vision.md @@ -50,17 +50,23 @@ In this task, you will use **OpenCV** to perform image preprocessing and identif import cv2 import numpy as np + def preprocess_image(image): gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) blurred = cv2.GaussianBlur(gray, (5, 5), 0) - thresh = cv2.adaptiveThreshold(blurred, 255, cv2.ADAPTIVE_THRESH_GAUSSIAN_C, cv2.THRESH_BINARY_INV, 11, 2) + thresh = cv2.adaptiveThreshold( + blurred, 255, cv2.ADAPTIVE_THRESH_GAUSSIAN_C, cv2.THRESH_BINARY_INV, 11, 2 + ) return thresh + def find_roi(image): processed = preprocess_image(image) - contours, _ = cv2.findContours(processed, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE) + contours, _ = cv2.findContours( + processed, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE + ) contours = sorted(contours, key=cv2.contourArea, reverse=True) - + rois = [] for contour in contours[:5]: area = cv2.contourArea(contour) @@ -71,21 +77,23 @@ def find_roi(image): rois.append((x, y, w, h)) return rois + def main(): - image = cv2.imread('sample_image.jpg') + image = cv2.imread("sample_image.jpg") if image is None: print("Error: Could not load image") return - + rois = find_roi(image) for roi in rois: x, y, w, h = roi - cv2.rectangle(image, (x, y), (x+w, y+h), (0, 255, 0), 2) - - cv2.imshow('ROI Detection', image) + cv2.rectangle(image, (x, y), (x + w, y + h), (0, 255, 0), 2) + + cv2.imshow("ROI Detection", image) cv2.waitKey(0) cv2.destroyAllWindows() + if __name__ == "__main__": main() ``` @@ -130,6 +138,7 @@ import cv2 import numpy as np from pupil_apriltags import Detector + def detect_apriltags(image): gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) at_detector = Detector( @@ -139,34 +148,47 @@ def detect_apriltags(image): quad_sigma=0.0, refine_edges=1, decode_sharpening=0.25, - debug=0 + debug=0, ) tags = at_detector.detect(gray) return tags + def spatial_referencing(image, tags, tag_size=0.05): - fx, fy, cx, cy = 1000, 1000, image.shape[1]/2, image.shape[0]/2 + fx, fy, cx, cy = 1000, 1000, image.shape[1] / 2, image.shape[0] / 2 camera_params = [fx, fy, cx, cy] - + for tag in tags: pose, e0, e1 = tag.fit_pose(camera_params, tag_size) rotation_matrix = pose[:3, :3] translation_vector = pose[:3, 3] euler_angles = cv2.Rodrigues(rotation_matrix)[0].flatten() - + cv2.polylines(image, [np.int32(tag.corners)], True, (0, 255, 0), 2) center = tuple(map(int, tag.center)) cv2.circle(image, center, 5, (0, 0, 255), -1) - cv2.putText(image, str(tag.tag_id), (center[0] - 10, center[1] - 10), - cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 255), 2) - + cv2.putText( + image, + str(tag.tag_id), + (center[0] - 10, center[1] - 10), + cv2.FONT_HERSHEY_SIMPLEX, + 0.5, + (0, 0, 255), + 2, + ) + print(f"Tag ID: {tag.tag_id}") - print(f"Position: X={translation_vector[0]:.2f}, Y={translation_vector[1]:.2f}, Z={translation_vector[2]:.2f}") - print(f"Rotation: Roll={np.degrees(euler_angles[0]):.2f}, Pitch={np.degrees(euler_angles[1]):.2f}, Yaw={np.degrees(euler_angles[2]):.2f}") + print( + f"Position: X={translation_vector[0]:.2f}, Y={translation_vector[1]:.2f}, Z={translation_vector[2]:.2f}" + ) + print( + f"Rotation: Roll={np.degrees(euler_angles[0]):.2f}, Pitch={np.degrees(euler_angles[1]):.2f}, Yaw={np.degrees(euler_angles[2]):.2f}" + ) print("---") - + return image + # Main function for AprilTag detection and spatial referencing def main_apriltag(): cap = cv2.VideoCapture(0) @@ -175,17 +197,18 @@ def main_apriltag(): if not ret: print("Failed to capture image") break - + tags = detect_apriltags(frame) frame_with_tags = spatial_referencing(frame, tags) - cv2.imshow('AprilTag Spatial Referencing', frame_with_tags) - - if cv2.waitKey(1) & 0xFF == ord('q'): + cv2.imshow("AprilTag Spatial Referencing", frame_with_tags) + + if cv2.waitKey(1) & 0xFF == ord("q"): break - + cap.release() cv2.destroyAllWindows() + if __name__ == "__main__": main_apriltag() ``` @@ -225,7 +248,7 @@ In this section, we'll combine OpenCV image processing techniques with a motoriz #### Demo -βœ… Watch +βœ… Watch [Building the OpenFlexure Microscope](https://www.youtube.com/watch?v=aQEyoch3iuo&ab_channel=TinkerTechTrove) This example code combines OpenCV image processing techniques with simulated microscope functionality to implement automated monitoring of microscopic particles. The main features include: @@ -249,22 +272,36 @@ Adjust the detection parameters in the code based on your specific sample and mi import cv2 import numpy as np from microscope_demo_client import MicroscopeDemo -from my_secrets import HIVEMQ_HOST, HIVEMQ_PASSWORD, HIVEMQ_PORT, HIVEMQ_USERNAME, MICROSCOPE_NAME +from my_secrets import ( + HIVEMQ_HOST, + HIVEMQ_PASSWORD, + HIVEMQ_PORT, + HIVEMQ_USERNAME, + MICROSCOPE_NAME, +) import time from PIL import Image import io -microscope = MicroscopeDemo(HIVEMQ_HOST, HIVEMQ_PORT, HIVEMQ_USERNAME, HIVEMQ_PASSWORD, MICROSCOPE_NAME) +microscope = MicroscopeDemo( + HIVEMQ_HOST, HIVEMQ_PORT, HIVEMQ_USERNAME, HIVEMQ_PASSWORD, MICROSCOPE_NAME +) + def preprocess_image(image): gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) blurred = cv2.GaussianBlur(gray, (5, 5), 0) - thresh = cv2.adaptiveThreshold(blurred, 255, cv2.ADAPTIVE_THRESH_GAUSSIAN_C, cv2.THRESH_BINARY_INV, 11, 2) + thresh = cv2.adaptiveThreshold( + blurred, 255, cv2.ADAPTIVE_THRESH_GAUSSIAN_C, cv2.THRESH_BINARY_INV, 11, 2 + ) return thresh + def detect_particles(image): processed = preprocess_image(image) - contours, _ = cv2.findContours(processed, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE) + contours, _ = cv2.findContours( + processed, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE + ) particles = [] for contour in contours: area = cv2.contourArea(contour) @@ -272,54 +309,66 @@ def detect_particles(image): particles.append(contour) return particles + def analyze_growth(previous_count, current_count): - growth_rate = (current_count - previous_count) / previous_count * 100 if previous_count > 0 else 0 + growth_rate = ( + (current_count - previous_count) / previous_count * 100 + if previous_count > 0 + else 0 + ) return growth_rate + def microscope_scan_and_analyze(): print("Starting area scan...") scan_images = microscope.scan([2000, 2000], [-2000, -2000]) - + total_particles = 0 for i, img_data in enumerate(scan_images): img = Image.open(io.BytesIO(img_data)) img_cv = cv2.cvtColor(np.array(img), cv2.COLOR_RGB2BGR) - + particles = detect_particles(img_cv) total_particles += len(particles) - + img_with_particles = img_cv.copy() cv2.drawContours(img_with_particles, particles, -1, (0, 255, 0), 2) - - cv2.imshow(f'Scan {i+1}/{len(scan_images)}', img_with_particles) + + cv2.imshow(f"Scan {i+1}/{len(scan_images)}", img_with_particles) cv2.waitKey(1000) - - print(f"Scan image {i+1}/{len(scan_images)}: Detected {len(particles)} microscopic particles") - + + print( + f"Scan image {i+1}/{len(scan_images)}: Detected {len(particles)} microscopic particles" + ) + cv2.destroyAllWindows() return total_particles + def main_microscope(): try: previous_particle_count = 0 while True: microscope.move(0, 0) microscope.focus() - + current_particle_count = microscope_scan_and_analyze() - growth_rate = analyze_growth(previous_particle_count, current_particle_count) - + growth_rate = analyze_growth( + previous_particle_count, current_particle_count + ) + print(f"Total detected microscopic particles: {current_particle_count}") print(f"Growth rate: {growth_rate:.2f}%") - + previous_particle_count = current_particle_count time.sleep(3600) # Wait for 1 hour - + except KeyboardInterrupt: print("Monitoring stopped") finally: microscope.end_connection() + if __name__ == "__main__": main_microscope() ``` @@ -404,4 +453,3 @@ https://q.utoronto.ca/courses/348619/assignments/1385147?display=full_width ::: :::: - diff --git a/docs/courses/robotics/3.6-solid-sample-transfer.md b/docs/courses/robotics/3.6-solid-sample-transfer.md index b5aa64c..f1d7d07 100644 --- a/docs/courses/robotics/3.6-solid-sample-transfer.md +++ b/docs/courses/robotics/3.6-solid-sample-transfer.md @@ -21,19 +21,19 @@ This module revolves around "transfer of solid samples", and completes complex a ### Bill of Materials -- [MyCobot Pi - World's Smallest and Lightest Six-Axis Collaborative Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot) +- [MyCobot Pi - World's Smallest and Lightest Six-Axis Collaborative Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot) A compact and versatile six-axis robot suitable for performing precise sample transfer tasks. -- [Camera Flange 2.0](https://shop.elephantrobotics.com/en-ca/collections/camera-modules/products/camera-flange-2-0) +- [Camera Flange 2.0](https://shop.elephantrobotics.com/en-ca/collections/camera-modules/products/camera-flange-2-0) A camera module attachment for enhanced visual feedback during sample transfer operations. -- [Adaptive Gripper](https://shop.elephantrobotics.com/en-ca/collections/grippers/products/adaptive-gripper) +- [Adaptive Gripper](https://shop.elephantrobotics.com/en-ca/collections/grippers/products/adaptive-gripper) A flexible gripper designed for secure handling and transfer of solid samples during automated processes. -- [G-Shape Base 2.0](https://shop.elephantrobotics.com/en-ca/collections/fixed-bases/products/g-shape-base-2-0) +- [G-Shape Base 2.0](https://shop.elephantrobotics.com/en-ca/collections/fixed-bases/products/g-shape-base-2-0) A stable base for mounting the MyCobot robot, ensuring precision during robotic movements and sample handling. -- [Multi-Axis Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot) +- [Multi-Axis Robot](https://shop.elephantrobotics.com/en-ca/collections/mycobot-280/products/mycobot-pi-worlds-smallest-and-lightest-six-axis-collaborative-robot) A robotic arm designed for executing precise sample movements across multiple axes. - Printed AprilTags (can be generated and printed from [AprilTag Generation](https://github.com/AprilRobotics/apriltag-generation)) @@ -44,25 +44,25 @@ This module revolves around "transfer of solid samples", and completes complex a ### Required Software -- [Prefect](https://www.prefect.io/) +- [Prefect](https://www.prefect.io/) A workflow orchestration tool to manage and coordinate asynchronous tasks during the sample transfer process. -- [AprilTags Python Library](https://pypi.org/project/apriltag/) +- [AprilTags Python Library](https://pypi.org/project/apriltag/) Used for spatial referencing and tracking of sample containers using AprilTags for accurate positioning. -- [ROS2 Humble](https://docs.ros.org/en/humble/Installation.html) +- [ROS2 Humble](https://docs.ros.org/en/humble/Installation.html) ROS2 (Robot Operating System) is an open-source framework for building robot applications. **Humble Hawksbill** is the currently recommended version due to its stability and long-term support. -- [Ubuntu 22.04 LTS](https://releases.ubuntu.com/22.04/) +- [Ubuntu 22.04 LTS](https://releases.ubuntu.com/22.04/) Ubuntu 22.04 LTS is the recommended Linux distribution for running ROS2, as it provides a stable and compatible environment. **Ubuntu 24.04** (the latest version) may encounter compatibility issues with ROS2 and other tools. #### Documentation -- [MyCobot Pi Documentation](https://docs.elephantrobotics.com/docs/gitbook-en/2-serialproduct/2.1-280/2.1.2-PI.html) +- [MyCobot Pi Documentation](https://docs.elephantrobotics.com/docs/gitbook-en/2-serialproduct/2.1-280/2.1.2-PI.html) A guide for setting up and controlling the MyCobot robot for sample transfer tasks. -- [Gripper Control via Python](https://docs.elephantrobotics.com/docs/gitbook-en/7-ApplicationBasePython/7.5_gripper.html) +- [Gripper Control via Python](https://docs.elephantrobotics.com/docs/gitbook-en/7-ApplicationBasePython/7.5_gripper.html) Detailed instructions for using Python to control the adaptive gripper during the sample transfer process. @@ -123,17 +123,19 @@ AprilTags are used to identify different stations in our setup. Here's an exampl import cv2 from pupil_apriltags import Detector + def detect_apriltags(image): gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) - detector = Detector(families='tag36h11') + detector = Detector(families="tag36h11") results = detector.detect(gray) - + for r in results: # Extract tag ID and pose information tag_id = r.tag_id pose = r.pose_t print(f"Detected AprilTag ID: {tag_id} at position: {pose}") + # Usage cap = cv2.VideoCapture(0) ret, frame = cap.read() @@ -149,6 +151,7 @@ Here's an example of how to control the MyCobot to pick up a sample: ```python from pymycobot.mycobot import MyCobot + def pickup_sample(mycobot, x, y, z): # Move to the sample position mycobot.send_coords([x, y, z, 0, 0, 0], 50, 0) @@ -157,6 +160,7 @@ def pickup_sample(mycobot, x, y, z): # Lift the sample mycobot.send_coords([x, y, z + 50, 0, 0, 0], 50, 0) + # Usage mycobot = MyCobot("/dev/ttyAMA0") pickup_sample(mycobot, 100, 100, 50) @@ -170,20 +174,22 @@ Here's a simple example of AGV navigation using ROS: import rospy from geometry_msgs.msg import Twist + def move_agv(linear_speed, angular_speed, duration): - pub = rospy.Publisher('/cmd_vel', Twist, queue_size=10) - rospy.init_node('agv_controller', anonymous=True) + pub = rospy.Publisher("/cmd_vel", Twist, queue_size=10) + rospy.init_node("agv_controller", anonymous=True) rate = rospy.Rate(10) # 10hz - + twist = Twist() twist.linear.x = linear_speed twist.angular.z = angular_speed - + start_time = rospy.Time.now() while (rospy.Time.now() - start_time).to_sec() < duration: pub.publish(twist) rate.sleep() + # Usage move_agv(0.2, 0, 5) # Move forward for 5 seconds ``` @@ -203,6 +209,7 @@ class LiquidHandler: # Code to control the liquid handler pass + # Usage liquid_handler = LiquidHandler() liquid_handler.dispense_liquid(5) @@ -223,6 +230,7 @@ class PowderDispenser: # Code to control the powder dispenser pass + # Usage powder_dispenser = PowderDispenser() powder_dispenser.dispense_powder(2) @@ -235,46 +243,55 @@ Now, we can use Prefect to orchestrate all these steps: ```python from prefect import task, Flow + @task def identify_station(): # Use the AprilTag detection code here pass + @task def pickup_sample(): # Use the MyCobot control code here pass + @task def move_to_liquid_station(): # Use the AGV navigation code here pass + @task def add_liquid(): liquid_handler = LiquidHandler() liquid_handler.dispense_liquid(5) + @task def move_to_powder_station(): # Use the AGV navigation code here pass + @task def add_powder(): powder_dispenser = PowderDispenser() powder_dispenser.dispense_powder(2) + @task def move_to_final_station(): # Use the AGV navigation code here pass + @task def place_sample(): # Use the MyCobot control code here to place the sample pass + with Flow("Solid Sample Transfer and Processing") as flow: station = identify_station() sample = pickup_sample() @@ -316,11 +333,13 @@ import pymongo from pymongo import MongoClient # Connect to MongoDB -client = MongoClient('mongodb://localhost:27017/') # Replace with your MongoDB connection string +client = MongoClient( + "mongodb://localhost:27017/" +) # Replace with your MongoDB connection string # Select database and collection -db = client['your_database_name'] -collection = db['your_collection_name'] +db = client["your_database_name"] +collection = db["your_collection_name"] # Insert a document document = {"course_id": "DEMO101", "data": "Some sample data"} @@ -347,7 +366,7 @@ print(f"Deleted {delete_result.deleted_count} document(s)") client.close() ``` -### Tips +### Tips When dealing with multiple robots, deciding which parts should be implemented in ROS2 and which in Prefect is indeed an important consideration. Here are some factors to consider and recommendations: ROS2 (Robot Operating System 2): 1. Real-time control and communication: ROS2 is better suited for handling real-time control and low-latency communication needs of robots. @@ -385,9 +404,11 @@ import logging from prefect import task from datetime import timedelta + class LiquidHandlerError(Exception): pass + class LiquidHandler: def __init__(self): self.logger = logging.getLogger(__name__) @@ -405,6 +426,7 @@ class LiquidHandler: # Other methods like _check_volume, _check_liquid_level, _dispense... + @task(max_retries=3, retry_delay=timedelta(seconds=5)) def add_liquid(volume_ml: float): liquid_handler = LiquidHandler() @@ -445,6 +467,7 @@ from datetime import timedelta import time import random + @task(cache_key_fn=task_input_hash, cache_expiration=timedelta(hours=1)) def robot_task(task_id): # Simulate a robot task that might fail @@ -453,17 +476,19 @@ def robot_task(task_id): time.sleep(2) # Simulate task duration return f"Task {task_id} completed successfully" + @task def notify_operator(task_id): print(f"ALERT: Task {task_id} has failed. An operator needs to check the robot.") # In a real scenario, this could send an email, SMS, or trigger an alert system operator_response = input("Has the issue been resolved? (yes/no): ") - return operator_response.lower() == 'yes' + return operator_response.lower() == "yes" + @flow def robot_workflow(): tasks = [robot_task.submit(i) for i in range(5)] - + for i, task in enumerate(tasks): try: result = task.result() @@ -476,11 +501,12 @@ def robot_workflow(): if not resolved: print("Please resolve the issue before continuing.") time.sleep(5) # Wait before checking again - + # Retry the task after the operator has resolved the issue retry_result = robot_task(i) print(f"Retry result: {retry_result}") + if __name__ == "__main__": robot_workflow() ``` @@ -503,7 +529,7 @@ This workflow demonstrates how Prefect can be used to: - Retry tasks after issues have been resolved This approach ensures that the workflow can continue even when unexpected issues occur, by bringing in human expertise to resolve problems that the automated system can't handle on its own. - + ### 3. Precision and Accuracy: - Regularly calibrate the robot and sensors to maintain high accuracy. @@ -526,33 +552,36 @@ Here's an example of how you might integrate your automated system with a cloud- import boto3 from prefect import task + class CloudLIMS: def __init__(self): - self.dynamodb = boto3.resource('dynamodb') - self.table = self.dynamodb.Table('SampleData') + self.dynamodb = boto3.resource("dynamodb") + self.table = self.dynamodb.Table("SampleData") def log_sample_data(self, sample_id, operation, result): self.table.put_item( Item={ - 'SampleID': sample_id, - 'Operation': operation, - 'Result': result, - 'Timestamp': datetime.now().isoformat() + "SampleID": sample_id, + "Operation": operation, + "Result": result, + "Timestamp": datetime.now().isoformat(), } ) + @task def process_sample(sample_id): lims = CloudLIMS() - + # Perform sample processing steps... - + # Log the results to the cloud LIMS - lims.log_sample_data(sample_id, 'LiquidAddition', 'Success') - lims.log_sample_data(sample_id, 'PowderAddition', 'Success') + lims.log_sample_data(sample_id, "LiquidAddition", "Success") + lims.log_sample_data(sample_id, "PowderAddition", "Success") return f"Sample {sample_id} processed successfully" + # Include this task in your Prefect flow with Flow("Sample Processing") as flow: sample_id = "SAMPLE001" @@ -618,8 +647,3 @@ https://q.utoronto.ca/courses/348619/assignments/1385149?display=full_width ::: :::: - - - - - diff --git a/docs/courses/robotics/index.md b/docs/courses/robotics/index.md index e29e335..15b6553 100644 --- a/docs/courses/robotics/index.md +++ b/docs/courses/robotics/index.md @@ -64,5 +64,3 @@ Waitlist {octicon}`link-external;1em` ``` - - diff --git a/docs/courses/software-dev/4.1-deep-dive-into-github.md b/docs/courses/software-dev/4.1-deep-dive-into-github.md index 0b5484b..262de9c 100644 --- a/docs/courses/software-dev/4.1-deep-dive-into-github.md +++ b/docs/courses/software-dev/4.1-deep-dive-into-github.md @@ -141,7 +141,8 @@ git commit -m "Resolve merge conflict in user authentication" ``` #### Example of a Merge Conflict: -```python + +``` <<<<<<< HEAD def authenticate_user(username, password): # Implementation using bcrypt @@ -213,4 +214,4 @@ https://q.utoronto.ca/courses/370070/assignments/1393622?display=full_width https://q.utoronto.ca/courses/370070/assignments/1393619?display=full_width ::: -:::: \ No newline at end of file +:::: diff --git a/docs/courses/software-dev/4.2-vscode-setup.md b/docs/courses/software-dev/4.2-vscode-setup.md index 4161e29..8973ada 100644 --- a/docs/courses/software-dev/4.2-vscode-setup.md +++ b/docs/courses/software-dev/4.2-vscode-setup.md @@ -8,7 +8,7 @@ ## πŸ”° Tutorial -In this module, you will learn how to set up VS Code and optimize it for Python development using tools like Miniconda, various extensions, and advanced features. +In this module, you will learn how to set up VS Code and optimize it for Python development using tools like Miniconda, various extensions, and advanced features. 1. Set up VS Code 2. Install Miniconda for environment management @@ -109,7 +109,7 @@ try: while True: GPIO.output(18, GPIO.HIGH) # Turn on time.sleep(1) - GPIO.output(18, GPIO.LOW) # Turn off + GPIO.output(18, GPIO.LOW) # Turn off time.sleep(1) except KeyboardInterrupt: GPIO.cleanup() # Clean up on exit @@ -184,4 +184,4 @@ https://q.utoronto.ca/courses/370070/assignments/1393623?display=full_width https://q.utoronto.ca/courses/370070/assignments/1394226?display=full_width ::: -:::: \ No newline at end of file +:::: diff --git a/docs/courses/software-dev/4.3-vscode-debugging.md b/docs/courses/software-dev/4.3-vscode-debugging.md index 3a8ec15..fa5c82c 100644 --- a/docs/courses/software-dev/4.3-vscode-debugging.md +++ b/docs/courses/software-dev/4.3-vscode-debugging.md @@ -210,4 +210,4 @@ https://q.utoronto.ca/courses/370070/assignments/1393624?display=full_width https://q.utoronto.ca/courses/370070/assignments/1394227?display=full_width ::: -:::: \ No newline at end of file +:::: diff --git a/docs/courses/software-dev/4.4-unit-testing.md b/docs/courses/software-dev/4.4-unit-testing.md index 798d4c5..f98448b 100644 --- a/docs/courses/software-dev/4.4-unit-testing.md +++ b/docs/courses/software-dev/4.4-unit-testing.md @@ -66,6 +66,7 @@ def mix_colors(color1, color2): else: return "unknown" + # test_light_mixing.py def test_mix_colors(): assert mix_colors("red", "blue") == "purple" diff --git a/docs/courses/software-dev/4.5-automated-docs.md b/docs/courses/software-dev/4.5-automated-docs.md index b01ac0e..6f1cc24 100644 --- a/docs/courses/software-dev/4.5-automated-docs.md +++ b/docs/courses/software-dev/4.5-automated-docs.md @@ -18,7 +18,7 @@ Learning Objectives: ### Writing Documentation in Markdown -Markdown is a lightweight markup language that uses plain text formatting to create formatted documents. It is widely used for writing documentation because of its simplicity and ease of use. +Markdown is a lightweight markup language that uses plain text formatting to create formatted documents. It is widely used for writing documentation because of its simplicity and ease of use. #### Why Markdown? - **Simplicity**: Markdown files are easy to read and write without the need for complex syntax. @@ -142,13 +142,14 @@ Sphinx is a documentation generator that converts reStructuredText or Markdown f ```python import os import sys - sys.path.insert(0, os.path.abspath('../src')) + + sys.path.insert(0, os.path.abspath("../src")) ``` 4. Add Extensions: To use Markdown files or automate docstring extraction, add extensions in `conf.py`: ```python - extensions = ['sphinx.ext.autodoc', 'myst_parser'] + extensions = ["sphinx.ext.autodoc", "myst_parser"] ``` 5. Build the Documentation: @@ -169,11 +170,11 @@ Readthedocs is a popular platform for hosting and automatically building documen 2. Connect Your Repository: After signing in, connect your GitHub repository with Readthedocs. 3. Add a `docs/` folder: Ensure that your project contains a `docs/` folder with the Sphinx configuration files (`conf.py`, `index.rst` or `index.md`). - + 4. Set Up Readthedocs: - In your Readthedocs dashboard, import the project by selecting your GitHub repository. - Follow the prompts to configure your build settings (e.g., Python version, Sphinx builder). - + 5. Build the Documentation: - Click "Build" in Readthedocs to generate the documentation. - Once the build is complete, your documentation will be available at a public URL. @@ -221,4 +222,3 @@ https://q.utoronto.ca/courses/370070/assignments/1394233?display=full_width ::: :::: - diff --git a/docs/courses/software-dev/4.6-continuous-integration.md b/docs/courses/software-dev/4.6-continuous-integration.md index 1fbca97..acac696 100644 --- a/docs/courses/software-dev/4.6-continuous-integration.md +++ b/docs/courses/software-dev/4.6-continuous-integration.md @@ -23,7 +23,7 @@ Continuous Integration (CI) is a software development practice where code change - **Consistent Code Quality**: CI tools run linters and other static analysis tools to enforce code quality standards. - **Faster Development**: Automating testing and integration allows developers to focus on writing code rather than manually running tests. - **Team Collaboration**: By frequently integrating code, CI reduces the chances of conflicts when multiple team members are working on the same codebase. - + Video Tutorial: [Continuous Integration Explained](https://www.youtube.com/watch?v=1er2cjUq1UI) ### Setting Up a GitHub Actions Workflow @@ -96,7 +96,7 @@ The CI process often includes running automated tests and building documentation #### Running Unit Tests on GitHub Actions: 1. **Install Testing Framework**: Make sure that `pytest` or any other testing framework you are using is included in your `requirements.txt` file. - + 2. **Define Test Step**: In the workflow, include a step to run the tests: ```yaml @@ -108,7 +108,7 @@ The CI process often includes running automated tests and building documentation #### Building Documentation on GitHub Actions: 1. **Install Sphinx**: Ensure that `Sphinx` and its dependencies are included in your `requirements.txt` file. - + 2. **Add Documentation Build Step**: In the workflow, add a step to build the documentation: ```yaml @@ -163,4 +163,4 @@ https://q.utoronto.ca/courses/370070/assignments/1393628?display=full_width https://q.utoronto.ca/courses/370070/assignments/1394236?display=full_width ::: -:::: \ No newline at end of file +:::: diff --git a/docs/courses/software-dev/4.9-cloud-simulations.md b/docs/courses/software-dev/4.9-cloud-simulations.md index 8caffa8..f240e88 100644 --- a/docs/courses/software-dev/4.9-cloud-simulations.md +++ b/docs/courses/software-dev/4.9-cloud-simulations.md @@ -261,4 +261,4 @@ https://q.utoronto.ca/courses/370070/assignments/1393632?display=full_width https://q.utoronto.ca/courses/370070/assignments/1394239?display=full_width ::: -:::: \ No newline at end of file +:::: From a56998d93e4bafbf1c6c4874c6daa41c2d6ce065 Mon Sep 17 00:00:00 2001 From: "Sterling G. Baird" Date: Wed, 9 Oct 2024 11:23:29 -0400 Subject: [PATCH 3/3] convert to ipynb --- .../{1.5.1-aws-lambda-read.json => 1.5.1-aws-lambda-read.ipynb} | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename docs/courses/hello-world/{1.5.1-aws-lambda-read.json => 1.5.1-aws-lambda-read.ipynb} (100%) diff --git a/docs/courses/hello-world/1.5.1-aws-lambda-read.json b/docs/courses/hello-world/1.5.1-aws-lambda-read.ipynb similarity index 100% rename from docs/courses/hello-world/1.5.1-aws-lambda-read.json rename to docs/courses/hello-world/1.5.1-aws-lambda-read.ipynb