Skip to content

Commit

Permalink
Merge
Browse files Browse the repository at this point in the history
  • Loading branch information
SmartManoj committed Oct 30, 2024
2 parents 36b1f07 + 9c2b48f commit 0618511
Show file tree
Hide file tree
Showing 52 changed files with 3,177 additions and 2,610 deletions.
48 changes: 47 additions & 1 deletion .github/workflows/ghcr-build.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Workflow that builds, tests and then pushes the OpenHands and runtime docker images to the ghcr.io repository
name: Build, Test and Publish RT Image
name: Docker

# Always run on "main"
# Always run on tags
Expand Down Expand Up @@ -399,3 +399,49 @@ jobs:
run: |
echo "Some runtime tests failed or were cancelled"
exit 1
update_pr_description:
name: Update PR Description
if: github.event_name == 'pull_request'
needs: [ghcr_build_runtime]
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4

- name: Get short SHA
id: short_sha
run: echo "SHORT_SHA=$(echo ${{ github.event.pull_request.head.sha }} | cut -c1-7)" >> $GITHUB_OUTPUT

- name: Update PR Description
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_NUMBER: ${{ github.event.pull_request.number }}
REPO: ${{ github.repository }}
SHORT_SHA: ${{ steps.short_sha.outputs.SHORT_SHA }}
run: |
echo "updating PR description"
DOCKER_RUN_COMMAND="docker run -it --rm \
-p 3000:3000 \
-v /var/run/docker.sock:/var/run/docker.sock \
--add-host host.docker.internal:host-gateway \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=ghcr.io/all-hands-ai/runtime:$SHORT_SHA-nikolaik \
--name openhands-app-$SHORT_SHA \
ghcr.io/all-hands-ai/runtime:$SHORT_SHA"
PR_BODY=$(gh pr view $PR_NUMBER --json body --jq .body)
if echo "$PR_BODY" | grep -q "To run this PR locally, use the following command:"; then
UPDATED_PR_BODY=$(echo "${PR_BODY}" | sed -E "s|docker run -it --rm.*|$DOCKER_RUN_COMMAND|")
else
UPDATED_PR_BODY="${PR_BODY}
---
To run this PR locally, use the following command:
\`\`\`
$DOCKER_RUN_COMMAND
\`\`\`"
fi
echo "updated body: $UPDATED_PR_BODY"
gh pr edit $PR_NUMBER --body "$UPDATED_PR_BODY"
28 changes: 12 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,37 +95,33 @@ Learn more at [docs.all-hands.dev](https://docs.all-hands.dev), or jump to the [

## ⚡ Quick Start

The easiest way to run OpenHands is in Docker. You can change `WORKSPACE_BASE` below to
point OpenHands to existing code that you'd like to modify.

The easiest way to run OpenHands is in Docker.
See the [Installation](https://docs.all-hands.dev/modules/usage/installation) guide for
system requirements and more information.

```bash
export WORKSPACE_BASE=$(pwd)/workspace

docker pull ghcr.io/all-hands-ai/runtime:0.11-nikolaik
docker pull docker.all-hands.dev/all-hands-ai/runtime:0.11-nikolaik

docker run -it --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=ghcr.io/all-hands-ai/runtime:0.11-nikolaik \
-e SANDBOX_USER_ID=$(id -u) \
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
-v $WORKSPACE_BASE:/opt/workspace_base \
docker run -it --rm --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.11-nikolaik \
-v /var/run/docker.sock:/var/run/docker.sock \
-p 3000:3000 \
--add-host host.docker.internal:host-gateway \
--name openhands-app-$(date +%Y%m%d%H%M%S) \
ghcr.io/all-hands-ai/openhands:0.11
--name openhands-app \
docker.all-hands.dev/all-hands-ai/openhands:0.11
```

You'll find OpenHands running at [http://localhost:3000](http://localhost:3000)!

You'll need a model provider and API key. One option that works well: [Claude 3.5 Sonnet](https://www.anthropic.com/api), but you have [many options](https://docs.all-hands.dev/modules/usage/llms).
Finally, you'll need a model provider and API key.
[Anthropic's Claude 3.5 Sonnet](https://www.anthropic.com/api) (`anthropic/claude-3-5-sonnet-20241022`)
works best, but you have [many options](https://docs.all-hands.dev/modules/usage/llms).

---

You can also run OpenHands in a scriptable [headless mode](https://docs.all-hands.dev/modules/usage/how-to/headless-mode),
or as an [interactive CLI](https://docs.all-hands.dev/modules/usage/how-to/cli-mode).
You can also [connect OpenHands to your local filesystem](https://docs.all-hands.dev/modules/usage/runtimes),
run OpenHands in a scriptable [headless mode](https://docs.all-hands.dev/modules/usage/how-to/headless-mode),
or interact with it via a [friendly CLI](https://docs.all-hands.dev/modules/usage/how-to/cli-mode).

Visit [Installation](https://docs.all-hands.dev/modules/usage/installation) for more information and setup instructions.

Expand Down
1 change: 1 addition & 0 deletions containers/app/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ ENV SANDBOX_LOCAL_RUNTIME_URL=http://host.docker.internal
ENV USE_HOST_NETWORK=false
ENV WORKSPACE_BASE=/opt/workspace_base
ENV OPENHANDS_BUILD_VERSION=$OPENHANDS_BUILD_VERSION
ENV SANDBOX_USER_ID=0
RUN mkdir -p $WORKSPACE_BASE

RUN apt-get update -y \
Expand Down
5 changes: 5 additions & 0 deletions containers/app/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,11 @@ if [ -z "$SANDBOX_USER_ID" ]; then
exit 1
fi

if [ -z "$WORKSPACE_MOUNT_PATH" ]; then
# This is set to /opt/workspace in the Dockerfile. But if the user isn't mounting, we want to unset it so that OpenHands doesn't mount at all
unset WORKSPACE_BASE
fi

if [[ "$SANDBOX_USER_ID" -eq 0 ]]; then
echo "Running OpenHands as root"
export RUN_AS_OPENHANDS=false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ WORKSPACE_BASE=$(pwd)/workspace
2. Définissez `LLM_MODEL` sur le modèle que vous souhaitez utiliser :

```bash
LLM_MODEL="anthropic/claude-3-5-sonnet-20240620"
LLM_MODEL="anthropic/claude-3-5-sonnet-20241022"
```

3. Définissez `LLM_API_KEY` sur votre clé API :
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,8 @@ Voici un exemple de fichier de configuration que vous pouvez utiliser pour défi
```toml
[llm]
# IMPORTANT : ajoutez votre clé API ici et définissez le modèle que vous souhaitez évaluer
model = "claude-3-5-sonnet-20240620"
model = "claude-3-5-sonnet-20241022"

api_key = "sk-XXX"

[llm.eval_gpt4_1106_preview_llm]
Expand Down Expand Up @@ -278,3 +279,4 @@ Cette fonction fait ce qui suit :
3. Si l'agent a fait plusieurs tentatives, il lui donne la possibilité d'abandonner

En utilisant cette fonction, vous pouvez garantir un comportement cohérent sur plusieurs exécutions d'évaluation et empêcher l'agent de rester bloqué en attendant une entrée humaine.

Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ WORKSPACE_BASE=$(pwd)/workspace
2. Définissez `LLM_MODEL` sur le modèle que vous voulez utiliser :

```bash
LLM_MODEL="anthropic/claude-3-5-sonnet-20240620"
LLM_MODEL="anthropic/claude-3-5-sonnet-20241022"
```

3. Définissez `LLM_API_KEY` sur votre clé API :
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ WORKSPACE_BASE=$(pwd)/workspace
2.`LLM_MODEL` 设置为你要使用的模型:

```bash
LLM_MODEL="anthropic/claude-3-5-sonnet-20240620"
LLM_MODEL="anthropic/claude-3-5-sonnet-20241022"
```

3.`LLM_API_KEY` 设置为你的 API 密钥:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
```toml
[llm]
# 重要:在此处添加您的 API 密钥,并将模型设置为您要评估的模型
model = "claude-3-5-sonnet-20240620"
model = "claude-3-5-sonnet-20241022"
api_key = "sk-XXX"

[llm.eval_gpt4_1106_preview_llm]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,8 @@ WORKSPACE_BASE=$(pwd)/workspace
2.`LLM_MODEL` 设置为你要使用的模型:

```bash
LLM_MODEL="anthropic/claude-3-5-sonnet-20240620"
LLM_MODEL="anthropic/claude-3-5-sonnet-20241022"

```

3.`LLM_API_KEY` 设置为你的 API 密钥:
Expand All @@ -57,3 +58,4 @@ docker run -it \
ghcr.io/all-hands-ai/openhands:0.11 \
python -m openhands.core.main -t "write a bash script that prints hi"
```

6 changes: 4 additions & 2 deletions docs/modules/usage/how-to/cli-mode.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,8 @@ WORKSPACE_BASE=$(pwd)/workspace
2. Set `LLM_MODEL` to the model you want to use:

```bash
LLM_MODEL="anthropic/claude-3-5-sonnet-20240620"
LLM_MODEL="anthropic/claude-3-5-sonnet-20241022"

```

3. Set `LLM_API_KEY` to your API key:
Expand All @@ -57,7 +58,7 @@ docker run -it \
-v /var/run/docker.sock:/var/run/docker.sock \
--add-host host.docker.internal:host-gateway \
--name openhands-app-$(date +%Y%m%d%H%M%S) \
ghcr.io/all-hands-ai/openhands:0.11 \
docker.all-hands.dev/all-hands-ai/openhands:0.11 \
python -m openhands.core.cli
```

Expand Down Expand Up @@ -106,3 +107,4 @@ Expected Output:
```bash
🤖 An error occurred. Please try again.
```

2 changes: 1 addition & 1 deletion docs/modules/usage/how-to/evaluation-harness.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Here's an example configuration file you can use to define and use multiple LLMs
```toml
[llm]
# IMPORTANT: add your API key here, and set the model to the one you want to evaluate
model = "claude-3-5-sonnet-20240620"
model = "claude-3-5-sonnet-20241022"
api_key = "sk-XXX"

[llm.eval_gpt4_1106_preview_llm]
Expand Down
6 changes: 4 additions & 2 deletions docs/modules/usage/how-to/headless-mode.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@ WORKSPACE_BASE=$(pwd)/workspace
2. Set `LLM_MODEL` to the model you want to use:

```bash
LLM_MODEL="anthropic/claude-3-5-sonnet-20240620"
LLM_MODEL="anthropic/claude-3-5-sonnet-20241022"

```

3. Set `LLM_API_KEY` to your API key:
Expand All @@ -51,6 +52,7 @@ docker run -it \
-v /var/run/docker.sock:/var/run/docker.sock \
--add-host host.docker.internal:host-gateway \
--name openhands-app-$(date +%Y%m%d%H%M%S) \
ghcr.io/all-hands-ai/openhands:0.11 \
docker.all-hands.dev/all-hands-ai/openhands:0.11 \
python -m openhands.core.main -t "write a bash script that prints hi"
```

12 changes: 6 additions & 6 deletions docs/modules/usage/how-to/openshift-example.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ metadata:
spec:
containers:
- name: openhands-app-2024
image: ghcr.io/all-hands-ai/openhands:main
image: docker.all-hands.dev/all-hands-ai/openhands:main
env:
- name: SANDBOX_USER_ID
value: "1000"
Expand All @@ -164,7 +164,7 @@ spec:
ports:
- containerPort: 3000
- name: openhands-sandbox-2024
image: ghcr.io/all-hands-ai/sandbox:main
image: docker.all-hands.dev/all-hands-ai/runtime:main
ports:
- containerPort: 51963
command: ["/usr/sbin/sshd", "-D", "-p 51963", "-o", "PermitRootLogin=yes"]
Expand Down Expand Up @@ -205,10 +205,10 @@ LAST SEEN TYPE REASON OBJECT
9s Normal SuccessfulAttachVolume pod/openhands-app-2024 AttachVolume.Attach succeeded for volume "pvc-2b1d223a-1c8f-4990-8e3d-68061a9ae252"
9s Normal SuccessfulAttachVolume pod/openhands-app-2024 AttachVolume.Attach succeeded for volume "pvc-31f15b25-faad-4665-a25f-201a530379af"
6s Normal AddedInterface pod/openhands-app-2024 Add eth0 [10.128.2.48/23] from openshift-sdn
6s Normal Pulled pod/openhands-app-2024 Container image "ghcr.io/all-hands-ai/openhands:main" already present on machine
6s Normal Pulled pod/openhands-app-2024 Container image "docker.all-hands.dev/all-hands-ai/openhands:main" already present on machine
6s Normal Created pod/openhands-app-2024 Created container openhands-app-2024
6s Normal Started pod/openhands-app-2024 Started container openhands-app-2024
6s Normal Pulled pod/openhands-app-2024 Container image "ghcr.io/all-hands-ai/sandbox:main" already present on machine
6s Normal Pulled pod/openhands-app-2024 Container image "docker.all-hands.dev/all-hands-ai/sandbox:main" already present on machine
5s Normal Created pod/openhands-app-2024 Created container openhands-sandbox-2024
5s Normal Started pod/openhands-app-2024 Started container openhands-sandbox-2024
83s Normal WaitForFirstConsumer persistentvolumeclaim/workspace-pvc waiting for first consumer to be created before binding
Expand Down Expand Up @@ -334,7 +334,7 @@ spec:
spec:
containers:
- name: openhands-app-2024
image: ghcr.io/all-hands-ai/openhands:main
image: docker.all-hands.dev/all-hands-ai/openhands:main
env:
- name: SANDBOX_USER_ID
value: "1000"
Expand All @@ -356,7 +356,7 @@ spec:
ports:
- containerPort: 3000
- name: openhands-sandbox-2024
image: ghcr.io/opendevin/sandbox:main
image: docker.all-hands.dev/all-hands-ai/runtime:main
# securityContext:
# privileged: true # Add this to allow privileged access
ports:
Expand Down
25 changes: 8 additions & 17 deletions docs/modules/usage/installation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,24 +8,18 @@

## Start the app

The easiest way to run OpenHands is in Docker. You can change `WORKSPACE_BASE` below to point OpenHands to
existing code that you'd like to modify.
The easiest way to run OpenHands is in Docker.

```bash
export WORKSPACE_BASE=$(pwd)/workspace
docker pull docker.all-hands.dev/all-hands-ai/runtime:0.11-nikolaik

docker pull ghcr.io/all-hands-ai/runtime:0.11-nikolaik

docker run -it --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=ghcr.io/all-hands-ai/runtime:0.11-nikolaik \
-e SANDBOX_USER_ID=$(id -u) \
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
-v $WORKSPACE_BASE:/opt/workspace_base \
docker run -it --rm --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.11-nikolaik \
-v /var/run/docker.sock:/var/run/docker.sock \
-p 3000:3000 \
--add-host host.docker.internal:host-gateway \
--name openhands-app-$(date +%Y%m%d%H%M%S) \
ghcr.io/all-hands-ai/openhands:0.11
--name openhands-app \
docker.all-hands.dev/all-hands-ai/openhands:0.11
```

You can also run OpenHands in a scriptable [headless mode](https://docs.all-hands.dev/modules/usage/how-to/headless-mode), as an [interactive CLI](https://docs.all-hands.dev/modules/usage/how-to/cli-mode), or using the [OpenHands GitHub Action](https://docs.all-hands.dev/modules/usage/how-to/github-action).
Expand All @@ -34,9 +28,6 @@ You can also run OpenHands in a scriptable [headless mode](https://docs.all-hand

After running the command above, you'll find OpenHands running at [http://localhost:3000](http://localhost:3000).

The agent will have access to the `./workspace` folder to do its work. You can copy existing code here, or change `WORKSPACE_BASE` in the
command to point to an existing folder.

Upon launching OpenHands, you'll see a settings modal. You **must** select an `LLM Provider` and `LLM Model` and enter a corresponding `API Key`.
These can be changed at any time by selecting the `Settings` button (gear icon) in the UI.

Expand All @@ -52,9 +43,9 @@ The `Advanced Options` also allow you to specify a `Base URL` if required.
## Versions

The command above pulls the most recent stable release of OpenHands. You have other options as well:
- For a specific release, use `ghcr.io/all-hands-ai/openhands:$VERSION`, replacing $VERSION with the version number.
- For a specific release, use `docker.all-hands.dev/all-hands-ai/openhands:$VERSION`, replacing $VERSION with the version number.
- We use semver, and release major, minor, and patch tags. So `0.9` will automatically point to the latest `0.9.x` release, and `0` will point to the latest `0.x.x` release.
- For the most up-to-date development version, you can use `ghcr.io/all-hands-ai/openhands:main`. This version is unstable and is recommended for testing or development purposes only.
- For the most up-to-date development version, you can use `docker.all-hands.dev/all-hands-ai/openhands:main`. This version is unstable and is recommended for testing or development purposes only.

You can choose the tag that best suits your needs based on stability requirements and desired features.

Expand Down
Loading

0 comments on commit 0618511

Please sign in to comment.