|
| 1 | +# DimOS |
| 2 | + |
| 3 | +## Installation |
| 4 | + |
| 5 | +Clone the repo: |
| 6 | + |
| 7 | +```bash |
| 8 | +git clone -b main --single-branch git@github.com:dimensionalOS/dimos.git |
| 9 | +cd dimos |
| 10 | +``` |
| 11 | + |
| 12 | +### System dependencies |
| 13 | + |
| 14 | +Tested on Ubuntu 22.04/24.04. |
| 15 | + |
| 16 | +```bash |
| 17 | +sudo apt update |
| 18 | +sudo apt install git-lfs python3-venv python3-pyaudio portaudio19-dev libturbojpeg0-dev |
| 19 | +``` |
| 20 | + |
| 21 | +### Python dependencies |
| 22 | + |
| 23 | +Install `uv` by [following their instructions](https://docs.astral.sh/uv/getting-started/installation/) or just run: |
| 24 | + |
| 25 | +```bash |
| 26 | +curl -LsSf https://astral.sh/uv/install.sh | sh |
| 27 | +``` |
| 28 | + |
| 29 | +Install Python dependencies: |
| 30 | + |
| 31 | +```bash |
| 32 | +uv sync |
| 33 | +``` |
| 34 | + |
| 35 | +Depending on what you want to test you might want to install more optional dependencies as well (recommended): |
| 36 | + |
| 37 | +```bash |
| 38 | +uv sync --extra dev --extra cpu --extra sim --extra drone |
| 39 | +``` |
| 40 | + |
| 41 | +### Install Foxglove Studio (robot visualization and control) |
| 42 | + |
| 43 | +> **Note:** This will be obsolete once we finish our migration to open source [Rerun](https://rerun.io/). |
| 44 | +
|
| 45 | +Download and install [Foxglove Studio](https://foxglove.dev/download): |
| 46 | + |
| 47 | +```bash |
| 48 | +wget https://get.foxglove.dev/desktop/latest/foxglove-studio-latest-linux-amd64.deb |
| 49 | +sudo apt install ./foxglove-studio-*.deb |
| 50 | +``` |
| 51 | + |
| 52 | +[Register an account](https://app.foxglove.dev/signup) to use it. |
| 53 | + |
| 54 | +Open Foxglove Studio: |
| 55 | + |
| 56 | +```bash |
| 57 | +foxglove-studio |
| 58 | +``` |
| 59 | + |
| 60 | +To connect and load our dashboard: |
| 61 | + |
| 62 | +1. Click on "Open connection" |
| 63 | +2. In the popup window, leave the WebSocket URL as `ws://localhost:8765` and click "Open" |
| 64 | +3. In the top right, click on the "Default" dropdown, then "Import from file..." |
| 65 | +4. Navigate to the `dimos` repo and select `assets/foxglove_dashboards/unitree.json` |
| 66 | + |
| 67 | +### Test the install |
| 68 | + |
| 69 | +Run the Python tests: |
| 70 | + |
| 71 | +```bash |
| 72 | +uv run pytest dimos |
| 73 | +``` |
| 74 | + |
| 75 | +They should all pass in about 3 minutes. |
| 76 | + |
| 77 | +### Test a robot replay |
| 78 | + |
| 79 | +Run the system by playing back recorded data from a robot (the replay data is automatically downloaded via Git LFS): |
| 80 | + |
| 81 | +```bash |
| 82 | +uv run dimos --replay run unitree-go2-basic |
| 83 | +``` |
| 84 | + |
| 85 | +You can visualize the robot data in Foxglove Studio. |
| 86 | + |
| 87 | +### Run a simulation |
| 88 | + |
| 89 | +```bash |
| 90 | +uv run dimos --simulation run unitree-go2-basic |
| 91 | +``` |
| 92 | + |
| 93 | +This will open a MuJoCo simulation window. You can also visualize data in Foxglove. |
| 94 | + |
| 95 | +If you want to also teleoperate the simulated robot run: |
| 96 | + |
| 97 | +```bash |
| 98 | +uv run dimos --simulation run unitree-go2-basic --extra-module keyboard_teleop |
| 99 | +``` |
| 100 | + |
| 101 | +This will also open a Keyboard Teleop window. Focus on the window and use WASD to control the robot. |
| 102 | + |
| 103 | +### Command center |
| 104 | + |
| 105 | +You can also control the robot from the `command-center` extension to Foxglove. |
| 106 | + |
| 107 | +First, pull the LFS file: |
| 108 | + |
| 109 | +```bash |
| 110 | +git lfs pull --include="assets/dimensional.command-center-extension-0.0.1.foxe" |
| 111 | +``` |
| 112 | + |
| 113 | +To install it, drag that file over the Foxglove Studio window. The extension will be installed automatically. Then, click on the "Add panel" icon on the top right and add "command-center". |
| 114 | + |
| 115 | +You can now click on the map to give it a travel goal, or click on "Start Keyboard Control" to teleoperate it. |
| 116 | + |
| 117 | +### Using `dimos` in your code |
| 118 | + |
| 119 | +If you want to use dimos in your own project (not the cloned repo), you can install it as a dependency: |
| 120 | + |
| 121 | +```bash |
| 122 | +uv add dimos |
| 123 | +``` |
| 124 | + |
| 125 | +Note, a few dependencies do not have PyPI packages and need to be installed from their Git repositories. These are only required for specific features: |
| 126 | + |
| 127 | +- **CLIP** and **detectron2**: Required for the Detic open-vocabulary object detector |
| 128 | +- **contact_graspnet_pytorch**: Required for robotic grasp prediction |
| 129 | + |
| 130 | +You can install them with: |
| 131 | + |
| 132 | +```bash |
| 133 | +uv add git+https://github.com/openai/CLIP.git |
| 134 | +uv add git+https://github.com/dimensionalOS/contact_graspnet_pytorch.git |
| 135 | +uv add git+https://github.com/facebookresearch/detectron2.git |
| 136 | +``` |
0 commit comments