Skip to content

LOCKhart07/mech-predict

 
 

Repository files navigation

Autonolas AI Mechs
License: Apache-2.0 Framework: Open Autonomy 0.21.16

This repository contains an AI Mech for the Predict Agent Economy.

Requirements

You need the following requirements installed in your system:

Developing, running and deploying Mechs and Mech tools

The easiest way to create, run, deploy and test your own Mech and Mech tools is to follow the Mech and Mech tool docs here. The Mech tools dev repo used in those docs greatly simplifies the development flow and dev experience.

Only continue reading this README if you know what you are doing and you are specifically interested in this repo.

Set up your environment

Follow these instructions to have your local environment prepared to run the demo below, as well as to build your own AI Mech.

  1. Create a Poetry virtual environment and install the dependencies:

    poetry install && poetry shell
  2. Fetch the software packages using the Open Autonomy CLI:

    autonomy packages sync --update-packages

    This will populate the Open Autonomy local registry (folder ./packages) with the required components to run the worker services.

Run the Mech Predict

Using Mech Quickstart (Preferred Method)

To help you integrate your own tools more easily, we’ve created a new base repository that serves as a minimal example of how to run the project. It’s designed to minimize setup time and provide a more intuitive starting point. This new repo is streamlined to give you a clean slate, making it easier than ever to get started.

Why Use the New Base Repo?

  • Less Configuration: A clean setup that removes unnecessary complexities.
  • Easier to Extend: Perfect for adding your own features and customizations.
  • Clear Example: Start with a working example and build from there.

Feature Comparison

Feature New Base Repo (Recommended) Old Mech Repo (Not Preferred)
Setup Ease Simplified minimal setup and quick to start Requires extra configuration and more error prone
Flexibility & Customization Easy to extend with your own features Less streamlined for extensions
Future Support Actively maintained & improved No longer the focus for updates
Complexity Low complexity, easy to use More complex setup

We highly encourage you to start with this base repo for future projects. You can find it here.

Running the old base mech

Warning
The old repo is no longer the recommended approach for running and extending the project. Although it’s still remains available for legacy projects, we advise you to use the new base repo to ensure you are working with the most current and efficient setup. Access the new mech repo here. Start with the preferred method mentioned above.

Follow the instructions below to run the AI Mech demo. Note that AI Mechs can be configured to work in two modes: polling mode, which periodically reads the chain, and websocket mode, which receives event updates from the chain. The default mode used by the demo is polling.

Environment Variables

You may customize the agent's behaviour by setting these environment variables.

Name Type Sample Value Description
TOOLS_TO_PACKAGE_HASH dict {"prediction-offline":"bafybei...","prediction-online":"bafybei..."} Tracks services for each tool packages.
API_KEYS dict {"openai":["dummy_api_key"], "google_api_key":["dummy_api_key"]} Tracks API keys for each service.
SERVICE_REGISTRY_ADDRESS str "0x9338b5153AE39BB89f50468E608eD9d764B755fD" Smart contract which registers the services.
COMPLEMENTARY_SERVICE_METADATA_ADDRESS str "0x0598081D48FB80B0A7E52FAD2905AE9beCd6fC69" Smart contract which tracks metadata hash of the mech.
MECH_MARKETPLACE_ADDRESS str "0x4554fE75c1f5576c1d7F765B2A036c199Adae329" Marketplace for posting and delivering requests served by Olas mechs.
MECH_TO_SUBSCRIPTION dict {"0x77af31De935740567Cf4fF1986D04B2c964A786a":{"tokenAddress":"0x0000000000000000000000000000000000000000","tokenId":"1"}} Tracks mech's subscription details.
MECH_TO_CONFIG dict {"0xFf82123dFB52ab75C417195c5fDB87630145ae81":{"use_dynamic_pricing":false,"is_marketplace_mech":false}} Tracks mech's config.

The rest of the common environment variables are present in the service.yaml, which are customizable too.

Warning
The demo service is configured to match a specific on-chain agent (ID 3 on Mech Hub). Since you will not have access to its private key, your local instance will not be able to transact. However, it will be able to receive Requests for AI tasks sent from Mech Hub. These Requests will be executed by your local instance, but you will notice that a failure will occur when it tries to submit the transaction on-chain (Deliver type).

Now, you have two options to run the worker: as a standalone agent or as a service.

Option 1: Run the Mech as a standalone agent

  1. Configure the standalone agent environment file:

    cp .example_agent.env .agentenv
  2. Ensure you have a file with a private key (ethereum_private_key.txt). You can generate a new private key file using the Open Autonomy CLI:

    autonomy generate-key ethereum
  3. Install the mech CLI from mech-server:

    pip install mech-server
  4. Set up and run the mech:

    mech setup -c gnosis
    mech run -c gnosis          # production (Docker deployment)
    mech run -c gnosis --dev    # development (host, Tendermint + agent)

Releases

To create a release, use the aea-helpers CLI:

pip install aea-helpers
aea-helpers make-release --version <VERSION> --env <ENV> --description "<DESCRIPTION>"

Included tools

Tools
packages/dvilela/customs/corcel_request
packages/dvilela/customs/gemini_prediction
packages/napthaai/customs/prediction_request_rag
packages/napthaai/customs/prediction_request_reasoning
packages/napthaai/customs/prediction_url_cot
packages/napthaai/customs/resolve_market_reasoning
packages/nickcom007/customs/prediction_request_sme
packages/valory/customs/prediction_langchain
packages/valory/customs/prediction_request
packages/valory/customs/prepare_tx
packages/valory/customs/resolve_market
packages/valory/customs/superforcaster
packages/victorpolisetty/customs/dalle_request
packages/victorpolisetty/customs/gemini_request

More on tools

  • Prediction request (prediction_request.py): Makes binary predictions on markets using web scraping, RAG, and LLM analysis.

    • prediction-offline: Uses only training data of the model to make the prediction.
    • prediction-online: Also uses online information to improve the prediction.
    • claude-prediction-offline, claude-prediction-online: Legacy aliases.
  • Prediction request SME (prediction_request_sme.py): Generates Subject Matter Expert roles and uses them for market predictions with web search.

    • prediction-offline-sme
    • prediction-online-sme
  • Prediction request reasoning (prediction_request_reasoning.py): Multi-turn reasoning with web scraping for binary market predictions.

    • prediction-request-reasoning
    • prediction-request-reasoning-claude
  • Prediction request RAG (prediction_request_rag.py): RAG-based predictions using FAISS embeddings, web scraping, and PDF extraction.

    • prediction-request-rag
    • prediction-request-rag-claude
  • Prediction URL CoT (prediction_url_cot.py): Chain-of-thought reasoning with web retrieval for market predictions.

    • prediction-url-cot
    • prediction-url-cot-claude
  • Prediction langchain (prediction_langchain.py): Langgraph workflow with researcher and data analyzer agents using Tavily search.

  • Superforcaster (superforcaster.py): Calibrated probability forecasting with web scraping and document retrieval.

    • superforcaster
  • Resolve market (resolve_market.py): Resolves closed markets by fetching news articles and analyzing outcomes.

    • close_market
  • Resolve market reasoning (resolve_market_reasoning.py): Improved market resolution using embeddings and multi-stage reasoning.

    • resolve-market-reasoning-gpt-4.1
  • Prepare tx (prepare_tx.py): Parses natural language into Ethereum transaction data.

    • native_transfer
  • DALL-E request (dalle_request.py): Generates images via the OpenAI DALL-E API.

    • dall-e-2, dall-e-3
  • Gemini request (gemini_request.py): Runs prompts against Google Gemini.

    • gemini-2.0-flash, gemini-2.0-flash-lite
  • Corcel request (corcel_request.py): Makes LLM requests to Corcel.

    • corcel-prediction, corcel-completion
  • Gemini prediction (gemini_prediction.py): Makes LLM prediction requests to Gemini.

    • gemini-prediction, gemini-completion

About

Mech serving Olas Predict

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • Python 99.5%
  • Makefile 0.5%