Developer Guide

Interaction with Dispatcher

API

submit task to Dispatcher

// api/job/submit POST

// request
{
    "from": "projectid", // string: the project id
    "tag": "", // enum (suspicion| malicious) :the tag of the job
    "user": "b77f1799de0148c07bc6ef630fb75ac267f31d147cd28797ad145afe72302632", // string:  the user or device id to verify
    "job": {  // json:  the job deatail of operator to verify
        "tag": "opml", // enum(opml|tee): the job type 
        "prompt": "What is AI?",
        "model": "llama-2-7b-chat.Q4_0.gguf", 
        "params": {
            "temperature": 1.0,
            "top_p": 0.5,
            "max_tokens": 1024
        }
    },
    "verify": "4cc2e1f9-0ac2-4af6-98f8-b814917b49b0" // project api token
}

// response

{
    "code": 200,
    "result": "636a5cf015a1a4c7480317f114d5b06d69476b9d0d5a67dbe53e9b59de72769b" // job id
}

Query job result of job id

api/job/result POST

// api/job/result POST

{
    "job_id": "5814075a7a4b7bc4db5450e7bc3ee2893562e3b2d9deb66a14da4e99d7276e0a" // job_id
}

// response

{
    "code": 200,
    "result": [
        {
            "id": "_9935188af8051af60e323429a88f90417eaa08319c2abb360486662538b05532_suspicion",
            "job_id": "9935188af8051af60e323429a88f90417eaa08319c2abb360486662538b05532",
            "operator": "",
            "result": "\\n What is AI?\\n Unterscheidung zwischen AI und KI\\n Introduction to AI\\n What is AI?\\n Artificial Intelligence (AI) is the simulation of human intelligence in machines that are programmed to perform tasks that typically require human intelligence. As per the definition by the National Institute of Artificial Intelligence (NIAI), AI is the ability of a computer to perform tasks that typically require human intelligence, such as understanding, learning, and problem-solving.\\nThe term AI was coined in 1956 by John McCarthy, a computer scientist who organized the first AI conference. Since then, AI has been a rapidly growing field, with many advancements in various areas, including machine learning, natural language processing, computer vision, and robotics.\\nThere are several types of AI, including:\\n1. Narrow or weak AI: This type of AI is designed to perform a specific task, such as playing chess or recognizing faces. Examples of narrow AI include Siri, Alexa, and Google Assistant.\\n2. General or strong AI: This type of AI is designed to perform any intellectual task that a human can. Examples of\\n\\n",
            "verify_id": "",
            "vrf": {
                "selected": false,
                "vrf_prompt_hash": "",
                "vrf_random_value": "",
                "vrf_verify_pubkey": "",
                "vrf_proof": ""
            },
            "clock": {
                "1": "2"
            },
            "signature": "",
            "job_type": "",
            "tag": "suspicion",
            "created_at": "2024-09-27 03:36:04"
        }
    ]
}

Query job request detail of job id

api/job/detail POST

// api/job/detail POST
{
    "job_id": "08422438b4ebd843ca8a260c8358b75e31acf0565e62b3b492370561c8c36d66"
}

// response 

{
    "code": 200,
    "result": {
        "id": "08422438b4ebd843ca8a260c8358b75e31acf0565e62b3b492370561c8c36d66",
        "job": {
            "tag": "opml",
            "prompt": "What is AI?",
            "model": "ss",
            "params": {
                "temperature": 1.0,
                "top_p": 0.5,
                "max_tokens": 1024
            }
        },
        "user": "b77f1799de0148c07bc6ef630fb75ac267f31d147cd28797ad145afe72302632tee",
        "job_type": "",
        "status": "dispatched",
        "tag": "",
        "clock": {
            "1": "1"
        },
        "created_at": "2024-09-27 06:51:09"
    }
}
Class
vCPUs (10th gen+)
Memory
Networking Capacity

General Purpose - large

2

8 GB

5 Mbps

General Purpose - xl

4

16 GB

25 Mbps

General Purpose - 4xl

16

64 GB

5 Gbps

How to deploy a Operator

Start the AOS operator by launching both the operator and the TEE worker to provide TEE inference verification services.

before you start, ensure you have the following dependencies installed:

  • TEE Worker

Operator

Install cargo Tool:

The Rust toolchain can be installed via Rustup. Execute the following command:

curl <https://sh.rustup.rs> -sSf | sh

After installation, ensure that the Rust toolchain path is added to your environment variable:

source $HOME/.cargo/env

Install Dependencies:

On Ubuntu systems, run the following command to install dependencies:

sudo apt-get update && sudo apt-get install \\\\
curl \\\\
libssl-dev \\\\
libpq-dev \\\\
openssl \\\\

On Fedora systems, run the following commands:

sudo yum groupinstall 'Development Tools'
sudo yum install openssl-devel postgresql-libs postgresql-devel

Install PostgreSQL:

  • On Ubuntu systems:

    sudo apt install postgresql
  • On Fedora systems:

    sudo yum install postgresql-server
    sudo yum install postgresql16.x86_64 postgresql16-server -y
    sudo postgresql-setup --initdb
    

Start PostgreSQL Service:

sudo systemctl start postgresql
sudo systemctl enable postgresql

Install Redis:

  • On Ubuntu systems:

    sudo apt-get install redis
  • On Fedora systems:

    sudo yum install -y redis6

Start Redis Service:

sudo systemctl enable redis-server
sudo systemctl start redis-server

Compile the Source Code:

Use the following command to build the project:

cargo build --release

Update the Configuration File:

Configuration file location: docs/template/config-operator.yaml

Update the following configuration items based on your environment:

pg_db_url: PostgreSQL database URL
dispatcher_url: AOS dispatcher URL
dispatcher_address: AOS dispatcher ID
node_id: Operator address
signer_key: Operator private key
vrf_key: Same as signer_key
chain_rpc_url: Ethereum RPC node

Initialize the Database:

Use the following command to initialize the PostgreSQL database. Replace 'postgres:hetu' in the URL below with the actual username and password, and replace 'operator_db' with the actual database name.

./target/debug/operator-runer -i postgres://postgres:[email protected]:5432/operator_db

Run the Operator:

Use the following command to start the operator service:

./target/release/operator-runer -c ./docs/template/config-operator.yaml

TEE Worker Deploy

Install Dependencies Run the following script to set up the required environment dependencies:

./scripts/init_env.sh

Download the Model Use wget to download the required model file and move it to the models directory:

wget <https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGUF/resolve/main/llama-2-7b-chat.Q4_0.gguf>
mv llama-2-7b-chat.Q4_0.gguf ./models

Build the TEE Worker Compile the TEE Worker in release mode using cargo:

cargo build --release

Start the TEE Environment Run the script to initialize the TEE environment:

./scripts/run_tee.sh

Start the TEE Worker Service Run the compiled TEE Worker executable:

./target/release/tee-worker

πŸŽ‰ Congrats, you've just spun up your own Operator and requested it to do some work!

Last updated