Skip to content

REST Interface (Submit Side)

Josh Hursey edited this page Apr 5, 2016 · 7 revisions

Submit Side REST Interface

This page describes the REST interface to which the MTT Client can submit test results.

The server is running at:

The interface is broken into the following parts:

  • / - Ping the server to see if it is running
  • /serial - Request a client_serial number
  • /submit - Submit data to the server

/

Title: Ping the server to see if it is running.

URL: /

Method: GET

URL Params: None

Data Params: None

Example JSON Output:

On success:

{
    "status": 0,
    "status_message": "Success"
}

Or if the server is not running you will see this text:

Sorry, the MTT Server does not seem to be running!

/serial

Title: Request a client_serial number

URL: /serial

Method: POST

URL Params: None

Data Params:

  • Note: The payload is ignored at this time.
  • serial: serial

Data Params Example:

{
    "serial": "serial"
}

Example JSON Output:

  • client_serial Integer
{
    "status": 0,
    "status_message": "Success",
    "client_serial": 1234567
}

/submit

Title: Submit test results to the server

URL: /submit

Method: POST

URL Params: None

Data Params:

  • metadata

    • client_serial A valid integer from a previous call to /serial
    • hostname The hostname where the test was run
    • local_username The local username on the host
    • mtt_client_version Version of the client you are running
    • phase One of the following
      • "MPI Install" MPI Install results
      • "Test Build" Test Build results
      • "Test Run" Test Run results
    • trial If this is a trial run.
      • 0 for false
      • 1 for true
    • platform_name Custom name of the platform (e.g., "my-cluster")
  • data (Below are the data fields for an "MPI Install" phase)

    • Notes:
      • Array of submissions all from the same phase type (identified in the metadata)
      • All fields are required unless otherwise noted as Optional
      • Unless otherwise enumerated, the string passed is defined by the client as does not need to adhere to any specific format.
    • platform_hardware String representation of the hardware (e.g., "x86_64")
    • platform_type String representation of the platform type (e.g., "linux-rhel6.7-x86_64")
    • os_name Common name for the OS (e.g., "Linux")
    • os_version Version information for the OS (e.g., "Linux 2.6.32-573.12.1.el6.x86_64")
    • compiler_name Common name for the compiler (e.g., "gnu")
    • compiler_version Version string for the compiler (e.g., "4.4.7")
    • mpi_name A name for the MPI version (e.g., "ompi-nightly-v1.10")
    • mpi_version Version string reported by MPI (e.g., "v1.10.2-114-gf3bad94")
    • configure_arguments Configure arguments
    • start_timestamp Timestamp when the test started (used to catalog the result)
      • Format (UTC timezone): "Mon Apr 4 16:30:35 2016"
      • Python examples: from datetime import datetime
        • datetime.utcnow().strftime("%c")
        • datetime.utcnow().strftime("%a %b %d %H:%M:%S %Y")
    • result_message A string representation of the test result (e.g., "Success" or ""Failed; timeout expired (00:10 DD:HH:MM:SS) )")
    • test_result A numerical classification of the test result
      • 0 - Failed
      • 1 - Passed
      • 2 - Skipped
      • 3 - Timed out
      • -1 - unknown
    • exit_value The return code of the process (e.g., "0")
    • (Optional) duration Time taken interval
    • (Optional) exit_signal (I donno, what this is meant to be) Default -1)
    • (Optional) bitness The bitness of the machine
      • 1 - 8 bit
      • 2 - 16 bit
      • 4 - 32 bit
      • 6 - 32/64 bit
      • 8 - 64 bit
      • 16 - 128 bit
      • "unknown" - unknown bitness (default)
    • (Optional) endian The endianness of the machine
      • "little" or 1 - Little endian
      • "big" or 2 - Big endian
      • "unknown" or 0 - unknown endianness (default)
    • (Optional) vpath_mode If the code was compiled using a VPATH build
      • "relative" or 1 - relative path
      • "absolute" or 2 - absolute path
      • "unnknown" or 0 - unknown (default)
    • (Optional) merge_stdout_stderr If the output was merged
      • 0 for false
      • 1 for true
    • (Optional) result_stdout stdout of the process
    • (Optional) result_stderr stderr of the process
    • (Optional) description Text description of this test
    • (Optional) environment Any environment variables of note (usually this is blank)
  • data (Below are the data fields for an "Test Build" phase)

    • Notes:
      • Array of submissions all from the same phase type (identified in the metadata)
      • All fields are required unless otherwise noted as Optional
      • Unless otherwise enumerated, the string passed is defined by the client as does not need to adhere to any specific format.
      • You can submit more data than is required. It is often just ignored by the server. Sometimes it can be helpful (for example, if you do not send a mpi_install_id but provide enough information to look it up)
    • compiler_name Common name for the compiler (e.g., "gnu")
      • Usually this matches with whatever you submitted for "MPI Install" phase
    • compiler_version Version string for the compiler (e.g., "4.4.7")
      • Usually this matches with whatever you submitted for "MPI Install" phase
    • suite_name A name for the test suite (e.g., "trivial")
    • start_timestamp (same as above)
    • result_message (same as above)
    • test_result (same as above)
    • exit_value (same as above)
    • (Optional, but strongly recommeded) mpi_install_id The ID returned by the previous "MPI Install" phase submission.
      • If nothing submitted, then the server will try to look it up. However, it usually just associated it with a dummy row since often not enough information is available to actually find it.
    • (Optional) duration (same as above)
    • (Optional) exit_signal (same as above)
    • (Optional) merge_stdout_stderr (same as above)
    • (Optional) result_stdout (same as above)
    • (Optional) result_stderr (same as above)
    • (Optional) description (same as above)
    • (Optional) environment (same as above)
  • data (Below are the data fields for a "Test Run" phase)

    • Notes:
      • Array of submissions all from the same phase type (identified in the metadata)
      • All fields are required unless otherwise noted as Optional
      • Unless otherwise enumerated, the string passed is defined by the client as does not need to adhere to any specific format.
      • You can submit more data than is required. It is often just ignored by the server.
    • test_name A name for the test - usually the binary name (e.g., "hello_c")
    • np number of processes used
    • command The full command line string used
    • start_timestamp (same as above)
    • result_message (same as above)
    • test_result (same as above)
    • exit_value (same as above)
    • (Optional, but strongly recommeded) test_build_id The ID returned by the previous "Test Build" phase submission.
      • If nothing submitted, then the server will try to look it up. However, it usually just associated it with a dummy row since often not enough information is available to actually find it.
    • (Optional) duration (same as above)
    • (Optional) exit_signal (same as above)
    • (Optional) launcher Binary name of what was used to launch the process (e.g., "mpirun")
    • (Optional) resource_manager String representation of the resource manager used (e.g., "slurm")
    • (Optional) parameters Breakdown the command line parameters (often we just send "", server will try to discover from command)
    • (Optional) network String representation of the network (often we just send "", server will try to discover from command)
    • (Optional) merge_stdout_stderr (same as above)
    • (Optional) result_stdout (same as above)
    • (Optional) result_stderr (same as above)
    • (Optional) description (same as above)
    • (Optional) environment (same as above)
    • (Optional) latency_bandwidth TODO
    • (Optional) message_size TODO
    • (Optional) latency_min TODO
    • (Optional) latency_avg TODO
    • (Optional) latency_max TODO
    • (Optional) bandwidth_min TODO
    • (Optional) bandwidth_avg TODO
    • (Optional) bandwidth_max TODO

Data Params Example:

Below is an example submission for a "MPI Install" phase

    "data": [
        {
            "bitness": 8,
            "endian": "little",
            "vpath_mode": "unknown",
            "platform_hardware": "x86_64",
            "platform_type": "linux-rhel6.7-x86_64",
            "os_name": "Linux",
            "os_version": "Linux 2.6.32-573.12.1.el6.x86_64",
            "compiler_name": "gnu",
            "compiler_version": "4.4.7",
            "mpi_name": "ompi-nightly-v1.10",
            "mpi_version": "v1.10.2-114-gf3bad94",
            "configure_arguments": "CFLAGS=-pipe --enable-picky --enable-debug",
            "start_timestamp": "Mon Apr  4 16:29:36 2016",
            "duration": "322 seconds",
            "result_message": "Success",
            "test_result": 1,
            "exit_value": 0,
            "merge_stdout_stderr": 0,
            "result_stderr": "--- make all -on\n",
        }
    ],
    "metadata": {
        "client_serial": 12345,
        "hostname": "node01.mydomain.com",
        "local_username": "fred",
        "mtt_client_version": "4.0a1",
        "phase": "MPI Install",
        "platform_name": "my-awesome-cluster",
        "trial": 0
    }
}

Below is an example submission for a "Test Build" phase

    "data": [
        {
            "compiler_name": "gnu",
            "compiler_version": "4.4.7",
            "suite_name": "trivial",
            "start_timestamp": "Mon Apr  4 16:34:59 2016",
            "duration": "1 seconds",
            "result_message": "Success",
            "test_result": 1,
            "exit_value": 0,
            "mpi_install_id": 187669
        }
    ],
    "metadata": {
        "client_serial": 12345,
        "hostname": "node01.mydomain.com",
        "local_username": "fred",
        "mtt_client_version": "4.0a1",
        "phase": "MPI Install",
        "platform_name": "my-awesome-cluster",
        "trial": 0
    }
}

Below is an example submission for a "Test Run" phase with 8 individual results

    "data": [
       {
            "launcher": "mpirun",
            "resource_manager": "none",
            "parameters": "",
            "network": "",
            "test_name": "hello_usempi",
            "np": "2",
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./hello_usempi ",
            "start_timestamp": "Mon Apr  4 16:35:07 2016",
            "duration": "14 seconds",
            "result_message": "Failed; timeout expired (00:10 DD:HH:MM:SS) )",
            "test_result": 3,
            "exit_value": 0,
            "result_stderr": null,
            "result_stdout": "Killed by signal 15.\r\nKilled by signal 15.\rMTT killed mpirun via SIGTERM",
            "test_build_id": 1026192
        },
        {
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./c_hello ",
            "description": null,
            "duration": "24 seconds",
            "environment": "",
            "exit_value": 0,
            "np": "2",
            "result_message": "Failed; timeout expired (00:10 DD:HH:MM:SS) )",
            "result_stderr": null,
            "result_stdout": "Abort is in progress...hit ctrl-c again within 5 seconds to forcibly terminate\n\nKilled by signal 15.\rMTT killed mpirun via 2 SIGKILLs",
            "start_timestamp": "Mon Apr  4 16:35:24 2016",
            "test_build_id": 1026192,
            "test_name": "c_hello",
            "test_result": 3
        },
        {
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./cxx_hello ",
            "duration": "12 seconds",
            "exit_value": 0,
            "launcher": "mpirun",
            "np": "2",
            "result_message": "Failed; timeout expired (00:10 DD:HH:MM:SS) )",
            "result_stderr": null,
            "result_stdout": "MTT killed mpirun via SIGTERM",
            "start_timestamp": "Mon Apr  4 16:35:50 2016",
            "test_build_id": 1026192,
            "test_name": "cxx_hello",
            "test_result": 3
        },
        {
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./ring_mpifh ",
            "duration": "12 seconds",
            "exit_value": 0,
            "launcher": "mpirun",
            "np": "2",
            "result_message": "Failed; timeout expired (00:10 DD:HH:MM:SS) )",
            "result_stderr": null,
            "result_stdout": "MTT killed mpirun via SIGTERM",
            "start_timestamp": "Mon Apr  4 16:36:04 2016",
            "test_build_id": 1026192,
            "test_name": "ring_mpifh",
            "test_result": 3
        },
        {
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./ring_usempi ",
            "duration": "4 seconds",
            "exit_value": 0,
            "np": "2",
            "result_message": "Passed",
            "result_stderr": null,
            "result_stdout": "",
            "start_timestamp": "Mon Apr  4 16:36:19 2016",
            "test_build_id": 1026192,
            "test_name": "ring_usempi",
            "test_result": 1
        },
        {
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./c_ring ",
            "duration": "2 seconds",
            "exit_value": 0,
            "launcher": "mpirun",
            "np": "2",
            "result_message": "Passed",
            "result_stderr": null,
            "result_stdout": "",
            "start_timestamp": "Mon Apr  4 16:36:23 2016",
            "test_build_id": 1026192,
            "test_name": "c_ring",
            "test_result": 1
        },
        {
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./hello_mpifh ",
            "duration": "2 seconds",
            "exit_signal": -1,
            "exit_value": 0,
            "launcher": "mpirun",
            "np": "2",
            "result_message": "Passed",
            "result_stderr": null,
            "result_stdout": " Hello, Fortran mpif.h world, I am            0  of            2\n Hello, Fortran mpif.h world, I am            1  of            2\n",
            "start_timestamp": "Mon Apr  4 16:36:25 2016",
            "test_build_id": 1026192,
            "test_name": "hello_mpifh",
            "test_result": 1
        },
        {
            "command": "mpirun --host flux1,flux2 -np 2 --mca oob_tcp_if_include eth1 --mca btl_tcp_if_include 192.168.1.0/16 --prefix /home/jjhursey/work/mtt/mtt-scratch/installs/QpR3/install ./cxx_ring ",
            "duration": "2 seconds",
            "exit_value": 0,
            "launcher": "mpirun",
            "np": "2",
            "result_message": "Passed",
            "result_stderr": null,
            "result_stdout": "",
            "start_timestamp": "Mon Apr  4 16:36:27 2016",
            "test_build_id": 1026192,
            "test_name": "cxx_ring",
            "test_result": 1
        }
    ],
    "metadata": {
        "client_serial": 12345,
        "hostname": "node01.mydomain.com",
        "local_username": "fred",
        "mtt_client_version": "4.0a1",
        "phase": "MPI Install",
        "platform_name": "my-awesome-cluster",
        "trial": 0
    }
}

Example JSON Output:

  • submit_id Integer
  • ids (Array of IDs from the submissions) - will be one of the following
    • mpi_install_id Integer
    • test_build_id Integer
    • test_run_id Integer
  • Below is the output for an "MPI Install" phase
{
    "status": 0,
    "status_message": "Success",
    "submit_id": 9876,
    "ids": [
        {
            "mpi_install_id": 187669
        }
    ]
}
  • Below is the output for a "Test Build" phase
{
    "status": 0,
    "status_message": "Success",
    "submit_id": 9876,
    "ids": [
        {
            "test_build_id": 1026192
        }
    ]
}
  • Below is the output for a "Test Run" phase with multiple individual test results submitted
{
    "status": 0,
    "status_message": "Success",
    "submit_id": 9876,
    "ids": [
        {
          "test_run_id": 341604885
        },
        {
          "test_run_id": 341604886
        },
        {
          "test_run_id": 341604887
        },
        {
          "test_run_id": 341604888
        },
        {
          "test_run_id": 341604889
        },
        {
          "test_run_id": 341604890
        },
        {
          "test_run_id": 341604891
        },
        {
          "test_run_id": 341604892
        }
    ]
}
Clone this wiki locally