Deploy your containerized application on Tensorkube. This command requires a dockerfile to be present in the current directory.

Usage

tensorkube deploy [OPTIONS]

Options

  • gpus:

    • Type: INT
    • Default: 0
    • Usage: --gpus

    Number of GPUs needed for the service.

  • gpu_type:

    • Type: Choice(’[‘V100’, ‘A10G’, ‘T4’, ‘L4’]‘)
    • Default: none
    • Usage: --gpu-type

    Type of GPU.

  • cpu:

    • Type: FLOAT
    • Default: 100
    • Usage: --cpu

    Number of CPU millicores. 1000 = 1 CPU

  • memory:

    • Type: FLOAT
    • Default: 200
    • Usage: --memory

    Amount of RAM in megabytes.

  • min_scale:

    • Type: INT
    • Default: 0
    • Usage: --min-scale

    Minimum number of pods to run.

  • max_scale:

    • Type: INT
    • Default: 3
    • Usage: --max-scale

    Maximum number of pods to run.

  • help:

    • Type: BOOL
    • Default: false
    • Usage: --help

    Show this message and exit.

CLI Help

Usage: tensorkube deploy [OPTIONS]

  Deploy your containerized application on Tensorkube. This command requires
  a dockerfile to be present in the current directory.

Options:
  --gpus INTEGER                Number of GPUs needed for the service.
  --gpu-type [V100|A10G|T4|L4]  Type of GPU.
  --cpu FLOAT                   Number of CPU millicores. 1000 = 1 CPU
  --memory FLOAT                Amount of RAM in megabytes.
  --min-scale INTEGER           Minimum number of pods to run.
  --max-scale INTEGER           Maximum number of pods to run.
  --help                        Show this message and exit.