Skip to content
Change the repository type filter

All

    Repositories list

    • core

      Public
      The core library and APIs implementing the Triton Inference Server.
      C++
      BSD 3-Clause "New" or "Revised" License
      104111019Updated Jan 4, 2025Jan 4, 2025
    • server

      Public
      The Triton Inference Server provides an optimized cloud and edge inferencing solution.
      Python
      BSD 3-Clause "New" or "Revised" License
      1.5k8.5k61359Updated Jan 4, 2025Jan 4, 2025
    • Python
      BSD 3-Clause "New" or "Revised" License
      2021006Updated Jan 4, 2025Jan 4, 2025
    • C++
      BSD 3-Clause "New" or "Revised" License
      936711Updated Jan 3, 2025Jan 3, 2025
    • The Triton backend for the ONNX Runtime.
      C++
      BSD 3-Clause "New" or "Revised" License
      57135713Updated Jan 3, 2025Jan 3, 2025
    • Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.
      Python
      Apache License 2.0
      2619141Updated Jan 3, 2025Jan 3, 2025
    • The Triton backend for the PyTorch TorchScript models.
      C++
      BSD 3-Clause "New" or "Revised" License
      4413503Updated Dec 30, 2024Dec 30, 2024
    • Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.
      C++
      BSD 3-Clause "New" or "Revised" License
      150571010Updated Dec 30, 2024Dec 30, 2024
    • common

      Public
      Common source, scripts and utilities shared across all Triton repositories.
      C++
      BSD 3-Clause "New" or "Revised" License
      746503Updated Dec 30, 2024Dec 30, 2024
    • The Triton TensorRT-LLM Backend
      Python
      Apache License 2.0
      11173627621Updated Dec 24, 2024Dec 24, 2024
    • The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.
      C++
      MIT License
      29129226Updated Dec 24, 2024Dec 24, 2024
    • Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.
      Python
      Apache License 2.0
      76444244Updated Dec 23, 2024Dec 23, 2024
    • Triton CLI is an open source command line interface that enables users to create, deploy, and profile models served by the Triton Inference Server.
      Python
      25320Updated Dec 19, 2024Dec 19, 2024
    • backend

      Public
      Common source, scripts and utilities for creating Triton backends.
      C++
      BSD 3-Clause "New" or "Revised" License
      9030303Updated Dec 16, 2024Dec 16, 2024
    • OpenVINO backend for Triton.
      C++
      BSD 3-Clause "New" or "Revised" License
      163065Updated Dec 16, 2024Dec 16, 2024
    • FIL backend for the Triton Inference Server
      Jupyter Notebook
      Apache License 2.0
      3576513Updated Dec 11, 2024Dec 11, 2024
    • tutorials

      Public
      This repository contains tutorials and examples for Triton Inference Server
      Python
      BSD 3-Clause "New" or "Revised" License
      99609812Updated Dec 7, 2024Dec 7, 2024
    • Third-party source packages that are modified for use in Triton.
      C
      BSD 3-Clause "New" or "Revised" License
      56605Updated Dec 7, 2024Dec 7, 2024
    • The Triton backend for TensorRT.
      C++
      BSD 3-Clause "New" or "Revised" License
      306701Updated Dec 7, 2024Dec 7, 2024
    • The Triton backend for TensorFlow.
      C++
      BSD 3-Clause "New" or "Revised" License
      204502Updated Dec 7, 2024Dec 7, 2024
    • Simple Triton backend used for testing.
      C++
      BSD 3-Clause "New" or "Revised" License
      4200Updated Dec 7, 2024Dec 7, 2024
    • An example Triton backend that demonstrates sending zero, one, or multiple responses for each request.
      C++
      BSD 3-Clause "New" or "Revised" License
      7500Updated Dec 7, 2024Dec 7, 2024
    • TRITONCACHE implementation of a Redis cache
      C++
      BSD 3-Clause "New" or "Revised" License
      41320Updated Dec 7, 2024Dec 7, 2024
    • Implementation of a local in-memory cache for Triton Inference Server's TRITONCACHE API
      C++
      BSD 3-Clause "New" or "Revised" License
      1510Updated Dec 7, 2024Dec 7, 2024
    • Example Triton backend that demonstrates most of the Triton Backend API.
      C++
      BSD 3-Clause "New" or "Revised" License
      12600Updated Dec 7, 2024Dec 7, 2024
    • C++
      101804Updated Dec 7, 2024Dec 7, 2024
    • client

      Public
      Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.
      Python
      BSD 3-Clause "New" or "Revised" License
      2345803426Updated Dec 7, 2024Dec 7, 2024
    • The Triton repository agent that verifies model checksums.
      C++
      BSD 3-Clause "New" or "Revised" License
      71000Updated Dec 7, 2024Dec 7, 2024
    • pytriton

      Public
      PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.
      Python
      Apache License 2.0
      5375880Updated Nov 19, 2024Nov 19, 2024
    • contrib

      Public
      Community contributions to Triton that are not officially supported or maintained by the Triton project.
      Python
      BSD 3-Clause "New" or "Revised" License
      7801Updated Jun 5, 2024Jun 5, 2024