Topic: inference-server Goto Github
Some thing interesting about inference-server
Some thing interesting about inference-server
inference-server,Serving AI/ML models in the open standard formats PMML and ONNX with both HTTP (REST API) and gRPC endpoints
Organization: autodeployai
inference-server,The simplest way to serve AI/ML models in production
Organization: basetenlabs
Home Page: https://truss.baseten.co
inference-server,This is a repository for an object detection inference API using the Tensorflow framework.
Organization: bmw-innovationlab
inference-server,This is a repository for an nocode object detection inference API using the Yolov4 and Yolov3 Opencv.
Organization: bmw-innovationlab
inference-server,This is a repository for an nocode object detection inference API using the Yolov3 and Yolov4 Darknet framework.
Organization: bmw-innovationlab
inference-server,Session Based Real-time Hotel Recommendation Web Application
User: csy1204
inference-server,Wingman is the fastest and easiest way to run Llama models on your PC or Mac.
User: curtisgray
Home Page: https://curtisgray.github.io/wingman/
inference-server,Serving distributed deep learning models with model parallel swapping.
User: dlzou
inference-server,Friendli: the fastest serving engine for generative AI
Organization: friendliai
Home Page: https://friendli.ai
inference-server,Audio components for geniusrise framework
Organization: geniusrise
Home Page: https://geniusrise.ai/
inference-server,Text components powering LLMs & SLMs for geniusrise framework
Organization: geniusrise
Home Page: https://geniusrise.ai
inference-server,Vision and vision-multi-modal components for geniusrise framework
Organization: geniusrise
Home Page: https://docs.geniusrise.ai
inference-server,Fullstack machine learning inference template
User: haicheviet
inference-server,Advanced inference pipeline using NVIDIA Triton Inference Server for CRAFT Text detection (Pytorch), included converter from Pytorch -> ONNX -> TensorRT, Inference pipelines (TensorRT, Triton server - multi-format). Supported model format for Triton inference: TensorRT engine, Torchscript, ONNX
User: k9ele7en
inference-server,K3ai is a lightweight, fully automated, AI infrastructure-in-a-box solution that allows anyone to experiment quickly with Kubeflow pipelines. K3ai is perfect for anything from Edge to laptops.
Organization: kf5i
inference-server,ONNX Runtime Server: The ONNX Runtime Server is a server that provides TCP and HTTP/HTTPS REST APIs for ONNX inference.
User: kibae
inference-server,Inference Server Implementation from Scratch for Machine Learning Models
User: leimao
Home Page: https://leimao.github.io/project/Simple-Inference-Server/
inference-server,Run your own production inference code with Sagemaker
User: liusy182
inference-server,Deploy DL/ ML inference pipelines with minimal extra code.
Organization: notai-tech
inference-server,A REST API for Caffe using Docker and Go
Organization: nvidia
inference-server,A networked inference server for Whisper so you don't have to keep waiting for the audio model to reload for the x-hunderdth time.
User: pandruszkow
inference-server,An open-source computer vision framework to build and deploy apps in minutes
Organization: pipeless-ai
Home Page: https://pipeless.ai
inference-server,An example of using Redis + RedisAI for a microservice that predicts consumer loan probabilities using Redis as a feature and model store and RedisAI as an inference server.
Organization: redisventures
inference-server,A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
Organization: roboflow
Home Page: https://inference.roboflow.com
inference-server,Roboflow's inference server to analyze video streams. This project extracts insights from video frames at defined intervals and generates informative visualizations and CSV outputs.
Organization: roboflow
Home Page: https://github.com/roboflow/inference
inference-server,A standalone inference server for trained Rubix ML estimators.
Organization: rubixml
Home Page: https://rubixml.com
inference-server,Serve pytorch inference requests using batching with redis for faster performance.
Organization: saber-labs
inference-server,Client/Server system to perform distributed inference on high load systems.
User: stefanolusardi
inference-server,Benchmark for machine learning model online serving (LLM, embedding, Stable-Diffusion, Whisper)
Organization: tensorchord
inference-server,Modelz is a developer-first platform for prototyping and deploying machine learning models.
Organization: tensorchord
Home Page: https://docs.modelz.ai
inference-server,Effortlessly Deploy and Serve Large Language Models in the Cloud as an API Endpoint for Inference
User: thefaheem
inference-server,Python + Inference - Model Deployment library in Python. Simplest model inference server ever.
Organization: underneathall
Home Page: https://pinferencia.underneathall.app
inference-server,Orkhon: ML Inference Framework and Server Runtime
User: vertexclique
inference-server,
User: zhangjun
inference-server,TensorRT Server
User: zhangjun
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.