H2O Eval Studio overview
H2O Eval Studio is a modular and extensible studio for Retrieval-Augmented Generation (RAG) and Large Language Model (LLM) evaluation. It aims to provide a systematic assessment of RAGs/LLMs performance, reliability, security, fairness, and effectiveness in various applications. Eval Studio provides a collection of RAG/LLM evaluators to be used in RAG/LLM application development and operations.
Eval Studio homepage
The H2O Eval Studio homepage displays the most recent model hosts, tests, and evaluations. You can add models and tests from this page by clicking the Add Model and Add Test buttons.
Using the Sidebar
Navigate through the following pages by clicking their icons in the sidebar:
- Home
- Evaluations
- Tests
- Model hosts
- Evaluators
You can also quickly access the most recent evaluations and start a new evaluation by clicking the New Evaluation button.
- Submit and view feedback for this page
- Send feedback about H2O Eval Studio to cloud-feedback@h2o.ai