## List `agents.evaluation_test_cases.list() -> EvaluationTestCaseListResponse` **get** `/v2/gen-ai/evaluation_test_cases` To list all evaluation test cases, send a GET request to `/v2/gen-ai/evaluation_test_cases`. ### Returns - `class EvaluationTestCaseListResponse` - **evaluation\_test\_cases:** `Optional[List[APIEvaluationTestCase]]` Alternative way of authentication for internal usage only - should not be exposed to public api - **archived\_at:** `Optional[datetime]` - **created\_at:** `Optional[datetime]` - **created\_by\_user\_email:** `Optional[str]` - **created\_by\_user\_id:** `Optional[str]` - **dataset:** `Optional[Dataset]` - **created\_at:** `Optional[datetime]` Time created at. - **dataset\_name:** `Optional[str]` Name of the dataset. - **dataset\_uuid:** `Optional[str]` UUID of the dataset. - **file\_size:** `Optional[str]` The size of the dataset uploaded file in bytes. - **has\_ground\_truth:** `Optional[bool]` Does the dataset have a ground truth column? - **row\_count:** `Optional[int]` Number of rows in the dataset. - **dataset\_name:** `Optional[str]` - **dataset\_uuid:** `Optional[str]` - **description:** `Optional[str]` - **latest\_version\_number\_of\_runs:** `Optional[int]` - **metrics:** `Optional[List[APIEvaluationMetric]]` - **description:** `Optional[str]` - **inverted:** `Optional[bool]` If true, the metric is inverted, meaning that a lower value is better. - **metric\_name:** `Optional[str]` - **metric\_type:** `Optional[Literal["METRIC_TYPE_UNSPECIFIED", "METRIC_TYPE_GENERAL_QUALITY", "METRIC_TYPE_RAG_AND_TOOL"]]` - `"METRIC_TYPE_UNSPECIFIED"` - `"METRIC_TYPE_GENERAL_QUALITY"` - `"METRIC_TYPE_RAG_AND_TOOL"` - **metric\_uuid:** `Optional[str]` - **metric\_value\_type:** `Optional[Literal["METRIC_VALUE_TYPE_UNSPECIFIED", "METRIC_VALUE_TYPE_NUMBER", "METRIC_VALUE_TYPE_STRING", "METRIC_VALUE_TYPE_PERCENTAGE"]]` - `"METRIC_VALUE_TYPE_UNSPECIFIED"` - `"METRIC_VALUE_TYPE_NUMBER"` - `"METRIC_VALUE_TYPE_STRING"` - `"METRIC_VALUE_TYPE_PERCENTAGE"` - **range\_max:** `Optional[float]` The maximum value for the metric. - **range\_min:** `Optional[float]` The minimum value for the metric. - **name:** `Optional[str]` - **star\_metric:** `Optional[APIStarMetric]` - **test\_case\_uuid:** `Optional[str]` - **total\_runs:** `Optional[int]` - **updated\_at:** `Optional[datetime]` - **updated\_by\_user\_email:** `Optional[str]` - **updated\_by\_user\_id:** `Optional[str]` - **version:** `Optional[int]` ### Example ```python from do_gradientai import GradientAI client = GradientAI() evaluation_test_cases = client.agents.evaluation_test_cases.list() print(evaluation_test_cases.evaluation_test_cases) ```