Google Cloud Ai Platform V1 Client - Class HarmCategory (1.6.0)

Reference documentation and code samples for the Google Cloud Ai Platform V1 Client class HarmCategory.

Harm categories that will block the content.

Protobuf type google.cloud.aiplatform.v1.HarmCategory

Namespace

Google \ Cloud \ AIPlatform \ V1

Methods

static::name

Parameter
Name Description
value mixed

static::value

Parameter
Name Description
name mixed

Constants

HARM_CATEGORY_UNSPECIFIED

Value: 0

The harm category is unspecified.

Generated from protobuf enum HARM_CATEGORY_UNSPECIFIED = 0;

HARM_CATEGORY_HATE_SPEECH

Value: 1

The harm category is hate speech.

Generated from protobuf enum HARM_CATEGORY_HATE_SPEECH = 1;

HARM_CATEGORY_DANGEROUS_CONTENT

Value: 2

The harm category is dangerous content.

Generated from protobuf enum HARM_CATEGORY_DANGEROUS_CONTENT = 2;

HARM_CATEGORY_HARASSMENT

Value: 3

The harm category is harassment.

Generated from protobuf enum HARM_CATEGORY_HARASSMENT = 3;

HARM_CATEGORY_SEXUALLY_EXPLICIT

Value: 4

The harm category is sexually explicit content.

Generated from protobuf enum HARM_CATEGORY_SEXUALLY_EXPLICIT = 4;