glm-4-32b

Neural Network

Beep-boop, writing text for you...
pc1pc2pc3pc4
Main

/

Models

/

glm-4-32b
4 096

Max. Response Length

(in tokens)

32 000

Context Size

(in tokens)

0,24 $

Prompt Cost

(per 1M tokens)

0,24 $

Response Cost

(per 1M tokens)

0 $

Prompt Image

(per 1K tokens)

Chat
Providers
API
bg
bg-shape
bothub
BotHub: Try for freebot

Caps remaining: 0

How it works glm-4-32b?

Bothubs gather information...empty