How to get the whole model list?

#1
by tastypear - opened

I'm curious about which models the serverless inference api supports. The official doesn't seem to give a specific list.

I just tested meta-llama/Meta-Llama-3.1-405B-Instruct-FP8 and it's available.

Sign up or log in to comment