Llama 3.3 70B

New

by Meta

High-performance multilingual LLM optimized for dialogue and instruction following.

Parameters
70B
Context Length
128K
Category
chat
Available Serverless

Run queries immediately, pay only for usage

$0.88in|$0.88out

Per 1M Tokens

Try this modelView documentation

About this model

Llama 3.3 70B is Meta's latest instruction-tuned language model, offering exceptional performance across a wide range of tasks. With 70 billion parameters and a 128K context window, it excels at complex reasoning, coding, multilingual tasks, and creative writing. The model has been fine-tuned using RLHF to be helpful, harmless, and honest.

Capabilities

Chat & DialogueInstruction FollowingCode GenerationMultilingualReasoningFunction Calling

Use Cases

  • Customer Support Bots
  • Code Assistants
  • Content Generation
  • Data Analysis
  • Research Assistants

Model Details

Provider
Meta
Model ID
meta-llama/Llama-3.3-70B-Instruct
Parameters
70B
Context Length
128K tokens
Category
chat

API Usage

Use the DOS API to integrate Llama 3.3 70B into your applications. Our API is compatible with OpenAI's client libraries for easy migration.

Model ID

meta-llama/Llama-3.3-70B-Instruct

Python

python
from dos import DOS

client = DOS()

response = client.chat.completions.create(
    model="meta-llama/Llama-3.3-70B-Instruct",
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ]
)

print(response.choices[0].message.content)

cURL

bash
curl https://api.dos.ai/v1/chat/completions \
  -H "Authorization: Bearer $DOS_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "meta-llama/Llama-3.3-70B-Instruct",
    "messages": [
      {"role": "user", "content": "Hello, how are you?"}
    ]
  }'

Node.js

javascript
import DOS from 'dos-ai';

const client = new DOS();

const response = await client.chat.completions.create({
  model: "meta-llama/Llama-3.3-70B-Instruct",
  messages: [
    { role: "user", content: "Hello, how are you?" }
  ]
});

console.log(response.choices[0].message.content);