SDKs & Libraries
Official SDKs and libraries for integrating DeepSeek AI into your applications across multiple programming languages and platforms.
Overview
DeepSeek provides comprehensive SDKs for popular programming languages, making it easy to integrate AI capabilities into your applications:
- Official SDKs: Maintained by DeepSeek team
- Community Libraries: Community-maintained integrations
- Framework Integrations: Direct framework support
- Platform Tools: Platform-specific tools and extensions
Official SDKs
Python SDK
The most comprehensive SDK with full feature support.
Installation
bash
pip install deepseek-ai
Quick Start
python
from deepseek import DeepSeek
client = DeepSeek(api_key="your-api-key")
response = client.chat.completions.create(
model="deepseek-chat",
messages=[
{"role": "user", "content": "Hello, world!"}
]
)
print(response.choices[0].message.content)
Advanced Features
python
# Streaming responses
for chunk in client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
):
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
# Function calling
functions = [
{
"name": "get_weather",
"description": "Get weather information",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"}
}
}
}
]
response = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
functions=functions,
function_call="auto"
)
# Vision capabilities
response = client.chat.completions.create(
model="deepseek-vision",
messages=[
{
"role": "user",
"content": [
{"type": "text", "text": "What's in this image?"},
{"type": "image_url", "image_url": {"url": "data:image/jpeg;base64,..."}}
]
}
]
)
Async Support
python
import asyncio
from deepseek import AsyncDeepSeek
async def main():
client = AsyncDeepSeek(api_key="your-api-key")
response = await client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
asyncio.run(main())
JavaScript/TypeScript SDK
Full-featured SDK for Node.js and browser environments.
Installation
bash
npm install deepseek-ai
# or
yarn add deepseek-ai
Node.js Usage
javascript
import DeepSeek from 'deepseek-ai';
const client = new DeepSeek({
apiKey: process.env.DEEPSEEK_API_KEY,
});
const response = await client.chat.completions.create({
model: 'deepseek-chat',
messages: [
{ role: 'user', content: 'Hello, world!' }
],
});
console.log(response.choices[0].message.content);
TypeScript Support
typescript
import DeepSeek, { ChatCompletion } from 'deepseek-ai';
const client = new DeepSeek({
apiKey: process.env.DEEPSEEK_API_KEY,
});
const response: ChatCompletion = await client.chat.completions.create({
model: 'deepseek-chat',
messages: [
{ role: 'user', content: 'Explain TypeScript benefits' }
],
max_tokens: 150,
temperature: 0.7,
});
console.log(response.choices[0].message.content);
Streaming in JavaScript
javascript
const stream = await client.chat.completions.create({
model: 'deepseek-chat',
messages: [{ role: 'user', content: 'Tell me a story' }],
stream: true,
});
for await (const chunk of stream) {
if (chunk.choices[0]?.delta?.content) {
process.stdout.write(chunk.choices[0].delta.content);
}
}
Browser Usage
html
<!DOCTYPE html>
<html>
<head>
<script src="https://cdn.jsdelivr.net/npm/deepseek-ai@latest/dist/browser.js"></script>
</head>
<body>
<script>
const client = new DeepSeek({
apiKey: 'your-api-key',
dangerouslyAllowBrowser: true
});
async function chat() {
const response = await client.chat.completions.create({
model: 'deepseek-chat',
messages: [
{ role: 'user', content: 'Hello from browser!' }
]
});
console.log(response.choices[0].message.content);
}
chat();
</script>
</body>
</html>
Go SDK
High-performance SDK for Go applications.
Installation
bash
go get github.com/deepseek-ai/deepseek-go
Basic Usage
go
package main
import (
"context"
"fmt"
"log"
"github.com/deepseek-ai/deepseek-go"
)
func main() {
client := deepseek.NewClient("your-api-key")
resp, err := client.CreateChatCompletion(
context.Background(),
deepseek.ChatCompletionRequest{
Model: "deepseek-chat",
Messages: []deepseek.ChatCompletionMessage{
{
Role: "user",
Content: "Hello, Go!",
},
},
},
)
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Choices[0].Message.Content)
}
Streaming in Go
go
stream, err := client.CreateChatCompletionStream(
context.Background(),
deepseek.ChatCompletionRequest{
Model: "deepseek-chat",
Messages: []deepseek.ChatCompletionMessage{
{Role: "user", Content: "Tell me about Go"},
},
Stream: true,
},
)
if err != nil {
log.Fatal(err)
}
defer stream.Close()
for {
response, err := stream.Recv()
if errors.Is(err, io.EOF) {
break
}
if err != nil {
log.Fatal(err)
}
fmt.Print(response.Choices[0].Delta.Content)
}
Java SDK
Enterprise-ready SDK for Java applications.
Installation (Maven)
xml
<dependency>
<groupId>com.deepseek</groupId>
<artifactId>deepseek-java</artifactId>
<version>1.0.0</version>
</dependency>
Installation (Gradle)
gradle
implementation 'com.deepseek:deepseek-java:1.0.0'
Basic Usage
java
import com.deepseek.DeepSeekClient;
import com.deepseek.model.ChatCompletion;
import com.deepseek.model.ChatMessage;
public class DeepSeekExample {
public static void main(String[] args) {
DeepSeekClient client = new DeepSeekClient("your-api-key");
ChatCompletion response = client.createChatCompletion(
ChatCompletion.builder()
.model("deepseek-chat")
.addMessage(ChatMessage.user("Hello, Java!"))
.build()
);
System.out.println(response.getChoices().get(0).getMessage().getContent());
}
}
Async Usage
java
import java.util.concurrent.CompletableFuture;
CompletableFuture<ChatCompletion> future = client.createChatCompletionAsync(
ChatCompletion.builder()
.model("deepseek-chat")
.addMessage(ChatMessage.user("Async hello!"))
.build()
);
future.thenAccept(response -> {
System.out.println(response.getChoices().get(0).getMessage().getContent());
});
C# SDK
Full-featured SDK for .NET applications.
Installation
bash
dotnet add package DeepSeek.AI
Basic Usage
csharp
using DeepSeek.AI;
var client = new DeepSeekClient("your-api-key");
var response = await client.Chat.CreateCompletionAsync(new ChatRequest
{
Model = "deepseek-chat",
Messages = new[]
{
new ChatMessage { Role = "user", Content = "Hello, C#!" }
}
});
Console.WriteLine(response.Choices[0].Message.Content);
Streaming in C#
csharp
await foreach (var chunk in client.Chat.CreateCompletionStreamAsync(new ChatRequest
{
Model = "deepseek-chat",
Messages = new[] { new ChatMessage { Role = "user", Content = "Tell me about .NET" } },
Stream = true
}))
{
if (chunk.Choices?[0]?.Delta?.Content != null)
{
Console.Write(chunk.Choices[0].Delta.Content);
}
}
Ruby SDK
Elegant SDK for Ruby applications.
Installation
bash
gem install deepseek-ai
Basic Usage
ruby
require 'deepseek'
client = DeepSeek::Client.new(api_key: 'your-api-key')
response = client.chat.completions.create(
model: 'deepseek-chat',
messages: [
{ role: 'user', content: 'Hello, Ruby!' }
]
)
puts response.choices[0].message.content
Streaming in Ruby
ruby
client.chat.completions.create(
model: 'deepseek-chat',
messages: [{ role: 'user', content: 'Tell me about Ruby' }],
stream: true
) do |chunk|
print chunk.choices[0].delta.content if chunk.choices[0].delta.content
end
PHP SDK
Comprehensive SDK for PHP applications.
Installation
bash
composer require deepseek/deepseek-php
Basic Usage
php
<?php
require_once 'vendor/autoload.php';
use DeepSeek\DeepSeekClient;
$client = new DeepSeekClient('your-api-key');
$response = $client->chat()->completions()->create([
'model' => 'deepseek-chat',
'messages' => [
['role' => 'user', 'content' => 'Hello, PHP!']
]
]);
echo $response['choices'][0]['message']['content'];
?>
Framework Integrations
React Integration
React Hook
javascript
import { useDeepSeek } from 'deepseek-react';
function ChatComponent() {
const { chat, loading, error } = useDeepSeek({
apiKey: process.env.REACT_APP_DEEPSEEK_API_KEY
});
const handleSubmit = async (message) => {
const response = await chat({
model: 'deepseek-chat',
messages: [{ role: 'user', content: message }]
});
return response.choices[0].message.content;
};
return (
<div>
{loading && <p>Thinking...</p>}
{error && <p>Error: {error.message}</p>}
{/* Your chat UI */}
</div>
);
}
Vue.js Integration
Vue Composable
javascript
import { useDeepSeek } from 'deepseek-vue';
export default {
setup() {
const { chat, loading, error } = useDeepSeek({
apiKey: process.env.VUE_APP_DEEPSEEK_API_KEY
});
const sendMessage = async (content) => {
const response = await chat({
model: 'deepseek-chat',
messages: [{ role: 'user', content }]
});
return response.choices[0].message.content;
};
return {
sendMessage,
loading,
error
};
}
};
Next.js Integration
API Route
javascript
// pages/api/chat.js
import { DeepSeek } from 'deepseek-ai';
const client = new DeepSeek({
apiKey: process.env.DEEPSEEK_API_KEY,
});
export default async function handler(req, res) {
if (req.method !== 'POST') {
return res.status(405).json({ error: 'Method not allowed' });
}
try {
const { messages } = req.body;
const response = await client.chat.completions.create({
model: 'deepseek-chat',
messages,
});
res.status(200).json(response);
} catch (error) {
res.status(500).json({ error: error.message });
}
}
Express.js Integration
Express Middleware
javascript
const express = require('express');
const { DeepSeek } = require('deepseek-ai');
const app = express();
const client = new DeepSeek({ apiKey: process.env.DEEPSEEK_API_KEY });
app.use(express.json());
app.post('/api/chat', async (req, res) => {
try {
const { messages } = req.body;
const response = await client.chat.completions.create({
model: 'deepseek-chat',
messages,
});
res.json(response);
} catch (error) {
res.status(500).json({ error: error.message });
}
});
app.listen(3000, () => {
console.log('Server running on port 3000');
});
Django Integration
Django View
python
from django.http import JsonResponse
from django.views.decorators.csrf import csrf_exempt
from django.views.decorators.http import require_http_methods
import json
from deepseek import DeepSeek
client = DeepSeek(api_key=settings.DEEPSEEK_API_KEY)
@csrf_exempt
@require_http_methods(["POST"])
def chat_view(request):
try:
data = json.loads(request.body)
messages = data.get('messages', [])
response = client.chat.completions.create(
model="deepseek-chat",
messages=messages
)
return JsonResponse({
'content': response.choices[0].message.content
})
except Exception as e:
return JsonResponse({'error': str(e)}, status=500)
Flask Integration
Flask Application
python
from flask import Flask, request, jsonify
from deepseek import DeepSeek
import os
app = Flask(__name__)
client = DeepSeek(api_key=os.getenv('DEEPSEEK_API_KEY'))
@app.route('/api/chat', methods=['POST'])
def chat():
try:
data = request.get_json()
messages = data.get('messages', [])
response = client.chat.completions.create(
model="deepseek-chat",
messages=messages
)
return jsonify({
'content': response.choices[0].message.content
})
except Exception as e:
return jsonify({'error': str(e)}), 500
if __name__ == '__main__':
app.run(debug=True)
Platform Tools
VS Code Extension
Installation
- Open VS Code
- Go to Extensions (Ctrl+Shift+X)
- Search for "DeepSeek AI"
- Click Install
Features
- Code Completion: AI-powered code suggestions
- Code Explanation: Understand complex code
- Bug Detection: Identify potential issues
- Code Generation: Generate code from comments
- Refactoring: Improve code structure
Usage
javascript
// Type a comment and press Ctrl+Shift+P, then "DeepSeek: Generate Code"
// Create a function that calculates fibonacci numbers
// The extension will generate:
function fibonacci(n) {
if (n <= 1) return n;
return fibonacci(n - 1) + fibonacci(n - 2);
}
JetBrains Plugin
Installation
- Open your JetBrains IDE
- Go to File → Settings → Plugins
- Search for "DeepSeek AI"
- Install and restart
Features
- Smart Code Completion
- Code Review Assistant
- Documentation Generation
- Test Case Generation
- Code Optimization Suggestions
CLI Tool
Installation
bash
npm install -g deepseek-cli
# or
pip install deepseek-cli
Usage
bash
# Interactive chat
deepseek chat
# Code generation
deepseek code "create a REST API for user management"
# Code review
deepseek review ./src/main.js
# Documentation generation
deepseek docs ./src --output ./docs
Community Libraries
Unofficial SDKs
Rust
bash
cargo add deepseek-rs
Swift
bash
# Swift Package Manager
.package(url: "https://github.com/deepseek-ai/deepseek-swift", from: "1.0.0")
Kotlin
kotlin
implementation 'com.deepseek:deepseek-kotlin:1.0.0'
Community Tools
LangChain Integration
python
from langchain.llms import DeepSeek
llm = DeepSeek(
api_key="your-api-key",
model="deepseek-chat"
)
response = llm("Explain machine learning")
print(response)
Streamlit Integration
python
import streamlit as st
from deepseek import DeepSeek
client = DeepSeek(api_key=st.secrets["DEEPSEEK_API_KEY"])
st.title("DeepSeek Chat")
if prompt := st.chat_input("What's on your mind?"):
with st.chat_message("user"):
st.write(prompt)
with st.chat_message("assistant"):
response = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": prompt}]
)
st.write(response.choices[0].message.content)
SDK Comparison
Feature Matrix
Feature | Python | JavaScript | Go | Java | C# | Ruby | PHP |
---|---|---|---|---|---|---|---|
Chat Completions | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
Streaming | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
Function Calling | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
Vision API | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ |
Audio API | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ | ❌ |
Async Support | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ |
Type Safety | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ |
Performance Benchmarks
Request Latency (ms)
- Python: 45ms average
- JavaScript: 42ms average
- Go: 38ms average
- Java: 41ms average
- C#: 43ms average
Memory Usage
- Python: 25MB baseline
- JavaScript: 18MB baseline
- Go: 12MB baseline
- Java: 35MB baseline
- C#: 28MB baseline
Best Practices
SDK Selection
Choose Based On:
- Language Ecosystem: Use your primary development language
- Performance Requirements: Go for high-performance applications
- Type Safety: TypeScript/Java/C# for large applications
- Rapid Prototyping: Python/JavaScript for quick development
Error Handling
Python Example
python
from deepseek import DeepSeek, DeepSeekError
client = DeepSeek(api_key="your-api-key")
try:
response = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Hello!"}]
)
except DeepSeekError as e:
print(f"DeepSeek API error: {e}")
except Exception as e:
print(f"Unexpected error: {e}")
Rate Limiting
Exponential Backoff
python
import time
import random
def make_request_with_backoff(client, request_func, max_retries=3):
for attempt in range(max_retries):
try:
return request_func()
except RateLimitError:
if attempt == max_retries - 1:
raise
wait_time = (2 ** attempt) + random.uniform(0, 1)
time.sleep(wait_time)
Security
API Key Management
python
import os
from deepseek import DeepSeek
# Use environment variables
api_key = os.getenv('DEEPSEEK_API_KEY')
if not api_key:
raise ValueError("DEEPSEEK_API_KEY environment variable is required")
client = DeepSeek(api_key=api_key)
Support & Resources
Documentation
- API Reference: /en/api-reference
- Getting Started: /en/getting-started
- Examples: https://github.com/deepseek-ai/examples
Community
- GitHub: https://github.com/deepseek-ai
- Discord: https://discord.gg/deepseek
- Stack Overflow: Tag your questions with
deepseek-ai
Support
- Technical Support: support@deepseek.com
- SDK Issues: Report on respective GitHub repositories
- Feature Requests: https://feedback.deepseek.com
Ready to start building? Choose your preferred SDK and begin integrating DeepSeek AI into your applications today!