An unofficial, feature-rich API for Cerebra.ai with OpenAI compatibility and advanced tools.
π Features β’ π οΈ Installation β’ π» Usage β’ π₯οΈ Server β’ π OpenAI Compatibility β’ π§° Tools β’ π Benchmarks β’ π€ Contributing β’ π License
- π₯ Seamless integration with Cerebra.ai models
- π OpenAI-compatible API endpoints
- π§ Access to state-of-the-art language models
- π οΈ Advanced tools for enhanced capabilities
- π₯οΈ Built-in server with FastAPI
- π Detailed usage metrics and quotas
- π Secure cookie-based authentication
- π‘ Real-time streaming responses
- Python 3.7+- Python 3.7+
- Cookie-Editor extension
-
Clone the repository:
git clone https://github.com/OE-LUCIFER/Cerebra.aiAPI.git cd Cerebra.aiAPI
-
Set up a virtual environment (optional but recommended):
python -m venv venv source venv/bin/activate # On Windows, use `venv\Scripts\activate`
-
Install dependencies:
pip install -r requirements.txt
-
Configure your cookies:
- Visit the Cerebra.ai website
- Use Cookie-Editor to export cookies
- Save as
cookies.json
in the project root
from cerebra_ai_api import CerebrasWithCookie
client = CerebrasWithCookie(cookie_path='cookies.json')
response = client.ask("Explain the theory of relativity in simple terms.")
print(response)
Click to expand
stream = client.generate_stream(
ChatRequest(
messages=[{"role": "user", "content": "Write a haiku about AI."}],
model="llama3.1-70b",
stream=True
)
)
for chunk in stream:
print(chunk['data'], end='', flush=True)
response_8b = client.ask("Summarize the importance of quantum computing.", model="llama3.1-8b")
response_70b = client.ask("Summarize the importance of quantum computing.", model="llama3.1-70b")
print("8B Model:", response_8b)
print("70B Model:", response_70b)
json_response = client.ask(
"List the top 5 programming languages in 2023.",
json_response=True
)
print(json_response)
client = CerebrasWithCookie(cookie_path='cookies.json')
client.start_server(host="0.0.0.0", port=8000)
Endpoint | Description |
---|---|
/v1/chat/completions |
Chat completions API |
/v1/models |
List available models |
Access the interactive API documentation at http://localhost:8000/docs
when the server is running.
Cerebra.aiAPI is designed as a drop-in replacement for the OpenAI Python library.
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:8000/v1",
api_key="dummy_key"
)
response = client.chat.completions.create(
model="llama3.1-70b",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain the concept of machine learning."}
]
)
print(response.choices[0].message.content)
Cerebra.aiAPI supports tool calling.
def get_web_info(query: str, max_results: int = 5) -> str:
results = search(query, num_results=max_results)
return json.dumps([{"title": r.title, "link": r.url, "snippet": r.description} for r in results])
tools = [
{
"type": "function",
"function": {
"name": "get_web_info",
"description": "Search the web for current information",
"parameters": {
"type": "object",
"properties": {
"query": {"type": "string", "description": "The search query"},
"max_results": {"type": "integer", "description": "Maximum number of results"}
},
"required": ["query"]
}
}
}
]
response = client.chat.completions.create(
model="llama3.1-70b",
messages=[{"role": "user", "content": "What are the latest developments in AI?"}],
tools=tools,
tool_choice="auto"
)
print(response.choices[0].message.content)
More Tool Examples
def calculate(expression: str) -> str:
try:
return str(eval(expression))
except Exception as e:
return f"Error: {str(e)}"
# Usage in tools list
{
"type": "function",
"function": {
"name": "calculate",
"description": "Perform mathematical calculations",
"parameters": {
"type": "object",
"properties": {
"expression": {"type": "string", "description": "The mathematical expression to evaluate"}
},
"required": ["expression"]
}
}
}
import requests
def get_weather(city: str) -> str:
API_KEY = "your_openweathermap_api_key"
url = f"http://api.openweathermap.org/data/2.5/weather?q={city}&appid={API_KEY}&units=metric"
response = requests.get(url)
data = response.json()
return f"The current temperature in {city} is {data['main']['temp']}Β°C with {data['weather'][0]['description']}."
# Usage in tools list
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather information for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The name of the city"}
},
"required": ["city"]
}
}
}
We welcome contributions from the community! Here's how you can help:
- Fork the repository
- Clone your fork:
git clone https://github.com/OE-LUCIFER/Cerebra.aiAPI.git
- Create a new branch:
git checkout -b feature-name
- Make your changes and commit them:
git commit -m 'Add some feature'
- Push to the branch:
git push origin feature-name
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.