Skip to main content

Getting Started

Install the SDK, authenticate, and make your first calls.

Installation

pip install strongly

Requires Python 3.9+. In Strongly workspaces, the SDK is pre-installed and auto-configured.

Getting Your API Key

  1. Log into the Strongly platform
  2. Go to Profile > Security
  3. Under API Keys, click Create API Key
  4. Name your key and select the permissions you need
  5. Copy the key — it's only shown once

Keys look like sk-prod-a1b2c3d4...

Setting Up the Client

Pass your key directly

from strongly import Strongly

client = Strongly(api_key="sk-prod-...")
export STRONGLY_API_KEY=sk-prod-...
from strongly import Strongly

client = Strongly() # picks up from environment

Config file

Create ~/.strongly/config:

[default]
api_key = sk-prod-...
client = Strongly()  # picks up from config file

Inside Strongly workspaces

No setup needed. The SDK auto-detects workspace credentials:

from strongly import Strongly

client = Strongly() # just works

Client Configuration

client = Strongly(
api_key="sk-prod-...", # API key (or use env var)
base_url="https://<your-instance>", # Your Strongly instance URL
timeout=60.0, # Request timeout in seconds
max_retries=3, # Retries on server errors
default_headers={"X-Custom": "value"}, # Extra headers
)
ParameterTypeDefaultDescription
api_keystrNoneAPI key. Auto-resolved if not provided
base_urlstrNoneBase URL for your Strongly instance (e.g., "https://mycompany.strongly.ai")
timeoutfloat60.0Request timeout in seconds
max_retriesint3Number of retries on server errors
default_headersdictNoneExtra headers sent with every request
http_clienthttpx.ClientNoneCustom httpx client instance
on_requestCallableNoneHook called before each HTTP request
on_responseCallableNoneHook called after each HTTP response

Event Hooks

Monitor all HTTP traffic with callback hooks:

def log_request(method, url, **kwargs):
print(f"→ {method} {url}")

def log_response(method, url, status_code, **kwargs):
print(f"← {status_code} {method} {url}")

client = Strongly(
on_request=log_request,
on_response=log_response,
)

Context Manager

The client can be used as a context manager to ensure the HTTP connection is properly closed:

with Strongly() as client:
apps = client.apps.list().to_list()
# connection is closed here

Verify Your Connection

me = client.auth.whoami()
print(f"Logged in as: {me.email}")
print(f"User ID: {me.user_id}")
print(f"Organization: {me.organization.id}")
print(f"Roles: {me.roles}")

Your First Steps

List Your Apps

for app in client.apps.list():
print(f"{app.name}{app.status}")

Create and Deploy an App

app = client.apps.create({
"name": "my-api",
"description": "Customer-facing API",
})

client.apps.deploy(app.id)

# Check status
import time
while True:
status = client.apps.status(app.id)
if status.ready_replicas > 0:
print(f"App is live at {status.url}")
break
print(f"State: {status.state}...")
time.sleep(5)

Run a Workflow

# Find an active workflow
for wf in client.workflows.list(status="active"):
print(f"{wf.name} (ID: {wf.id})")

# Execute it
result = client.workflows.execute("wf-abc123")
exec_id = result["executionId"]

# Track progress
progress = client.executions.progress(exec_id)
print(f"{progress.completed_nodes}/{progress.total_nodes} nodes complete")

Chat with an AI Model

response = client.ai.inference.chat_completion(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "You are a Python expert."},
{"role": "user", "content": "What's the best way to handle errors in Python?"},
],
)
print(response.choices[0].message.content)

Stream a Response

for chunk in client.ai.inference.chat_completion(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Write a haiku about debugging"}],
stream=True,
):
print(chunk.content, end="", flush=True)
print()

Working with Lists

When you call .list(), you get an iterator that fetches results in batches automatically:

# Iterate through all results
for addon in client.addons.list():
print(addon.label)

# Control batch size
for addon in client.addons.list(limit=10):
print(addon.label)

# Get everything as a plain Python list
all_addons = client.addons.list().to_list()

# Just grab the first result
first = client.addons.list().first()

# Check the total count
paginator = client.addons.list()
next(paginator) # fetch first batch
print(f"Total: {paginator.total}")

Handling Errors

The SDK raises specific exceptions so you can handle each case:

from strongly import (
Strongly,
NotFoundError,
AuthenticationError,
PermissionDeniedError,
RateLimitError,
ValidationError,
)

client = Strongly()

try:
app = client.apps.retrieve("nonexistent")
except NotFoundError as e:
print(f"Not found: {e.message}")
except AuthenticationError:
print("Check your credentials")
except PermissionDeniedError:
print("Your key doesn't have permission for this")
except RateLimitError as e:
print(f"Too many requests — wait {e.retry_after}s")
except ValidationError as e:
print(f"Invalid input: {e.message}")
for detail in e.details:
print(f" {detail}")

Async / Await

For async code (FastAPI, asyncio, etc.), use AsyncStrongly:

import asyncio
from strongly import AsyncStrongly

async def main():
async with AsyncStrongly() as client:
# List things
async for wf in client.workflows.list():
print(wf.name)

# Await individual operations
response = await client.ai.inference.chat_completion(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)

asyncio.run(main())

Every resource and method available on Strongly has an identical async counterpart on AsyncStrongly.

Environment Variables

VariablePurpose
STRONGLY_API_KEYYour API key
STRONGLY_API_URLOverride the base URL (for self-hosted instances)
STRONGLY_LOGSet SDK log level (DEBUG, INFO, WARNING, ERROR)

Next Steps