← Back

---

inclusion: manual

---


Aurora DSQL Get Started Guide


Overview


This guide provides steps to help users get started with Aurora DSQL in their project. It sets up their DSQL cluster with IAM authentication and connects their database to their code by understanding the context within the codebase.


Use Case


These guidelines apply when users say "Get started with DSQL" or similar phrases. The user's codebase may be mature (with existing database connections) or have little to no code - the guidelines should apply to both cases.


Agent Communication Style


Keep all responses succinct:


Examples:



---


Get Started with DSQL (Interactive Guide)


TRIGGER PHRASE: When the user says "Get started with DSQL", "Get started with Aurora DSQL", or similar phrases, provide an interactive onboarding experience by following these steps:


Before starting: Let the user know they can pause and resume anytime by saying "Continue with DSQL setup" if they need to come back later.


RESUME TRIGGER: If the user says "Continue with DSQL setup" or similar, check what's already configured (AWS credentials, clusters, MCP server, connection tested) and resume from where they left off. Ask them which step they'd like to continue from or analyze their setup to determine automatically.


Step 1: Verify Prerequisites


Check AWS credentials:



aws sts get-caller-identity

If not configured:


Check PostgreSQL client:



psql --version

If missing OR version <=14:

DSQL requires SNI support from psql >=14.


Step 2: Check for Existing Clusters


Set region (uses AWS_REGION or REGION if set, defaults to us-east-1):


REGION=${AWS_REGION:-${REGION:-us-east-1}}
echo $REGION

List clusters in the region:


aws dsql list-clusters --region $REGION

If they have NO clusters:


If they have ANY clusters:


Create cluster command (if needed):



aws dsql create-cluster --region $REGION --tags '{"Name":"my-dsql-cluster","created_by":""}'

Wait for ACTIVE status (takes ~60 seconds):



aws dsql get-cluster --identifier CLUSTER_ID --region $REGION

Step 3: Get Cluster Connection Details


Construct cluster endpoint:



CLUSTER_ID=""
CLUSTER_ENDPOINT="${CLUSTER_ID}.dsql.${REGION}.on.aws"
echo $CLUSTER_ENDPOINT

Store endpoint for their project environment:


Step 4: Set Up MCP Server (Optional)


Would the user like to be guided through setting up the MCP server?


If so, follow the steps detailed in mcp-setup.md


MCP server provides:


Step 5: Test Connection


Generate authentication token and connect:



export PGPASSWORD=$(aws dsql generate-db-connect-admin-auth-token \
  --region $REGION \
  --hostname $CLUSTER_ENDPOINT \
  --expires-in 3600)

export PGSSLMODE=require
export PGAPPNAME="/"

psql --quiet -h $CLUSTER_ENDPOINT -U admin -d postgres

Verify with test query:



SELECT current_database(), version();

If connection fails:


Step 6: Understand the Project


First, check if this is an empty/new project:


If empty or near-empty project:


If established project:


ALWAYS reference ./development-guide.md before making schema changes


Step 7: Install Database Driver


Based on their language, install appropriate driver (some examples):


JavaScript/TypeScript:


npm install @aws-sdk/credential-providers @aws-sdk/dsql-signer pg tsx
npm install @aws/aurora-dsql-node-postgres-connector

Python:


pip install psycopg2-binary
pip install aurora-dsql-python-connector

Go:


go get github.com/jackc/pgx/v5

Rust:


cargo add sqlx --features postgres,runtime-tokio-native-tls
cargo add aws-sdk-dsql tokio --features full

For implementation patterns, reference ./dsql-examples.md and ./language.md


Step 8: Schema Setup


Check for existing schema:


If existing schema found:


If no schema found:

1. Create simple example table

2. Design custom schema together

3. Skip for now


If creating example table:


Use MCP server or psql to execute:



CREATE TABLE users (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  email VARCHAR(255) UNIQUE NOT NULL,
  name VARCHAR(255),
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

CREATE INDEX ASYNC idx_users_email ON users(email);

For custom schema:


Step 9: Set Up Scoped Database Roles


Recommend creating scoped roles before application development begins.




-- As admin
CREATE ROLE app_user WITH LOGIN;
AWS IAM GRANT app_user TO 'arn:aws:iam:::role/';
GRANT USAGE ON SCHEMA public TO app_user;
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO app_user;



CREATE SCHEMA users_schema;
GRANT USAGE ON SCHEMA users_schema TO app_user;
GRANT SELECT, INSERT, UPDATE ON ALL TABLES IN SCHEMA users_schema TO app_user;
GRANT CREATE ON SCHEMA users_schema TO app_user;


Step 10: What's Next


Let them know you're ready to help with more:


"You're all set! Here are some things I can help with - feel free to ask about any of these (or anything else):



Important Notes:



MCP Server Workflow:


---


DSQL Best Practices


Critical Constraints


ALWAYS follow these rules:


1. Indexes: Use CREATE INDEX ASYNC - synchronous index creation not supported

2. Serialization: Store arrays/JSON as TEXT (comma-separated or JSON.stringify)

3. Referential Integrity: Implement foreign key validation in application code

4. DDL Operations: Execute one DDL per transaction, no mixing with DML

5. Transaction Limits: Maximum 3,000 row modifications, 10 MiB data size per transaction

6. Token Refresh: Regenerate auth tokens before 15-minute expiration

7. SSL Required: Always set PGSSLMODE=require or sslmode=require


DSQL-Specific Features


Leverage Aurora DSQL capabilities:


1. Serverless: True scale-to-zero with consumption-based pricing

2. Distributed: Active-active writes across multiple regions

3. Strong Consistency: Immediate read-your-writes across all regions

4. IAM Authentication: No password management, automatic token rotation

5. PostgreSQL Compatible: Supports a listed 10 Database Drivers

(#database-drivers), 4 ORMs, and 3 Adapters/Dialects as listed.


For detailed patterns, see ./development-guide.md


Additional Resources