Skip to main content
This is a reference to all environment variables that can be used to configure a Lightdash deployment.
VariableDescription
PGHOST(Required) Hostname of postgres server to store Lightdash data
PGPORT(Required) Port of postgres server to store Lightdash data
PGUSER(Required) Username of postgres user to access postgres server to store Lightdash data
PGPASSWORD(Required) Password for PGUSER
PGDATABASE(Required) Database name inside postgres server to store Lightdash data
PGCONNECTIONURIConnection URI for postgres server to store Lightdash data in the format postgresql://user:password@host:port/db?params. This is an alternative to providing the previous PG variables.
PGMAXCONNECTIONSMaximum number of connections to the database
PGMINCONNECTIONSMinimum number of connections to the database
LIGHTDASH_SECRET(Required) Secret key used to secure various tokens in Lightdash. This must be fixed between deployments. If the secret changes, you won’t have access to Lightdash data.
SECURE_COOKIESOnly allows cookies to be stored over a https connection. We use cookies to keep you logged in. This is recommended to be set to true in production. (default=false)
COOKIES_MAX_AGE_HOURSHow many hours a user session exists before the user is automatically signed out. For example if 24, then the user will be automatically after 24 hours of inactivity.
TRUST_PROXYThis tells the Lightdash server that it can trust the X-Forwarded-Proto header it receives in requests. This is useful if you use SECURE_COOKIES=true behind a HTTPS terminated proxy that you can trust. (default=false)
SITE_URLSite url where Lightdash is being hosted. It should include the protocol. E.g https://lightdash.mycompany.com (default=http://localhost:8080)
INTERNAL_LIGHTDASH_HOSTInternal Lightdash host for the Headless browser to send requests when your Lightdash instance is not accessible from the Internet. Needs to support https if SECURE_COOKIES=true (default=Same as SITE_URL)
STATIC_IPServer static IP so users can add the IP to their warehouse allow-list. (default=http://localhost:8080)
LIGHTDASH_QUERY_MAX_LIMITQuery max rows limit (default=5000)
LIGHTDASH_QUERY_DEFAULT_LIMITDefault number of rows to return in a query (default=500)
LIGHTDASH_QUERY_MAX_PAGE_SIZEMaximum page size for paginated queries (default=2500)
SCHEDULER_ENABLEDEnables/Disables the scheduler worker that triggers the scheduled deliveries. (default=true)
SCHEDULER_CONCURRENCYHow many scheduled delivery jobs can be processed concurrently. (default=3)
SCHEDULER_JOB_TIMEOUTAfter how many milliseconds the job should be timeout so the scheduler worker can pick other jobs. (default=600000, 10 minutes)
SCHEDULER_SCREENSHOT_TIMEOUTTimeout in milliseconds for taking screenshots
SCHEDULER_INCLUDE_TASKSComma-separated list of scheduler tasks to include
SCHEDULER_EXCLUDE_TASKSComma-separated list of scheduler tasks to exclude
LIGHTDASH_CSV_CELLS_LIMITMax cells on CSV file exports (default=100000)
LIGHTDASH_CHART_VERSION_HISTORY_DAYS_LIMITConfigure how far back the chart versions history goes in days (default=3)
LIGHTDASH_PIVOT_TABLE_MAX_COLUMN_LIMITConfigure maximum number of columns in pivot table (default=60)
GROUPS_ENABLEDEnables/Disables groups functionality (default=false)
CUSTOM_VISUALIZATIONS_ENABLEDEnables/Disables custom chart functionality (default=false)
LIGHTDASH_MAX_PAYLOADMaximum HTTP request body size (default=5mb)
LIGHTDASH_LICENSE_KEYLicense key for Lightdash Enterprise Edition. See Enterprise License Keys for details. Get your license key
HEADLESS_BROWSER_HOSTHostname for the headless browser
HEADLESS_BROWSER_PORTPort for the headless browser (default=3001)
ALLOW_MULTIPLE_ORGSIf set to true, new users registering on Lightdash will have their own organization, separated from others (default=false)
LIGHTDASH_MODEMode for Lightdash (default, demo, pr, etc.) (default=default)
DISABLE_PATDisables Personal Access Tokens (default=false)
PAT_ALLOWED_ORG_ROLESComma-separated list of organization roles allowed to use Personal Access Tokens (default=All roles)
PAT_MAX_EXPIRATION_TIME_IN_DAYSMaximum expiration time in days for Personal Access Tokens
MAX_DOWNLOADS_AS_CODEMaximum number of downloads as code (default=100)
EXTENDED_USAGE_ANALYTICSEnables extended usage analytics (default=false)
USE_SECURE_BROWSERUse secure WebSocket connections for headless browser (default=false)
DISABLE_DASHBOARD_COMMENTSDisables dashboard comments (default=false)
ORGANIZATION_WAREHOUSE_CREDENTIALS_ENABLEDEnables organization warehouse settings (default=false)
Lightdash also accepts all standard postgres environment variables

SMTP

This is a reference to all the SMTP environment variables that can be used to configure a Lightdash email client.
VariableDescription
EMAIL_SMTP_HOST(Required) Hostname of email server
EMAIL_SMTP_PORTPort of email server (default=587)
EMAIL_SMTP_SECURESecure connection (default=true)
EMAIL_SMTP_USER(Required) Auth user
EMAIL_SMTP_PASSWORDAuth password [1]
EMAIL_SMTP_ACCESS_TOKENAuth access token for Oauth2 authentication [1]
EMAIL_SMTP_ALLOW_INVALID_CERTAllow connection to TLS server with self-signed or invalid TLS certificate (default=false)
EMAIL_SMTP_SENDER_EMAIL(Required) The email address that sends emails
EMAIL_SMTP_SENDER_NAMEThe name of the email address that sends emails (default=Lightdash)
[1] EMAIL_SMTP_PASSWORD or EMAIL_SMTP_ACCESS_TOKEN needs to be provided

SSO

These variables enable you to control Single Sign On (SSO) functionality.
VariableDescription
AUTH_DISABLE_PASSWORD_AUTHENTICATIONIf “true” disables signing in with plain passwords (default=false)
AUTH_ENABLE_GROUP_SYNCIf “true” enables assigning SSO groups to Lightdash groups (default=false)
AUTH_ENABLE_OIDC_LINKINGEnables linking a new OIDC identity to an existing user if they already have another OIDC with the same email (default=false)
AUTH_ENABLE_OIDC_TO_EMAIL_LINKINGEnables linking OIDC identity to an existing user by email. Required when using SCIM with SSO (default=false)
AUTH_GOOGLE_OAUTH2_CLIENT_IDRequired for Google SSO
AUTH_GOOGLE_OAUTH2_CLIENT_SECRETRequired for Google SSO
AUTH_OKTA_OAUTH_CLIENT_IDRequired for Okta SSO
AUTH_OKTA_OAUTH_CLIENT_SECRETRequired for Okta SSO
AUTH_OKTA_OAUTH_ISSUERRequired for Okta SSO
AUTH_OKTA_DOMAINRequired for Okta SSO
AUTH_OKTA_AUTHORIZATION_SERVER_IDOptional for Okta SSO with a custom authorization server
AUTH_OKTA_EXTRA_SCOPESOptional for Okta SSO scopes (e.g. groups) without a custom authorization server
AUTH_ONE_LOGIN_OAUTH_CLIENT_IDRequired for One Login SSO
AUTH_ONE_LOGIN_OAUTH_CLIENT_SECRETRequired for One Login SSO
AUTH_ONE_LOGIN_OAUTH_ISSUERRequired for One Login SSO
AUTH_AZURE_AD_OAUTH_CLIENT_IDRequired for Azure AD
AUTH_AZURE_AD_OAUTH_CLIENT_SECRETRequired for Azure AD
AUTH_AZURE_AD_OAUTH_TENANT_IDRequired for Azure AD
AUTH_AZURE_AD_OIDC_METADATA_ENDPOINTOptional for Azure AD
AUTH_AZURE_AD_X509_CERT_PATHOptional for Azure AD
AUTH_AZURE_AD_X509_CERTOptional for Azure AD
AUTH_AZURE_AD_PRIVATE_KEY_PATHOptional for Azure AD
AUTH_AZURE_AD_PRIVATE_KEYOptional for Azure AD
DATABRICKS_OAUTH_CLIENT_IDClient ID for Databricks OAuth
DATABRICKS_OAUTH_CLIENT_SECRETClient secret for Databricks OAuth (optional)
DATABRICKS_OAUTH_AUTHORIZATION_ENDPOINTAuthorization endpoint URL for Databricks OAuth
DATABRICKS_OAUTH_TOKEN_ENDPOINTToken endpoint URL for Databricks OAuth

S3

These variables allow you to configure S3 Object Storage, which is required to self-host Lightdash.
VariableDescription
S3_ENDPOINT(Required) S3 endpoint for storing results
S3_BUCKET(Required) Name of the S3 bucket for storing files
S3_REGION(Required) Region where the S3 bucket is located
S3_ACCESS_KEYAccess key for authenticating with the S3 bucket
S3_SECRET_KEYSecret key for authenticating with the S3 bucket
S3_USE_CREDENTIALS_FROMConfigures the credential provider chain for AWS S3 authentication if access key and secret is not provided. Supports: env (environment variables), token_file (token file credentials), ini (initialization file credentials), ecs (container metadata credentials), ec2 (instance metadata credentials). Multiple values can be specified in order of preference.
S3_EXPIRATION_TIMEExpiration time for scheduled deliveries files (default=259200, 3d)
S3_FORCE_PATH_STYLEForce path style addressing, needed for MinIO setup e.g. http://your.s3.domain/BUCKET/KEY instead of http://BUCKET.your.s3.domain/KEY (default=false)
RESULTS_S3_BUCKETName of the S3 bucket used for storing query results (default=S3_BUCKET)
RESULTS_S3_REGIONRegion where the S3 query storage bucket is located (default=S3_REGION)
RESULTS_S3_ACCESS_KEYAccess key for authenticating with the S3 query storage bucket (default=S3_ACCESS_KEY)
RESULTS_S3_SECRET_KEYSecret key for authenticating with the S3 query storage bucket (default=S3_SECRET_KEY)

Cache

Note that you will need an Enterprise License Key for this functionality.
VariableDescription
RESULTS_CACHE_ENABLEDEnables caching for chart results (default=false)
AUTOCOMPLETE_CACHE_ENABLEDEnables caching for filter autocomplete results (default=false)
CACHE_STALE_TIME_SECONDSDefines how long cached results remain valid before being considered stale (default=86400, 24h)
These variables are deprecated; use the RESULTS_S3_* versions instead.
VariableDescription
RESULTS_CACHE_S3_BUCKETDeprecated - use RESULTS_S3_BUCKET (default=S3_BUCKET)
RESULTS_CACHE_S3_REGIONDeprecated - use RESULTS_S3_REGION (default=S3_REGION)
RESULTS_CACHE_S3_ACCESS_KEYDeprecated - use RESULTS_S3_ACCESS_KEY (default=S3_ACCESS_KEY)
RESULTS_CACHE_S3_SECRET_KEYDeprecated - use RESULTS_S3_SECRET_KEY (default=S3_SECRET_KEY)

Logging

VariableDescription
LIGHTDASH_LOG_LEVELThe minimum level of log messages to show. DEBUG, AUDIT, HTTP, INFO, WARN ERROR (default=INFO)
LIGHTDASH_LOG_FORMATThe format of log messages. PLAIN, PRETTY, JSON (default=pretty)
LIGHTDASH_LOG_OUTPUTSThe outputs to send log messages to (default=console)
LIGHTDASH_LOG_CONSOLE_LEVELThe minimum level of log messages to display on the console (default=LIGHTDASH_LOG_LEVEL)
LIGHTDASH_LOG_CONSOLE_FORMATThe format of log messages on the console (default=LIGHTDASH_LOG_FORMAT)
LIGHTDASH_LOG_FILE_LEVELThe minimum level of log messages to write to the log file (default=LIGHTDASH_LOG_LEVEL)
LIGHTDASH_LOG_FILE_FORMATThe format of log messages in the log file (default=LIGHTDASH_LOG_FORMAT)
LIGHTDASH_LOG_FILE_PATHThe path to the log file. Requires LIGHTDASH_LOG_OUTPUTS to include file to enable file output. (default=./logs/all.log)

Prometheus

VariableDescription
LIGHTDASH_PROMETHEUS_ENABLEDEnables/Disables Prometheus metrics endpoint (default=false)
LIGHTDASH_PROMETHEUS_PORTPort for Prometheus metrics endpoint (default=9090)
LIGHTDASH_PROMETHEUS_PATHPath for Prometheus metrics endpoint (default=/metrics)
LIGHTDASH_PROMETHEUS_PREFIXPrefix for metric names.
LIGHTDASH_GC_DURATION_BUCKETSBuckets for duration histogram in seconds. (default=0.001, 0.01, 0.1, 1, 2, 5)
LIGHTDASH_EVENT_LOOP_MONITORING_PRECISIONPrecision for event loop monitoring in milliseconds. Must be greater than zero. (default=10)
LIGHTDASH_PROMETHEUS_LABELSLabels to add to all metrics. Must be valid JSON

Security

VariableDescription
LIGHTDASH_CSP_REPORT_ONLYEnables Content Security Policy (CSP) reporting only mode. This is recommended to be set to false in production. (default=true)
LIGHTDASH_CSP_ALLOWED_DOMAINSList of domains that are allowed to load resources from. Values must be separated by commas.
LIGHTDASH_CSP_REPORT_URIURI to send CSP violation reports to.
LIGHTDASH_CORS_ENABLEDEnables Cross-Origin Resource Sharing (CORS) (default=false)
LIGHTDASH_CORS_ALLOWED_DOMAINSList of domains that are allowed to make cross-origin requests. Values must be separated by commas.

Analytics & Event Tracking

VariableDescription
RUDDERSTACK_WRITE_KEYRudderStack key used to track events (by default Lightdash’s key is used)
RUDDERSTACK_DATA_PLANE_URLRudderStack data plane URL to which events are tracked (by default Lightdash’s data plane is used)
RUDDERSTACK_ANALYTICS_DISABLEDSet to true to disable RudderStack analytics
POSTHOG_PROJECT_API_KEYAPI key for Posthog (by default Lightdash’s key is used)
POSTHOG_FE_API_HOSTHostname for Posthog’s front-end API
POSTHOG_BE_API_HOSTHostname for Posthog’s back-end API

AI Analyst

These variables enable you to configure the AI Analyst functionality. Note that you will need an Enterprise Licence Key for this functionality.
VariableDescription
AI_COPILOT_ENABLEDEnables/Disables AI Analyst functionality (default=false)
ASK_AI_BUTTON_ENABLEDEnables the “Ask AI” button in the interface for direct access to AI agents, when disabled agents can be acessed from /ai-agents route (default=false)
AI_EMBEDDING_ENABLEDEnables AI embedding functionality for verified answers similarity matching (default=false)
AI_DEFAULT_PROVIDERDefault AI provider to use (openai, azure, anthropic, openrouter, bedrock) (default=openai)
AI_DEFAULT_EMBEDDING_PROVIDERDefault AI provider for embeddings (openai, bedrock, azure) (default=openai)
AI_COPILOT_DEBUG_LOGGING_ENABLEDEnables debug logging for AI Copilot (default=false)
AI_COPILOT_TELEMETRY_ENABLEDEnables telemetry for AI Copilot (default=false)
AI_COPILOT_REQUIRES_FEATURE_FLAGRequires a feature flag to use AI Copilot (default=false)
AI_COPILOT_MAX_QUERY_LIMITMaximum number of rows returned in AI-generated queries (default=500)
AI_VERIFIED_ANSWER_SIMILARITY_THRESHOLDSimilarity threshold (0-1) for verified answer matching (default=0.6)
The AI Analyst supports multiple providers for flexibility. Choose one of the provider configurations below based on your preferred AI service. OpenAI integration is the recommended option as it is the most tested and stable implementation.

Minimum Required Setup

To enable AI Analyst, set AI_COPILOT_ENABLED=true and provide an API key for AI_DEFAULT_PROVIDER (e.g., OPENAI_API_KEY for OpenAI, ANTHROPIC_API_KEY for Anthropic).

OpenAI Configuration

VariableDescription
OPENAI_API_KEY(Required when using OpenAI) API key for OpenAI
OPENAI_MODEL_NAMEOpenAI model name to use (default=gpt-4.1)
OPENAI_EMBEDDING_MODELOpenAI embedding model for verified answers (default=text-embedding-3-small)
OPENAI_BASE_URLOptional base URL for OpenAI compatible API
OPENAI_AVAILABLE_MODELSComma-separated list of models available in the model picker (default=All supported models)

Anthropic Configuration

VariableDescription
ANTHROPIC_API_KEY(Required when using Anthropic) API key for Anthropic
ANTHROPIC_MODEL_NAMEAnthropic model name to use (default=claude-sonnet-4-5)
ANTHROPIC_AVAILABLE_MODELSComma-separated list of models available in the model picker (default=All supported models)

Azure AI Configuration

VariableDescription
AZURE_AI_API_KEY(Required when using Azure AI) API key for Azure AI
AZURE_AI_ENDPOINT(Required when using Azure AI) Endpoint for Azure AI
AZURE_AI_API_VERSION(Required when using Azure AI) API version for Azure AI
AZURE_AI_DEPLOYMENT_NAME(Required when using Azure AI) Deployment name for Azure AI
AZURE_EMBEDDING_DEPLOYMENT_NAMEDeployment name for Azure embedding model (default=text-embedding-3-small)
AZURE_USE_DEPLOYMENT_BASED_URLSUse deployment-based URLs for Azure OpenAI API calls (default=true)

OpenRouter Configuration

VariableDescription
OPENROUTER_API_KEY(Required when using OpenRouter) API key for OpenRouter
OPENROUTER_MODEL_NAMEOpenRouter model name to use (default=openai/gpt-4.1-2025-04-14)
OPENROUTER_SORT_ORDERProvider sorting method (price, throughput, latency) (default=latency)
OPENROUTER_ALLOWED_PROVIDERSComma-separated list of allowed providers (anthropic, openai, google) (default=openai)

AWS Bedrock Configuration

VariableDescription
BEDROCK_API_KEY(Required if not using IAM credentials) API key for Bedrock (alternative to IAM credentials)
BEDROCK_ACCESS_KEY_ID(Required if not using API key) AWS access key ID for Bedrock
BEDROCK_SECRET_ACCESS_KEY(Required if using access key ID) AWS secret access key for Bedrock
BEDROCK_SESSION_TOKENAWS session token (for temporary credentials)
BEDROCK_REGION(Required) AWS region for Bedrock
BEDROCK_MODEL_NAMEBedrock model name to use (default=claude-sonnet-4-5)
BEDROCK_EMBEDDING_MODELBedrock embedding model for verified answers (default=cohere.embed-english-v3)
BEDROCK_AVAILABLE_MODELSComma-separated list of models available in the model picker (default=All supported models)

Supported Models

OpenAI: gpt-5.1, gpt-4.1 Anthropic: claude-sonnet-4-5, claude-haiku-4-5, claude-sonnet-4 AWS Bedrock: claude-sonnet-4-5, claude-haiku-4-5, claude-sonnet-4
Exact model snapshots are automatically assigned (e.g., gpt-5.1gpt-5.1-2025-11-13).
For Bedrock, the region prefix is also added based on BEDROCK_REGION (e.g., claude-sonnet-4-5us.anthropic.claude-sonnet-4-5-20250929-v1:0).

Slack Integration

These variables enable you to configure the Slack integration.
VariableDescription
SLACK_SIGNING_SECRETRequired for Slack integration
SLACK_CLIENT_IDRequired for Slack integration
SLACK_CLIENT_SECRETRequired for Slack integration
SLACK_STATE_SECRETRequired for Slack integration (default=slack-state-secret)
SLACK_APP_TOKENApp token for Slack
SLACK_PORTPort for Slack integration (default=4351)
SLACK_SOCKET_MODEEnable socket mode for Slack (default=false)
SLACK_CHANNELS_CACHED_TIMETime in milliseconds to cache Slack channels (default=600000, 10 minutes)
SLACK_SUPPORT_URLURL for Slack support

GitHub Integration

These variables enable you to configure Github integrations
VariableDescription
GITHUB_PRIVATE_KEY(Required) GitHub private key for GitHub App authentication
GITHUB_APP_ID(Required) GitHub Application ID
GITHUB_CLIENT_ID(Required) GitHub OAuth client ID
GITHUB_CLIENT_SECRET(Required) GitHub OAuth client secret
GITHUB_APP_NAME(Required) Name of the GitHub App
GITHUB_REDIRECT_DOMAINDomain for GitHub OAuth redirection

Microsoft Teams Integration

These variables enable you to configure Microsoft Teams integration.
VariableDescription
MICROSOFT_TEAMS_ENABLEDEnables Microsoft Teams integration (default=false)

Google Cloud Platform

These variables enable you to configure Google Cloud Platform integration.
VariableDescription
GOOGLE_CLOUD_PROJECT_IDGoogle Cloud Platform project ID
GOOGLE_DRIVE_API_KEYGoogle Drive API key
AUTH_GOOGLE_ENABLEDEnables Google authentication (default=false)
AUTH_ENABLE_GCLOUD_ADCEnables Google Cloud Application Default Credentials (default=false)

Embedding

Note that you will need an Enterprise Licence Key for this functionality.
VariableDescription
EMBEDDING_ENABLEDEnables embedding functionality (default=false)
EMBED_ALLOW_ALL_DASHBOARDS_BY_DEFAULTWhen creating new embeds, allow all dashboards by default (default=false)
EMBED_ALLOW_ALL_CHARTS_BY_DEFAULTWhen creating new embeds, allow all charts by default (default=false)
LIGHTDASH_IFRAME_EMBEDDING_DOMAINSList of domains that are allowed to embed Lightdash in an iframe. Values must be separated by commas.

Custom roles

Note that you will need an Enterprise Licence Key for this functionality.
VariableDescription
CUSTOM_ROLES_ENABLEDEnables creation of custom organization roles with configurable permission scopes beyond the default Admin, Developer, Editor, and Viewer roles. (default=false)

Service account

Note that you will need an Enterprise Licence Key for this functionality.
VariableDescription
SERVICE_ACCOUNT_ENABLEDEnables service account functionality (default=false)

SCIM

Note that you will need an Enterprise Licence Key for this functionality.
VariableDescription
SCIM_ENABLEDEnables SCIM (System for Cross-domain Identity Management) (default=false)

Sentry

These variables enable you to configure Sentry for error tracking.
VariableDescription
SENTRY_DSNSentry DSN for both frontend and backend
SENTRY_BE_DSNSentry DSN for backend only
SENTRY_FE_DSNSentry DSN for frontend only
SENTRY_BE_SECURITY_REPORT_URIURI for Sentry backend security reports
SENTRY_TRACES_SAMPLE_RATESample rate for Sentry traces (0.0 to 1.0) (default=0.1)
SENTRY_PROFILES_SAMPLE_RATESample rate for Sentry profiles (0.0 to 1.0) (default=0.2)
SENTRY_ANR_ENABLEDEnables Sentry Application Not Responding detection (default=false)
SENTRY_ANR_CAPTURE_STACKTRACECaptures stacktrace for ANR events (default=false)
SENTRY_ANR_TIMEOUTTimeout in milliseconds for ANR detection

Intercom & Pylon

These variables enable you to configure Intercom and Pylon for customer support and feedback.
VariableDescription
INTERCOM_APP_IDIntercom application ID
INTERCOM_APP_BASEBase URL for Intercom API (default=https://api-iam.intercom.io)
PYLON_APP_IDPylon application ID
PYLON_IDENTITY_VERIFICATION_SECRETSecret for verifying Pylon identities

Kubernetes

These variables enable you to configure Kubernetes integration.
VariableDescription
K8S_NODE_NAMEName of the Kubernetes node
K8S_POD_NAMEName of the Kubernetes pod
K8S_POD_NAMESPACENamespace of the Kubernetes pod
LIGHTDASH_CLOUD_INSTANCEIdentifier for Lightdash cloud instance

Organization appearance

These variables allow you to customize the default appearance settings for your Lightdash instance’s organizations. This color palette will be set for all organizations in your instance. You can’t choose another one while these env vars are set.
VariableDescription
OVERRIDE_COLOR_PALETTE_NAMEName of the default color palette
OVERRIDE_COLOR_PALETTE_COLORSComma-separated list of hex color codes for the default color palette (must be 20 colors)

Initialize instance

When a new Lightdash instance is created, and there are no orgs and projects. You can initialize a new org and project using ENV variables to simplify the deployment process.
Initialize instance is only available on Lightdash Enterprise plans.For more information on our plans, visit our pricing page.
Currently we only support Databricks project types and Github dbt configuration.
VariableDescription
LD_SETUP_ADMIN_NAMEName of the admin user for initial setup (default=Admin User)
LD_SETUP_ADMIN_EMAIL(Required) Email of the admin user for initial setup
LD_SETUP_ORGANIZATION_EMAIL_DOMAINComma-separated list of email domains for organization whitelisting
LD_SETUP_ORGANIZATION_DEFAULT_ROLEDefault role for new organization members (default=viewer)
LD_SETUP_ORGANIZATION_NAME(Required) Name of the organization
LD_SETUP_ADMIN_API_KEY(Required) API key for the admin user, must start with ldpat_ prefix
LD_SETUP_API_KEY_EXPIRATIONNumber of days until API key expires (0 for no expiration) (default=30)
LD_SETUP_SERVICE_ACCOUNT_TOKEN(Required) A pre-set token for the service account, must start with ldsvc_ prefix
LD_SETUP_SERVICE_ACCOUNT_EXPIRATIONNumber of days until service account token expires (0 for no expiration) (default=30)
LD_SETUP_PROJECT_NAME(Required) Name of the project
LD_SETUP_PROJECT_CATALOGCatalog name for Databricks project
LD_SETUP_PROJECT_SCHEMA(Required) Schema/database name for the project
LD_SETUP_PROJECT_HOST(Required) Hostname for the Databricks server
LD_SETUP_PROJECT_HTTP_PATH(Required) HTTP path for Databricks connection
LD_SETUP_PROJECT_PAT(Required) Personal access token for Databricks
LD_SETUP_START_OF_WEEKDay to use as start of week (default=SUNDAY)
LD_SETUP_PROJECT_COMPUTEJSON string with Databricks compute configuration like {"name": "string", "httpPath": "string"}
LD_SETUP_DBT_VERSIONVersion of dbt to use (eg: v1.8) (default=latest)
LD_SETUP_GITHUB_PAT(Required) GitHub personal access token
LD_SETUP_GITHUB_REPOSITORY(Required) GitHub repository for dbt project
LD_SETUP_GITHUB_BRANCH(Required) GitHub branch for dbt project
LD_SETUP_GITHUB_PATHSubdirectory path within GitHub repository (default=/)
In order to login as the admin user using SSO, you must enable the following ENV variable too:
AUTH_ENABLE_OIDC_TO_EMAIL_LINKING=true
This will alow you to link your SSO account with the email provided without using an invitation code.
This email will be trusted, and any user with an OIDC account with that email will access the admin user.

Update instance

On server start, we will check the following variables, and update some configuration of the organization or project. Having more than one org or project can throw runtime errors and prevent the server from starting.
Update instance is only available on Lightdash Enterprise plans.For more information on our plans, visit our pricing page.
VariableDescription
LD_SETUP_ADMIN_EMAIL(Required if LD_SETUP_ADMIN_API_KEY is present) Email of the admin to update its Personal access token
LD_SETUP_ADMIN_API_KEYAPI key for the admin user, must start with ldpat_ prefix
LD_SETUP_ORGANIZATION_EMAIL_DOMAINComma-separated list of email domains for organization whitelisting
LD_SETUP_ORGANIZATION_DEFAULT_ROLEDefault role for new organization members (default=viewer)
LD_SETUP_PROJECT_HTTP_PATHHTTP path for Databricks connection
LD_SETUP_PROJECT_PATPersonal access token for Databricks
LD_SETUP_DBT_VERSIONVersion of dbt to use (eg: v1.8) (default=latest)
LD_SETUP_GITHUB_PATGitHub personal access token
LD_SETUP_SERVICE_ACCOUNT_TOKENA pre-set token for the service account, must start with ldsvc_ prefix