The project includes integration tests located in src/test/java/com/databricks/jdbc/integration/fakeservice/tests.
These tests run against fake services that mimic production services like SQL_EXEC, SQL_GATEWAY, and THRIFT_SERVER.
The fake service can operate in three modes, controlled by the environment variable FAKE_SERVICE_TEST_MODE:
- RECORD: Records responses from production services and saves them to
/src/test/resources/ - REPLAY (default): Replays recorded responses without connecting to production services
- DRY: Connects to production services with fake service acting as a pass-through proxy
The fake service can emulate different Databricks service types, controlled by the environment variable FAKE_SERVICE_TYPE:
- SQL_EXEC (default): Emulates the SQL Execution service (SQL warehouse compute)
- SQL_GATEWAY: Emulates the SQL Gateway service (SQL warehouse compute)
- THRIFT_SERVER: Emulates the Thrift Server service (All-purpose compute)
The appropriate client type will be automatically selected based on the FAKE_SERVICE_TYPE.
For RECORD and DRY modes, which connect to actual Databricks services, authentication is required via the environment variable DATABRICKS_TOKEN:
- DATABRICKS_TOKEN: A valid Databricks Personal Access Token (PAT) for the environment specified in the properties files
- The token is not required for REPLAY mode, which uses pre-recorded responses
- For security, always use environment variables rather than hardcoding tokens in test files
- The token should have permissions to access the SQL warehouses or clusters defined in the properties files
Each fake service type has its own properties file in the src/test/resources/ directory:
- SQL_EXEC: Uses
sqlexecfakeservicetest.properties - SQL_GATEWAY: Uses
sqlgatewayfakeservicetest.properties - THRIFT_SERVER: Uses
thriftserverfakeservicetest.properties
The FAKE_SERVICE_TYPE environment variable determines which properties file is loaded. These files contain configuration values required for testing, including:
host.databricks: The Databricks host URLhost.cloudfetch: The cloud storage URL for fetching resultshttppath: The HTTP path for the compute resource- SQL warehouse paths (for SQL_EXEC and SQL_GATEWAY) use format:
/sql/1.0/warehouses/<warehouse-id> - All-purpose compute paths (for THRIFT_SERVER) use format:
/sql/protocolv1/o/<org-id>/<cluster-id>
- SQL warehouse paths (for SQL_EXEC and SQL_GATEWAY) use format:
connschema: The default schema to connect toconncatalog: The default catalog to connect totestcatalog: The catalog to use for creating test tablestestschema: The schema to use for creating test tables
Run connection tests using SQL_GATEWAY in REPLAY mode:
FAKE_SERVICE_TYPE=SQL_GATEWAY FAKE_SERVICE_TEST_MODE=replay mvn -Dtest=com.databricks.jdbc.integration.fakeservice.tests.ConnectionIntegrationTests testRun all tests using THRIFT_SERVER in REPLAY mode:
FAKE_SERVICE_TYPE=THRIFT_SERVER FAKE_SERVICE_TEST_MODE=replay mvn -Dtest=*IntegrationTests testRun execution tests using the default SQL_EXEC in RECORD mode:
DATABRICKS_TOKEN=<personal-access-token> FAKE_SERVICE_TEST_MODE=record mvn -Dtest=com.databricks.jdbc.integration.fakeservice.tests.ExecutionIntegrationTests testFor RECORD or DRY modes, you need to set a personal access token:
DATABRICKS_TOKEN=<personal-access-token> FAKE_SERVICE_TYPE=SQL_GATEWAY FAKE_SERVICE_TEST_MODE=record mvn -Dtest=*IntegrationTests test- Classes ending with
Testare unit tests - Classes under
com/databricks/jdbc/integration/e2eare the highest fidelity end-to-end tests - Classes under
com/databricks/jdbc/integration/fakeservice/testsare fake service tests