Skip to content

Database Exception Leakage on Duplicate Entities #63

@Bodeayman

Description

@Bodeayman

Description

When creating entities (e.g., via POST), the api correctly validates allowed keys, but it performs insufficient validation for duplicate keys before interacting with the database. Specifically, when a payload contains a duplicate unique key (like name), the API attempts an SQL INSERT directly. Because the database schema enforces uniqueness "As found in the SQLs files", this triggers an internal UniqueViolationError. Because this specific exception was unhandled or poorly handled, it leaks the internal Postgres schema and constraint names back to the client, which might help unauthorized users gain knowledge about the internal database structure, in addition to that it returns a 400 Bad Request, which is not the suitable for Developer Experience that should returns as 409 Conflict

Affected Files

  • app/utils/utils.py
  • app/v1/endpoints/create/thing.py
  • app/v1/endpoints/create/sensor.py
  • app/v1/endpoints/create/location.py
  • app/v1/endpoints/create/observed_property.py
  • app/v1/endpoints/create/feature_of_interest.py
  • app/v1/endpoints/create/network.py
  • Equivalent UPDATE (PATCH) endpoints are also affected.

Affected Endpoints

POST/PATCH Endpoint Unique Constraint Field
/Things name
/Sensors name
/Locations name
/ObservedProperties name
/FeaturesOfInterest name
/Networks name

Steps to Reproduce

You will need to run the containers and send a POST request to create an entity that already exists.

Send this twice:

POST /v1.1/Locations
Content-Type: application/json

{
  "name": "location name 1",
  "description": "location 1",
  "encodingType": "application/vnd.geo+json",
  "location": {
    "type": "Point",
    "coordinates": [-117.05, 51.05]
  }
}

The server allows the payload through the initial router validation and attempts to construct the database query. Because the name already exists, it triggers an internal database exception that bubbles up to the client.

You will see that instead of a validation warning, the API spits out internal data like:

{
  "code": 400,
  "type": "error",
  "message": "duplicate key value violates unique constraint \"unique_location_name\"\nDETAIL:  Key (name)=(location name 1) already exists."
}

Additional Behavior

For PATCH requests on these entities, sending a payload with an existing unique name triggers the exact same unhandled database exception leakage.

Consequence / Destructive Potential

This behavior allows any client to map out internal database structures (like constraint names and relational structures) by intentionally causing duplicate key conflicts. It also wastes database connection pool resources on these types of requests, as the transaction fails deeply at the Postgres level rather than failing early at the application routing level.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions