You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
_New Mexico Bureau of Geology and Mineral Resources_
11
11
12
-
NMSampleLocations is a FastAPI-based backend service designed to manage geospatial sample location data across New Mexico. It supports research, field operations, and public data delivery for the Bureau of Geology and Mineral Resources.
12
+
OcotilloAPI is a FastAPI-based backend service designed to manage geospatial sample location data across New Mexico. It
13
+
supports research, field operations, and public data delivery for the Bureau of Geology and Mineral Resources.
13
14
14
15
---
15
16
@@ -197,4 +198,39 @@ Notes:
197
198
- All `Update` schema fields are optional and default to `None`
198
199
- All `Response` schema fields are defined as `<type>` if non-nullable and `<type> | None` if nullable
199
200
- All raised exceptions should use the `PydanticStyleException` as defined in `services/exceptions_helper.py`
200
-
- Errors handled by the database should be enumerated and handled in a database_error_handler in each router's file
201
+
- Errors handled by the database should be enumerated and handled in a database_error_handler in each router's file---
202
+
203
+
## 📦 Ocotillo CLI
204
+
205
+
The `oco` command exposes project automation and bulk data utilities.
The bulk upload command parses and validates each row, creates the corresponding field events/samples/observations, and prints a JSON summary (matching the API response shape) so the workflow can be automated or scripted.
216
+
## 🧪 Testing
217
+
218
+
```bash
219
+
# Run unit tests
220
+
pytest
221
+
222
+
# Run Behave BDD specs
223
+
behave tests/features
224
+
```
225
+
226
+
> Tests require a local Postgres/PostGIS instance. Set `POSTGRES_*` values in `.env`, run migrations, and ensure the database is reachable before running the suites.
227
+
228
+
## 🔄 Data Transfers
229
+
230
+
Legacy or staging datasets can be imported using the transfer utilities:
231
+
232
+
```bash
233
+
python -m transfers.transfer
234
+
```
235
+
236
+
Configure the `.env` file with the appropriate credentials before running transfers.
0 commit comments