Conversation
|
❌ PR approval conditions NOT satisfied. At least 1 team lead OR 2 team members must approve before merging. |
| } | ||
|
|
||
| // Handler for POST /api/sessions/{session_id}/eeg_data/export | ||
| async fn export_eeg_data( |
There was a problem hiding this comment.
This is a massive function with too many responsibility. I would recommend to break it down into bunch of helpers to improve readability and modularity
| ) -> Result<impl IntoResponse, (StatusCode, String)> { | ||
| info!("Received request to export EEG data for session {}", session_id); | ||
|
|
||
| // right now the only export format supported is CSV, so we just check for that |
There was a problem hiding this comment.
The prechecks can be in it's own helper function. Make it a more modular function, since the logic for selecting a time frame(start and end time) is not Export specific. This will reduce code duplication
`
// check for time range, else use defaults
// for end time, we default to the current time
// for start time, we default to the earliest timestamp for the session
let end_time = match request.options.end_time {
Some(t) => t,
None => Utc::now(),
};
let start_time = match request.options.start_time {
Some(t) => t,
None => {
// we call the helper function in db.rs to get the earliest timestamp
match shared_logic::db::get_earliest_eeg_timestamp(&app_state.db_client, session_id).await {
Ok(Some(t)) => t,
Ok(None) => return Err((StatusCode::NOT_FOUND, format!("No EEG data found for session {}", session_id))),
Err(e) => return Err((StatusCode::INTERNAL_SERVER_ERROR, format!("Failed to get earliest EEG timestamp: {}", e))),
}
}
};
if start_time > end_time {
return Err((StatusCode::BAD_REQUEST, "start_time cannot be after end_time".to_string()));
}`
| Some(t) => t, | ||
| None => { | ||
| // we call the helper function in db.rs to get the earliest timestamp | ||
| match shared_logic::db::get_earliest_eeg_timestamp(&app_state.db_client, session_id).await { |
There was a problem hiding this comment.
you can add to the use shared_logic::db::{initialize_connection, DbClient}; so that you don't have to do shared_logic::db::get_earliest_eeg_timestamp
Export and Import Pull Request
What?
Added the ability to export and import EEG data to and from CSV's.
By doing so,
eeg_datawas also made session-based. Note that this alters the first message the frontend sends, as it must now include thesession_id.Also importantly, the migration for this calls
TRUNCATEon the existing data in the table to empty it out, as the migration I wrote was under the assumption the tables have no progress (since at least as of time of writing this code, it's dummy data).How?
Columns and Indexing
Added session id as a column in the EEG data table, localizing it. Also added an index for session id and time, since these will be the majority of our EEG look-ups.
There was an existing index for time. Left it there for now, as could still be useful for if we're ever planning on doing global time based stuff (i.e. looking at EEG data across all sessions).
Insert
Along with modifying the EEG table, I also had to modify the function that inserts into it. Was going to modify any place that calls it, so had to also change
start_broadcastinbc.rsandmain.rsfor the WebSocket server.Export Format
The JSON request I designed requires a start and end time, in the format:
For now, I also made start and end time optional, such that if they don't give one, it just assumes they want it until now. If start time isn't given, it's assumed we're starting from the earliest timestamp in the database for the calling session.
Import Format
The first two assumptions are with the shape of the data rows, and the existence of a header. Basically, we expect the CSVs to look like this:
Except with values, ideally.
We also assume the time is in RFC3339 format.
Testing
Better
Populate a session ID with some
eeg_data. One way to do this is to use the new import call.BASE_URL/api/sessions/{session_id}/eeg_data/import, with the body as theeeg_data, for example:Now that we've populated the database somewhat, we can test export by fetching the same data. We can do so with a POST request, to
BASE_URL/api/sessions/{session_id}/eeg_data/export, with the body as follows:Doing this should return a CSV in the body in Postman, and produce an actual CSV at the top-level when running in terminal/VSC. It should naturally match the values that you've passed in.
Additionally, we can also test the time-filtering capabilities of this function using the same example, but changing the body slightly to cut off certain values:
Notes:
session_idand the timestamp, potentially worth coming back to to discuss.