-
Notifications
You must be signed in to change notification settings - Fork 8
Adapt probtest workflow to LETKF outputs #83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
45 commits
Select commit
Hold shift + click to select a range
b3798e0
report and observations in check
cghielmini b255f0e
improved version of extended check
cghielmini aac573d
first version of including veri_data in fof-compare
cghielmini a08b99b
adapt probtest to ekf
cghielmini 357534d
first complete version of this PR
cghielmini d053bcf
rebasing
cghielmini 43344d3
solve tests failing
cghielmini e4faee0
integration first part of comments
cghielmini 17f49ac
improve write_differences function
cghielmini 9560e94
solve pylint
cghielmini 33bf265
first draft new version fof-compare
cghielmini 704c0bd
make fof-comare more similar to check
cghielmini 5c67751
log file for error
cghielmini 17e6256
correct path
cghielmini 8c276bb
in progress
cghielmini 84fac35
adapt to ekf
cghielmini c320c83
make code more efficient and clean
cghielmini ec729f6
add log file for tolerance
cghielmini ea37cfd
write log tolerance
cghielmini b9ab769
Merge branch 'main' into adapt_to_ekf
cghielmini 1c6d5da
change way to write log tolerance
cghielmini 85b9c34
change text write_tolerance_log
cghielmini 60b187a
clean create_tolerance_csv
cghielmini b554bc8
clean log file creation
cghielmini d8e1805
fof types names and two loggers
cghielmini 9166d42
differenciate better between detailed and normal logger
cghielmini f22df50
solved tolerance problem
cghielmini f488e44
allow multiple log files
cghielmini c8b1d89
Merge remote-tracking branch 'origin/main' into adapt_to_ekf
cghielmini 082c9eb
function for names
cghielmini 1641123
add description funcitons and polish
cghielmini 70f4b3b
fix tests
cghielmini 3461953
ready for review
cghielmini 360e127
ready for review for real
cghielmini a6fb9a7
correct pylint
cghielmini 535cfb6
first part of suggestions
cghielmini 5d4ec20
integration all suggestions
cghielmini 91a161e
Update engine/fof_compare.py
cghielmini 6aaf681
add rules and temp directory
cghielmini abbdc3c
Merge branch 'main' into adapt_to_ekf
cghielmini 3123839
add test for fof-compare
cghielmini 96f0fe5
cleaning test
cghielmini 88d8f31
make fof_type mandatory and add help for file path 1 and 2
cghielmini 9428b08
clean
cghielmini 096eef7
clean commented lines
cghielmini File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,135 @@ | ||
| """ | ||
| This module contains test cases to validate the functionality | ||
| of fof-compare CLI commands. | ||
| """ | ||
|
|
||
| import logging | ||
| import os | ||
| from pathlib import Path | ||
|
|
||
| import pytest | ||
| from click.testing import CliRunner | ||
|
|
||
| from engine.fof_compare import fof_compare | ||
|
|
||
|
|
||
| @pytest.fixture(name="fof_datasets", scope="function") | ||
| def fixture_fof_datasets(fof_datasets_base, tmp_dir): | ||
| """ | ||
| FOF datasets written to disk, returns file paths. | ||
| """ | ||
| ds1, ds2, _, _ = fof_datasets_base | ||
| ds3 = ds2.copy(deep=True) | ||
| ds3["flags"] = (("d_body",), ds3["flags"].values * 1.55) | ||
|
|
||
| ds1_file = os.path.join(tmp_dir, "fof1_SYNOP.nc") | ||
| ds2_file = os.path.join(tmp_dir, "fof2_SYNOP.nc") | ||
| ds3_file = os.path.join(tmp_dir, "fof3_SYNOP.nc") | ||
|
|
||
| ds1.to_netcdf(ds1_file) | ||
| ds2.to_netcdf(ds2_file) | ||
| ds3.to_netcdf(ds3_file) | ||
|
|
||
| yield ds1_file, ds2_file, ds3_file | ||
|
|
||
|
|
||
| def test_fof_compare_works(fof_datasets, tmp_dir, monkeypatch): | ||
| """ | ||
| Test that fof-compare works and produces a log file. | ||
| """ | ||
|
|
||
| df1, df2, _ = fof_datasets | ||
|
|
||
| df1 = df1.replace("SYNOP", "{fof_type}") | ||
| df2 = df2.replace("SYNOP", "{fof_type}") | ||
| monkeypatch.chdir(tmp_dir) | ||
| rules = "" | ||
| runner = CliRunner() | ||
|
|
||
| result = runner.invoke( | ||
| fof_compare, | ||
| [ | ||
| "--file1", | ||
| df1, | ||
| "--file2", | ||
| df2, | ||
| "--fof-types", | ||
| "SYNOP", | ||
| "--tolerance", | ||
| "1e-12", | ||
| "--rules", | ||
| rules, | ||
| ], | ||
| ) | ||
|
|
||
| assert result.exit_code == 0 | ||
|
|
||
| log_file = Path(tmp_dir + "/error_fof1_SYNOP.log") | ||
|
|
||
| assert (log_file).exists() | ||
|
|
||
|
|
||
| def test_fof_compare_not_consistent(fof_datasets, tmp_dir, monkeypatch, caplog): | ||
| """ | ||
| Test that if there are differences in the files, then fof-compare writes | ||
| in the log file that the files are not consistent. | ||
| """ | ||
|
|
||
| df1, _, df3 = fof_datasets | ||
| df1 = df1.replace("SYNOP", "{fof_type}") | ||
| df3 = df3.replace("SYNOP", "{fof_type}") | ||
| monkeypatch.chdir(tmp_dir) | ||
|
|
||
| rules = "" | ||
| runner = CliRunner() | ||
| with caplog.at_level(logging.INFO): | ||
| runner.invoke( | ||
| fof_compare, | ||
| [ | ||
| "--file1", | ||
| df1, | ||
| "--file2", | ||
| df3, | ||
| "--fof-types", | ||
| "SYNOP", | ||
| "--tolerance", | ||
| "5", | ||
| "--rules", | ||
| rules, | ||
| ], | ||
| ) | ||
|
|
||
| assert "Files are NOT consistent!" in caplog.text | ||
|
|
||
|
|
||
| def test_fof_compare_consistent(fof_datasets, tmp_dir, monkeypatch, caplog): | ||
| """ | ||
| Test that if there are no differences in the files and the tolerance is big | ||
| enough, then fof-compare writes in the log file that the files are consistent. | ||
| """ | ||
|
|
||
| df1, df2, _ = fof_datasets | ||
| df1 = df1.replace("SYNOP", "{fof_type}") | ||
| df2 = df2.replace("SYNOP", "{fof_type}") | ||
| monkeypatch.chdir(tmp_dir) | ||
|
|
||
| rules = "" | ||
| runner = CliRunner() | ||
| with caplog.at_level(logging.INFO): | ||
| runner.invoke( | ||
| fof_compare, | ||
| [ | ||
| "--file1", | ||
| df1, | ||
| "--file2", | ||
| df2, | ||
| "--fof-types", | ||
| "SYNOP", | ||
| "--tolerance", | ||
| "5", | ||
| "--rules", | ||
| rules, | ||
| ], | ||
| ) | ||
|
|
||
| assert "Files are consistent!" in caplog.text |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.