Skip to content

Commit 5c6dbfc

Browse files
authored
Merge branch 'maps-as-data:main' into fix_model_summary_reset_device_#542
2 parents 2590da5 + 7c8805c commit 5c6dbfc

8 files changed

Lines changed: 46 additions & 69 deletions

File tree

.github/workflows/mr_ci.yml

Lines changed: 2 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,8 @@
22
name: Units Tests
33

44
on:
5-
pull_request:
6-
branches:
7-
- main
5+
# Manual trigger only
6+
workflow_dispatch:
87

98
# Cancel existing tests on the same PR if a new commit is added to a pull request
109
concurrency:
@@ -39,7 +38,6 @@ jobs:
3938
- name: Install dependencies
4039
run: |
4140
python -m pip install ".[dev]"
42-
python -m pip install 'git+https://github.com/rwood-97/piffle.git@iiif_dataclasses'
4341
python -m pip install pytest-cov
4442
4543
- name: Quality Assurance
@@ -52,15 +50,3 @@ jobs:
5250
- name: Test with pytest
5351
run: |
5452
python -m pytest ./tests --ignore=tests/test_text_spotting/
55-
56-
57-
- name: Upload coverage to Codecov
58-
uses: codecov/codecov-action@v4
59-
with:
60-
token: ${{ secrets.CODECOV_TOKEN }}
61-
directory: ./coverage/reports/
62-
env_vars: OS,PYTHON
63-
fail_ci_if_error: false
64-
files: ./coverage.xml,!./cache
65-
flags: unittests
66-
name: codecov-umbrella

.github/workflows/mr_ci_text_spotting.yml

Lines changed: 6 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -50,12 +50,10 @@ jobs:
5050
python -m pip install numpy==1.26.4 torch==2.2.2 torchvision==0.17.2 -f https://download.pytorch.org/whl/torch_stable.html
5151
python -m pip install ".[dev]"
5252
python -m pip install pytest-cov
53-
python -m pip install 'git+https://github.com/facebookresearch/detectron2.git'
54-
python -m pip install 'git+https://github.com/maps-as-data/DeepSolo.git'
55-
python -m pip install 'git+https://github.com/maps-as-data/DPText-DETR.git'
56-
python -m pip install 'git+https://github.com/maps-as-data/MapTextPipeline.git'
57-
python -m pip install 'git+https://github.com/rwood-97/piffle.git@iiif_dataclasses'
58-
53+
python -m pip install --no-build-isolation 'git+https://github.com/facebookresearch/detectron2.git'
54+
python -m pip install --no-build-isolation 'git+https://github.com/maps-as-data/DeepSolo.git'
55+
python -m pip install --no-build-isolation 'git+https://github.com/maps-as-data/DPText-DETR.git'
56+
python -m pip install --no-build-isolation 'git+https://github.com/maps-as-data/MapTextPipeline.git'
5957
6058
- name: Clone DPText-DETR
6159
run: |
@@ -71,22 +69,11 @@ jobs:
7169
7270
- name: Hugging Face CLI
7371
run: |
74-
pip install -U "huggingface_hub[cli]"
72+
pip install -U "huggingface-hub[cli]>=0.30.0,<0.34.0"
7573
huggingface-cli download rwood-97/DPText_DETR_ArT_R_50_poly art_final.pth --local-dir .
7674
huggingface-cli download rwood-97/DeepSolo_ic15_res50 ic15_res50_finetune_synth-tt-mlt-13-15-textocr.pth --local-dir .
7775
huggingface-cli download rwood-97/MapTextPipeline_rumsey rumsey-finetune.pth --local-dir .
7876
7977
- name: Test with pytest
8078
run: |
81-
python -m pytest --cov=./ --cov-report=xml ./tests
82-
83-
- name: Upload coverage to Codecov
84-
uses: codecov/codecov-action@v5
85-
with:
86-
token: ${{ secrets.CODECOV_TOKEN }}
87-
directory: ./coverage/reports/
88-
env_vars: OS,PYTHON
89-
fail_ci_if_error: false
90-
files: ./coverage.xml,!./cache
91-
flags: unittests
92-
name: codecov-umbrella
79+
python -m pytest ./tests

.github/workflows/mr_pip_ci.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,6 @@ jobs:
3636
- name: Install dependencies
3737
run: |
3838
python -m pip install mapreader[dev]
39-
python -m pip install 'git+https://github.com/rwood-97/piffle.git@iiif_dataclasses'
4039
4140
- name: Quality Assurance
4241
run: |

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,12 @@ The following table shows which versions of MapReader are compatible with which
1414
## Pre-release
1515
_Add new changes here_
1616

17+
## [v1.8.2](https://github.com/Living-with-machines/MapReader/releases/tag/v1.8.2) (2025-12-19)
18+
19+
### Added
20+
21+
- Added `piffle` package as dependency ([#575](https://github.com/maps-as-data/MapReader/pull/575))
22+
1723
## [v1.8.1](https://github.com/Living-with-machines/MapReader/releases/tag/v1.8.1) (2025-08-11)
1824

1925
### Added

docs/source/using-mapreader/step-by-step-guide/1-download.rst

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -526,12 +526,6 @@ For more information on IIIF, see their documentation `here <https://iiif.io/>`_
526526

527527
MapReader accepts any IIIF manifest which is compliant with the IIIF Presentation API (version `2 <https://iiif.io/api/presentation/2.1/>`__ or `3 <https://iiif.io/api/presentation/3.0/>`__).
528528

529-
First, install piffle using the command below:
530-
531-
.. code-block:: python
532-
533-
pip install piffle@git+https://github.com/rwood-97/piffle.git@iiif_dataclasses
534-
535529

536530
IIIFDownloader
537531
~~~~~~~~~~~~~~~

mapreader/annotate/annotator.py

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -512,7 +512,12 @@ def check_eligibility(row):
512512
queue_df["eligible"] = queue_df.apply(check_eligibility, axis=1)
513513

514514
if self._sortby is not None:
515-
queue_df.sort_values(self._sortby, ascending=self._ascending, inplace=True)
515+
queue_df.sort_values(
516+
by=[self._sortby, "min_y"],
517+
ascending=[self._ascending, True],
518+
kind="mergesort",
519+
inplace=True,
520+
)
516521
queue_df = queue_df[queue_df.eligible]
517522
else:
518523
queue_df = queue_df[queue_df.eligible].sample(frac=1) # shuffle

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@
6363
"folium>=0.12,<1.0.0",
6464
"mapclassify>=2.0.0,<3.0.0",
6565
"xyzservices==2024.9.0",
66-
# "piffle @ git+https://github.com/rwood-97/piffle.git@iiif_dataclasses",
66+
"piffle>=0.7.0",
6767
"lxml",
6868
],
6969
extras_require={

tests/test_classify/test_classifier.py

Lines changed: 25 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -161,31 +161,31 @@ def test_init_resnet18_timm(inputs):
161161
assert classifier.dataloaders == {}
162162

163163

164-
@pytest.mark.dependency(name="timm_models", scope="session")
165-
def test_init_models_timm(inputs):
166-
annots, dataloaders = inputs
167-
for model2test in [
168-
["resnest50d_4s2x40d", timm.models.ResNet],
169-
["resnest101e", timm.models.ResNet],
170-
["resnext101_32x8d.fb_swsl_ig1b_ft_in1k", timm.models.ResNet],
171-
["resnet152", timm.models.ResNet],
172-
["tf_efficientnet_b3.ns_jft_in1k", timm.models.EfficientNet],
173-
["swin_base_patch4_window7_224", timm.models.swin_transformer.SwinTransformer],
174-
["vit_base_patch16_224", timm.models.vision_transformer.VisionTransformer],
175-
]: # these are models from 2021 paper
176-
model, model_type = model2test
177-
my_model = timm.create_model(
178-
model, pretrained=True, num_classes=len(annots.labels_map)
179-
)
180-
assert isinstance(my_model, model_type)
181-
classifier = ClassifierContainer(
182-
my_model, labels_map=annots.labels_map, dataloaders=dataloaders
183-
)
184-
assert isinstance(classifier.model, model_type)
185-
assert all(k in classifier.dataloaders.keys() for k in ["train", "test", "val"])
186-
classifier = ClassifierContainer(my_model, labels_map=annots.labels_map)
187-
assert isinstance(classifier.model, model_type)
188-
assert classifier.dataloaders == {}
164+
# @pytest.mark.dependency(name="timm_models", scope="session")
165+
# def test_init_models_timm(inputs):
166+
# annots, dataloaders = inputs
167+
# for model2test in [
168+
# ["resnest50d_4s2x40d", timm.models.ResNet],
169+
# ["resnest101e", timm.models.ResNet],
170+
# ["resnext101_32x8d.fb_swsl_ig1b_ft_in1k", timm.models.ResNet],
171+
# ["resnet152", timm.models.ResNet],
172+
# ["tf_efficientnet_b3.ns_jft_in1k", timm.models.EfficientNet],
173+
# ["swin_base_patch4_window7_224", timm.models.swin_transformer.SwinTransformer],
174+
# ["vit_base_patch16_224", timm.models.vision_transformer.VisionTransformer],
175+
# ]: # these are models from 2021 paper
176+
# model, model_type = model2test
177+
# my_model = timm.create_model(
178+
# model, pretrained=True, num_classes=len(annots.labels_map)
179+
# )
180+
# assert isinstance(my_model, model_type)
181+
# classifier = ClassifierContainer(
182+
# my_model, labels_map=annots.labels_map, dataloaders=dataloaders
183+
# )
184+
# assert isinstance(classifier.model, model_type)
185+
# assert all(k in classifier.dataloaders.keys() for k in ["train", "test", "val"])
186+
# classifier = ClassifierContainer(my_model, labels_map=annots.labels_map)
187+
# assert isinstance(classifier.model, model_type)
188+
# assert classifier.dataloaders == {}
189189

190190

191191
# test loading object from pickle file

0 commit comments

Comments
 (0)