Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
14fe9cd
Save image name instead of crs on each det
GwenaelleSa May 14, 2024
87ce33b
Correct formatting of metric table in assessment
GwenaelleSa May 14, 2024
741ae70
Always get nbr classes from class file
GwenaelleSa May 31, 2024
039b9f6
Set the category file in assessement as a param
GwenaelleSa Jun 10, 2024
deb74ce
Test docker
GwenaelleSa Jun 14, 2024
0d3fbf7
Merge branch 'master' into gs/code_improvement
GwenaelleSa Jun 14, 2024
99e8cff
Add the possiblity to use a pre-defined threshold in assessment
GwenaelleSa Sep 17, 2024
f7aa58a
Merge branch 'master' into gs/code_improvement
GwenaelleSa Oct 17, 2024
3279e50
Correct typos
GwenaelleSa Oct 17, 2024
ce72e21
Merge branch 'master' into gs/code_improvement
GwenaelleSa Oct 17, 2024
3b0d821
Include sonar cube feeback
GwenaelleSa Oct 17, 2024
ef5c026
Include sonar cube feeback part 2
GwenaelleSa Oct 17, 2024
164f0cc
Remove unnecessary year info from FOLDER connector
GwenaelleSa Nov 4, 2024
134e81c
Update with master
GwenaelleSa Nov 4, 2024
140ad74
Correct FP duplicates in generate tiles
cleherny Nov 5, 2024
f9c64c6
Correct error on year dtype in prepare data
cleherny Nov 5, 2024
a896da9
Restore change in FOLDER
cleherny Nov 6, 2024
c6df88b
Merge with branch ch/update_README
cleherny Nov 6, 2024
40788b4
Correct arg in generate tilesets
cleherny Nov 7, 2024
f346432
Add year range in prepare data
cleherny Nov 7, 2024
b5c62f7
Correct error aoi_tiles not defined
cleherny Nov 7, 2024
527e7ef
Update value in config trn MES
cleherny Nov 7, 2024
6413b05
Remove useless line in prep data MES
cleherny Nov 7, 2024
42db2ca
Remove useless space in prep data MES
cleherny Nov 7, 2024
41d2e78
Add exception to year in prep data MES
cleherny Nov 7, 2024
46403b6
Improve comment config_trne MES
cleherny Nov 7, 2024
454bbf4
Add info to get_img_to_folder doc string
cleherny Nov 7, 2024
b05db75
Add fct concat_sampled_tiles
cleherny Nov 7, 2024
ecb648d
Remove useless space
cleherny Nov 7, 2024
c600a90
Correct typo in doc string
cleherny Nov 7, 2024
b0153bd
Include CH's review
GwenaelleSa Nov 11, 2024
098d33f
Remove unnecessary lines
GwenaelleSa Nov 14, 2024
6b2ff6e
Remove float for year in docstring
cleherny Nov 14, 2024
eaa1dfb
Set year col as int in MES/prepare_data
cleherny Nov 18, 2024
f86237e
Correct wrong sign in remove_overlap fct
cleherny Nov 18, 2024
24d33a7
Add remove_overlap_det arg to config files
cleherny Nov 18, 2024
ab26c8d
Add yeat to img_metadata
cleherny Nov 18, 2024
37dc493
Add doc string to get_geotiff XYZ
cleherny Nov 18, 2024
c025c5e
Correct error in MES/config_trne
cleherny Nov 18, 2024
e498d67
Set remove_overlap to false
cleherny Nov 18, 2024
e443575
Set remove overlap False in MES/config_det
cleherny Nov 18, 2024
69ff311
Remove useless else condition in debug mode
cleherny Nov 20, 2024
9d6693e
Remove 'else None' for remove_overlap key
cleherny Nov 22, 2024
f49f120
Conrrect elif to if in generate_tileset debug mode
cleherny Nov 22, 2024
7ffd07f
Remove year_img key from metadata
cleherny Nov 22, 2024
38431a9
Merge branch 'gs/code_improvement' into ch/code_improvement
cleherny Nov 25, 2024
57b58d7
Merge pull request #39 from swiss-territorial-data-lab/ch/code_improv…
cleherny Nov 25, 2024
9ee9f5f
Aesthetic modification
GwenaelleSa Nov 26, 2024
b1f17ee
Debug COCO file production
GwenaelleSa Nov 26, 2024
d2c28a7
Deal with year in the metadata
GwenaelleSa Nov 27, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 0 additions & 2 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
version: "3"

services:
stdl-objdet:
build: .
Expand Down
1 change: 1 addition & 0 deletions examples/mineral-extract-sites-detection/config_det.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ make_detections.py:
enabled: True
epsilon: 2.0 # cf. https://rdp.readthedocs.io/en/latest/
score_lower_threshold: 0.3
remove_det_overlap: False # if several detections overlap (IoU > 0.5), only the one with the highest confidence score is retained

# 4-Filtering and merging detection polygons
filter_detections.py:
Expand Down
41 changes: 16 additions & 25 deletions examples/mineral-extract-sites-detection/config_trne.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,43 +3,33 @@ prepare_data.py:
srs: EPSG:2056
datasets:
shapefile: ./data/labels/tlm-hr-trn-topo.shp # GT labels
# FP_shapefile: ./data/FP/FP_list.gpkg # FP labels
# empty_tiles_aoi: ./data/AoI/AoI_2020.shp # AOI in which additional empty tiles can be selected. Only one 'empty_tiles' option can be selected
# empty_tiles_year: multi-year # If "empty_tiles_aoi" selected then provide a year. Choice: (1) numeric (i.e. 2020), (2) multi-year (random year selection within years [1945-2023])
# empty_tiles_shp: .data/EPT/<SHPFILE> # Provided shpfile of selected empty tiles. Only one 'empty_tiles' option can be selected
# fp_shapefile: ./data/FP/<SHPFILE> # FP labels
# empty_tiles_aoi: ./data/AoI/<SHPFILE> # AOI in which additional empty tiles can be selected. Only one 'empty_tiles' option can be selected
# empty_tiles_year: 2020 # If "empty_tiles_aoi" selected then provide a year. Choice: (1) numeric (i.e. 2020), (2) [year1, year2] (random selection of a year within a given year range)
# empty_tiles_shp: .data/empty_tiles/<SHPFILE> # Provided shapefile of selected empty tiles. Only one 'empty_tiles' option can be selected
output_folder: ./output/output_trne
zoom_level: 16

# Fetch of tiles and split into 3 datasets: train, test, validation
generate_tilesets.py:
debug_mode:
enable: False # sample of tiles
nb_tiles_max: 1000
nb_tiles_max: 5000
working_directory: output
datasets:
aoi_tiles: output_trne/tiles.geojson
ground_truth_labels: output_trne/labels.geojson
# FP_labels: output_trne/FP.geojson
# fp_labels:
# fp_shp: output_trne/FP.geojson
# frac_trn: 0.7 # fraction of fp tiles to add to the trn dataset, then the remaining tiles will be split in 2 and added to tst and val datasets
image_source:
# #############
# type: FOLDER
# location: ./data/images/SWISSIMAGE/
# srs: 3857
# # year: multi-year
# #############
# type: WMS # supported values: 1. MIL = Map Image Layer 2. WMS 3. XYZ 4. FOLDER
# location: https://wms.geo.admin.ch/service
# layers: ch.swisstopo.swissimage
# srs: "EPSG:2056"
# ############
type: XYZ # supported values: 1. MIL = Map Image Layer 2. WMS 3. XYZ 4. FOLDER
year: 2020 # supported values: 1. multi-year (tiles of different year), 2. <year> (i.e. 2020)
location: https://wmts.geo.admin.ch/1.0.0/ch.swisstopo.swissimage-product/default/{year}/3857/{z}/{x}/{y}.jpeg
# ############
empty_tiles: # add empty tiles to datasets
tiles_frac: 0.5 # fraction (relative to the number of tiles intersecting labels) of empty tiles to add
frac_trn: 0.75 # fraction of empty tiles to add to the trn dataset, then the remaining tiles will be split in 2 and added to tst and val datasets
keep_oth_tiles: False # keep tiles in oth dataset not intersecting oth labels
# empty_tiles: # add empty tiles to datasets
# tiles_frac: 0.5 # fraction (relative to the number of tiles intersecting labels) of empty tiles to add
# frac_trn: 0.7 # fraction of empty tiles to add to the trn dataset, then the remaining tiles will be split in 2 and added to tst and val datasets
# keep_oth_tiles: False # keep tiles in oth dataset not intersecting oth labels
output_folder: output_trne
tile_size: 256 # per side, in pixels
seed: 42
Expand Down Expand Up @@ -79,13 +69,13 @@ make_detections.py:
tst: COCO_tst.json
detectron2_config_file: ../../detectron2_config_dqry.yaml # path relative to the working_folder
model_weights:
pth_file: ./logs/model_0000999.pth # trained model minimising the validation loss curve, monitor the training process via tensorboard (tensorboard --logdir </logs>)
pth_file: ./logs/model_0002999.pth # trained model minimising the validation loss curve, monitor the training process via tensorboard (tensorboard --logdir </logs>)
image_metadata_json: img_metadata.json
rdp_simplification: # rdp = Ramer-Douglas-Peucker
enabled: true
enabled: True
epsilon: 2.0 # cf. https://rdp.readthedocs.io/en/latest/
score_lower_threshold: 0.05
remove_det_overlap: True
remove_det_overlap: False # if several detections overlap (IoU > 0.5), only the one with the highest confidence score is retained

# Evaluate the quality of the detections for the different datasets by calculating metrics
assess_detections.py:
Expand All @@ -94,6 +84,7 @@ assess_detections.py:
ground_truth_labels: labels.geojson
image_metadata_json: img_metadata.json
split_aoi_tiles: split_aoi_tiles.geojson # aoi = Area of Interest
categories: category_ids.json
detections:
trn: trn_detections_at_0dot05_threshold.gpkg
val: val_detections_at_0dot05_threshold.gpkg
Expand Down
42 changes: 28 additions & 14 deletions examples/mineral-extract-sites-detection/prepare_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ def bbox(bounds):
OUTPUT_DIR = cfg['output_folder']
SHPFILE = cfg['datasets']['shapefile']
CATEGORY = cfg['datasets']['category'] if 'category' in cfg['datasets'].keys() else False
FP_SHPFILE = cfg['datasets']['FP_shapefile'] if 'FP_shapefile' in cfg['datasets'].keys() else None
FP_SHPFILE = cfg['datasets']['fp_shapefile'] if 'fp_shapefile' in cfg['datasets'].keys() else None
EPT_YEAR = cfg['datasets']['empty_tiles_year'] if 'empty_tiles_year' in cfg['datasets'].keys() else None
if 'empty_tiles_aoi' in cfg['datasets'].keys() and 'empty_tiles_shp' in cfg['datasets'].keys():
logger.error("Choose between supplying an AoI shapefile ('empty_tiles_aoi') in which empty tiles will be selected, or a shapefile with selected empty tiles ('empty_tiles_shp')")
Expand Down Expand Up @@ -166,7 +166,11 @@ def bbox(bounds):
## Convert datasets shapefiles into geojson format
logger.info('Convert labels shapefile into GeoJSON format (EPSG:4326)...')
labels_gdf = gpd.read_file(SHPFILE)
labels_4326_gdf = labels_gdf.to_crs(epsg=4326)
if 'year' in labels_gdf.keys():
labels_gdf['year'] = labels_gdf.year.astype(int)
labels_4326_gdf = labels_gdf.to_crs(epsg=4326).drop_duplicates(subset=['geometry', 'year'])
else:
labels_4326_gdf = labels_gdf.to_crs(epsg=4326).drop_duplicates(subset=['geometry'])
labels_4326_gdf['CATEGORY'] = 'quarry'
labels_4326_gdf['SUPERCATEGORY'] = 'land usage'
gt_labels_4326_gdf = labels_4326_gdf.copy()
Expand All @@ -184,7 +188,11 @@ def bbox(bounds):
if FP_SHPFILE:
fp_labels_gdf = gpd.read_file(FP_SHPFILE)
assert_year(fp_labels_gdf, labels_gdf, 'FP', EPT_YEAR)
fp_labels_4326_gdf = fp_labels_gdf.to_crs(epsg=4326)
if 'year' in fp_labels_gdf.keys():
fp_labels_gdf['year'] = fp_labels_gdf.year.astype(int)
fp_labels_4326_gdf = fp_labels_gdf.to_crs(epsg=4326).drop_duplicates(subset=['geometry', 'year'])
else:
fp_labels_4326_gdf = fp_labels_gdf.to_crs(epsg=4326).drop_duplicates(subset=['geometry'])
fp_labels_4326_gdf['CATEGORY'] = 'quarry'
fp_labels_4326_gdf['SUPERCATEGORY'] = 'land usage'

Expand Down Expand Up @@ -233,7 +241,9 @@ def bbox(bounds):
if isinstance(EPT_YEAR, int):
empty_tiles_4326_aoi_gdf['year'] = int(EPT_YEAR)
else:
empty_tiles_4326_aoi_gdf['year'] = np.random.randint(low=1945, high=2023, size=(len(empty_tiles_4326_aoi_gdf),))
empty_tiles_4326_aoi_gdf['year'] = np.random.randint(low=EPT_YEAR[0], high=EPT_YEAR[1], size=(len(empty_tiles_4326_aoi_gdf)))
elif EPT_SHPFILE and EPT_YEAR:
logger.warning("No year column in the label shapefile. The provided empty tile year will be ignored.")
elif EPT == 'shp':
if EPT_YEAR:
logger.warning("A shapefile of selected empty tiles are provided. The year set for the empty tiles in the configuration file will be ignored")
Expand All @@ -242,8 +252,8 @@ def bbox(bounds):
aoi_bbox = None
aoi_bbox_contains = False

logger.info('Creating tiles for the Area of Interest (AoI)...')
logger.info("Creating tiles for the Area of Interest (AoI)...")

# Get tiles coordinates and shapes
tiles_4326_aoi_gdf = aoi_tiling(boundaries_df)

Expand All @@ -253,27 +263,31 @@ def bbox(bounds):
logger.info(f"- Number of tiles intersecting GT labels = {len(tiles_4326_gt_gdf)}")

if FP_SHPFILE:
tiles_fp_4326_gdf = gpd.sjoin(tiles_4326_aoi_gdf, fp_labels_4326_gdf, how='inner', predicate='intersects')
tiles_fp_4326_gdf.drop_duplicates('title', inplace=True)
logger.info(f"- Number of tiles intersecting FP labels = {len(tiles_fp_4326_gdf)}")
tiles_4326_fp_gdf = gpd.sjoin(tiles_4326_aoi_gdf, fp_labels_4326_gdf, how='inner', predicate='intersects')
tiles_4326_fp_gdf.drop_duplicates('title', inplace=True)
logger.info(f"- Number of tiles intersecting FP labels = {len(tiles_4326_fp_gdf)}")

if not EPT_SHPFILE or EPT_SHPFILE and aoi_bbox_contains == False:
# Keep only tiles intersecting labels
if FP_SHPFILE:
tiles_4326_aoi_gdf = pd.concat([tiles_4326_gt_gdf, tiles_fp_4326_gdf])
tiles_4326_aoi_gdf = pd.concat([tiles_4326_gt_gdf, tiles_4326_fp_gdf])
else:
tiles_4326_aoi_gdf = tiles_4326_gt_gdf.copy()

# Get all the tiles in one gdf
if EPT_SHPFILE and aoi_bbox.contains(labels_bbox) == False:
logger.info("- Add label tiles to empty AoI tiles")
if EPT_SHPFILE and aoi_bbox_contains == False:
logger.info("- Concatenate label tiles and empty AoI tiles")
tiles_4326_all_gdf = pd.concat([tiles_4326_aoi_gdf, empty_tiles_4326_aoi_gdf])
else:
tiles_4326_all_gdf = tiles_4326_aoi_gdf.copy()

# - Remove duplicated tiles
if nb_labels > 1:
tiles_4326_all_gdf.drop_duplicates(['title', 'year'] if 'year' in tiles_4326_all_gdf.keys() else 'title', inplace=True)
if 'year' in tiles_4326_all_gdf.keys():
tiles_4326_all_gdf['year'] = tiles_4326_all_gdf.year.astype(int)
tiles_4326_all_gdf.drop_duplicates(['title', 'year'], inplace=True)
else:
tiles_4326_all_gdf.drop_duplicates(['title'], inplace=True)

# - Remove useless columns, reset feature id and redefine it according to xyz format
logger.info('- Add tile IDs and reorganise data set')
Expand All @@ -289,7 +303,7 @@ def bbox(bounds):
tile_filepath = os.path.join(OUTPUT_DIR, tile_filename)
tiles_4326_all_gdf.to_file(tile_filepath, driver='GeoJSON')
written_files.append(tile_filepath)
logger.success(f"{DONE_MSG} A file was written: {tile_filepath}")
logger.success(f"{DONE_MSG} A file was written: {tile_filepath}")

print()
logger.info("The following files were written. Let's check them out!")
Expand Down
17 changes: 16 additions & 1 deletion examples/road-surface-classification/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,16 +12,31 @@ It is made of the following assets:
- an excel file with the road parameters,
- a data preparation script (`prepare_data.py`) producing the files to be used as input to the `generate_tilesets.py`script.

After performing the installation described in the root folder of the project, the end-to-end workflow can be run by issuing the following list of commands, straight from this folder:
Installation can be carried out by following the instructions in the main readme file. When using docker, the container must be launched before running the workflow:

```bash
$ sudo chown -R 65534:65534 examples
$ docker compose run --rm -it stdl-objdet
```

The end-to-end workflow can be run by issuing the following list of commands:

```bash
$ cd examples/road-surface-classification
$ python prepare_data.py config_rs.yaml
$ stdl-objdet generate_tilesets config_rs.yaml
$ stdl-objdet train_model config_rs.yaml
$ stdl-objdet make_detections config_rs.yaml
$ stdl-objdet assess_detections config_rs.yaml
```

The docker container is exited and permissions restored with:

```bash
$ exit
$ sudo chmod -R a+w examples
```

This example is made up from a subset of the data used in the proj-roadsurf project. For more information about this project, you can consult [the associated repository](https://github.com/swiss-territorial-data-lab/proj-roadsurf) and [its full documentation](https://tech.stdl.ch/PROJ-ROADSURF/). <br>
The original project does not use the original script for assessment but has his own.

Expand Down
8 changes: 3 additions & 5 deletions examples/road-surface-classification/config_rs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,6 @@ generate_tilesets.py:
year: 2018
location: https://wmts.geo.admin.ch/1.0.0/ch.swisstopo.swissimage-product/default/{year}/3857/{z}/{x}/{y}.jpeg
srs: "EPSG:3857"
# empty_tiles: # add empty tiles to datasets
# tiles_frac: 0.5 # fraction (relative to the number of tiles intersecting labels) of empty tiles to add
# frac_trn: 0.75 # fraction of empty tiles to add to the trn dataset, then the remaining tiles will be split in 2 and added to tst and val datasets
# keep_oth_tiles: False # keep tiles in oth dataset not intersecting oth labels
output_folder: .
tile_size: 256 # per side, in pixels
overwrite: False
Expand Down Expand Up @@ -74,19 +70,21 @@ make_detections.py:
oth: COCO_oth.json
detectron2_config_file: ../detectron2_config_3bands.yaml # path relative to the working_folder
model_weights:
pth_file: logs/model_0010999.pth
pth_file: logs/model_0011499.pth
image_metadata_json: img_metadata.json
rdp_simplification: # rdp = Ramer-Douglas-Peucker
enabled: true
epsilon: 0.75 # cf. https://rdp.readthedocs.io/en/latest/
score_lower_threshold: 0.05
remove_det_overlap: False # if several detections overlap (IoU > 0.5), only the one with the highest confidence score is retained

assess_detections.py:
working_directory: outputs_RS
datasets:
ground_truth_labels: json_inputs/ground_truth_labels.geojson
other_labels: json_inputs/other_labels.geojson
split_aoi_tiles: split_aoi_tiles.geojson
categories: category_ids.json
detections:
trn: trn_detections_at_0dot05_threshold.gpkg
val: val_detections_at_0dot05_threshold.gpkg
Expand Down
6 changes: 2 additions & 4 deletions examples/swimming-pool-detection/GE/config_GE.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,10 +19,6 @@ generate_tilesets.py:
type: MIL # supported values: 1. MIL = Map Image Layer 2. WMS 3. XYZ 4. FOLDER
location: https://raster.sitg.ge.ch/arcgis/rest/services/ORTHOPHOTOS_2018_EPSG2056/MapServer
srs: "EPSG:3857"
# empty_tiles: # add empty tiles to datasets
# tiles_frac: 0.5 # fraction (relative to the number of tiles intersecting labels) of empty tiles to add
# frac_trn: 0.75 # fraction of empty tiles to add to the trn dataset, then the remaining tiles will be split in 2 and added to tst and val datasets
# keep_oth_tiles: True # keep tiles in oth dataset not intersecting oth labels
output_folder: output_GE
tile_size: 256 # per side, in pixels
overwrite: False
Expand Down Expand Up @@ -67,13 +63,15 @@ make_detections.py:
enabled: true
epsilon: 0.5 # cf. https://rdp.readthedocs.io/en/latest/
score_lower_threshold: 0.05
remove_det_overlap: False # if several detections overlap (IoU > 0.5), only the one with the highest confidence score is retained

assess_detections.py:
working_directory: output_GE
datasets:
ground_truth_labels: ground_truth_labels.geojson
other_labels: other_labels.geojson
split_aoi_tiles: split_aoi_tiles.geojson # aoi = Area of Interest
categories: category_ids.json
detections:
trn: trn_detections_at_0dot05_threshold.gpkg
val: val_detections_at_0dot05_threshold.gpkg
Expand Down
2 changes: 2 additions & 0 deletions examples/swimming-pool-detection/NE/config_NE.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -68,13 +68,15 @@ make_detections.py:
enabled: true
epsilon: 0.5 # cf. https://rdp.readthedocs.io/en/latest/
score_lower_threshold: 0.05
remove_det_overlap: False # if several detections overlap (IoU > 0.5), only the one with the highest confidence score is retained

assess_detections.py:
working_directory: output_NE
datasets:
ground_truth_labels: ground_truth_labels.geojson
other_labels: other_labels.geojson
split_aoi_tiles: split_aoi_tiles.geojson # aoi = Area of Interest
categories: category_ids.json
detections:
trn: trn_detections_at_0dot05_threshold.gpkg
val: val_detections_at_0dot05_threshold.gpkg
Expand Down
2 changes: 2 additions & 0 deletions helpers/FOLDER.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ def get_image_to_folder(basepath, filename, bbox, year, save_metadata=False, ove
basepath (path): path to the original image tile
filename (path): path to the image tile for the object detector
bbox (tuple): coordinates of the bounding box
year (int): year of the image tile
save_metadata (bool, optional): Whether to save the metadata in a json file. Defaults to False.
overwrite (bool, optional): Whether to overwrite the files already existing in the target folder or to skip them. Defaults to True.

Expand Down Expand Up @@ -103,6 +104,7 @@ def get_image_to_folder(basepath, filename, bbox, year, save_metadata=False, ove
# we can mimick ESRI MapImageLayer's metadata,
# at least the section that we need
image_metadata = {
**({'year': year} if year else {}),
"width": width,
"height": height,
"extent": {
Expand Down
17 changes: 15 additions & 2 deletions helpers/XYZ.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,20 @@ def detect_img_format(url):


def get_geotiff(xyz_url, bbox, year, xyz, filename, save_metadata=False, overwrite=True):
"""
...
""" Download tile url formatting and addition of image metadata

Args:
xyz_url (path): path to the original image tile
bbox (tuple): coordinates of the bounding box
year (int): year of the image tile
xyz (tuple): x, y, z coordinates of the tile
save_metadata (bool, optional): Whether to save the metadata in a json file. Defaults to False.
overwrite (bool, optional): Whether to overwrite the files already existing in the target folder or to skip them. Defaults to True.

Returns:
dictionnary:
- key: name of the geotiff file
- value: image metadata
"""

if not filename.endswith('.tif'):
Expand Down Expand Up @@ -87,6 +99,7 @@ def get_geotiff(xyz_url, bbox, year, xyz, filename, save_metadata=False, overwri
# we can mimick ESRI MapImageLayer's metadata,
# at least the section that we need
image_metadata = {
**({'year': year} if year else {}),
"width": width,
"height": height,
"extent": {
Expand Down
Loading
Loading