Skip to content

Commit 4dbd708

Browse files
Add "Seed Phenotyping" to the project catalog (#675)
* Create entry for "Seed Phenotyping" in the project catalog * Adding datasource for seed phenotyping * Updating action to not run if from a fork --------- Co-authored-by: Isabel Fenton <[email protected]> Co-authored-by: Isabel Fenton <[email protected]>
1 parent 36ed9b2 commit 4dbd708

File tree

5 files changed

+55
-1
lines changed

5 files changed

+55
-1
lines changed

.github/workflows/frontend.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@ jobs:
1414
fetch-depth: 0
1515

1616
- name: Update branch
17+
if: github.event.pull_request.head.repo.full_name == github.repository
1718
run: |
1819
git config user.name "github-actions[bot]"
1920
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"

src/scivision/catalog/data/datasources.json

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -373,6 +373,32 @@
373373
"marine-biology",
374374
"species-classification"
375375
]
376+
},
377+
{
378+
"tasks": [
379+
"segmentation",
380+
"object-detection"
381+
],
382+
"labels_provided": true,
383+
"domains": [
384+
"plant-biology",
385+
"agriculture",
386+
"computer-vision"
387+
],
388+
"institution": [
389+
"The Alan Turing Institute",
390+
"Aberystwyth University",
391+
"National Plant Phenomics Centre",
392+
"Rothamsted Research"
393+
],
394+
"tags": [
395+
"2D",
396+
"3D",
397+
"plant-phenotyping"
398+
],
399+
"name": "Pixelflow Seed Demo Data",
400+
"url": "https://zenodo.org/api/records/8355920/files-archive",
401+
"description": "2D and 3D images and labels of oilseed rape (Brassica napus) seed pods for use with Pixelflow Seed Demo notebooks (https://github.com/scivision-gallery/pixelflow_seed_demo)"
376402
}
377403
]
378404
}

src/scivision/catalog/data/projects.json

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,6 @@
105105
"header": "Coastal Vegetation Edge Detection",
106106
"description": "Edge detection of coastal vegetation from RGB satellite imagery",
107107
"page": "Recent advances in satellite imagery availability and spatial resolution are providing new opportunities for the rapid, cost-effective detection of a shoreline’s location and dynamics. [Rogers et al. (2021)](https://www.tandfonline.com/doi/abs/10.1080/01431161.2021.1897185?journalCode=tres20) advance in coastal vegetation monitoring by developing `VEdge_detector`, a tool to extract the coastal vegetation line from remote-sensing imagery, training a very deep convolutional neural network (holistically nested edge detection), to predict sequential vegetation line locations on annual to decadal timescales. The `VEdge_Detector` model was trained using Planet 3 – 5 m spatial resolution imagery. It has also detected vegetation edges in Landsat and Copernicus Sentinel imagery, although performance is not guaranteed. The tool cannot detect the vegetation edge in aerial imagery.\n\n# Example notebook\nThere is a worked example of the VEdge_Detector model in action available at the [Scivision Gallery](https://github.com/scivision-gallery/coastalveg-edge-detection).\n\nIn this notebook, we demonstrate how scivision facilitates the discovery of the VEdge_detector model for differentiating between the coastal vegetation edge and other boundaries in remote sensing images. We pair the model with one of the matched data sources from the scivision data catalog, in this case some sample of satellite images (n=3) from different geographical areas (Suffolk, United Kingdom; Wilk auf Föhr, Germany; Varela, Guinea Bissau) provided within the VEdge model repository."
108-
109108
},
110109
{
111110
"models": [
@@ -130,6 +129,34 @@
130129
"header": "Tree Crown Detection using detectreeRGB",
131130
"name": "treecrown-detectreeRGB",
132131
"page": "The delineation of individual trees in remote sensing images is an key task in forest analysis. As part of Sebastian Hickman's AI4ER MRes project, titled 'Detecting changes in tall tree height with machine learning, LiDAR, and RGB imagery', the authors propose the detectreeRGB model, an implementation of Mask R-CNN from [Detectron2](https://github.com/facebookresearch/detectron2) to perform tree crown delineation from RGB imagery.\n\nFurther details of the detectreeRGB model can be found in the [original source code repository](https://github.com/shmh40/detectreeRGB/).\n\n## Example notebook\nThere is a worked example of the detectreeRGB model in action available at the [Scivision Gallery](https://github.com/scivision-gallery/tree-crown-detection).\n\nIn this notebook, we demonstrate how scivision can assist in discovering a pretrained detectreeRGB model provided by Hickman et al (2021), and then use it to delineate crowns from a sample drone RGB image dataset."
132+
},
133+
{
134+
"models": [
135+
"StarDist Seed"
136+
],
137+
"datasources": [
138+
"Pixelflow Seed Demo Data"
139+
],
140+
"tasks": [
141+
"object-detection",
142+
"segmentation"
143+
],
144+
"institution": [
145+
"The Alan Turing Institute",
146+
"Rothamsted Research",
147+
"National Plant Phenomics Centre"
148+
],
149+
"tags": [
150+
"plant biology",
151+
"plant-phenotyping",
152+
"agriculture",
153+
"2D",
154+
"3D"
155+
],
156+
"name": "Seed Phenotyping",
157+
"header": "Automated Extraction of Seed Phenotype Data",
158+
"description": "Automated Extraction of 2D and 3D Seed Phenotype Data using a fine-tuned StarDist model, Scivision, and Pixelflow",
159+
"page": "A fine-tuned StarDist model was used to extract location, size and shape data for oilseed rape (*Brassica napus*) seeds detected and segmented in 2D light box and 3D X-ray computed tomography images as described in ['Automated extraction of pod phenotype data from micro-computed tomography' - Corcoran et al. 2023](https://www.frontiersin.org/articles/10.3389/fpls.2023.1120182/full).\n\nThe fine-tuned StarDist model for automated detection and segmentation of seeds is available from the **Scivision model catalogue** under the name `StarDist Seed`. \n\nThe [Pixelflow](https://github.com/alan-turing-institute/pixelflow) tool was used to extract seed size and shape metrics from the outputs of the fine-tuned StarDist model. Jupyter notebooks demonstrating how to carry out this process for both 2D and 3D data are available from the [Scivision Gallery](https://github.com/scivision-gallery/pixelflow_seed_demo). \n\nExample 2D and 3D seed images and label masks used in these notebooks can be downloaded from [zenodo](https://zenodo.org/record/8355920)\n\nR code used to run valve sorting in this notebook is available from the Scivision Gallery github page please see the following file: **('seedpod_2D_valve_lowess_single.R')**\n"
133160
}
134161
]
135162
}
8.57 KB
Loading
8.57 KB
Loading

0 commit comments

Comments
 (0)