Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update workflow user guide to use latest sciline interface #93

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
hidden:
---

examples/index
user-guide/index
api-reference/index
developer/index
about/index
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/index.md → docs/user-guide/index.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Examples
# User Guide

```{toctree}
---
Expand Down
102 changes: 74 additions & 28 deletions docs/examples/workflow.ipynb → docs/user-guide/workflow.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,21 +23,20 @@
"from ess.nmx.data import small_mcstas_3_sample\n",
"\n",
"from ess.nmx.types import *\n",
"from ess.nmx.reduction import NMXData, NMXReducedData, merge_panels\n",
"from ess.nmx.reduction import NMXReducedData, merge_panels\n",
"from ess.nmx.nexus import export_as_nexus\n",
"\n",
"wf = McStasWorkflow()\n",
"# Replace with the path to your own file\n",
"wf[FilePath] = small_mcstas_3_sample()\n",
"wf[MaximumProbability] = 10000\n",
"wf[TimeBinSteps] = 50"
"base_wf = McStasWorkflow() # Instantiate the base workflow\n",
"base_wf[FilePath] = small_mcstas_3_sample() # Replace with the path to your own file\n",
"base_wf[MaximumProbability] = 10_000\n",
"base_wf[TimeBinSteps] = 50"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To see what the workflow can produce, display it:"
"### All Types and Providers of the Workflow"
]
},
{
Expand All @@ -46,13 +45,31 @@
"metadata": {},
"outputs": [],
"source": [
"wf"
"base_wf"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Graph of the Workflow."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"base_wf.visualize(NMXReducedData, graph_attr={\"rankdir\": \"LR\"})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Detector Index Mapping\n",
"\n",
"We want to reduce all three panels, so we map the relevant part of the workflow over a list of the three panels:"
]
},
Expand All @@ -64,18 +81,15 @@
"source": [
"# DetectorIndex selects what detector panels to include in the run\n",
"# in this case we select all three panels.\n",
"wf[NMXReducedData] = (\n",
" wf[NMXReducedData]\n",
" .map({DetectorIndex: sc.arange('panel', 3, unit=None)})\n",
" .reduce(index=\"panel\", func=merge_panels)\n",
")"
"detector_panel_ids = {DetectorIndex: sc.arange('panel', 3, unit=None)}\n",
"pipeline = base_wf.map(detector_panel_ids)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Build Workflow"
"### Visualize Mapped Graph"
]
},
{
Expand All @@ -84,14 +98,20 @@
"metadata": {},
"outputs": [],
"source": [
"wf.visualize(NMXReducedData, graph_attr={\"rankdir\": \"TD\"}, compact=True)"
"import sciline as sl\n",
"\n",
"pipeline.visualize(\n",
" sl.get_mapped_node_names(pipeline, NMXReducedData),\n",
" graph_attr={\"rankdir\": \"LR\"},\n",
" compact=True,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Compute Desired Types"
"### Reduce Mapped Results"
]
},
{
Expand All @@ -100,12 +120,18 @@
"metadata": {},
"outputs": [],
"source": [
"from cyclebane.graph import NodeName, IndexValues\n",
"from typing import NewType\n",
"\n",
"# Event data grouped by pixel id for each of the selected detectors\n",
"targets = [NodeName(NMXData, IndexValues((\"panel\",), (i,))) for i in range(3)]\n",
"dg = merge_panels(*wf.compute(targets).values())\n",
"dg"
"MergedNMXReducedData = NewType('MergedNMXReducedData', sc.DataGroup)\n",
"graph = pipeline.reduce(func=merge_panels, name=MergedNMXReducedData)\n",
"graph.get(MergedNMXReducedData).visualize(compact=True, graph_attr={\"rankdir\": \"LR\"})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Compute Desired Types"
]
},
{
Expand All @@ -114,8 +140,7 @@
"metadata": {},
"outputs": [],
"source": [
"# Data from all selected detectors binned by panel, pixel and timeslice\n",
"binned_dg = wf.compute(NMXReducedData)\n",
"binned_dg = graph.compute(MergedNMXReducedData)\n",
"binned_dg"
]
},
Expand All @@ -139,6 +164,27 @@
"export_as_nexus(binned_dg, \"test.nxs\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Merge Intermediate Mapped Nodes"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from ess.nmx.reduction import NMXData\n",
"\n",
"dg = merge_panels(\n",
" *pipeline.compute(sl.get_mapped_node_names(pipeline, NMXData)).values()\n",
") # We need ``NMXData`` to use instrument view\n",
"dg"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand All @@ -162,17 +208,17 @@
"source": [
"import scippneutron as scn\n",
"\n",
"da = dg[\"weights\"]\n",
"da.coords[\"position\"] = dg[\"position\"]\n",
"da = binned_dg['counts'].sum('t')\n",
"da.coords[\"position\"] = binned_dg[\"position\"]\n",
"# Plot one out of 100 pixels to reduce size of docs output\n",
"view = scn.instrument_view(da[\"id\", ::100].hist(), pixel_size=0.0075)\n",
"view = scn.instrument_view(da[\"id\", ::100], pixel_size=0.0075)\n",
"view"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "nmx-dev-310",
"language": "python",
"name": "python3"
},
Expand All @@ -186,7 +232,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.10.13"
}
},
"nbformat": 4,
Expand Down