Skip to content

Commit

Permalink
Merge pull request hed-standard#329 from hed-standard/develop
Browse files Browse the repository at this point in the history
Minor updates to Jupyter notebooks
  • Loading branch information
VisLab authored Oct 24, 2023
2 parents 84167cf + c590598 commit 579b55a
Show file tree
Hide file tree
Showing 12 changed files with 373 additions and 238 deletions.
18 changes: 11 additions & 7 deletions docs/source/FileRemodelingTools.md
Original file line number Diff line number Diff line change
Expand Up @@ -236,9 +236,9 @@ The programs use a standard command-line argument list for specifying input as s
````{table} Summary of command-line arguments for the remodeling programs.
| Script name | Arguments | Purpose |
| ----------- | -------- | ------- |
|*run_remodel_backup* | *data_dir*<br/>*-e -\\-extensions*<br/>*-f -\\-file-suffix*<br/>*-n -\\-backup-name*<br/>*-t -\\-task-names*<br/>*-v -\\-verbose*<br/>*-w -\\-work-dir*<br/>*-x -\\-exclude-dirs*| Create a backup event files. |
|*run_remodel* | *data_dir*<br/>*model_path*<br/>*-b -\\-bids-format*<br/>*-e -\\-extensions*<br/>*-f -\\-file-suffix*<br/>*-i -\\-individual-summaries*<br/>*-j -\\-json-sidecar*<br/>*-n -\\-backup-name*<br/>*-nb -\\-no-backup*<br/>*-ns -\\-no-summaries*<br/>*-nu -\\-no-update*<br/>*-r -\\-hed-version*<br/>*-s -\\-save-formats*<br/>*-t -\\-task-names*<br/>*-v -\\-verbose*<br/>*-w -\\-work-dir*<br/>*-x -\\-exclude-dirs* | Restructure or summarize the event files. |
|*run_remodel_restore* | *data_dir*<br/>*-n -\\-backup-name*<br/>*-t -\\-task-names*<br/>*-v -\\-verbose*<br/>*-w -\\-work-dir*<br/> | Restore a backup of event files. |
|*run_remodel_backup* | *data_dir*<br/>*-bd -\\-backup-dir*<br/>*-bn -\\-backup-name*<br/>*-e -\\-extensions*<br/>*-f -\\-file-suffix*<br/>*-t -\\-task-names*<br/>*-v -\\-verbose*<br/>*-x -\\-exclude-dirs*| Create a backup event files. |
|*run_remodel* | *data_dir*<br/>*model_path*<br/>*-b -\\-bids-format*<br/>*-bd -\\-backup-dir*<br/>*-bn -\\-backup-name*<br/>*-e -\\-extensions*<br/>*-f -\\-file-suffix*<br/>*-i -\\-individual-summaries*<br/>*-j -\\-json-sidecar*<br/>*-nb -\\-no-backup*<br/>*-ns -\\-no-summaries*<br/>*-nu -\\-no-update*<br/>*-r -\\-hed-version*<br/>*-s -\\-save-formats*<br/>*-t -\\-task-names*<br/>*-v -\\-verbose*<br/>*-w -\\-work-dir*<br/>*-x -\\-exclude-dirs* | Restructure or summarize the event files. |
|*run_remodel_restore* | *data_dir*<br/>*-bd -\\-backup-dir*<br/>*-bn -\\-backup-name*<br/>*-t -\\-task-names*<br/>*-v -\\-verbose*<br/> | Restore a backup of event files. |
````
All the scripts have a required argument, which is the full path of the dataset root (*data_dir*).
Expand Down Expand Up @@ -278,6 +278,13 @@ Users are free to use either form.
`-b`, `--bids-format`
> If this flag present, the dataset is in BIDS format with sidecars. Tabular files and their associated sidecars are located using BIDS naming.
`-bd`, `--backup-dir`
> The path to the directory holding the backups (default: `[data_root]/derivatives/remodel/backups`).
> Use the `-nb` option if you wish to omit the backup (in `run_remodel`).
`-bn`, `--backup-name`
> The name of the backup used for the remodeling (default: `default_back`).
`-e`, `--extensions`
> This option is followed by a list of file extension(s) of the data files to process.
> The default is `.tsv`. Comma separated tabular files are not permitted.
Expand All @@ -298,9 +305,6 @@ Users are free to use either form.
> This option is followed by the full path of the JSON sidecar with HED annotations to be
> applied during the processing of HED-related remodeling operations.
`-n`, `--backup-name`
> The name of the backup used for the remodeling (default: `default_back`).
`-nb`, `--no-backup`
> If present, no backup is used. Rather operations are performed directly on the files.
Expand Down Expand Up @@ -346,7 +350,7 @@ Users are free to use either form.
> are printed to standard output.
`-w`, `--work-dir`
> The path to the remodeling work root directory --both for backups and summaries (default: `[data_root]/derivatives/remodel`).
> The path to the remodeling work root directory --both for summaries (default: `[data_root]/derivatives/remodel`).
> Use the `-nb` option if you wish to omit the backup (in `run_remodel`).
`-x`, `--exclude-dirs`
Expand Down
116 changes: 44 additions & 72 deletions hedcode/jupyter_notebooks/bids/find_event_combinations.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -33,77 +33,46 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": 3,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" key_counts event_code cond_code event_type focus_modality \\\n",
"0 96 1 1 hear_word auditory \n",
"1 96 1 2 hear_word visual \n",
"2 288 1 3 hear_word auditory \n",
"3 96 2 1 look_word auditory \n",
"4 96 2 2 look_word visual \n",
"5 287 2 3 look_word visual \n",
"6 192 3 1 high_tone auditory \n",
"7 192 3 2 high_tone visual \n",
"8 192 4 1 light_bar auditory \n",
"9 192 4 2 light_bar visual \n",
"10 767 5 1 low_tone auditory \n",
"11 767 5 2 low_tone visual \n",
"12 766 6 1 dark_bar auditory \n",
"13 767 6 2 dark_bar visual \n",
"14 385 7 3 high_tone auditory \n",
"15 384 8 3 light_bar visual \n",
"16 191 9 3 high_tone visual \n",
"17 192 10 3 light_bar auditory \n",
"18 1531 11 3 low_tone auditory \n",
"19 1525 12 3 dark_bar visual \n",
"20 771 13 3 low_tone visual \n",
"21 774 14 3 dark_bar auditory \n",
"22 195 201 1 button_press auditory \n",
"23 191 201 2 button_press visual \n",
"24 390 201 3 button_press auditory \n",
"25 381 201 3 button_press visual \n",
"26 5 202 1 pause_recording auditory \n",
"27 6 202 2 pause_recording visual \n",
"28 6 202 3 pause_recording auditory \n",
"29 1 202 3 pause_recording nan \n",
"30 8 202 3 pause_recording visual \n",
"\n",
" attention_status task_role condition \n",
"0 unattended cue_auditory attend_auditory \n",
"1 unattended cue_visual attend_visual \n",
"2 attended cue_auditory shift_attention \n",
"3 unattended cue_auditory attend_auditory \n",
"4 unattended cue_visual attend_visual \n",
"5 attended cue_visual shift_attention \n",
"6 attended infrequent_stimulus attend_auditory \n",
"7 unattended infrequent_stimulus attend_visual \n",
"8 unattended infrequent_stimulus attend_auditory \n",
"9 attended infrequent_stimulus attend_visual \n",
"10 attended frequent_stimulus attend_auditory \n",
"11 unattended frequent_stimulus attend_visual \n",
"12 unattended frequent_stimulus attend_auditory \n",
"13 attended frequent_stimulus attend_visual \n",
"14 attended infrequent_stimulus shift_attention \n",
"15 attended infrequent_stimulus shift_attention \n",
"16 unattended infrequent_stimulus shift_attention \n",
"17 unattended infrequent_stimulus shift_attention \n",
"18 attended frequent_stimulus shift_attention \n",
"19 attended frequent_stimulus shift_attention \n",
"20 unattended frequent_stimulus shift_attention \n",
"21 unattended frequent_stimulus shift_attention \n",
"22 nan target_detected attend_auditory \n",
"23 nan target_detected attend_visual \n",
"24 nan target_detected shift_attention \n",
"25 nan target_detected shift_attention \n",
"26 nan nan attend_auditory \n",
"27 nan nan attend_visual \n",
"28 nan nan shift_attention \n",
"29 nan nan shift_attention \n",
"30 nan nan shift_attention \n"
"sub-002_task-FaceRecognition_events.tsv\n",
"sub-003_task-FaceRecognition_events.tsv\n",
"sub-004_task-FaceRecognition_events.tsv\n",
"sub-005_task-FaceRecognition_events.tsv\n",
"sub-006_task-FaceRecognition_events.tsv\n",
"sub-007_task-FaceRecognition_events.tsv\n",
"sub-008_task-FaceRecognition_events.tsv\n",
"sub-009_task-FaceRecognition_events.tsv\n",
"sub-010_task-FaceRecognition_events.tsv\n",
"sub-011_task-FaceRecognition_events.tsv\n",
"sub-012_task-FaceRecognition_events.tsv\n",
"sub-013_task-FaceRecognition_events.tsv\n",
"sub-014_task-FaceRecognition_events.tsv\n",
"sub-015_task-FaceRecognition_events.tsv\n",
"sub-016_task-FaceRecognition_events.tsv\n",
"sub-017_task-FaceRecognition_events.tsv\n",
"sub-018_task-FaceRecognition_events.tsv\n",
"sub-019_task-FaceRecognition_events.tsv\n",
"The total count of the keys is:31448\n",
" key_counts trial_type value\n",
"0 90 boundary 0\n",
"1 2700 famous_new 5\n",
"2 1313 famous_second_early 6\n",
"3 1291 famous_second_late 7\n",
"4 3532 left_nonsym 256\n",
"5 3381 left_sym 256\n",
"6 3616 right_nonsym 4096\n",
"7 4900 right_sym 4096\n",
"8 2700 scrambled_new 17\n",
"9 1271 scrambled_second_early 18\n",
"10 1334 scrambled_second_late 19\n",
"11 2700 unfamiliar_new 13\n",
"12 1304 unfamiliar_second_early 14\n",
"13 1316 unfamiliar_second_late 15\n"
]
}
],
Expand All @@ -114,22 +83,25 @@
"from hed.tools.util.io_util import get_file_list\n",
"\n",
"# Variables to set for the specific dataset\n",
"dataset_root_path = os.path.realpath('../../../datasets/eeg_ds002893s_hed_attention_shift')\n",
"data_root = 'T:/summaryTests/ds002718-download'\n",
"output_path = ''\n",
"exclude_dirs = ['stimuli']\n",
"exclude_dirs = ['stimuli', 'derivatives', 'code', 'sourcecode']\n",
"\n",
"# Construct the key map\n",
"key_columns = [ \"event_code\", \"cond_code\", \"event_type\", \"focus_modality\", \"attention_status\", \"task_role\", \"condition\"]\n",
"key_columns = [ \"trial_type\", \"value\"]\n",
"key_map = KeyMap(key_columns)\n",
"\n",
"# Construct the unique combinations\n",
"event_files = get_file_list(dataset_root_path, extensions=[\".tsv\"], name_suffix=\"_events\", exclude_dirs=exclude_dirs)\n",
"event_files = get_file_list(data_root, extensions=[\".tsv\"], name_suffix=\"_events\", exclude_dirs=exclude_dirs)\n",
"for event_file in event_files:\n",
" print(f\"{os.path.basename(event_file)}\")\n",
" df = get_new_dataframe(event_file)\n",
" key_map.update(df)\n",
"\n",
"key_map.resort()\n",
"template = key_map.make_template()\n",
"key_counts_sum = template['key_counts'].sum()\n",
"print(f\"The total count of the keys is:{key_counts_sum}\")\n",
"if output_path:\n",
" template.to_csv(output_path, sep='\\t', index=False, header=True)\n",
"else:\n",
Expand All @@ -138,8 +110,8 @@
"metadata": {
"collapsed": false,
"ExecuteTime": {
"end_time": "2023-09-03T19:43:55.968067600Z",
"start_time": "2023-09-03T19:43:48.336417200Z"
"end_time": "2023-10-24T20:08:40.958637400Z",
"start_time": "2023-10-24T20:08:24.603887900Z"
}
}
}
Expand Down
28 changes: 20 additions & 8 deletions hedcode/jupyter_notebooks/bids/validate_bids_dataset.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,18 @@
},
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"execution_count": 6,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using HEDTOOLS version: {'date': '2023-10-12T10:29:43-0500', 'dirty': True, 'error': None, 'full-revisionid': 'e9e63e39f6a99487c5e69ae96e653c4580045ec0', 'version': '0.3.0+164.ge9e63e3.dirty'}\n",
"HED Examples version: {'version': '0.2.0+173.g63d5080.dirty', 'full-revisionid': '63d50808c4c7e97bdfc89076cb19af9e13252057', 'dirty': True, 'error': None, 'date': '2023-10-23T11:09:31-0500'}\n",
"No HED validation errors\n"
]
}
],
"source": [
"from hed.errors import get_printable_issue_string\n",
"from hed.tools import BidsDataset\n",
Expand All @@ -45,10 +55,8 @@
"\n",
"## Set the dataset location and the check_for_warnings flag\n",
"check_for_warnings = False\n",
"#dataset_path = '../../../datasets/eeg_ds003645s_hed_column'\n",
"#dataset_path = '../../../datasets/eeg_ds004105s_hed'\n",
"dataset_path = 't:/summaryTests/ds004105-download'\n",
"outfile = 't:/ds004105_errors.txt'\n",
"dataset_path = 't:/summaryTests/ds002718-download'\n",
"outfile = 't:/ds002718_errors.txt'\n",
"## Validate the dataset\n",
"bids = BidsDataset(dataset_path)\n",
"issue_list = bids.validate(check_for_warnings=check_for_warnings)\n",
Expand All @@ -57,12 +65,16 @@
"else:\n",
" issue_str = \"No HED validation errors\"\n",
"print(issue_str)\n",
"if outfile:\n",
"if outfile and issue_list:\n",
" with open(outfile, 'w') as fp:\n",
" fp.write(issue_str)\n"
],
"metadata": {
"collapsed": false
"collapsed": false,
"ExecuteTime": {
"end_time": "2023-10-24T18:51:33.095713100Z",
"start_time": "2023-10-24T18:51:31.396158Z"
}
}
}
],
Expand Down
Loading

0 comments on commit 579b55a

Please sign in to comment.