TobiasPitters commited on
Commit
aad6f22
Β·
1 Parent(s): de25271

Initial commit: BIDS neuroimaging viewer

Browse files
Files changed (7) hide show
  1. .gitattributes +1 -1
  2. Dockerfile +17 -0
  3. Makefile +5 -0
  4. README.md +57 -1
  5. data/train/train-00000-of-00010.parquet +3 -0
  6. main.py +484 -0
  7. requirements.txt +4 -0
.gitattributes CHANGED
@@ -15,7 +15,6 @@
15
  *.npz filter=lfs diff=lfs merge=lfs -text
16
  *.onnx filter=lfs diff=lfs merge=lfs -text
17
  *.ot filter=lfs diff=lfs merge=lfs -text
18
- *.parquet filter=lfs diff=lfs merge=lfs -text
19
  *.pb filter=lfs diff=lfs merge=lfs -text
20
  *.pickle filter=lfs diff=lfs merge=lfs -text
21
  *.pkl filter=lfs diff=lfs merge=lfs -text
@@ -33,3 +32,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
15
  *.npz filter=lfs diff=lfs merge=lfs -text
16
  *.onnx filter=lfs diff=lfs merge=lfs -text
17
  *.ot filter=lfs diff=lfs merge=lfs -text
 
18
  *.pb filter=lfs diff=lfs merge=lfs -text
19
  *.pickle filter=lfs diff=lfs merge=lfs -text
20
  *.pkl filter=lfs diff=lfs merge=lfs -text
 
32
  *.zip filter=lfs diff=lfs merge=lfs -text
33
  *.zst filter=lfs diff=lfs merge=lfs -text
34
  *tfevents* filter=lfs diff=lfs merge=lfs -text
35
+ data/train/train-00000-of-00010.parquet filter=lfs diff=lfs merge=lfs -text
Dockerfile ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ FROM python:3.13-slim
2
+
3
+ WORKDIR /app
4
+
5
+ # Install git (needed for git+ URLs in requirements.txt)
6
+ RUN apt-get update && apt-get install -y git && rm -rf /var/lib/apt/lists/*
7
+
8
+ # Copy requirements first for better caching
9
+ COPY requirements.txt .
10
+ COPY data/ data/
11
+ RUN pip install --no-cache-dir -r requirements.txt
12
+
13
+ # Copy application files
14
+ COPY main.py .
15
+
16
+ EXPOSE 7680
17
+ CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "7680"]
Makefile ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ build_container:
2
+ docker build -t bids-neuroimaging .
3
+
4
+ run_container:
5
+ docker run --rm -p 7680:7680 bids-neuroimaging
README.md CHANGED
@@ -9,4 +9,60 @@ license: cc-by-4.0
9
  short_description: Visualize Aphasia Recovery Cohort (ARC) Dataset - Mini Samp
10
  ---
11
 
12
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  short_description: Visualize Aphasia Recovery Cohort (ARC) Dataset - Mini Samp
10
  ---
11
 
12
+ # BIDS Neuroimaging Viewer
13
+
14
+ A FastAPI-based web application for visualizing neuroimaging data in BIDS format using NiiVue.
15
+
16
+ ## Installation
17
+
18
+ 1. Clone the repository:
19
+ ```bash
20
+ git clone <your-repo-url>
21
+ cd bids-neuroimaging
22
+ ```
23
+
24
+ 2. Create a virtual environment and install dependencies:
25
+ ```bash
26
+ python -m venv .venv
27
+ source .venv/bin/activate # On Windows: .venv\Scripts\activate
28
+ pip install -r requirements.txt
29
+ ```
30
+
31
+ 3. Install the custom datasets fork with BIDS loader:
32
+ ```bash
33
+ pip install git+https://github.com/The-Obstacle-Is-The-Way/datasets.git@feat/bids-loader
34
+ ```
35
+
36
+ ## Running the Application
37
+
38
+ Start the FastAPI server with:
39
+
40
+ ```bash
41
+ fastapi dev main.py
42
+ ```
43
+
44
+ Or using uvicorn directly:
45
+
46
+ ```bash
47
+ uvicorn main:app --reload --host 0.0.0.0 --port 8000
48
+ ```
49
+
50
+ Then open your browser and navigate to:
51
+ - **Main viewer**: http://localhost:8000
52
+ - **Health check**: http://localhost:8000/health
53
+
54
+ ## Features
55
+
56
+ - Interactive 3D visualization of NIfTI files using NiiVue
57
+ - Multiple viewing modes: Multiplanar + 3D, and 3D Render Only
58
+ - Next button to iterate through dataset samples (implement the `/next` endpoint)
59
+
60
+ ## Development
61
+
62
+ The main application is in `main.py`. To implement the "Next" functionality:
63
+ 1. Modify the `/next` endpoint in `main.py` to load and return the next sample
64
+ 2. The endpoint should return a JSON with the base64-encoded NIfTI data
65
+
66
+ ## Configuration Reference
67
+
68
+ For Hugging Face Spaces deployment, check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
data/train/train-00000-of-00010.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:986b9d3b0614d4283a97301e8495ae5cf67968f28c6f85ce46e178fd5bfb1bd2
3
+ size 406159346
main.py ADDED
@@ -0,0 +1,484 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from fastapi import FastAPI
2
+ from fastapi.responses import HTMLResponse
3
+ from datasets import load_dataset
4
+ import base64
5
+ import asyncio
6
+ from contextlib import asynccontextmanager
7
+
8
+ ds_iter = None
9
+ first_sample = None
10
+ nifti_base_url = "data:application/octet-stream;base64,{}"
11
+ initial_metadata = {}
12
+ num_iterations = 0
13
+
14
+ async def load_dataset_async():
15
+ global ds_iter
16
+ dataset = load_dataset("TobiasPitters/ds004884-mini", streaming=True)
17
+ ds_iter = iter(dataset["train"])
18
+
19
+ @asynccontextmanager
20
+ async def lifespan(app: FastAPI):
21
+ global first_sample, initial_metadata
22
+
23
+ asyncio.create_task(load_dataset_async())
24
+
25
+ yield
26
+
27
+ app = FastAPI(title="BIDS Neuroimaging Viewer", lifespan=lifespan)
28
+
29
+ @app.get("/", response_class=HTMLResponse)
30
+ async def root():
31
+ """Main page with NiiVue viewer"""
32
+ html_content = f"""<!DOCTYPE html>
33
+ <html>
34
+ <head>
35
+ <meta charset="UTF-8">
36
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
37
+ <title>BIDS Neuroimaging Viewer</title>
38
+ <style>
39
+ * {{
40
+ margin: 0;
41
+ padding: 0;
42
+ box-sizing: border-box;
43
+ }}
44
+ body {{
45
+ font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
46
+ background: #1a1a1a;
47
+ color: #fff;
48
+ }}
49
+ .header {{
50
+ background: #2d2d2d;
51
+ padding: 20px;
52
+ border-bottom: 2px solid #3498db;
53
+ }}
54
+ .header h1 {{
55
+ margin: 0;
56
+ color: #3498db;
57
+ }}
58
+ .container {{
59
+ max-width: 1400px;
60
+ margin: 0 auto;
61
+ padding: 20px;
62
+ }}
63
+ .main-content {{
64
+ display: flex;
65
+ gap: 20px;
66
+ }}
67
+ .viewer-section {{
68
+ flex: 1;
69
+ min-width: 0;
70
+ }}
71
+ .metadata-section {{
72
+ width: 300px;
73
+ flex-shrink: 0;
74
+ }}
75
+ .metadata-card {{
76
+ background: #2d2d2d;
77
+ border-radius: 8px;
78
+ padding: 20px;
79
+ position: sticky;
80
+ top: 20px;
81
+ }}
82
+ .metadata-card h3 {{
83
+ margin: 0 0 15px 0;
84
+ color: #3498db;
85
+ font-size: 18px;
86
+ }}
87
+ .metadata-item {{
88
+ margin-bottom: 12px;
89
+ padding-bottom: 12px;
90
+ border-bottom: 1px solid #404040;
91
+ }}
92
+ .metadata-item:last-child {{
93
+ border-bottom: none;
94
+ margin-bottom: 0;
95
+ padding-bottom: 0;
96
+ }}
97
+ .metadata-label {{
98
+ font-size: 11px;
99
+ text-transform: uppercase;
100
+ color: #888;
101
+ letter-spacing: 0.5px;
102
+ margin-bottom: 4px;
103
+ }}
104
+ .metadata-value {{
105
+ font-size: 14px;
106
+ color: #fff;
107
+ word-break: break-word;
108
+ }}
109
+ .viewer-container {{
110
+ width: 100%;
111
+ height: 80vh;
112
+ min-height: 600px;
113
+ position: relative;
114
+ background: #000;
115
+ border-radius: 8px;
116
+ overflow: hidden;
117
+ }}
118
+ #niivue-canvas {{
119
+ width: 100%;
120
+ height: 100%;
121
+ display: block;
122
+ }}
123
+ #status {{
124
+ position: absolute;
125
+ top: 10px;
126
+ left: 10px;
127
+ background: rgba(0, 0, 0, 0.8);
128
+ color: #fff;
129
+ font: 14px monospace;
130
+ padding: 10px 15px;
131
+ border-radius: 5px;
132
+ z-index: 100;
133
+ }}
134
+ .info {{
135
+ margin-top: 20px;
136
+ padding: 15px;
137
+ background: #2d2d2d;
138
+ border-radius: 5px;
139
+ }}
140
+ .tabs {{
141
+ display: flex;
142
+ gap: 10px;
143
+ margin-bottom: 20px;
144
+ border-bottom: 2px solid #3498db;
145
+ }}
146
+ .tab {{
147
+ padding: 12px 24px;
148
+ background: #2d2d2d;
149
+ border: none;
150
+ color: #fff;
151
+ cursor: pointer;
152
+ font-size: 16px;
153
+ border-radius: 8px 8px 0 0;
154
+ transition: all 0.3s;
155
+ }}
156
+ .tab:hover {{
157
+ background: #3d3d3d;
158
+ }}
159
+ .tab.active {{
160
+ background: #3498db;
161
+ color: #fff;
162
+ }}
163
+ .controls {{
164
+ display: flex;
165
+ gap: 10px;
166
+ margin-top: 20px;
167
+ }}
168
+ .btn {{
169
+ padding: 12px 32px;
170
+ background: #3498db;
171
+ border: none;
172
+ color: #fff;
173
+ cursor: pointer;
174
+ font-size: 16px;
175
+ border-radius: 5px;
176
+ transition: all 0.3s;
177
+ font-weight: 600;
178
+ }}
179
+ .btn:hover {{
180
+ background: #2980b9;
181
+ transform: translateY(-2px);
182
+ box-shadow: 0 4px 8px rgba(52, 152, 219, 0.3);
183
+ }}
184
+ .btn:active {{
185
+ transform: translateY(0);
186
+ }}
187
+ </style>
188
+ </head>
189
+ <body>
190
+ <div class="header">
191
+ <h1>🧠 BIDS Neuroimaging Viewer</h1>
192
+ <p>Aphasia Recovery Cohort (ARC) Dataset - Mini Sample</p>
193
+ </div>
194
+
195
+ <div class="container">
196
+ <div class="main-content">
197
+ <div class="viewer-section">
198
+ <div class="tabs">
199
+ <button class="tab active" id="tab-multiplanar">Multiplanar + 3D</button>
200
+ <button class="tab" id="tab-render3d">3D Render Only</button>
201
+ </div>
202
+
203
+ <div class="viewer-container">
204
+ <canvas id="niivue-canvas"></canvas>
205
+ <div id="status">Loading NiiVue...</div>
206
+ </div>
207
+
208
+ <div class="controls">
209
+ <button class="btn" id="next-btn">Next</button>
210
+ </div>
211
+
212
+ <div class="info">
213
+ <h3 id="info-title">Controls:</h3>
214
+ <ul id="info-list">
215
+ <li><strong>Slice views (3 panels):</strong> Click to move crosshair, drag to pan, scroll to zoom</li>
216
+ <li><strong>3D view (bottom-right):</strong> Left-click drag to rotate, scroll to zoom</li>
217
+ </ul>
218
+ </div>
219
+ </div>
220
+
221
+ <div class="metadata-section">
222
+ <div class="metadata-card">
223
+ <h3>Sample Info</h3>
224
+ <div id="metadata-content">
225
+ <div class="metadata-item">
226
+ <div class="metadata-label">Loading...</div>
227
+ </div>
228
+ </div>
229
+ </div>
230
+ </div>
231
+ </div>
232
+ </div>
233
+
234
+ <script type="module">
235
+ let nv;
236
+
237
+ function updateMetadata(metadata) {{
238
+ const metadataContent = document.getElementById('metadata-content');
239
+ const fields = ['subject', 'session', 'datatype', 'suffix', 'task', 'run'];
240
+
241
+ metadataContent.innerHTML = fields.map(field => `
242
+ <div class="metadata-item">
243
+ <div class="metadata-label">${{field}}</div>
244
+ <div class="metadata-value">${{metadata[field] || 'N/A'}}</div>
245
+ </div>
246
+ `).join('');
247
+ }}
248
+
249
+ function switchView(mode, clickedElement) {{
250
+ if (!nv) return; // Not initialized yet
251
+
252
+ // Update tab styles
253
+ if (clickedElement) {{
254
+ document.querySelectorAll('.tab').forEach(tab => tab.classList.remove('active'));
255
+ clickedElement.classList.add('active');
256
+ }}
257
+
258
+ // Switch view mode
259
+ if (mode === 'multiplanar') {{
260
+ nv.setSliceType(nv.sliceTypeMultiplanar);
261
+ if (nv.setMultiplanarLayout) {{
262
+ nv.setMultiplanarLayout(2);
263
+ }}
264
+ nv.opts.show3Dcrosshair = true;
265
+
266
+ document.getElementById('info-list').innerHTML = `
267
+ <li><strong>Slice views (3 panels):</strong> Click to move crosshair, drag to pan, scroll to zoom</li>
268
+ <li><strong>3D view (bottom-right):</strong> Left-click drag to rotate, right-click drag to pan, scroll to zoom</li>
269
+ <li><strong>Double click:</strong> Reset view</li>
270
+ `;
271
+ }} else if (mode === 'render3d') {{
272
+ nv.setSliceType(nv.sliceTypeRender);
273
+
274
+ // Enable clip plane in 3D render
275
+ nv.setClipPlane([0, 270, 0]); // Show clipping plane
276
+ nv.opts.clipPlaneHotKey = 'c'; // Press 'c' to toggle clip plane
277
+
278
+ document.getElementById('info-list').innerHTML = `
279
+ <li><strong>Left click + drag:</strong> Rotate the 3D volume</li>
280
+ <li><strong>Right click + drag:</strong> Pan/move</li>
281
+ <li><strong>Mouse wheel:</strong> Adjust clip plane depth</li>
282
+ `;
283
+ }}
284
+
285
+ nv.drawScene();
286
+ }}
287
+
288
+ // Setup tab event listeners
289
+ document.getElementById('tab-multiplanar').addEventListener('click', function() {{
290
+ switchView('multiplanar', this);
291
+ }});
292
+
293
+ document.getElementById('tab-render3d').addEventListener('click', function() {{
294
+ switchView('render3d', this);
295
+ }});
296
+
297
+ // Next button handler
298
+ document.getElementById('next-btn').addEventListener('click', async function() {{
299
+ if (!nv) {{
300
+ alert('Viewer not initialized yet');
301
+ return;
302
+ }}
303
+
304
+ const statusEl = document.getElementById('status');
305
+ try {{
306
+ statusEl.style.display = 'block';
307
+ statusEl.textContent = 'Loading next sample...';
308
+ statusEl.style.background = 'rgba(0, 0, 0, 0.8)';
309
+
310
+ const response = await fetch('/next');
311
+ const data = await response.json();
312
+
313
+ if (data.status === 'error') {{
314
+ statusEl.textContent = 'Error: ' + data.message;
315
+ statusEl.style.background = 'rgba(160, 0, 0, 0.8)';
316
+ setTimeout(() => {{ statusEl.style.display = 'none'; }}, 3000);
317
+ return;
318
+ }}
319
+
320
+ console.log('Loading next sample...');
321
+ await nv.loadVolumes([{{
322
+ url: data.data_url,
323
+ name: 'volume.nii.gz'
324
+ }}]);
325
+
326
+ // Update metadata
327
+ updateMetadata(data.metadata);
328
+
329
+ nv.drawScene();
330
+ statusEl.style.display = 'none';
331
+ console.log('βœ“ Next sample loaded!');
332
+ }} catch (err) {{
333
+ console.error('Error loading next sample:', err);
334
+ statusEl.textContent = 'Error: ' + err.message;
335
+ statusEl.style.background = 'rgba(160, 0, 0, 0.8)';
336
+ setTimeout(() => {{ statusEl.style.display = 'none'; }}, 3000);
337
+ }}
338
+ }});
339
+
340
+ (async () => {{
341
+ const statusEl = document.getElementById('status');
342
+
343
+ try {{
344
+ statusEl.textContent = 'Fetching NiiVue library...';
345
+
346
+ const niivueModule = await import('https://unpkg.com/@niivue/[email protected]/dist/index.js');
347
+ const Niivue = niivueModule.Niivue;
348
+
349
+ statusEl.textContent = 'Initializing viewer...';
350
+ nv = new Niivue({{
351
+ logging: true,
352
+ show3Dcrosshair: true,
353
+ textHeight: 0.04
354
+ }});
355
+
356
+ await nv.attachTo('niivue-canvas');
357
+
358
+ statusEl.textContent = 'Loading initial sample...';
359
+
360
+ // Fetch initial sample and metadata
361
+ const initResponse = await fetch('/initial');
362
+ const initData = await initResponse.json();
363
+
364
+ const volumes = [{{
365
+ url: initData.data_url,
366
+ name: 'volume.nii.gz'
367
+ }}];
368
+
369
+ await nv.loadVolumes(volumes);
370
+
371
+ // Update metadata display
372
+ updateMetadata(initData.metadata);
373
+
374
+ // Start with multiplanar + 3D
375
+ nv.setSliceType(nv.sliceTypeMultiplanar);
376
+ if (nv.setMultiplanarLayout) {{
377
+ nv.setMultiplanarLayout(2);
378
+ }}
379
+ nv.setRenderAzimuthElevation(120, 10);
380
+ nv.opts.show3Dcrosshair = true;
381
+ nv.opts.crosshairWidth = 2;
382
+
383
+ statusEl.textContent = 'Rendering...';
384
+ setTimeout(() => {{
385
+ nv.updateGLVolume();
386
+ nv.drawScene();
387
+ statusEl.style.display = 'none';
388
+ console.log('βœ“βœ“βœ“ Viewer ready!');
389
+ }}, 300);
390
+
391
+ }} catch (err) {{
392
+ console.error('ERROR:', err);
393
+ console.error('Error message:', err.message);
394
+ console.error('Error stack:', err.stack);
395
+ statusEl.textContent = 'Error: ' + err.message;
396
+ statusEl.style.background = 'rgba(160, 0, 0, 0.8)';
397
+ }}
398
+ }})();
399
+ </script>
400
+ </body>
401
+ </html>
402
+ """
403
+ return html_content
404
+
405
+ @app.get("/initial")
406
+ async def get_initial():
407
+ print("Loading dataset...")
408
+ # dataset = load_dataset("TobiasPitters/ds004884-mini", streaming=True)
409
+ global ds_iter
410
+ dataset = load_dataset("parquet", data_dir="./data/" ,streaming=True)
411
+ ds_iter = iter(dataset["train"])
412
+
413
+ first_sample = next(ds_iter)
414
+
415
+ if isinstance(first_sample['nifti'], dict):
416
+ nifti_bytes = first_sample['nifti']['bytes']
417
+ else:
418
+ nifti_bytes = first_sample['nifti'].to_bytes()
419
+ nifti_b64 = base64.b64encode(nifti_bytes).decode("utf-8")
420
+ data_url = nifti_base_url.format(nifti_b64)
421
+
422
+ initial_metadata = {
423
+ 'subject': first_sample.get('subject', 'N/A'),
424
+ 'session': first_sample.get('session', 'N/A'),
425
+ 'datatype': first_sample.get('datatype', 'N/A'),
426
+ 'suffix': first_sample.get('suffix', 'N/A'),
427
+ 'task': first_sample.get('task', 'N/A'),
428
+ 'run': first_sample.get('run', 'N/A'),
429
+ 'path': first_sample.get('path', 'N/A')
430
+ }
431
+
432
+ return {
433
+ "status": "success",
434
+ "data_url": data_url,
435
+ "metadata": initial_metadata
436
+ }
437
+
438
+ async def load_dataset_iterator():
439
+ global num_iterations
440
+ dataset = load_dataset("TobiasPitters/ds004884-mini", streaming=True)
441
+ ds_iter_new = iter(dataset["train"])
442
+ for _ in range(num_iterations):
443
+ next(ds_iter_new)
444
+
445
+ global ds_iter
446
+ ds_iter = ds_iter_new
447
+
448
+ @app.get("/next")
449
+ async def next_sample():
450
+ """Load next sample from dataset"""
451
+ global ds_iter
452
+ global num_iterations
453
+ try:
454
+ ex = next(ds_iter)
455
+ if isinstance(ex['nifti'], dict):
456
+ nifti_bytes = ex['nifti']['bytes']
457
+ else:
458
+ nifti_bytes = ex['nifti'].to_bytes()
459
+ nifti_b64 = base64.b64encode(nifti_bytes).decode("utf-8")
460
+ new_data_url = nifti_base_url.format(nifti_b64)
461
+
462
+ metadata = {
463
+ 'subject': ex.get('subject', 'N/A'),
464
+ 'session': ex.get('session', 'N/A'),
465
+ 'datatype': ex.get('datatype', 'N/A'),
466
+ 'suffix': ex.get('suffix', 'N/A'),
467
+ 'task': ex.get('task', 'N/A'),
468
+ 'run': ex.get('run', 'N/A'),
469
+ 'path': ex.get('path', 'N/A')
470
+ }
471
+
472
+ num_iterations += 1
473
+ return {
474
+ "status": "success",
475
+ "data_url": new_data_url,
476
+ "metadata": metadata
477
+ }
478
+ except StopIteration:
479
+ return {"status": "error", "message": "No more samples in dataset"}
480
+
481
+ @app.get("/health")
482
+ async def health():
483
+ """Health check endpoint"""
484
+ return {"status": "healthy", "dataset_loaded": True}
requirements.txt ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ fastapi
2
+ uvicorn[standard]
3
+ nibabel
4
+ git+https://github.com/The-Obstacle-Is-The-Way/datasets.git@feat/bids-loader