UESTC-BICS commited on
Commit
a0f85c5
Β·
1 Parent(s): fd25d00

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +175 -3
README.md CHANGED
@@ -1,3 +1,175 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ ---
4
+
5
+ <div align="center">
6
+
7
+ <h1>I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks</h1>
8
+
9
+ [![Paper](https://img.shields.io/badge/Arxiv-2511.08065-B31B1B.svg)](https://arxiv.org/abs/2511.08065)
10
+ [![AAAI 2026](https://img.shields.io/badge/AAAI%202026-Oral-4b44ce.svg)](https://aaai.org/)
11
+ [![Google Scholar](https://img.shields.io/badge/Google%20Scholar-Paper-4285F4?style=flat-square&logo=google-scholar&logoColor=white)](https://scholar.google.com/scholar?cluster=1814482600796011970)
12
+ [![GitHub](https://img.shields.io/badge/GitHub-Repository-black?logo=github)](https://github.com/Ruichen0424/I2E)
13
+
14
+ [![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Paper-FFD21E?style=flat-square&logo=huggingface&logoColor=black)](https://huggingface.co/papers/2511.08065)
15
+ [![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Models-FFD21E?style=flat-square&logo=huggingface&logoColor=black)](https://huggingface.co/Ruichen0424/I2E)
16
+ </div>
17
+
18
+ ## πŸš€ Introduction
19
+
20
+ This repository contains the **I2E-Datasets** for the paper **"I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks"**, which has been accepted for **Oral Presentation at AAAI 2026**.
21
+
22
+ **I2E** is a pioneering framework that bridges the data scarcity gap in neuromorphic computing. By simulating microsaccadic eye movements via highly parallelized convolution, I2E converts static images into high-fidelity event streams in real-time (>300x faster than prior methods).
23
+
24
+ ### ✨ Key Highlights
25
+ * **SOTA Performance**: Achieves **60.50%** top-1 accuracy on Event-based ImageNet.
26
+ * **Sim-to-Real Transfer**: Pre-training on I2E data enables **92.5%** accuracy on real-world CIFAR10-DVS, setting a new benchmark.
27
+ * **Real-Time Conversion**: Enables on-the-fly data augmentation for deep SNN training.
28
+
29
+ ## πŸ† Model Zoo & Results
30
+
31
+ We provide pre-trained models for **I2E-CIFAR** and **I2E-ImageNet**. You can download the `.pth` files directly from the [**Files and versions**](https://huggingface.co/Ruichen0424/I2E/tree/main) tab in model repository.
32
+
33
+ [![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Models-FFD21E?style=flat-square&logo=huggingface&logoColor=black)](https://huggingface.co/Ruichen0424/I2E)
34
+
35
+ <table border="1">
36
+ <tr>
37
+ <th>Target Dataset</th>
38
+ <th align="center">Architecture</th>
39
+ <th align="center">Method</th>
40
+ <th align="center">Top-1 Acc</th>
41
+ </tr>
42
+ <!-- CIFAR10-DVS -->
43
+ <tr>
44
+ <td rowspan="3" align="center" style="vertical-align: middle;"><strong>CIFAR10-DVS</strong><br>(Real)</td>
45
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
46
+ <td align="center" style="vertical-align: middle;">Baseline</td>
47
+ <td align="center" style="vertical-align: middle;">65.6%</td>
48
+ </tr>
49
+ <tr>
50
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
51
+ <td align="center" style="vertical-align: middle;">Transfer-I</td>
52
+ <td align="center" style="vertical-align: middle;">83.1%</td>
53
+ </tr>
54
+ <tr>
55
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
56
+ <td align="center" style="vertical-align: middle;">Transfer-II (Sim-to-Real)</td>
57
+ <td align="center" style="vertical-align: middle;"><strong>92.5%</strong></td>
58
+ </tr>
59
+ <!-- I2E-CIFAR10 -->
60
+ <tr>
61
+ <td rowspan="3" align="center" style="vertical-align: middle;"><strong>I2E-CIFAR10</strong></td>
62
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
63
+ <td align="center" style="vertical-align: middle;">Baseline-I</td>
64
+ <td align="center" style="vertical-align: middle;">85.07%</td>
65
+ </tr>
66
+ <tr>
67
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
68
+ <td align="center" style="vertical-align: middle;">Baseline-II</td>
69
+ <td align="center" style="vertical-align: middle;">89.23%</td>
70
+ </tr>
71
+ <tr>
72
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
73
+ <td align="center" style="vertical-align: middle;">Transfer-I</td>
74
+ <td align="center" style="vertical-align: middle;"><strong>90.86%</strong></td>
75
+ </tr>
76
+ <!-- I2E-CIFAR100 -->
77
+ <tr>
78
+ <td rowspan="3" align="center" style="vertical-align: middle;"><strong>I2E-CIFAR100</strong></td>
79
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
80
+ <td align="center" style="vertical-align: middle;">Baseline-I</td>
81
+ <td align="center" style="vertical-align: middle;">51.32%</td>
82
+ </tr>
83
+ <tr>
84
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
85
+ <td align="center" style="vertical-align: middle;">Baseline-II</td>
86
+ <td align="center" style="vertical-align: middle;">60.68%</td>
87
+ </tr>
88
+ <tr>
89
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
90
+ <td align="center" style="vertical-align: middle;">Transfer-I</td>
91
+ <td align="center" style="vertical-align: middle;"><strong>64.53%</strong></td>
92
+ </tr>
93
+ <!-- I2E-ImageNet -->
94
+ <tr>
95
+ <td rowspan="4" align="center" style="vertical-align: middle;"><strong>I2E-ImageNet</strong></td>
96
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
97
+ <td align="center" style="vertical-align: middle;">Baseline-I</td>
98
+ <td align="center" style="vertical-align: middle;">48.30%</td>
99
+ </tr>
100
+ <tr>
101
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
102
+ <td align="center" style="vertical-align: middle;">Baseline-II</td>
103
+ <td align="center" style="vertical-align: middle;">57.97%</td>
104
+ </tr>
105
+ <tr>
106
+ <td align="center" style="vertical-align: middle;">MS-ResNet18</td>
107
+ <td align="center" style="vertical-align: middle;">Transfer-I</td>
108
+ <td align="center" style="vertical-align: middle;">59.28%</td>
109
+ </tr>
110
+ <tr>
111
+ <td align="center" style="vertical-align: middle;">MS-ResNet34</td>
112
+ <td align="center" style="vertical-align: middle;">Baseline-II</td>
113
+ <td align="center" style="vertical-align: middle;"><strong>60.50%</strong></td>
114
+ </tr>
115
+ </table>
116
+
117
+ > **Method Legend:**
118
+ > * **Baseline-I**: Training from scratch with minimal augmentation.
119
+ > * **Baseline-II**: Training from scratch with full augmentation.
120
+ > * **Transfer-I**: Fine-tuning from Static ImageNet (or I2E-ImageNet for CIFAR targets).
121
+ > * **Transfer-II**: Fine-tuning from I2E-CIFAR10.
122
+
123
+ ## πŸ‘οΈ Visualization
124
+
125
+ Below is the visualization of the I2E conversion process. We illustrate the high-fidelity conversion from static RGB images to dynamic event streams.
126
+
127
+ More than 200 additional visualization comparisons can be found in [Visualization.md](./Visualization.md).
128
+
129
+ <table border="0" style="width: 100%">
130
+ <tr>
131
+ <td width="25%" align="center"><img src="./assets/original_1.jpg" alt="Original 1" style="width:100%"></td>
132
+ <td width="25%" align="center"><img src="./assets/converted_1.gif" alt="Converted 1" style="width:100%"></td>
133
+ <td width="25%" align="center"><img src="./assets/original_2.jpg" alt="Original 2" style="width:100%"></td>
134
+ <td width="25%" align="center"><img src="./assets/converted_2.gif" alt="Converted 2" style="width:100%"></td>
135
+ </tr>
136
+ <tr>
137
+ <td width="25%" align="center"><img src="./assets/original_3.jpg" alt="Original 3" style="width:100%"></td>
138
+ <td width="25%" align="center"><img src="./assets/converted_3.gif" alt="Converted 3" style="width:100%"></td>
139
+ <td width="25%" align="center"><img src="./assets/original_4.jpg" alt="Original 4" style="width:100%"></td>
140
+ <td width="25%" align="center"><img src="./assets/converted_4.gif" alt="Converted 4" style="width:100%"></td>
141
+ </tr>
142
+ </table>
143
+
144
+ ## πŸ’» Usage
145
+ The generated I2E-Datasets are provided in the [**Files and versions**](https://huggingface.co/datasets/UESTC-BICS/I2E/tree/main) section, including I2E-CIFAR10, I2E-CIFAR100, and I2E-ImageNet.
146
+ For I2E-ImageNet, the following command can be used to concatenate the zip file.
147
+ ``` bash
148
+ cat ./I2E-ImageNet_split.part_* > ./I2E-ImageNet.zip
149
+ ```
150
+ We also provide MD5 checksums to facilitate verification. Use the following command to verify the compressed files:
151
+ ``` bash
152
+ md5sum -c md5.txt
153
+ ```
154
+
155
+ This repository hosts the **datasets only**.
156
+
157
+ For the **I2E dataset generation code**, **training scripts**, and detailed usage instructions, please refer to our official GitHub repository.
158
+
159
+ To generate the datasets (I2E-CIFAR10, I2E-CIFAR100, I2E-ImageNet) yourself using the I2E algorithm, please follow the instructions in the GitHub README.
160
+
161
+ [![GitHub](https://img.shields.io/badge/GitHub-Repository-black?logo=github)](https://github.com/Ruichen0424/I2E)
162
+
163
+
164
+ ## πŸ“œ Citation
165
+
166
+ If you find this work or the models useful, please cite our AAAI 2026 paper:
167
+
168
+ ```bibtex
169
+ @article{ma2025i2e,
170
+ title={I2E: Real-Time Image-to-Event Conversion for High-Performance Spiking Neural Networks},
171
+ author={Ma, Ruichen and Meng, Liwei and Qiao, Guanchao and Ning, Ning and Liu, Yang and Hu, Shaogang},
172
+ journal={arXiv preprint arXiv:2511.08065},
173
+ year={2025}
174
+ }
175
+ ```