diff --git a/content/projects/dropout-reduces-underfitting.md b/content/projects/dropout-reduces-underfitting.md
index 7b7fc34..1770d6f 100644
--- a/content/projects/dropout-reduces-underfitting.md
+++ b/content/projects/dropout-reduces-underfitting.md
@@ -139,6 +139,11 @@ According to the paper, you should observe:
- Early Dropout: Higher initial Loss, followed by a sharp drop after the switch_epoch, often reaching a lower minimum than Standard Dropout (reduction of underfitting).
- Late Dropout: Rapid rise in accuracy at the start (potential overfitting), then stabilized by the activation of dropout.
+## 📄 Detailed Report
+
+
+
## 📝 Authors
- [Arthur Danjou](https://github.com/ArthurDanjou)
@@ -148,9 +153,4 @@ According to the paper, you should observe:
- [Moritz Von Siemens](https://github.com/MoritzSiem)
M.Sc. Statistical and Financial Engineering (ISF) - Data Science Track at Université Paris-Dauphine PSL
-Based on the work of Liu, Z., et al. (2023). Dropout Reduces Underfitting.
-
-## 📄 Detailed Report
-
-
\ No newline at end of file
+Based on the work of Liu, Z., et al. (2023). Dropout Reduces Underfitting.
\ No newline at end of file