vault backup: 2024-10-28 09:05:04

This commit is contained in:
Dane Sabo 2024-10-28 09:05:04 -04:00
parent 062c5c2ed4
commit 00f099b607
2 changed files with 5 additions and 2 deletions

View File

@ -11,6 +11,7 @@
- [x] Write about how to generate structured perturbations 📅 2024-10-17 ✅ 2024-10-16 - [x] Write about how to generate structured perturbations 📅 2024-10-17 ✅ 2024-10-16
- [ ] Write story for RA - [ ] Write story for RA
- [ ] Reposition things on the timeline - [ ] Reposition things on the timeline
- [ ] How does H_infty work?
# Milestones # Milestones
- [x] Goals and Outcomes Finished 🆔 kwyu6a ⏳ 2024-10-02 📅 2024-10-04 ✅ 2024-10-02 - [x] Goals and Outcomes Finished 🆔 kwyu6a ⏳ 2024-10-02 📅 2024-10-04 ✅ 2024-10-02
- [*] State of the Art Finished 🆔 i9ybdy ⏳ 2024-10-09 📅 2024-10-11 - [*] State of the Art Finished 🆔 i9ybdy ⏳ 2024-10-09 📅 2024-10-11

View File

@ -32,5 +32,7 @@ Something to justify, why diffusion model as opposed to other generative AI
9. This noise is a gaussian noise 9. This noise is a gaussian noise
10. over time this forward process degrades the input until it is unrecognizable 10. over time this forward process degrades the input until it is unrecognizable
11. This takes several several iterations, but is dependent on the amount of noise in each step 11. This takes several several iterations, but is dependent on the amount of noise in each step
12. 12. The markov chain that this creates is also gaussian at every step, until the noise at the end of the day is some gaussian distribution. Usually mean 0 std. dev beta
13. A reverse process that tries to remove the noise 13. A reverse process that tries to remove the noise
14. But if we destroy the input how can we do this?
15. Well we train a neural network as a denoiser.