From 00f099b6070d605f5ff7bbedf623100cd606dafa Mon Sep 17 00:00:00 2001 From: Dane Sabo Date: Mon, 28 Oct 2024 09:05:04 -0400 Subject: [PATCH] vault backup: 2024-10-28 09:05:04 --- 4 Qualifying Exam/1 Managing Stuff/0. QE To Do List.md | 1 + 4 Qualifying Exam/2 Writing/3. QE Research Approach.md | 6 ++++-- 2 files changed, 5 insertions(+), 2 deletions(-) diff --git a/4 Qualifying Exam/1 Managing Stuff/0. QE To Do List.md b/4 Qualifying Exam/1 Managing Stuff/0. QE To Do List.md index b43da23b..5d24ea11 100644 --- a/4 Qualifying Exam/1 Managing Stuff/0. QE To Do List.md +++ b/4 Qualifying Exam/1 Managing Stuff/0. QE To Do List.md @@ -11,6 +11,7 @@ - [x] Write about how to generate structured perturbations 📅 2024-10-17 ✅ 2024-10-16 - [ ] Write story for RA - [ ] Reposition things on the timeline +- [ ] How does H_infty work? # Milestones - [x] Goals and Outcomes Finished 🆔 kwyu6a ⏳ 2024-10-02 📅 2024-10-04 ✅ 2024-10-02 - [*] State of the Art Finished 🆔 i9ybdy ⏳ 2024-10-09 📅 2024-10-11 diff --git a/4 Qualifying Exam/2 Writing/3. QE Research Approach.md b/4 Qualifying Exam/2 Writing/3. QE Research Approach.md index 3d8d8143..8bcf0dc0 100644 --- a/4 Qualifying Exam/2 Writing/3. QE Research Approach.md +++ b/4 Qualifying Exam/2 Writing/3. QE Research Approach.md @@ -32,5 +32,7 @@ Something to justify, why diffusion model as opposed to other generative AI 9. This noise is a gaussian noise 10. over time this forward process degrades the input until it is unrecognizable 11. This takes several several iterations, but is dependent on the amount of noise in each step -12. -13. A reverse process that tries to remove the noise \ No newline at end of file +12. The markov chain that this creates is also gaussian at every step, until the noise at the end of the day is some gaussian distribution. Usually mean 0 std. dev beta +13. A reverse process that tries to remove the noise +14. But if we destroy the input how can we do this? +15. Well we train a neural network as a denoiser. \ No newline at end of file