# Thoughts to Organize
* You have a new great view of matrices as linear transformation. How can you think about softmax and probability in that context? Think about them as FUNCTIONS not just toosl
* In some ways it is entirely natural that a *constraint* yields a *line* in the case of affine span and affine combinations
* 3b1b lex Friedman conversation - GPT3 conversation. Pattern recognition is part of it (think empiricism) but this is only needed to then create a final explanation. This is very similar to Deutsch. The final proof is a form of explanation, not simply the recognition of a pattern
* Linearity allows for reductionist thinking, breaking things down into their simplest parts and then recombining them to get a final answer. Pg 280 infinite powers
* recent podcast learnings?
* Go through this study and incorporate findings https://scholarsbank.uoregon.edu/xmlui/bitstream/handle/1794/22314/slovic_110.pdf
* Derek sivers: be useful, smart, and happy (podcast)
* Buridans donkey https://sive.rs/donkey
* Hell yeah or no, bike story, Derek sivers
* Cognitive dissonance podcast, how does it fit in with al the others things I think about
* Book list to redo and summarize - Incerto, deutsch, mental models
* Buffers, streams, etc
* High performance python (chapter 1, computer architecture)
* Nice is dead because nice ends up with Gulag's (weinstein on lex fridman podcast)
* "At the end of the day if you want to deliver, you have to ship" - Don't hide behind fear
* The forces of destruction which were once kinetic energy are now all being stored as potential energy and its potential energy is building from the Lex Fridman podcast with Eric Weinstein
* Ego protection quote mental models pg 27
* Andy Benoit quote mental models of 24
* add Deutsch pg 315 to obsidian
* What is special about us are the details, not some abstract attribute! Wolfram on Fridman, 1130
* The Reality of Abstractions and Mathematics post
* My writing
* https://drive.google.com/drive/folders/1XssZ02kCwP4pz_n7IGFiWcknU7hxyYZS
* https://docs.google.com/document/d/18i1WvNw2zHxOyfzMYHYKUDmEoM5uXqy-o21pvPUjWYw/edit
* https://docs.google.com/document/d/1EgfcjD2KGqeO7GRiN_xB3T8nTgvODnt_6wj4tkVNklY/edit
* https://docs.google.com/document/d/1JSqEPjtpIpL1NFB26qVVIHxnqGHpqmPJMqDrmcKZ87U/edit
* https://docs.google.com/document/d/1cGITVvu2aNriKZEQpke23sk0wa0TBOrtN3SUhkDDlds/edit
* https://docs.google.com/document/d/1MmSA7QyaM2XBmOkF7Vhq1l8z4IKUOXtTA_XjHBekhn0/edit
* Skin in the game mathematics ideas (if you have a probability of death > 0%, your expected result is 0)
* How does gaussian work? Look at taleb for base intuition. The small (0 and 1 values) tend to cancel each other out. The reason fat tailed processes don't work is because they don't have this cancellation property.
* Discrete stochastic probability notes (search MIT6_262S11_chap01_with_notes)
* Fourier Transform and laplace applied to probability (moment generating functions...), use your intuitive views of fourier and laplace!
* How does recursion exploit repetitive structure?
* Page 132, science of design (Science of the artificial)
* Intelligence is error correction, lex Friedman with Robert Breedlove
* categorize
* Whiteboard photos around June 8, 2020, good content related to norms (https://photos.google.com/search/whiteboards, https://photos.google.com/search/whiteboards/photo/AF1QipMvn5rXa2eeoH8BhtwKE5pyQmSZCSgHW2NHllJt)
* convexity https://photos.google.com/search/whiteboards/photo/AF1QipOjlgoQdqjCmSdifJaHVIKSmqRFDZth1kSZm6bW
* Strogatz with 3b1b convo: https://overcast.fm/+xdh5f_nxk/14:39
* You have to understand (and ideally love!!) the question. You have to love the question.
* how do neural networks provide compositionality?
* parameter space, https://en.m.wikipedia.org/wiki/Parameter_space, seems like a very useful thing to think about in all of ML. Think about the *structure* of parameter space!
* Dynamical system - https://en.m.wikipedia.org/wiki/Dynamical_system
* "Trajectories in phase space" - this is a cool concept. We aren't talking about a trajectories in 3D space, but rather the space of tuple! Think of the "space of card configurations!" - see if strogatz or Ellenberg mention this in their books
* Chapter 8, the geometry of vector spaces, Lay linear algebra text
* Group theory reading and learning
* 3b1b videos
* colah blog post
* math3ma articles
* Review Sean carrol big ideas and Sean carrol with Emily riehl topology big idea notes. Also, t bradley - what is algebra, what is topology
* Topology and symmetry (pg 2 Euler a book), is there a fundamental symmetry that topology follows. Under what actions does it leave things invariant? See Shape#
* What if you found a dataset you were particularly interested in, then configured it via fabric, then started to run all sorts of experiments trying to take advantage of its capabilities (set up a cluster on your sys 76 perhaps), try different graph techniques, semantic propagation, and so on
* Should I build my own dask cluster on my sys 76 via Kubernetes and experiment with some problem that I want to solve (say, distributed compute on a graph), as a pet project that would help enhance skills?
* Write a blog post on general geometry, shape, embeddings, etc. Basically, a nice summary of how these ideas in geometry can be reasoned about. This would mainly be for your own benefit. This could be a great combination with your goal of getting your blog up and running over thanksgiving.
* Algebraic objects - consider the inverse operation as being division. So matrices really are algebraic objects, but also transformations…? Think about this more!!! What does it mean to be an algebraic object
* What is the relationship between autoencoders and PCA? See more [here](https://youtu.be/oJNHXPs0XDk?t=446).
---
Date: 20211111
Links to:
Tags:
References:
* []()
### Docs
This page contains a list of thoughts and ideas that came to me during the week that I am not sure where to place. This list should be organized and cleared out weekly, or else it will no longer serve its purpose.
---
Date: 20211107
Links to: [Incoming Ideas MOC](Incoming%20Ideas%20MOC.md)
Tags:
References:
* []()