Data Analysis
- Learning = Representation + Evaluation + Optimization.
- Generalisation is necessary.
- Data alone ain’t enough.
- Overfitting will attack you from unpredictable places.
- Intuition fails in high dimensions.
- Theoretical guarantees are wobbly.
- Feature engineering for the win.
- More data beats better algos.
- Learn many models.
- Simplicity doesn’t imply accuracy (always).
- Use Occam’s razor with care.
- Representable doesn’t mean learnable (always).
- Correlation doesn’t imply causation.
Universal Approximation
Perceptrons are the simplest form of neural networks. MLPs or multi-layer perceptrons are called universal approximators.
The Curse of Dimensionality
The curse of dimensionality refers to various phenomena that arise when analyzing and organizing data in high-dimensional spaces that do not occur in low-dimensional settings such as the three-dimensional physical space of everyday experience.
In case of machine learning, the number of learning samples increases exponentially with increasing dimensions.
Graphs
Graph functions and node functions are permutation independent. Graph Theorists argue that GNNs are just special cases of graph isomorphism test(s) like WL test.