"LLE" (Machine Learning Method)
- Method for DimensionReduction, DimensionReduce, FeatureSpacePlot and FeatureSpacePlot3D.
- Reduce the dimension of data using a locally linear embedding.
Details & Suboptions
- "LLE", which stands for locally linear embedding, is a nonlinear neighborhood-preserving dimensionality reduction method.
- "LLE" is able to learn nonlinear manifolds; however, it can fail if data has high-density variations and tends to collapse large portions of the data close together.
- The following plots (see FeatureSpacePlot) show two-dimensional embeddings learned by the "LLE" method applied to the benchmark datasets Fisher's Irises, MNIST and FashionMNIST:
- "LLE" seeks to find a low-dimensional embedding that locally preserves the intrinsic geometry (such as angles and relative distances) of the data. To do so, "LLE" first defines the neighborhood of each data point by its
nearest neighbors. Then, it computes the optimal "reconstruction weights" Wij by minimizing the reconstruction error (xj is a neighbor of xi): ∑ Ni=1xi-∑jWijxj 2, subject to the constraint ∑jWij=1 (this constraint enforces the invariance of Wij to rotations, rescalings and translations of the data). Once Wij are computed, the lower-dimensional embeddings
are computed by minimizing the embedding cost: ∑ Ni=1yi-∑jWijyj 2.
- "LLE" attempts to have the same reconstruction weights in the original and embedding space, which is why the local geometric properties of the data points are approximately preserved.
- The following suboption can be given:
-
"NeighborsNumber" Automatic number of nearest neighbors

Examples
open allclose allBasic Examples (1)Summary of the most common use cases
Create and visualize a "Swiss roll" dataset:

https://wolfram.com/xid/0673l568d-fqjqny

https://wolfram.com/xid/0673l568d-7bpljt

Train a nonlinear dimension reducer using "LLE" on the dataset to map to two-dimensional space:

https://wolfram.com/xid/0673l568d-4b74ci

Find and visualize the data coordinates in the reduced space:

https://wolfram.com/xid/0673l568d-hxupzm

Visualize the dataset in the original space, with each point colored according to its reduced variable:

https://wolfram.com/xid/0673l568d-2lgnr6

Scope (1)Survey of the scope of standard use cases
Dataset Visualization (1)
Load the Fisher Iris dataset from ExampleData and perform a train/test split:

https://wolfram.com/xid/0673l568d-i0qojo
Generate a reducer function using "LLE" with the features of each training example:

https://wolfram.com/xid/0673l568d-hw72bx

Group test examples by their species:

https://wolfram.com/xid/0673l568d-u41y3o
Reduce the dimension of the features:

https://wolfram.com/xid/0673l568d-2geip6
Visualize the reduced dataset:

https://wolfram.com/xid/0673l568d-ck8dgi

Options (1)Common values & functionality for each option
"NeighborsNumber" (1)
Generate a dataset of different head poses from 3D geometry data with random viewpoints:

https://wolfram.com/xid/0673l568d-vt62ci
Visualize different head poses:

https://wolfram.com/xid/0673l568d-4qtsfe

Reduce the dataset to a two-dimensional representation by specifying the "NeighborsNumber" in the neighborhood graph for performing the locally linear embedding:

https://wolfram.com/xid/0673l568d-g2w0io
Visualize the original images in the reduced space, in which the up-down and front-side poses are disentangled:

https://wolfram.com/xid/0673l568d-u9gdn8
