Skip to content

rendering neural net loss landscapes on Snap Spectacles

Notifications You must be signed in to change notification settings

amosyou/loss-landscapes-lens

Repository files navigation

Loss Landscapes in a Lens

loss landscape on spectacles

This project was inspired by Visualizing the Loss Landscape of Neural Nets, a NeurIPS 2018 paper. The paper describes a method for visualizing loss landscapes of neural networks by applying normalization to specific model parameters, and then plotting loss values along 2 randomly selected directions during training. This project directly applies the paper by rendering these loss landscapes in real life using the Snap Spectacles.

With wearable tech and smart glasses like the Snap Spectacles, we can finally have an immersive experience in AR to better visualize the shapes of these 3D surface plots, and have visibility into more fine-grain or precise details. The lens can be found here.

Usage

  1. To install dependencies, create a Python virtual environment and run the following command.
pip install -r requirements.txt
  1. loss-landscape/ corresponds to the loss-landscape repo, which provides code to generate .vtp files for the 3D loss surface plots of a few models.

  2. Lens Studio only allows for 3D object imports from .fbx, .obj, and .gltf file formats, so we convert the .vtp files to .obj files using PyVista, a 3D plotting Python library. The notebook can be found at vtp2obj.ipynb.

  3. lens/ holds the Lens Studio project with the project file and assets. We import these loss surface meshes into Lens Studio as assets, and customize the arrangement of the camera, change scaling and positioning of the mesh, enable surface detection, and apply material to the mesh for the color gradient.

And with some sync'ing to the Spectacles, we have a working lens!

Future Work

Although we have working code to generate the 3D surface plots given some model, we're hoping there's a way to automate the setup of the assets in Lens Studio.

We're also hoping to continue fine-tuning some LLMs such as Llama 3 and diffusion models like FLUX.1-dev to see how the loss surfaces differ for newer models compared to the traditional models (ie. DenseNet, ResNet, and VGG) presented in the paper.

Acknowledgements

Code heavily borrowed from loss-landscape.

About

rendering neural net loss landscapes on Snap Spectacles

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published