Skip to content

Architecture: Nanoscope Visualizer

Leland-Takamine edited this page May 7, 2018 · 6 revisions

Method tracing visualizer built to handle millions of calls.

Overview

GitHub Repo

Background

Initial iterations of Nanoscope leveraged Chrome's Event Profiling Tool for visualization. However, the tool was unable to handle the large volume of data generated by Nanoscope, so we were forced filter out much of the data before loading it into the Event Profiling Tool. Nanoscope Visualizer doesn't suffer from this limitation as it was built from the ground up to handle millions of events and is now the default visualization frontend for Nanoscope. It's still possible, however, to specify the Chrome tracing tool with a command line option.

"Tiling"

In order to maintain a reasonable frame rate, we follow a principle similar to map tiling:

  • Decrease data resolution on zoom out
  • Increase data resolution on zoom in
  • Only display on-screen data

Using this strategy, we're able to limit the maximum amount of on-screen vertices at any given time. The GIF below depicts the tiling in action:

The red box represents the screen viewport in relation to model coordinate space. When the box shrinks, that represents us zooming in.

Data Resolution

In the GIF above you can see how the resolution of the data is increased as we zoom in. We use a simple algorithm that relies on a minimum event duration at each zoom level. Zooming in allows us to see smaller and smaller events. Since the dataset and the available zoom levels are fixed, we can calculate the renderable data at each zoom level upfront:

Algorithm
for event in events:
  for zoomLevel in zoomLevels:
    if event.duration > zoomLevel.minDuration:
      zoomLevel.addEvent(event)

Viewport Culling

Without filtering out off-screen events, we wouldn't gain much by playing with data resolution. For example, our minimum event duration at our highest zoom levels is very small, so without any other restrictions, we'd be rendering all events in the dataset. To limit the number of events we attempt to render, we also filter out off-screen events. This allows us to show very small events at our most zoomed-in since our viewport only covers a small portion of the dataset as seen in the GIF above.

Algorithm
for row in currentZoomLevel.rows:
  leftIndex = binarySearch(row.events, viewport.left)
  rightIndex = binarySearch(row.events, viewport.right)
  for i in range(leftIndex, rightIndex + 1):
    render(row.events[i])