diff --git a/doc/User_Guide/Annotating_Data.rst b/doc/User_Guide/Annotating_Data.rst new file mode 100644 index 0000000000..b9ca5c3f68 --- /dev/null +++ b/doc/User_Guide/Annotating_Data.rst @@ -0,0 +1,2 @@ +Annotating your Data +____________________ diff --git a/doc/User_Guide/Data_Pipelines.rst b/doc/User_Guide/Data_Pipelines.rst new file mode 100644 index 0000000000..6235855982 --- /dev/null +++ b/doc/User_Guide/Data_Pipelines.rst @@ -0,0 +1,2 @@ +Data Transformation Pipelines +_____________________________ diff --git a/doc/User_Guide/Dimensioned_Containers.rst b/doc/User_Guide/Dimensioned_Containers.rst new file mode 100644 index 0000000000..146f053e10 --- /dev/null +++ b/doc/User_Guide/Dimensioned_Containers.rst @@ -0,0 +1,2 @@ +Dimensioned Containers +______________________ diff --git a/doc/User_Guide/IPython_Magics.rst b/doc/User_Guide/IPython_Magics.rst new file mode 100644 index 0000000000..727092de95 --- /dev/null +++ b/doc/User_Guide/IPython_Magics.rst @@ -0,0 +1,2 @@ +IPython Magics +______________ diff --git a/doc/User_Guide/Plots_and_Renderers.rst b/doc/User_Guide/Plots_and_Renderers.rst new file mode 100644 index 0000000000..846bbb8ea5 --- /dev/null +++ b/doc/User_Guide/Plots_and_Renderers.rst @@ -0,0 +1,2 @@ +Working with Plot and Renderers +_______________________________ diff --git a/doc/User_Guide/Plotting_with_Matplotlib.rst b/doc/User_Guide/Plotting_with_Matplotlib.rst new file mode 100644 index 0000000000..6d0de34d70 --- /dev/null +++ b/doc/User_Guide/Plotting_with_Matplotlib.rst @@ -0,0 +1,2 @@ +Plotting with Matplotlib +________________________ diff --git a/doc/conf.py b/doc/conf.py index 9727bf1a5f..dfc26d484a 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -52,8 +52,8 @@ ), # Links 'LINKS': ( - ('Getting started', 'Getting_Started/index.html'), - ('User Guides', 'User_Guide/index.html'), + ('Getting started', 'getting_started/index.html'), + ('User Guide', 'user_guide/index.html'), ('Tutorials', 'Tutorials/index.html'), ('Gallery', 'Gallery/index.html'), ('Reference', 'Reference_Manual/index.html'), @@ -71,8 +71,8 @@ ), # Links for the docs sub navigation 'NAV_DOCS': ( - ('Getting started', 'Getting_Started/index'), - ('User Guides', 'User_Guide/index'), + ('Getting started', 'getting_started/index'), + ('User Guide', 'user_guide/index'), ('Tutorials', 'Tutorials/index'), ('Gallery', 'gallery/index'), ('Reference', 'Reference_Manual/index'), diff --git a/doc/getting_started/Customization.rst b/doc/getting_started/Customization.rst new file mode 100644 index 0000000000..c47261930b --- /dev/null +++ b/doc/getting_started/Customization.rst @@ -0,0 +1,5 @@ +Customization +_____________ + +.. notebook:: holoviews ../../examples/getting_started/2-Customization.ipynb + :offset: 1 diff --git a/doc/getting_started/Gridded_Datasets.rst b/doc/getting_started/Gridded_Datasets.rst new file mode 100644 index 0000000000..b062e7207e --- /dev/null +++ b/doc/getting_started/Gridded_Datasets.rst @@ -0,0 +1,5 @@ +Gridded Datasets +________________ + +.. notebook:: holoviews ../../examples/getting_started/4-Gridded_Datasets.ipynb + :offset: 1 diff --git a/doc/getting_started/Introduction.rst b/doc/getting_started/Introduction.rst new file mode 100644 index 0000000000..f1b4454e52 --- /dev/null +++ b/doc/getting_started/Introduction.rst @@ -0,0 +1,5 @@ +Introduction +____________ + +.. notebook:: holoviews ../../examples/getting_started/1-Introduction.ipynb + :offset: 1 diff --git a/doc/getting_started/Live_Data.rst b/doc/getting_started/Live_Data.rst new file mode 100644 index 0000000000..4b0d6711ca --- /dev/null +++ b/doc/getting_started/Live_Data.rst @@ -0,0 +1,6 @@ +Live Data +_________ + +.. notebook:: holoviews ../../examples/getting_started/5-Live_Data.ipynb + :skip_output: When run live, this cell's output should match the behavior of the GIF below + :offset: 1 diff --git a/doc/getting_started/Tabular_Datasets.rst b/doc/getting_started/Tabular_Datasets.rst new file mode 100644 index 0000000000..46758d5320 --- /dev/null +++ b/doc/getting_started/Tabular_Datasets.rst @@ -0,0 +1,5 @@ +Tabular Datasets +________________ + +.. notebook:: holoviews ../../examples/getting_started/3-Tabular_Datasets.ipynb + :offset: 1 diff --git a/doc/getting_started/index.rst b/doc/getting_started/index.rst new file mode 100644 index 0000000000..56fa7e9371 --- /dev/null +++ b/doc/getting_started/index.rst @@ -0,0 +1,21 @@ +Welcome to HoloViews! +_____________________ + +This 'Getting Started' guide aims to get you using HoloViews productively as quickly as possible. It is designed as an entrypoint for new users that will introduce the core concepts necessary to get you working productively with your own data. We recommend reading this guide in order if you wish to get an overview of what is offered by HoloViews. For detailed documentation, please consult our `User Guide <../user_guide/index.html>`_ which we will link to from the appropriate sections of this guide. + +* `Introduction `_ +* `Customization `_ +* `Tabular Datasets `_ +* `Gridded Datasets `_ +* `Live Data `_ + +.. toctree:: + :titlesonly: + :hidden: + :maxdepth: 2 + + Introduction + Customization + Tabular Datasets + Gridded Datasets + Live Data diff --git a/doc/index.rst b/doc/index.rst index 8a4b196388..d431bb7378 100644 --- a/doc/index.rst +++ b/doc/index.rst @@ -139,6 +139,8 @@ showing how to fix the bug or implement the feature! :maxdepth: 2 Home + Getting Started + User Guide Features Tutorials Examples diff --git a/doc/user_guide/Building_Composite_Objects.rst b/doc/user_guide/Building_Composite_Objects.rst new file mode 100644 index 0000000000..922808081b --- /dev/null +++ b/doc/user_guide/Building_Composite_Objects.rst @@ -0,0 +1,5 @@ +Building Composite Objects +__________________________ + +.. notebook:: holoviews ../../examples/user_guide/05-Building_Composite_Objects.ipynb + :offset: 1 diff --git a/doc/user_guide/Continuous_Coordinates.rst b/doc/user_guide/Continuous_Coordinates.rst new file mode 100644 index 0000000000..0d0206df49 --- /dev/null +++ b/doc/user_guide/Continuous_Coordinates.rst @@ -0,0 +1,5 @@ +Continuous Coordinates +______________________ + +.. notebook:: holoviews ../../examples/user_guide/Continuous_Coordinates.ipynb + :offset: 1 diff --git a/doc/user_guide/Custom_Interactivity.rst b/doc/user_guide/Custom_Interactivity.rst new file mode 100644 index 0000000000..eb712588d6 --- /dev/null +++ b/doc/user_guide/Custom_Interactivity.rst @@ -0,0 +1,5 @@ +Custom Interactivity +____________________ + +.. notebook:: holoviews ../../examples/user_guide/12-Custom_Interactivity.ipynb + :offset: 1 diff --git a/doc/user_guide/Customizing_Plots.rst b/doc/user_guide/Customizing_Plots.rst new file mode 100644 index 0000000000..f1f0ef949c --- /dev/null +++ b/doc/user_guide/Customizing_Plots.rst @@ -0,0 +1,5 @@ +Customizing Plots +_________________ + +.. notebook:: holoviews ../../examples/user_guide/03-Customizing_Plots.ipynb + :offset: 1 diff --git a/doc/user_guide/Deploying_Bokeh_Apps.rst b/doc/user_guide/Deploying_Bokeh_Apps.rst new file mode 100644 index 0000000000..46f9197b79 --- /dev/null +++ b/doc/user_guide/Deploying_Bokeh_Apps.rst @@ -0,0 +1,6 @@ +Deploying Bokeh Apps +____________________ + +.. notebook:: holoviews ../../examples/user_guide/Deploying_Bokeh_Apps.ipynb + :skip_execute: True + :offset: 1 diff --git a/doc/user_guide/Exporting_and_Archiving.rst b/doc/user_guide/Exporting_and_Archiving.rst new file mode 100644 index 0000000000..f20976da79 --- /dev/null +++ b/doc/user_guide/Exporting_and_Archiving.rst @@ -0,0 +1,5 @@ +Exporting and Archiving +_______________________ + +.. notebook:: holoviews ../../examples/user_guide/Exporting_and_Archiving.ipynb + :offset: 1 diff --git a/doc/user_guide/Gridded_Datasets.rst b/doc/user_guide/Gridded_Datasets.rst new file mode 100644 index 0000000000..05b4f1426d --- /dev/null +++ b/doc/user_guide/Gridded_Datasets.rst @@ -0,0 +1,5 @@ +Gridded Datasets +________________ + +.. notebook:: holoviews ../../examples/user_guide/08-Gridded_Datasets.ipynb + :offset: 1 diff --git a/doc/user_guide/Indexing_and_Selecting_Data.rst b/doc/user_guide/Indexing_and_Selecting_Data.rst new file mode 100644 index 0000000000..6f3ae69fd4 --- /dev/null +++ b/doc/user_guide/Indexing_and_Selecting_Data.rst @@ -0,0 +1,5 @@ +Indexing and Selecting Data +___________________________ + +.. notebook:: holoviews ../../examples/user_guide/09-Indexing_and_Selecting_Data.ipynb + :offset: 1 diff --git a/doc/user_guide/Large_Data.rst b/doc/user_guide/Large_Data.rst new file mode 100644 index 0000000000..63b998eee8 --- /dev/null +++ b/doc/user_guide/Large_Data.rst @@ -0,0 +1,5 @@ +Working with large data using datashader +________________________________________ + +.. notebook:: holoviews ../../examples/user_guide/14-Large_Data.ipynb + :offset: 1 diff --git a/doc/user_guide/Live_Data.rst b/doc/user_guide/Live_Data.rst new file mode 100644 index 0000000000..5a1057b223 --- /dev/null +++ b/doc/user_guide/Live_Data.rst @@ -0,0 +1,5 @@ +Live Data +_________ + +.. notebook:: holoviews ../../examples/user_guide/06-Live_Data.ipynb + :offset: 1 diff --git a/doc/user_guide/Plotting_with_Bokeh.rst b/doc/user_guide/Plotting_with_Bokeh.rst new file mode 100644 index 0000000000..1bd08c6425 --- /dev/null +++ b/doc/user_guide/Plotting_with_Bokeh.rst @@ -0,0 +1,5 @@ +Plotting with Bokeh +___________________ + +.. notebook:: holoviews ../../examples/user_guide/Plotting_with_Bokeh.ipynb + :offset: 1 diff --git a/doc/user_guide/Responding_to_Events.rst b/doc/user_guide/Responding_to_Events.rst new file mode 100644 index 0000000000..aa043ce68a --- /dev/null +++ b/doc/user_guide/Responding_to_Events.rst @@ -0,0 +1,5 @@ +Responding to Events +____________________ + +.. notebook:: holoviews ../../examples/user_guide/11-Responding_to_Events.ipynb + :offset: 1 diff --git a/doc/user_guide/Tabular_Datasets.rst b/doc/user_guide/Tabular_Datasets.rst new file mode 100644 index 0000000000..ebe6841bb0 --- /dev/null +++ b/doc/user_guide/Tabular_Datasets.rst @@ -0,0 +1,5 @@ +Tabular Datasets +________________ + +.. notebook:: holoviews ../../examples/user_guide/07-Tabular_Datasets.ipynb + :offset: 1 diff --git a/doc/user_guide/Transforming_Elements.rst b/doc/user_guide/Transforming_Elements.rst new file mode 100644 index 0000000000..d253918f0a --- /dev/null +++ b/doc/user_guide/Transforming_Elements.rst @@ -0,0 +1,5 @@ +Transforming Elements +_____________________ + +.. notebook:: holoviews ../../examples/user_guide/10-Transforming_Elements.ipynb + :offset: 1 diff --git a/doc/user_guide/index.rst b/doc/user_guide/index.rst new file mode 100644 index 0000000000..aa40090c06 --- /dev/null +++ b/doc/user_guide/index.rst @@ -0,0 +1,137 @@ +User Guide +__________ + + +Core guides +----------- + +These user guides provide detailed explanation of some of the core +concepts in HoloViews: + +* `Annotating your Data `_ + How to wrap your data in Element and annotate it with additional + metadata to explore and visualize it effectively. + +* `Composition of Elements `_ + Composing your wrapped data into ``Overlay`` and ``Layout`` + collections with the ``+`` and ``*`` operators. + +* `Customizing Plots `_ + Applying plot, style and normalization options to control the look + and feel of the plotting. + +* `Dimensioned Containers `_ + Declaring multi-dimensional containers to animate and facet your + data flexibly. Learn about ``HoloMap``, ``NdOverlay``, ``GridSpace`` + and ``NdLayout`` types and how to use them effectively with the + corresponding ``.layout``, ``.overlay`` and ``.grid`` method. + +* `Building Composite Objects `_ + How to build and work with complex composite objects. + +* `Live Data `_ + Introducing ``DynamicMap`` to lazily declare data and generate + complex interactive visualizations. + +* `Tabular Datasets `_ + Loading and wrapping tabular datasets in HoloViews using NumPy, + pandas and dask and flexibly exploring the dataset using selection, + grouping and aggregation. + +* `Gridded Datasets `_ + Loading and wrapping gridded dataset in HoloViews using NumPy and + XArray to flexibly explore and visualize labelled n-dimensional + arrays. + +* `Indexing and Selecting Data `_ + Effectively indexing and selecting subsets of the data on the + different HoloViews datastructures. + +* `Transforming Elements `_ + Applying and declaring ``Operations`` that transform your allowing + you to define the building blocks of a data analysis pipeline and + quickly explore and visualize the effect of different parameters on + your data. + +* `Responding to Events `_ + Effectively using ``Streams`` to dynamically control and drive your + visualizations by responding to user defined events such as custom + widgets or from the commandline or notebook. + +* `Custom Interactivity `_ + Using linked ``Streams`` to respond to events generated by + interacting with a bokeh plot, e.g. by responding to mouse position, + mouse taps, selections or the current axis range. + +* `Data Transformation Pipelines `_ + Chaining different operations to build complex and lazy data + analysis pipelines, which can drive interactive plots in a notebook + or in a deployed dashboard. + +* `Working with large data `_ + Leveraging datashader support in HoloViews to effectively and + interactively explore and visualize millions or even billions of + datapoints. + + +Supplementary guides +-------------------- + +These guides provide detail about specific additional features in HoloViews: + +* `Plotting with Bokeh `_ + The basics of plotting with bokeh including details about plot tools + and backend specific styling options and working with bokeh models + more directly. + +* `Deploying Bokeh Apps `_ + Instructions on how to declare and deploy bokeh apps using HoloViews + in various scenarios, e.g. from scripts, from the commandline and + within the notebook. + +* `Plotting with matplotlib `_ + The basics of plotting with matplotlib highlighting core differences + in styling and controlling the layout of matplotlib figures. + +* `Plotting with plotly `_ + The basics of plotting with plotly focusing on 3D plotting, one of + the main strengths of plotly. + +* `Working with renderers and plots `_ + Using HoloViews Renderer and Plot classes directly to access and + manipulate your visualizations directly. + +* `Exporting and Archiving `_ + Using HoloViews to archive both your data and visualization from the + notebook. + +* `Continuous Coordinates `_ Details on + how continuous coordinates are handled in HoloViews specifically + focusing on the difference between ``Image`` and other ``Raster`` types. + + +.. toctree:: + :titlesonly: + :hidden: + :maxdepth: 2 + + Annotating your Data + Composition of Elements + Customizing Plots + Dimensioned Containers + Building Composite Objects + Live Data + Tabular Datasets + Gridded Datasets + Indexing and Selecting Data + Transforming Elements + Responding to Events + Custom Interactivity + Working with large data + Plotting with Bokeh + Deploying Bokeh Apps + Plotting with matplotlib + Plotting with plotly + Working with Plot and Renderers + Exporting and Archiving + Continuous Coordinates diff --git a/examples/README.rst b/examples/README.rst index 031e5d3e06..a26a1a43c1 100644 --- a/examples/README.rst +++ b/examples/README.rst @@ -1,7 +1,7 @@ When working on a notebook in examples/ make sure to run ``python examples.py reference Example.ipynb`` to locate any references to that -notebook in the documenation. +notebook in the documentation. If there are any such references, please check that the content of those pages is still correct. diff --git a/examples/assets/diseases.csv.gz b/examples/assets/diseases.csv.gz new file mode 100644 index 0000000000..67e0950a5c Binary files /dev/null and b/examples/assets/diseases.csv.gz differ diff --git a/examples/assets/hourly_taxi_data.npz b/examples/assets/hourly_taxi_data.npz new file mode 100644 index 0000000000..1aba0683fe Binary files /dev/null and b/examples/assets/hourly_taxi_data.npz differ diff --git a/examples/assets/penguins.png b/examples/assets/penguins.png new file mode 120000 index 0000000000..23d0a156b8 --- /dev/null +++ b/examples/assets/penguins.png @@ -0,0 +1 @@ +../../examples/elements/assets/penguins.png \ No newline at end of file diff --git a/examples/assets/spike_train.csv.gz b/examples/assets/spike_train.csv.gz new file mode 100644 index 0000000000..8422f79322 Binary files /dev/null and b/examples/assets/spike_train.csv.gz differ diff --git a/examples/assets/station_info.csv b/examples/assets/station_info.csv new file mode 100644 index 0000000000..07ec08fe55 --- /dev/null +++ b/examples/assets/station_info.csv @@ -0,0 +1,61 @@ +name,lat,lon,opened,services,service_names,ridership +First Avenue,40.730953,-73.981628,1924,1,['L'],7.70211 +Second Avenue,40.723402,-73.989938,1936,1,['F'],5.84771 +Third Avenue,40.732849,-73.986122,1924,1,['L'],2.386533 +Fifth Avenue,40.753821,-73.981963,1920,6,"['7', 'E', 'M', 'N', 'R', 'W']",16.220605 +Sixth Avenue,40.737335,-73.996786,1924,1,['L'],16.121318 +Seventh Avenue,40.762862,-73.981637,1919,7,"['B', 'D', 'E', 'N', 'Q', 'R', 'W']",12.013107 +Eighth Street,40.730328,-73.992629,1917,4,"['N', 'Q', 'R', 'W']",5.894747 +14th Street,40.737826,-74.000201,1918,8,"['1', '2', '3', 'A', 'C', 'E', 'F', 'M']",30.885045 +14th Street–Union Square,40.734673,-73.989951,1904,8,"['4', '5', '6', 'L', 'N', 'Q', 'R', 'W']",35.320623 +18th Street,40.74104,-73.997871,1918,2,"['1', '2']",2.676304 +23rd Street,40.739864,-73.986599,1904,13,"['1', '2', '4', '6', 'A', 'C', 'E', 'F', 'M', 'N', 'Q', 'R', 'W']",38.264231 +28th Street,40.74307,-73.984264,1904,8,"['1', '2', '4', '6', 'N', 'Q', 'R', 'W']",15.962813 +33rd Street,40.746081,-73.982076,1904,2,"['4', '6']",9.701723 +34th Street,40.750373,-73.991057,1917,15,"['1', '2', '3', '7', 'A', 'B', 'C', 'D', 'E', 'F', 'M', 'N', 'Q', 'R', 'W']",54.456594 +42nd Street,40.757308,-73.989735,1932,7,"['A', 'B', 'C', 'D', 'E', 'F', 'M']",66.359208 +49th Street,40.759901,-73.984139,1919,4,"['N', 'Q', 'R', 'W']",8.029988 +51st Street,40.757107,-73.97192,1918,2,"['4', '6']",20.479923 +57th Street,40.764664,-73.980658,1968,1,['F'],4.720245 +59th Street,40.762526,-73.967967,1918,3,"['4', '5', '6']",25.566655 +59th Street–Columbus Circle,40.768247,-73.981929,1904,6,"['1', '2', 'A', 'B', 'C', 'D']",23.299666 +66th Street–Lincoln Center,40.77344,-73.982209,1904,2,"['1', '2']",7.790234 +68th Street–Hunter College,40.768141,-73.96387,1918,2,"['4', '6']",10.237854 +72nd Street,40.778453,-73.98197,1904,8,"['1', '2', '3', 'A', 'B', 'C', 'N', 'Q']",16.320597 +77th Street,40.77362,-73.959874,1918,2,"['4', '6']",12.738039 +81st Street–Museum of Natural History,40.781433,-73.972143,1932,3,"['A', 'B', 'C']",4.584041 +96th Street,40.785672,-73.95107,1904,10,"['1', '2', '3', '4', '6', 'A', 'B', 'C', 'N', 'Q']",24.38848 +103rd Street,40.7906,-73.947478,1904,6,"['1', '4', '6', 'A', 'B', 'C']",9.810965 +110th Street,40.79502,-73.94425,1904,4,"['2', '3', '4', '6']",4.209449 +116th Street,40.798629,-73.941617,1904,7,"['2', '3', '4', '6', 'A', 'B', 'C']",11.518755 +Astor Place,40.730054,-73.99107,1904,2,"['4', '6']",5.447655 +Bleecker Street,40.725915,-73.994659,1904,2,"['4', '6']",12.666868 +Bowery,40.72028,-73.993915,1913,2,"['J', 'Z']",1.18492 +Bowling Green,40.704817,-74.014065,1905,2,"['4', '5']",9.153462 +Broad Street,40.706476,-74.011056,1931,2,"['J', 'Z']",1.83478 +Broadway–Lafayette Street,40.725297,-73.996204,1936,4,"['B', 'D', 'F', 'M']",12.666868 +Brooklyn Bridge–City Hall,40.713065,-74.004131,1904,3,"['4', '5', '6']",10.481576 +Canal Street,40.718092,-73.999892,1904,13,"['1', '2', '4', '6', 'A', 'C', 'E', 'J', 'N', 'Q', 'R', 'W', 'Z']",22.936298 +Chambers Street,40.71419,-74.003199,1913,7,"['1', '2', '3', 'A', 'C', 'J', 'Z']",34.446431 +Christopher Street,40.733422,-74.002906,1918,2,"['1', '2']",3.363949 +City Hall,40.713282,-74.006978,1918,3,"['N', 'R', 'W']",1.828806 +Cortlandt Street,40.710668,-74.011029,1918,3,"['N', 'R', 'W']",2.713532 +Delancey Street,40.718611,-73.988114,1936,1,['F'],8.226975 +East Broadway,40.713715,-73.990173,1936,1,['F'],4.576662 +Essex Street,40.718315,-73.987437,1908,3,"['J', 'M', 'Z']",8.226975 +Franklin Street,40.719318,-74.006886,1918,2,"['1', '2']",1.731354 +Fulton Street,40.710374,-74.007582,1905,8,"['2', '3', '4', '5', 'A', 'C', 'J', 'Z']",23.315037 +Grand Central–42nd Street,40.751776,-73.976848,1904,5,"['4', '5', '6', '7', 'S']",46.737564 +Grand Street,40.711926,-73.94067,1967,2,"['B', 'D']",10.253621 +Houston Street,40.728251,-74.005367,1918,2,"['1', '2']",4.377409 +Lexington Avenue,40.76266,-73.967258,1920,7,"['E', 'F', 'M', 'N', 'Q', 'R', 'W']",21.407792 +Prince Street,40.724329,-73.997702,1917,4,"['N', 'Q', 'R', 'W']",5.386641 +Rector Street,40.707513,-74.013783,1918,4,"['1', 'N', 'R', 'W']",4.939783 +Roosevelt Island,40.759145,-73.95326,1989,1,['F'],1.966493 +South Ferry,40.702068,-74.013664,2009,1,['1'],8.750364 +Spring Street,40.722301,-73.997141,1904,5,"['4', '6', 'A', 'C', 'E']",7.586717 +Times Square–42nd Street,40.754672,-73.986754,1917,7,"['1', '2', '3', 'N', 'Q', 'R', 'W']",66.359208 +Wall Street,40.707557,-74.011862,1905,4,"['2', '3', '4', '5']",14.359598 +West Fourth Street,40.732338,-74.000495,1932,7,"['A', 'B', 'C', 'D', 'E', 'F', 'M']",14.147148 +Whitehall Street,40.703087,-74.012994,1918,3,"['N', 'R', 'W']",8.750364 +World Trade Center,40.712582,-74.009781,1932,1,['E'],16.910084 diff --git a/examples/assets/twophoton.npz b/examples/assets/twophoton.npz new file mode 100644 index 0000000000..cb09c72cbb Binary files /dev/null and b/examples/assets/twophoton.npz differ diff --git a/examples/apps/bokeh/crossfilter.py b/examples/gallery/apps/bokeh/crossfilter.py similarity index 100% rename from examples/apps/bokeh/crossfilter.py rename to examples/gallery/apps/bokeh/crossfilter.py diff --git a/examples/apps/bokeh/game_of_life.py b/examples/gallery/apps/bokeh/game_of_life.py similarity index 91% rename from examples/apps/bokeh/game_of_life.py rename to examples/gallery/apps/bokeh/game_of_life.py index ef6db87b14..8ccfd2b4c4 100644 --- a/examples/apps/bokeh/game_of_life.py +++ b/examples/gallery/apps/bokeh/game_of_life.py @@ -64,21 +64,19 @@ def update(pattern, counter, x, y): r, c = pattern.shape y, x = img.sheet2matrixidx(x,y) img.data[y:y+r,x:x+c] = pattern[::-1] - img.data = img.data.copy() - return img + return hv.Image(img) title = 'Game of Life - Tap to place pattern, Doubletap to clear' opts = { 'style': {'cmap': 'gray', 'toolbar': False, }, - 'plot' : {'height': 400, 'width': 800, title_format: '{label}', + 'plot' : {'height': 400, 'width': 800, 'title_format': '{label}', 'xaxis': None, 'yaxis': None} } -img = hv.Image(np.zeros((100, 200), dtype=np.uint8)).redim.range(z=(0, 1))(**opts) - +img = hv.Image(np.zeros((100, 200), dtype=np.uint8)) counter, tap = Counter(transient=True), Tap(transient=True) pattern_dim = hv.Dimension('Pattern', values=sorted(shapes.keys())) dmap = hv.DynamicMap(update, kdims=[pattern_dim], streams=[counter, tap]) -doc = renderer.app(dmap) +doc = renderer.server_doc(dmap.redim.range(z=(0, 1))(**opts)) dmap.periodic(0.05, None) doc.title = 'Game of Life' diff --git a/examples/apps/bokeh/gapminder.py b/examples/gallery/apps/bokeh/gapminder.py similarity index 95% rename from examples/apps/bokeh/gapminder.py rename to examples/gallery/apps/bokeh/gapminder.py index d5a3092501..b7198de6ac 100644 --- a/examples/apps/bokeh/gapminder.py +++ b/examples/gallery/apps/bokeh/gapminder.py @@ -37,8 +37,8 @@ gapminder_ds = ds.redim(**dimensions).to(hv.Points, kdims, vdims, 'Year') # Define annotations -text = ds.clone({yr: hv.Text(1.2, 25, str(int(yr)), fontsize=30) - for yr in gapminder_ds.keys()}) +text = gapminder_ds.clone({yr: hv.Text(1.2, 25, str(int(yr)), fontsize=30) + for yr in gapminder_ds.keys()}) # Define options opts = {'plot': dict(width=1000, height=600,tools=['hover'], size_index='Population', diff --git a/examples/apps/bokeh/mandelbrot.py b/examples/gallery/apps/bokeh/mandelbrot.py similarity index 100% rename from examples/apps/bokeh/mandelbrot.py rename to examples/gallery/apps/bokeh/mandelbrot.py diff --git a/examples/gallery/apps/bokeh/nytaxi_hover.py b/examples/gallery/apps/bokeh/nytaxi_hover.py new file mode 100644 index 0000000000..fe57529ea9 --- /dev/null +++ b/examples/gallery/apps/bokeh/nytaxi_hover.py @@ -0,0 +1,52 @@ +""" +Bokeh app example using datashader for rasterizing a large dataset and +geoviews for reprojecting coordinate systems. + +This example requires the 1.7GB nyc_taxi.csv dataset which you can +obtain by following the instructions for 'nyc_taxi' at: + + https://github.com/bokeh/datashader/blob/master/examples/README.md + +Once this CSV is placed in a data/ subfolder, you can run this app with: + + bokeh serve --show nytaxi_hover.py + +""" +import holoviews as hv +import geoviews as gv +import dask.dataframe as dd +import cartopy.crs as ccrs + +from holoviews.operation.datashader import datashade, aggregate + +hv.extension('bokeh') + +# Set plot and style options +hv.util.opts('Image [width=800 height=400 shared_axes=False logz=True] {+axiswise} ') +hv.util.opts("HLine VLine (color='white' line_width=1) Layout [shared_axes=False] ") +hv.util.opts("Curve [xaxis=None yaxis=None show_grid=False, show_frame=False] (color='orangered') {+framewise}") + +# Read the CSV file +df = dd.read_csv('data/nyc_taxi.csv',usecols=['pickup_x', 'pickup_y']) +df = df.persist() + +# Reproject points from Mercator to PlateCarree (latitude/longitude) +points = gv.Points(df, kdims=['pickup_x', 'pickup_y'], vdims=[], crs=ccrs.GOOGLE_MERCATOR) +projected = gv.operation.project_points(points, projection=ccrs.PlateCarree()) +projected = projected.redim(pickup_x='lon', pickup_y='lat') + +# Use datashader to rasterize and linked streams for interactivity +agg = aggregate(projected, link_inputs=True, x_sampling=0.0001, y_sampling=0.0001) +pointerx = hv.streams.PointerX(x=-74, source=projected) +pointery = hv.streams.PointerY(y=40.8, source=projected) +vline = hv.DynamicMap(lambda x: hv.VLine(x), streams=[pointerx]) +hline = hv.DynamicMap(lambda y: hv.HLine(y), streams=[pointery]) + +sampled = hv.util.Dynamic(agg, operation=lambda obj, x: obj.sample(lon=x), + streams=[pointerx], link_inputs=False) + +hvobj = ((agg * hline * vline) << sampled.opts(plot={'Curve': dict(width=100)})) + +# Obtain Bokeh document and set the title +doc = hv.renderer('bokeh').server_doc(hvobj) +doc.title = 'NYC Taxi Crosshair' diff --git a/examples/apps/bokeh/player.py b/examples/gallery/apps/bokeh/player.py similarity index 100% rename from examples/apps/bokeh/player.py rename to examples/gallery/apps/bokeh/player.py diff --git a/examples/apps/bokeh/selection_stream.py b/examples/gallery/apps/bokeh/selection_stream.py similarity index 100% rename from examples/apps/bokeh/selection_stream.py rename to examples/gallery/apps/bokeh/selection_stream.py diff --git a/examples/demos/bokeh/area_chart.ipynb b/examples/gallery/demos/bokeh/area_chart.ipynb similarity index 78% rename from examples/demos/bokeh/area_chart.ipynb rename to examples/gallery/demos/bokeh/area_chart.ipynb index f6686392ce..bb3cf7d0c5 100644 --- a/examples/demos/bokeh/area_chart.ipynb +++ b/examples/gallery/demos/bokeh/area_chart.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -34,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# create some example data\n", @@ -60,9 +56,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "style = dict(fill_alpha=0.5)\n", @@ -73,23 +67,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/bachelors_degrees_by_gender.ipynb b/examples/gallery/demos/bokeh/bachelors_degrees_by_gender.ipynb similarity index 92% rename from examples/demos/bokeh/bachelors_degrees_by_gender.ipynb rename to examples/gallery/demos/bokeh/bachelors_degrees_by_gender.ipynb index fe39cd8932..a8c9ffe805 100644 --- a/examples/demos/bokeh/bachelors_degrees_by_gender.ipynb +++ b/examples/gallery/demos/bokeh/bachelors_degrees_by_gender.ipynb @@ -119,23 +119,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/bokeh/bars_economic.ipynb b/examples/gallery/demos/bokeh/bars_economic.ipynb new file mode 100644 index 0000000000..27fe12878c --- /dev/null +++ b/examples/gallery/demos/bokeh/bars_economic.ipynb @@ -0,0 +1,71 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Matplotlib - bars_economic](../matplotlib/bars_economic.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('bokeh','matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Declaring data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "macro_df = pd.read_csv('http://assets.holoviews.org/macro.csv', '\\t')\n", + "key_dimensions = [('year', 'Year'), ('country', 'Country')]\n", + "value_dimensions = [('unem', 'Unemployment'), ('capmob', 'Capital Mobility'),\n", + " ('gdp', 'GDP Growth'), ('trade', 'Trade')]\n", + "macro = hv.Table(macro_df, kdims=key_dimensions, vdims=value_dimensions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Bars [stack_index=1 xrotation=90 legend_cols=7 show_legend=False show_frame=False tools=['hover']]\n", + "%%opts Bars (color=Cycle('Category20'))\n", + "macro.to.bars([ 'Year', 'Country'], 'Trade', [])" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/bokeh/boxplot_chart.ipynb b/examples/gallery/demos/bokeh/boxplot_chart.ipynb similarity index 78% rename from examples/demos/bokeh/boxplot_chart.ipynb rename to examples/gallery/demos/bokeh/boxplot_chart.ipynb index a183a0fb7f..99190fd3fb 100644 --- a/examples/demos/bokeh/boxplot_chart.ipynb +++ b/examples/gallery/demos/bokeh/boxplot_chart.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.autompg import autompg as df\n", @@ -64,23 +62,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/dot_example.ipynb b/examples/gallery/demos/bokeh/dot_example.ipynb similarity index 82% rename from examples/demos/bokeh/dot_example.ipynb rename to examples/gallery/demos/bokeh/dot_example.ipynb index 861bd16a7f..2f63ac6eec 100644 --- a/examples/demos/bokeh/dot_example.ipynb +++ b/examples/gallery/demos/bokeh/dot_example.ipynb @@ -28,9 +28,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "factors = [\"a\", \"b\", \"c\", \"d\", \"e\", \"f\", \"g\", \"h\"]\n", @@ -68,22 +66,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/bokeh/dragon_curve.ipynb b/examples/gallery/demos/bokeh/dragon_curve.ipynb new file mode 100644 index 0000000000..e2df26032d --- /dev/null +++ b/examples/gallery/demos/bokeh/dragon_curve.ipynb @@ -0,0 +1,137 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Dragon curve example from the [L-systems](../../topics/geometry/lsystems.ipynb) topic notebook in ``examples/topics/geometry``.\n", + "\n", + "Most examples work across multiple plotting backends, this example is also available for:\n", + "* [Matplotlib - dragon_curve](../matplotlib/dragon_curve.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "import numpy as np\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## L-system definition" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The following class is a simplified version of the approach used in the [L-systems](../../topics/geometry/lsystems.ipynb) notebook, made specifically for plotting the [Dragon Curve](https://en.wikipedia.org/wiki/Dragon_curve)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "class DragonCurve(object):\n", + " \"L-system agent that follows rules to generate the Dragon Curve\"\n", + " \n", + " initial ='FX'\n", + " productions = {'X':'X+YF+', 'Y':'-FX-Y'}\n", + " dragon_rules = {'F': lambda t,d,a: t.forward(d),\n", + " 'B': lambda t,d,a: t.back(d),\n", + " '+': lambda t,d,a: t.rotate(-a),\n", + " '-': lambda t,d,a: t.rotate(a),\n", + " 'X':lambda t,d,a: None,\n", + " 'Y':lambda t,d,a: None }\n", + " \n", + " def __init__(self, x=0,y=0, iterations=1):\n", + " self.heading = 0\n", + " self.distance = 5\n", + " self.angle = 90\n", + " self.x, self.y = x,y\n", + " self.trace = [(self.x, self.y)]\n", + " self.process(self.expand(iterations), self.distance, self.angle)\n", + " \n", + " def process(self, instructions, distance, angle):\n", + " for i in instructions: \n", + " self.dragon_rules[i](self, distance, angle)\n", + " \n", + " def expand(self, iterations):\n", + " \"Expand an initial symbol with the given production rules\"\n", + " expansion = self.initial\n", + " \n", + " for i in range(iterations):\n", + " intermediate = \"\"\n", + " for ch in expansion:\n", + " intermediate = intermediate + self.productions.get(ch,ch)\n", + " expansion = intermediate\n", + " return expansion\n", + "\n", + " def forward(self, distance):\n", + " self.x += np.cos(2*np.pi * self.heading/360.0)\n", + " self.y += np.sin(2*np.pi * self.heading/360.0)\n", + " self.trace.append((self.x,self.y))\n", + " \n", + " def rotate(self, angle):\n", + " self.heading += angle\n", + " \n", + " def back(self, distance):\n", + " self.heading += 180\n", + " self.forward(distance)\n", + " self.heading += 180\n", + " \n", + " @property\n", + " def path(self):\n", + " return hv.Path([self.trace])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Path {+framewise} [xaxis=None yaxis=None title_format=''] (color='black' line_width=1)\n", + "\n", + "def pad_extents(path):\n", + " \"Add 5% padding around the path\"\n", + " minx, maxx = path.range('x')\n", + " miny, maxy = path.range('y')\n", + " xpadding = ((maxx-minx) * 0.1)/2\n", + " ypadding = ((maxy-miny) * 0.1)/2\n", + " path.extents = (minx-xpadding, miny-ypadding, maxx+xpadding, maxy+ypadding)\n", + " return path\n", + " \n", + "hmap = hv.HoloMap(kdims=['Iteration'])\n", + "for i in range(7,17):\n", + " path = DragonCurve(-200, 0, i).path\n", + " hmap[i] = pad_extents(path)\n", + "hmap" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/gallery/demos/bokeh/dropdown_economic.ipynb b/examples/gallery/demos/bokeh/dropdown_economic.ipynb new file mode 100644 index 0000000000..3413b69f40 --- /dev/null +++ b/examples/gallery/demos/bokeh/dropdown_economic.ipynb @@ -0,0 +1,85 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Matplotlib - dropdown_economic](../matplotlib/dropdown_economic.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Declaring data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "macro_df = pd.read_csv('http://assets.holoviews.org/macro.csv', '\\t')\n", + "key_dimensions = [('year', 'Year'), ('country', 'Country')]\n", + "value_dimensions = [('unem', 'Unemployment'), ('capmob', 'Capital Mobility'),\n", + " ('gdp', 'GDP Growth'), ('trade', 'Trade')]\n", + "macro = hv.Table(macro_df, kdims=key_dimensions, vdims=value_dimensions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Note:** The ``Arrow`` element is not current supported by the Bokeh plotting extension. This version uses ``VLines`` and ``Text`` instead of ``Arrow`` and will be updated to match matplotlib's approach in the next version of HoloViews." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Overlay [width=700 height=400 show_frame=False]\n", + "%%opts Curve (color='k') Scatter [color_index=2 size_index=2 scaling_factor=1.4] (cmap='Blues' line_color='k')\n", + "%%opts VLine (color='k' line_width=1)\n", + "%%opts Text (text_font_size='13px')\n", + "gdp_curves = macro.to.curve('Year', 'GDP Growth')\n", + "gdp_unem_scatter = macro.to.scatter('Year', ['GDP Growth', 'Unemployment'])\n", + "vlines = hv.VLine(1973)* hv.VLine(1975) * hv.VLine(1979) * hv.VLine(1981.9)\n", + "text = (hv.Text(1971.7,9, 'Oil Crisis') * hv.Text(1976.4, 9, 'Stagflation') \n", + " * hv.Text(1977.3, 7, 'Energy Crisis') * hv.Text(1985, 9, 'Early Eighties Recession'))\n", + "gdp_curves * gdp_unem_scatter * vlines * text" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/bokeh/histogram_example.ipynb b/examples/gallery/demos/bokeh/histogram_example.ipynb similarity index 92% rename from examples/demos/bokeh/histogram_example.ipynb rename to examples/gallery/demos/bokeh/histogram_example.ipynb index 600ea77b9d..6e8efdd5c6 100644 --- a/examples/demos/bokeh/histogram_example.ipynb +++ b/examples/gallery/demos/bokeh/histogram_example.ipynb @@ -129,23 +129,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/iris_example.ipynb b/examples/gallery/demos/bokeh/iris_example.ipynb similarity index 76% rename from examples/demos/bokeh/iris_example.ipynb rename to examples/gallery/demos/bokeh/iris_example.ipynb index e2156a057e..10645b7bf7 100644 --- a/examples/demos/bokeh/iris_example.ipynb +++ b/examples/gallery/demos/bokeh/iris_example.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.iris import flowers\n", @@ -60,23 +58,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/iris_splom_example.ipynb b/examples/gallery/demos/bokeh/iris_splom_example.ipynb similarity index 80% rename from examples/demos/bokeh/iris_splom_example.ipynb rename to examples/gallery/demos/bokeh/iris_splom_example.ipynb index cf00e50424..b7ff1e3a87 100644 --- a/examples/demos/bokeh/iris_splom_example.ipynb +++ b/examples/gallery/demos/bokeh/iris_splom_example.ipynb @@ -54,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "plot_opts = dict(tools=['hover', 'box_select'], bgcolor='#efe8e2')\n", @@ -67,23 +65,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/legend_example.ipynb b/examples/gallery/demos/bokeh/legend_example.ipynb similarity index 81% rename from examples/demos/bokeh/legend_example.ipynb rename to examples/gallery/demos/bokeh/legend_example.ipynb index 5f9cb80343..d7c0336f29 100644 --- a/examples/demos/bokeh/legend_example.ipynb +++ b/examples/gallery/demos/bokeh/legend_example.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "x = np.linspace(0, 4*np.pi, 100)\n", @@ -70,23 +68,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/lesmis_example.ipynb b/examples/gallery/demos/bokeh/lesmis_example.ipynb similarity index 84% rename from examples/demos/bokeh/lesmis_example.ipynb rename to examples/gallery/demos/bokeh/lesmis_example.ipynb index 3bd72548fb..def790195a 100644 --- a/examples/demos/bokeh/lesmis_example.ipynb +++ b/examples/gallery/demos/bokeh/lesmis_example.ipynb @@ -10,9 +10,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -30,9 +28,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.les_mis import data\n", @@ -76,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "plot_opts = dict(height=800, width=800, xaxis='top', logz=True, xrotation=90,\n", @@ -95,9 +89,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "hv.Layout([c(plot=dict(width=300, height=300))\n", @@ -107,23 +99,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/lorenz_attractor_example.ipynb b/examples/gallery/demos/bokeh/lorenz_attractor_example.ipynb similarity index 79% rename from examples/demos/bokeh/lorenz_attractor_example.ipynb rename to examples/gallery/demos/bokeh/lorenz_attractor_example.ipynb index 1646c9ed90..b1fa88425e 100644 --- a/examples/demos/bokeh/lorenz_attractor_example.ipynb +++ b/examples/gallery/demos/bokeh/lorenz_attractor_example.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -34,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from scipy.integrate import odeint\n", @@ -76,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "lorenzian(style={'Path': dict(color=hv.Palette('Blues'))})" @@ -86,23 +80,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/bokeh/mandelbrot_section.ipynb b/examples/gallery/demos/bokeh/mandelbrot_section.ipynb new file mode 100644 index 0000000000..d53d85fd81 --- /dev/null +++ b/examples/gallery/demos/bokeh/mandelbrot_section.ipynb @@ -0,0 +1,82 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Matplotlib - mandelbrot section](../matplotlib/mandelbrot_section.ipynb)\n", + "\n", + "HoloViews demo that used to be showcased on the [holoviews.org" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Load the data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import io\n", + "try: from urllib2 import urlopen\n", + "except: from urllib.request import urlopen\n", + "\n", + "raw = urlopen('http://assets.holoviews.org/data/mandelbrot.npy').read()\n", + "array = np.load(io.BytesIO(raw)).astype(np.float32)[::4,::4]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [scaling_factor=50] Contours [show_legend=False] (color='w')\n", + "dots = np.linspace(-0.45, 0.45, 19)\n", + "fractal = hv.Image(array)\n", + "# First example on the old holoviews.org homepage was:\n", + "# ((fractal * hv.HLine(y=0)).hist() + fractal.sample(y=0))\n", + "layouts = {y: (fractal * hv.Points(fractal.sample([(i,y) for i in dots])) +\n", + " fractal.sample(y=y) +\n", + " hv.operation.threshold(fractal, level=np.percentile(fractal.sample(y=y)['z'], 90)) +\n", + " hv.operation.contours(fractal, levels=[np.percentile(fractal.sample(y=y)['z'], 60)]))\n", + " for y in np.linspace(-0.3, 0.3, 11)} # Half the frames of the bokeh version\n", + "\n", + "hv.HoloMap(layouts, kdims=['Y']).collate().cols(2)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/bokeh/measles_example.ipynb b/examples/gallery/demos/bokeh/measles_example.ipynb similarity index 85% rename from examples/demos/bokeh/measles_example.ipynb rename to examples/gallery/demos/bokeh/measles_example.ipynb index f2b4217236..43f60ff2ff 100644 --- a/examples/demos/bokeh/measles_example.ipynb +++ b/examples/gallery/demos/bokeh/measles_example.ipynb @@ -33,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "url = 'https://raw.githubusercontent.com/blmoore/blogR/master/data/measles_incidence.csv'\n", @@ -78,27 +76,9 @@ } ], "metadata": { - "hide_input": false, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" - }, - "widgets": { - "state": {}, - "version": "1.1.2" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/quiver_demo.ipynb b/examples/gallery/demos/bokeh/quiver_demo.ipynb similarity index 84% rename from examples/demos/bokeh/quiver_demo.ipynb rename to examples/gallery/demos/bokeh/quiver_demo.ipynb index 0f0ae3f5a2..0c02b0a612 100644 --- a/examples/demos/bokeh/quiver_demo.ipynb +++ b/examples/gallery/demos/bokeh/quiver_demo.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "xs, ys = np.arange(0, 2 * np.pi, .2), np.arange(0, 2 * np.pi, .2)\n", @@ -85,23 +83,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/bokeh/scatter_economic.ipynb b/examples/gallery/demos/bokeh/scatter_economic.ipynb new file mode 100644 index 0000000000..ce425a4252 --- /dev/null +++ b/examples/gallery/demos/bokeh/scatter_economic.ipynb @@ -0,0 +1,73 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Matplotlib - scatter_economic](../matplotlib/scatter_economic.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Declaring data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "macro_df = pd.read_csv('http://assets.holoviews.org/macro.csv', '\\t')\n", + "key_dimensions = [('year', 'Year'), ('country', 'Country')]\n", + "value_dimensions = [('unem', 'Unemployment'), ('capmob', 'Capital Mobility'),\n", + " ('gdp', 'GDP Growth'), ('trade', 'Trade')]\n", + "macro = hv.Table(macro_df, kdims=key_dimensions, vdims=value_dimensions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter [width=700 height=400 scaling_method='width' scaling_factor=2 size_index=2 show_grid=True] \n", + "%%opts Scatter (color=Cycle('Category20') line_color='k')\n", + "%%opts NdOverlay [legend_position='left' show_frame=False]\n", + "gdp_unem_scatter = macro.to.scatter('Year', ['GDP Growth', 'Unemployment'])\n", + "gdp_unem_scatter.overlay('Country')" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/bokeh/square_limit.ipynb b/examples/gallery/demos/bokeh/square_limit.ipynb similarity index 90% rename from examples/demos/bokeh/square_limit.ipynb rename to examples/gallery/demos/bokeh/square_limit.ipynb index c10948a1a4..86d55199da 100644 --- a/examples/demos/bokeh/square_limit.ipynb +++ b/examples/gallery/demos/bokeh/square_limit.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Demo version of the topic notebook in [square_limit](../../topics/geometry/square_limit.ipynb) notebook in ``examples/topics/geometry``.\n", + "Demo version of the [square_limit](../../topics/geometry/square_limit.ipynb) topic notebook in ``examples/topics/geometry``.\n", "\n", "\n", "Most examples work across multiple plotting backends, this example is also available for:\n", @@ -34,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "spline=[(0.0,1.0),(0.08,0.98),(0.22,0.82),(0.29,0.72),(0.29,0.72),(0.3,0.64),(0.29,0.57),(0.3,0.5),\n", @@ -76,13 +74,13 @@ " den = float(n + m)\n", " t1 = Affine2D().scale(n / den, 1)\n", " t2 = Affine2D().scale(m / den, 1).translate(n / den, 0)\n", - " return T(spline1, t1) * T(spline2, t2)\n", + " return combine(T(spline1, t1) * T(spline2, t2))\n", "\n", "def above(spline1, spline2, n=1, m=1):\n", " den = float(n + m)\n", " t1 = Affine2D().scale(1, n / den).translate(0, m / den)\n", " t2 = Affine2D().scale(1, m / den)\n", - " return T(spline1, t1) * T(spline2, t2)\n", + " return combine(T(spline1, t1) * T(spline2, t2))\n", "\n", "def nonet(p, q, r, s, t, u, v, w, x):\n", " return above(beside(p, beside(q, r), 1, 2),\n", @@ -137,22 +135,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.1" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/step_chart.ipynb b/examples/gallery/demos/bokeh/step_chart.ipynb similarity index 83% rename from examples/demos/bokeh/step_chart.ipynb rename to examples/gallery/demos/bokeh/step_chart.ipynb index db4744d218..1331f479e3 100644 --- a/examples/demos/bokeh/step_chart.ipynb +++ b/examples/gallery/demos/bokeh/step_chart.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# build a dataset where multiple columns measure the same thing\n", @@ -73,23 +71,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/stocks_example.ipynb b/examples/gallery/demos/bokeh/stocks_example.ipynb similarity index 86% rename from examples/demos/bokeh/stocks_example.ipynb rename to examples/gallery/demos/bokeh/stocks_example.ipynb index 9ec35aa063..bc5375cd9b 100644 --- a/examples/demos/bokeh/stocks_example.ipynb +++ b/examples/gallery/demos/bokeh/stocks_example.ipynb @@ -33,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from holoviews.operation.timeseries import rolling\n", @@ -86,23 +84,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/texas_choropleth_example.ipynb b/examples/gallery/demos/bokeh/texas_choropleth_example.ipynb similarity index 81% rename from examples/demos/bokeh/texas_choropleth_example.ipynb rename to examples/gallery/demos/bokeh/texas_choropleth_example.ipynb index edbf096ede..56d7193650 100644 --- a/examples/demos/bokeh/texas_choropleth_example.ipynb +++ b/examples/gallery/demos/bokeh/texas_choropleth_example.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -34,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.us_counties import data as counties\n", @@ -68,9 +64,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "plot_opts = dict(logz=True, tools=['hover'], xaxis=None, yaxis=None,\n", @@ -82,23 +76,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/topographic_hillshading.ipynb b/examples/gallery/demos/bokeh/topographic_hillshading.ipynb similarity index 86% rename from examples/demos/bokeh/topographic_hillshading.ipynb rename to examples/gallery/demos/bokeh/topographic_hillshading.ipynb index d171770a83..dcc457b845 100644 --- a/examples/demos/bokeh/topographic_hillshading.ipynb +++ b/examples/gallery/demos/bokeh/topographic_hillshading.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -88,23 +86,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/us_unemployment.ipynb b/examples/gallery/demos/bokeh/us_unemployment.ipynb similarity index 78% rename from examples/demos/bokeh/us_unemployment.ipynb rename to examples/gallery/demos/bokeh/us_unemployment.ipynb index 96684633f9..c1de654cc7 100644 --- a/examples/demos/bokeh/us_unemployment.ipynb +++ b/examples/gallery/demos/bokeh/us_unemployment.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import pandas as pd\n", @@ -34,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.unemployment1948 import data\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from matplotlib.colors import ListedColormap\n", @@ -73,23 +67,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/bokeh/verhulst_mandelbrot.ipynb b/examples/gallery/demos/bokeh/verhulst_mandelbrot.ipynb similarity index 91% rename from examples/demos/bokeh/verhulst_mandelbrot.ipynb rename to examples/gallery/demos/bokeh/verhulst_mandelbrot.ipynb index a88d9fcfb0..2c77d43c24 100644 --- a/examples/demos/bokeh/verhulst_mandelbrot.ipynb +++ b/examples/gallery/demos/bokeh/verhulst_mandelbrot.ipynb @@ -40,9 +40,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# Area of the complex plane\n", @@ -111,22 +109,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.1" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/area_chart.ipynb b/examples/gallery/demos/matplotlib/area_chart.ipynb similarity index 78% rename from examples/demos/matplotlib/area_chart.ipynb rename to examples/gallery/demos/matplotlib/area_chart.ipynb index 121f0c3f46..00294b95d6 100644 --- a/examples/demos/matplotlib/area_chart.ipynb +++ b/examples/gallery/demos/matplotlib/area_chart.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -35,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# create some example data\n", @@ -61,9 +57,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "style = dict(alpha=0.5)\n", @@ -74,23 +68,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/bachelors_degrees_by_gender.ipynb b/examples/gallery/demos/matplotlib/bachelors_degrees_by_gender.ipynb similarity index 91% rename from examples/demos/matplotlib/bachelors_degrees_by_gender.ipynb rename to examples/gallery/demos/matplotlib/bachelors_degrees_by_gender.ipynb index b968c4e0d7..a27ba49902 100644 --- a/examples/demos/matplotlib/bachelors_degrees_by_gender.ipynb +++ b/examples/gallery/demos/matplotlib/bachelors_degrees_by_gender.ipynb @@ -33,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import pandas as pd\n", @@ -99,9 +97,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "# Define a callback to define a custom grid along the y-axis and disabling the (ugly) axis spines\n", @@ -122,23 +118,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/matplotlib/bars_economic.ipynb b/examples/gallery/demos/matplotlib/bars_economic.ipynb new file mode 100644 index 0000000000..c78c003b2e --- /dev/null +++ b/examples/gallery/demos/matplotlib/bars_economic.ipynb @@ -0,0 +1,71 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Bokeh - bars_economic](../bokeh/bars_economic.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Declaring data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "macro_df = pd.read_csv('http://assets.holoviews.org/macro.csv', '\\t')\n", + "key_dimensions = [('year', 'Year'), ('country', 'Country')]\n", + "value_dimensions = [('unem', 'Unemployment'), ('capmob', 'Capital Mobility'),\n", + " ('gdp', 'GDP Growth'), ('trade', 'Trade')]\n", + "macro = hv.Table(macro_df, kdims=key_dimensions, vdims=value_dimensions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Bars [category_index=2 stack_index=1 group_index=0 xrotation=90 color_by=['stack']] (color=Cycle('tab20'))\n", + "%%opts Bars [legend_position='right' legend_cols=2]\n", + "macro.to.bars([ 'Year', 'Country'], 'Trade', [])" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/matplotlib/boxplot_chart.ipynb b/examples/gallery/demos/matplotlib/boxplot_chart.ipynb similarity index 78% rename from examples/demos/matplotlib/boxplot_chart.ipynb rename to examples/gallery/demos/matplotlib/boxplot_chart.ipynb index 3996f2a781..86b436e780 100644 --- a/examples/demos/matplotlib/boxplot_chart.ipynb +++ b/examples/gallery/demos/matplotlib/boxplot_chart.ipynb @@ -33,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.autompg import autompg as df\n", @@ -63,23 +61,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/matplotlib/dragon_curve.ipynb b/examples/gallery/demos/matplotlib/dragon_curve.ipynb new file mode 100644 index 0000000000..d3632c8df6 --- /dev/null +++ b/examples/gallery/demos/matplotlib/dragon_curve.ipynb @@ -0,0 +1,138 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Dragon curve example from the [L-systems](../../topics/geometry/lsystems.ipynb) topic notebook in ``examples/topics/geometry``.\n", + "\n", + "Most examples work across multiple plotting backends, this example is also available for:\n", + "* [Bokeh - dragon_curve](../bokeh/dragon_curve.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "import numpy as np\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## L-system definition" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The following class is a simplified version of the approach used in the [L-systems](../../topics/geometry/lsystems.ipynb) notebook, made specifically for plotting the [Dragon Curve](https://en.wikipedia.org/wiki/Dragon_curve)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "class DragonCurve(object):\n", + " \"L-system agent that follows rules to generate the Dragon Curve\"\n", + " \n", + " initial ='FX'\n", + " productions = {'X':'X+YF+', 'Y':'-FX-Y'}\n", + " dragon_rules = {'F': lambda t,d,a: t.forward(d),\n", + " 'B': lambda t,d,a: t.back(d),\n", + " '+': lambda t,d,a: t.rotate(-a),\n", + " '-': lambda t,d,a: t.rotate(a),\n", + " 'X':lambda t,d,a: None,\n", + " 'Y':lambda t,d,a: None }\n", + " \n", + " def __init__(self, x=0,y=0, iterations=1):\n", + " self.heading = 0\n", + " self.distance = 5\n", + " self.angle = 90\n", + " self.x, self.y = x,y\n", + " self.trace = [(self.x, self.y)]\n", + " self.process(self.expand(iterations), self.distance, self.angle)\n", + " \n", + " def process(self, instructions, distance, angle):\n", + " for i in instructions: \n", + " self.dragon_rules[i](self, distance, angle)\n", + " \n", + " def expand(self, iterations):\n", + " \"Expand an initial symbol with the given production rules\"\n", + " expansion = self.initial\n", + " \n", + " for i in range(iterations):\n", + " intermediate = \"\"\n", + " for ch in expansion:\n", + " intermediate = intermediate + self.productions.get(ch,ch)\n", + " expansion = intermediate\n", + " return expansion\n", + "\n", + " def forward(self, distance):\n", + " self.x += np.cos(2*np.pi * self.heading/360.0)\n", + " self.y += np.sin(2*np.pi * self.heading/360.0)\n", + " self.trace.append((self.x,self.y))\n", + " \n", + " def rotate(self, angle):\n", + " self.heading += angle\n", + " \n", + " def back(self, distance):\n", + " self.heading += 180\n", + " self.forward(distance)\n", + " self.heading += 180\n", + " \n", + " @property\n", + " def path(self):\n", + " return hv.Path([self.trace])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%output size=200\n", + "%%opts Path {+framewise} [xaxis=None yaxis=None title_format=''] (color='black' linewidth=1)\n", + "\n", + "def pad_extents(path):\n", + " \"Add 5% padding around the path\"\n", + " minx, maxx = path.range('x')\n", + " miny, maxy = path.range('y')\n", + " xpadding = ((maxx-minx) * 0.1)/2\n", + " ypadding = ((maxy-miny) * 0.1)/2\n", + " path.extents = (minx-xpadding, miny-ypadding, maxx+xpadding, maxy+ypadding)\n", + " return path\n", + " \n", + "hmap = hv.HoloMap(kdims=['Iteration'])\n", + "for i in range(7,17):\n", + " path = DragonCurve(-200, 0, i).path\n", + " hmap[i] = pad_extents(path)\n", + "hmap" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/gallery/demos/matplotlib/dropdown_economic.ipynb b/examples/gallery/demos/matplotlib/dropdown_economic.ipynb new file mode 100644 index 0000000000..1ad070251c --- /dev/null +++ b/examples/gallery/demos/matplotlib/dropdown_economic.ipynb @@ -0,0 +1,75 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Bokeh - dropdown_economic](../bokeh/dropdown_economic.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Declaring data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "macro_df = pd.read_csv('http://assets.holoviews.org/macro.csv', '\\t')\n", + "key_dimensions = [('year', 'Year'), ('country', 'Country')]\n", + "value_dimensions = [('unem', 'Unemployment'), ('capmob', 'Capital Mobility'),\n", + " ('gdp', 'GDP Growth'), ('trade', 'Trade')]\n", + "macro = hv.Table(macro_df, kdims=key_dimensions, vdims=value_dimensions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve (color='k') Scatter [color_index=2 size_index=2 scaling_factor=1.4] (cmap='Blues' edgecolors='k')\n", + "%%opts Overlay [show_frame=True aspect=2, fig_size=250, show_frame=False]\n", + "gdp_curves = macro.to.curve('Year', 'GDP Growth')\n", + "gdp_unem_scatter = macro.to.scatter('Year', ['GDP Growth', 'Unemployment'])\n", + "annotations = hv.Arrow(1973, 8, 'Oil Crisis', 'v') * hv.Arrow(1975, 6, 'Stagflation', 'v') *\\\n", + "hv.Arrow(1979, 8, 'Energy Crisis', 'v') * hv.Arrow(1981.9, 5, 'Early Eighties\\n Recession', 'v')\n", + "gdp_curves * gdp_unem_scatter* annotations" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/matplotlib/histogram_example.ipynb b/examples/gallery/demos/matplotlib/histogram_example.ipynb similarity index 92% rename from examples/demos/matplotlib/histogram_example.ipynb rename to examples/gallery/demos/matplotlib/histogram_example.ipynb index df49c0e379..4f79ea782d 100644 --- a/examples/demos/matplotlib/histogram_example.ipynb +++ b/examples/gallery/demos/matplotlib/histogram_example.ipynb @@ -129,23 +129,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/iris_example.ipynb b/examples/gallery/demos/matplotlib/iris_example.ipynb similarity index 76% rename from examples/demos/matplotlib/iris_example.ipynb rename to examples/gallery/demos/matplotlib/iris_example.ipynb index a1dccdcf8f..c700682388 100644 --- a/examples/demos/matplotlib/iris_example.ipynb +++ b/examples/gallery/demos/matplotlib/iris_example.ipynb @@ -33,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.iris import flowers\n", @@ -61,23 +59,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/iris_splom_example.ipynb b/examples/gallery/demos/matplotlib/iris_splom_example.ipynb similarity index 79% rename from examples/demos/matplotlib/iris_splom_example.ipynb rename to examples/gallery/demos/matplotlib/iris_splom_example.ipynb index e934d42f83..c86673b2b8 100644 --- a/examples/demos/matplotlib/iris_splom_example.ipynb +++ b/examples/gallery/demos/matplotlib/iris_splom_example.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.iris import flowers\n", @@ -67,23 +65,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/legend_example.ipynb b/examples/gallery/demos/matplotlib/legend_example.ipynb similarity index 81% rename from examples/demos/matplotlib/legend_example.ipynb rename to examples/gallery/demos/matplotlib/legend_example.ipynb index 492f7f76fe..8b91dec824 100644 --- a/examples/demos/matplotlib/legend_example.ipynb +++ b/examples/gallery/demos/matplotlib/legend_example.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "x = np.linspace(0, 4*np.pi, 100)\n", @@ -70,23 +68,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/lorenz_attractor_example.ipynb b/examples/gallery/demos/matplotlib/lorenz_attractor_example.ipynb similarity index 81% rename from examples/demos/matplotlib/lorenz_attractor_example.ipynb rename to examples/gallery/demos/matplotlib/lorenz_attractor_example.ipynb index 419bdd7e16..9c65b059a7 100644 --- a/examples/demos/matplotlib/lorenz_attractor_example.ipynb +++ b/examples/gallery/demos/matplotlib/lorenz_attractor_example.ipynb @@ -31,10 +31,8 @@ }, { "cell_type": "code", - "execution_count": 4, - "metadata": { - "collapsed": true - }, + "execution_count": null, + "metadata": {}, "outputs": [], "source": [ "from scipy.integrate import odeint\n", @@ -82,23 +80,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/matplotlib/mandelbrot_section.ipynb b/examples/gallery/demos/matplotlib/mandelbrot_section.ipynb new file mode 100644 index 0000000000..30a8af72db --- /dev/null +++ b/examples/gallery/demos/matplotlib/mandelbrot_section.ipynb @@ -0,0 +1,82 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Bokeh - mandelbrot section](../bokeh/mandelbrot_section.ipynb)\n", + "\n", + "HoloViews demo that used to be showcased on the [holoviews.org" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Load the data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import io\n", + "try: from urllib2 import urlopen\n", + "except: from urllib.request import urlopen\n", + "\n", + "raw = urlopen('http://assets.holoviews.org/data/mandelbrot.npy').read()\n", + "array = np.load(io.BytesIO(raw)).astype(np.float32)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [scaling_factor=50] Contours [show_legend=False] (color='w')\n", + "dots = np.linspace(-0.45, 0.45, 19)\n", + "fractal = hv.Image(array)\n", + "# First example on the old holoviews.org homepage was:\n", + "# ((fractal * hv.HLine(y=0)).hist() + fractal.sample(y=0))\n", + "layouts = {y: (fractal * hv.Points(fractal.sample([(i,y) for i in dots])) +\n", + " fractal.sample(y=y) +\n", + " hv.operation.threshold(fractal, level=np.percentile(fractal.sample(y=y)['z'], 90)) +\n", + " hv.operation.contours(fractal, levels=[np.percentile(fractal.sample(y=y)['z'], 60)]))\n", + " for y in np.linspace(-0.3, 0.3, 21)}\n", + "\n", + "hv.HoloMap(layouts, kdims=['Y']).collate().cols(2)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/matplotlib/measles_example.ipynb b/examples/gallery/demos/matplotlib/measles_example.ipynb similarity index 85% rename from examples/demos/matplotlib/measles_example.ipynb rename to examples/gallery/demos/matplotlib/measles_example.ipynb index d2c0a94687..164d6efa1b 100644 --- a/examples/demos/matplotlib/measles_example.ipynb +++ b/examples/gallery/demos/matplotlib/measles_example.ipynb @@ -34,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "url = 'https://raw.githubusercontent.com/blmoore/blogR/master/data/measles_incidence.csv'\n", @@ -81,28 +79,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "hide_input": false, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" - }, - "widgets": { - "state": {}, - "version": "1.1.2" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/polar_scatter_demo.ipynb b/examples/gallery/demos/matplotlib/polar_scatter_demo.ipynb similarity index 80% rename from examples/demos/matplotlib/polar_scatter_demo.ipynb rename to examples/gallery/demos/matplotlib/polar_scatter_demo.ipynb index c9ff752a3a..761a3aa29d 100644 --- a/examples/demos/matplotlib/polar_scatter_demo.ipynb +++ b/examples/gallery/demos/matplotlib/polar_scatter_demo.ipynb @@ -59,22 +59,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/quiver_demo.ipynb b/examples/gallery/demos/matplotlib/quiver_demo.ipynb similarity index 84% rename from examples/demos/matplotlib/quiver_demo.ipynb rename to examples/gallery/demos/matplotlib/quiver_demo.ipynb index 20058009bc..92dbdfc874 100644 --- a/examples/demos/matplotlib/quiver_demo.ipynb +++ b/examples/gallery/demos/matplotlib/quiver_demo.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "xs, ys = np.arange(0, 2 * np.pi, .2), np.arange(0, 2 * np.pi, .2)\n", @@ -85,23 +83,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/gallery/demos/matplotlib/scatter_economic.ipynb b/examples/gallery/demos/matplotlib/scatter_economic.ipynb new file mode 100644 index 0000000000..8b6a9f07e8 --- /dev/null +++ b/examples/gallery/demos/matplotlib/scatter_economic.ipynb @@ -0,0 +1,74 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most examples work across multiple plotting backends, this example is also available for:\n", + "\n", + "* [Bokeh - scatter_economic](../bokeh/scatter_economic.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Declaring data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "macro_df = pd.read_csv('http://assets.holoviews.org/macro.csv', '\\t')\n", + "key_dimensions = [('year', 'Year'), ('country', 'Country')]\n", + "value_dimensions = [('unem', 'Unemployment'), ('capmob', 'Capital Mobility'),\n", + " ('gdp', 'GDP Growth'), ('trade', 'Trade')]\n", + "macro = hv.Table(macro_df, kdims=key_dimensions, vdims=value_dimensions)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Plot" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%output dpi=100\n", + "%%opts Scatter [scaling_method='width' scaling_factor=2 size_index=2 show_grid=True] \n", + "%%opts Scatter (color=Cycle('tab20') edgecolors='k')\n", + "%%opts NdOverlay [legend_position='right' aspect=2, fig_size=250, show_frame=False]\n", + "gdp_unem_scatter = macro.to.scatter('Year', ['GDP Growth', 'Unemployment'])\n", + "gdp_unem_scatter.overlay('Country')" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/demos/matplotlib/square_limit.ipynb b/examples/gallery/demos/matplotlib/square_limit.ipynb similarity index 90% rename from examples/demos/matplotlib/square_limit.ipynb rename to examples/gallery/demos/matplotlib/square_limit.ipynb index 2a09de4d1b..eda54976be 100644 --- a/examples/demos/matplotlib/square_limit.ipynb +++ b/examples/gallery/demos/matplotlib/square_limit.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Demo version of the topic notebook in [square_limit](../../topics/geometry/square_limit.ipynb) notebook in ``examples/topics/geometry``.\n", + "Demo version of the [square_limit](../../topics/geometry/square_limit.ipynb) topic notebook in ``examples/topics/geometry``.\n", "\n", "Most examples work across multiple plotting backends, this example is also available for:\n", "* [Bokeh - square_limit](../bokeh/square_limit.ipynb)" @@ -74,13 +74,13 @@ " den = float(n + m)\n", " t1 = Affine2D().scale(n / den, 1)\n", " t2 = Affine2D().scale(m / den, 1).translate(n / den, 0)\n", - " return T(spline1, t1) * T(spline2, t2)\n", + " return combine(T(spline1, t1) * T(spline2, t2))\n", "\n", "def above(spline1, spline2, n=1, m=1):\n", " den = float(n + m)\n", " t1 = Affine2D().scale(1, n / den).translate(0, m / den)\n", " t2 = Affine2D().scale(1, m / den)\n", - " return T(spline1, t1) * T(spline2, t2)\n", + " return combine(T(spline1, t1) * T(spline2, t2))\n", "\n", "def nonet(p, q, r, s, t, u, v, w, x):\n", " return above(beside(p, beside(q, r), 1, 2),\n", @@ -125,8 +125,8 @@ "metadata": {}, "outputs": [], "source": [ - "%%opts Spline [xaxis=None yaxis=None aspect='equal' bgcolor='white']\n", - "%%output size=300\n", + "%%opts Spline [xaxis=None yaxis=None aspect='equal' bgcolor='white'] (linewidth=0.8)\n", + "%%output size=250\n", "\n", "fish = hv.Spline((spline, [1,4,4,4]*34)) # Cubic splines\n", "smallfish = flip(rot45(fish))\n", @@ -137,22 +137,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.1" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/step_chart.ipynb b/examples/gallery/demos/matplotlib/step_chart.ipynb similarity index 79% rename from examples/demos/matplotlib/step_chart.ipynb rename to examples/gallery/demos/matplotlib/step_chart.ipynb index 1fd6915a8d..8bef1b6b69 100644 --- a/examples/demos/matplotlib/step_chart.ipynb +++ b/examples/gallery/demos/matplotlib/step_chart.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -34,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# build a dataset where multiple columns measure the same thing\n", @@ -61,9 +57,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "plot = dict(interpolation='steps-mid')\n", @@ -77,23 +71,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/stocks_example.ipynb b/examples/gallery/demos/matplotlib/stocks_example.ipynb similarity index 86% rename from examples/demos/matplotlib/stocks_example.ipynb rename to examples/gallery/demos/matplotlib/stocks_example.ipynb index abc6b6df0c..11dc6cb0a2 100644 --- a/examples/demos/matplotlib/stocks_example.ipynb +++ b/examples/gallery/demos/matplotlib/stocks_example.ipynb @@ -34,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from holoviews.operation.timeseries import rolling\n", @@ -87,23 +85,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/surface_3d.ipynb b/examples/gallery/demos/matplotlib/surface_3d.ipynb similarity index 72% rename from examples/demos/matplotlib/surface_3d.ipynb rename to examples/gallery/demos/matplotlib/surface_3d.ipynb index 1bc9109f37..0bace1c6df 100644 --- a/examples/demos/matplotlib/surface_3d.ipynb +++ b/examples/gallery/demos/matplotlib/surface_3d.ipynb @@ -14,9 +14,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -34,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# Make data.\n", @@ -59,9 +55,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "surface(plot=dict(colorbar=True))" @@ -69,23 +63,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/texas_choropleth_example.ipynb b/examples/gallery/demos/matplotlib/texas_choropleth_example.ipynb similarity index 85% rename from examples/demos/matplotlib/texas_choropleth_example.ipynb rename to examples/gallery/demos/matplotlib/texas_choropleth_example.ipynb index c6ad71e968..8fb4e9d2c7 100644 --- a/examples/demos/matplotlib/texas_choropleth_example.ipynb +++ b/examples/gallery/demos/matplotlib/texas_choropleth_example.ipynb @@ -33,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.us_counties import data as counties\n", @@ -79,23 +77,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/topographic_hillshading.ipynb b/examples/gallery/demos/matplotlib/topographic_hillshading.ipynb similarity index 87% rename from examples/demos/matplotlib/topographic_hillshading.ipynb rename to examples/gallery/demos/matplotlib/topographic_hillshading.ipynb index e45e8135c3..1895e113e6 100644 --- a/examples/demos/matplotlib/topographic_hillshading.ipynb +++ b/examples/gallery/demos/matplotlib/topographic_hillshading.ipynb @@ -33,9 +33,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -89,23 +87,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/trisurf3d_demo.ipynb b/examples/gallery/demos/matplotlib/trisurf3d_demo.ipynb similarity index 80% rename from examples/demos/matplotlib/trisurf3d_demo.ipynb rename to examples/gallery/demos/matplotlib/trisurf3d_demo.ipynb index 3923b45274..9406c371f9 100644 --- a/examples/demos/matplotlib/trisurf3d_demo.ipynb +++ b/examples/gallery/demos/matplotlib/trisurf3d_demo.ipynb @@ -21,9 +21,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -42,9 +40,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "n_radii = 8\n", @@ -79,9 +75,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "trisurface" @@ -89,23 +83,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/us_unemployment.ipynb b/examples/gallery/demos/matplotlib/us_unemployment.ipynb similarity index 81% rename from examples/demos/matplotlib/us_unemployment.ipynb rename to examples/gallery/demos/matplotlib/us_unemployment.ipynb index b49ba3b57f..3348194914 100644 --- a/examples/demos/matplotlib/us_unemployment.ipynb +++ b/examples/gallery/demos/matplotlib/us_unemployment.ipynb @@ -32,9 +32,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from bokeh.sampledata.unemployment1948 import data\n", @@ -69,23 +67,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/matplotlib/verhulst_mandelbrot.ipynb b/examples/gallery/demos/matplotlib/verhulst_mandelbrot.ipynb similarity index 91% rename from examples/demos/matplotlib/verhulst_mandelbrot.ipynb rename to examples/gallery/demos/matplotlib/verhulst_mandelbrot.ipynb index 0e624f6344..b3170b116a 100644 --- a/examples/demos/matplotlib/verhulst_mandelbrot.ipynb +++ b/examples/gallery/demos/matplotlib/verhulst_mandelbrot.ipynb @@ -40,9 +40,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# Area of the complex plane\n", @@ -111,22 +109,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.1" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/plotly/surface_3d.ipynb b/examples/gallery/demos/plotly/surface_3d.ipynb similarity index 81% rename from examples/demos/plotly/surface_3d.ipynb rename to examples/gallery/demos/plotly/surface_3d.ipynb index 310c185c74..32caad44c2 100644 --- a/examples/demos/plotly/surface_3d.ipynb +++ b/examples/gallery/demos/plotly/surface_3d.ipynb @@ -72,23 +72,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/demos/plotly/trisurf3d_demo.ipynb b/examples/gallery/demos/plotly/trisurf3d_demo.ipynb similarity index 84% rename from examples/demos/plotly/trisurf3d_demo.ipynb rename to examples/gallery/demos/plotly/trisurf3d_demo.ipynb index c286925ed2..45d013a6e1 100644 --- a/examples/demos/plotly/trisurf3d_demo.ipynb +++ b/examples/gallery/demos/plotly/trisurf3d_demo.ipynb @@ -41,9 +41,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "n_radii = 8\n", @@ -87,23 +85,9 @@ } ], "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python [default]", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/getting_started/1-Introduction.ipynb b/examples/getting_started/1-Introduction.ipynb new file mode 100644 index 0000000000..63b411128a --- /dev/null +++ b/examples/getting_started/1-Introduction.ipynb @@ -0,0 +1,371 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Introduction" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Why HoloViews?\n", + "\n", + "HoloViews is an [open-source](https://github.com/ioam/holoviews/blob/master/LICENSE.txt) Python 2 and 3 library for data analysis and visualization. Python already has excellent tools like numpy, pandas, and xarray for data processing, and bokeh and matplotlib for plotting, so why yet another library?\n", + "\n", + "**HoloViews helps you understand your data better, by letting you work seamlessly with both the data *and* its graphical representation.**\n", + "\n", + "HoloViews focuses on bundling your data together with the appropriate metadata to support both analysis and visualization, making your raw data *and* its visualization equally accessible at all times. This process can be unfamiliar to those used to traditional data-processing and plotting tools, and this getting-started guide is meant to demonstrate how it all works at a high level. More detailed information about each topic is then provided in the [User Guide](../user_guide/).\n", + "\n", + "With HoloViews, instead of building a plot using direct calls to a plotting library, you first describe your data with a small amount of crucial semantic information required to make it visualizable, then you specify additional metadata as needed to determine more detailed aspects of your visualization. This approach provides immediate, automatic visualization that can be effortlessly requested at any time as your data evolves, rendered automatically by one of the supported plotting libraries (such as Bokeh or Matplotlib). \n", + "\n", + "\n", + "## Tabulated data: subway stations\n", + "\n", + "To illustrate how this process works, we will demonstrate some of the key features of HoloViews using a collection of datasets related to transportation in New York City. First let's run some imports to make [numpy](http://numpy.org) and [pandas](http://pandas.pydata.org) accessible for loading the data. Here we start with a table of subway station information loaded from a CSV file with pandas:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is the standard way to make the numpy and pandas libraries available in the namespace. We recommend always importing HoloViews as ``hv`` and if you haven't already installed HoloViews, check out our [installation page].\n", + "\n", + "Note that after importing HoloViews as ``hv`` we run ``hv.extension('bokeh')`` to load the bokeh plotting extension, allowing us to generate visualizations with [Bokeh](http://bokeh.pydata.org/). In the next section we will see how you can use other plotting libraries such as [matplotlib](matplotlib.org) and even how you can mix and match between them.\n", + "\n", + "Now let's load our subway data using pandas:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "station_info = pd.read_csv('../assets/station_info.csv')\n", + "station_info.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that this table contains the subway station name, its latitude and longitude, the year it was opened, the number of services available from the station and ther names, and finally the yearly ridership (in millions for 2015)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## ``Elements`` of visualization\n", + "\n", + "We can immediately visualize some of the the data in this table as a scatter plot. Let's view how ridership varies with the number of services offered at each station:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scatter = hv.Scatter(station_info, kdims=['services'], vdims=['ridership'])\n", + "scatter" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we passed our dataframe to [``hv.Scatter``](../reference/elements/elements/bokeh/Scatter.ipynb) to create an *object* called `scatter`, which is independent of any plotting library. `scatter` is a simple wrapper around our dataframe that knows that the 'services' column is the independent variable, normally plotted along the x-axis, and that the 'ridership' column is a dependent variable, plotted on the y-axis. These are our *dimensions* which we will describe in more detail a little later.\n", + "\n", + "Given that we have the handle ``scatter`` on our ``Scatter`` object, we can show that it is indeed an object and not a plot by printing it:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(scatter)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The bokeh plot above is simply the rich, visual representation of ``scatter`` which is plotted automatically by HoloViews and displayed automatically in the [Jupyter notebook](https://jupyter.org/). Although HoloViews itself is independent of notebooks, this convenience makes working with HoloViews easiest in the notebook environment." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compositional ``Layouts``\n", + "\n", + "The class [``Scatter``](../reference/elements/elements/bokeh/Scatter.ipynb) is a subclass of ``Element``. As shown in our [element gallery], Elements are the simplest viewable components in HoloViews. Now that we have a handle on ``scatter``, we can demonstrate the compositionality of these objects:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout = scatter + hv.Histogram(np.histogram(station_info['opened'], bins=24), kdims=['opened'])\n", + "layout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In a single line using the ``+`` operator, we created a new, compositional object called a ``Layout`` built from our scatter visualizations and a ``Histogram`` that shows how many subway stations opened in Manhattan since 1900. Note that once again, all the plotting is happening behind the scenes. The ``layout`` is a not a plot, it's a new object that exists independently of any given plotting system:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(layout)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Array data: taxi dropoffs\n", + "\n", + "So far we have visualized data in a [pandas ``DataFrame``](http://pandas.pydata.org/) but ``HoloViews`` is as agnostic to data formats as it is to plotting libraries; see [XXXX] for more information. This means we can work with array data as easily as we can work with tabular data. To demonstrate this, here are some [numpy arrays](http://www.numpy.org/) relating to taxi dropoff locations in New York City:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "taxi_dropoffs = {hour:arr for hour, arr in np.load('../assets/hourly_taxi_data.npz').items()}\n", + "#print('Hours: {hours}'.format(hours=', '.join(taxi_dropoffs.keys())))\n", + "print('Taxi data contains {num} arrays (one per hour).\\nDescription of the first array:\\n'.format(num=len(taxi_dropoffs)))\n", + "np.info(taxi_dropoffs['0'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, this dataset contains 24 arrays (one for each hour of the day) of taxi dropoff locations (by latitude and longitude), aggregated over one month in 2015. The array shown above contains the accumulated dropoffs for the first hour of the day." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compositional ``Overlays``\n", + "\n", + "Once again, we can easily visualize this data with HoloViews by passing our array to [``hv.Image``](../reference/elements/elements/bokeh/Image.ipynb) to create the ``image`` object. This object has the spatial extent of the data declared as the ``bounds``, in terms of the corresponding range of latitudes and longitudes." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "bounds = (-74.05, 40.70, -73.90, 40.80)\n", + "image = hv.Image(taxi_dropoffs['0'], bounds=bounds, kdims=['lon','lat'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloViews supports ``numpy``, ``xarray``, ``iris``, and ``dask`` arrays when working with array data (see [Gridded Datasets](../user_guide/Gridded_Datasets.ipynb)). We can also compose elements containing array data with those containing tabular data. To illustrate, let's pass our tabular station data to a [``Points``](../reference/elements/elements/bokeh/Points.ipynb) element which is used to mark positions in two-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "points = hv.Points(station_info, kdims=['lon','lat'])\n", + "image + image * points" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "On the left, we have the visual representation of the ``image`` object we declared. Using ``+`` we put it into a ``Layout`` together with a new compositional object created with the ``*`` operator called an ``Overlay``. This particular overlay displays the station positions on top of our image which works correctly as both elements contain data that exist in the same space, namely New York City.\n", + "\n", + "This overlay on the right lets us see the location of all the subway stations in relation to our midnight taxi dropoffs. Of course, HoloViews allows you to visually express more of the available information with our points. For instance, you could represent the ridership of each subway by point color or point size. For more information see [Customizing Plots](../user_guide/Customizing_Plots.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Effortlessly exploring data\n", + "\n", + "You can keep composing datastructures together until there are more dimensions than can fit on simultaneously on your screen. For instance, you can visualize a dictionary of [``Images``](../reference/elements/elements/bokeh/Image.ipynb) (one for every hour of the day) by declaring a ``HoloMap``: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dictionary = {int(hour):hv.Image(arr, bounds=bounds, kdims=['lon','lat']) \n", + " for hour, arr in taxi_dropoffs.items()}\n", + "hv.HoloMap(dictionary, kdims=['Hour'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is yet another object which is rendered by the HoloViews plotting system with Bokeh behind the scenes:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "holomap = hv.HoloMap(dictionary, kdims=['Hour'])\n", + "print(holomap)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As this a ``HoloMap`` is a container for our ``Image`` elements, we can use the methods it offers to return new containers. For instance, in the next cell we select three different hours of the morning from the ``HoloMap`` and display them as a ``Layout``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "holomap.select(Hour={3,6,9}).layout()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here the ``select`` method picks values from the specified 'Hour' dimension. We have seen dimensions used in the ``kdims`` and ``vdims`` arguments when declaring our elements (``scatter``, ``histogram`` and ``image`` above) and when declaring the ``HoloMap`` of ``Image``s above. These are the *key dimensions* (indexing dimensions, or ``kdims``) and *value dimensions* (resulting data, or ``vdims``) used to express the space in which our data lives.\n", + "\n", + "Note how the ``Image`` elements where the holomap is constructed are declared using ``kdims=['lat','lon']`` which describes the fact that New York City is being viewed in terms of longitude and latitude. This semantic information is automatically mapped to our visualization by the HoloViews plotting system, which sets the x-axis and y-axis labels accordingly. In the case of the ``HoloMap`` we used ``kdims=['Hour']`` to declare that the interactive slider ranges over the hours of the day." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Data as visualization\n", + "\n", + "Holomaps are able to compose with elements and other holomaps into overlay and layouts just as easily as you compose two elements together. Here is one such composition where we select a range of longitudes and latitudes from our [``Points``](../reference/elements/elements/bokeh/Points.ipynb) before we overlay them:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Image [xrotation=90] Points (color='deepskyblue' marker='v' size=6)\n", + "hotspot = points.select(lon=(-73.99, -73.96), lat=(40.75,40.765))\n", + "composition = holomap * hotspot\n", + "composition" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The line starting with ``%%opts`` used to specify the visual style is part of the HoloViews options system described in the next 'Getting started' section which also describes how to achieve the same effect with standard Python syntax.\n", + "\n", + "In the cell above we created and styled a composite object within a few short lines of code. Furthermore, this composite object relates tabular and array data and is immediately presented in a way that can be explored interactively. This way of working enables highly productive exploration, allowing new insights to be gained easily. For instance, after exploring with the slider we notice a hotspot of taxi dropoffs at 7am which we can select as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "composition.select(Hour=7)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now see that the slice of subway locations was chosen in relation to the hotspot in taxi dropoffs around 7am in the morning. This area of Manhattan just south of Central Park contains many popular tourist attractions, including Times Square, and we can infer that tourists often take short taxi rides from the subway stations into this area.\n", + "\n", + "At this point it may appear that HoloViews is about easily generating explorative, interactive visualizations *from* your data. In fact, as we have been building these visualizations we have actually been working *with* our data, as we can show by examining the ``.data`` attribute of our sliced subway locations:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hotspot.data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that slicing the HoloViews [``Points``](../reference/elements/elements/bokeh/Points.ipynb) object in the visualization sliced the underlying data, with the structure of the table left intact. We can see that the Times Square 42nd Street station is indeed one of the subway stations surrounding our taxi dropoff hotspot. This seamless interplay and exchange between the raw data and easy-to-generate visualizations of it is crucial to how HoloViews helps you understand your data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Onwards\n", + "\n", + "The next getting-started section shows how to do [Customization](../Customization.ipynb) of the visual appearance of your data, allowing you highlight the most important features and change the look and feel. Other related topics for deeper study:\n", + " \n", + "* The above plots did not require any special geographic-data support, but when working with larger areas of the Earth's surface (for which curvature becomes significant) or when overlaying data with geographic features, the separate [GeoViews](http://geo.holoviews.org) library provides convenient geo-specific extensions to HoloViews.\n", + "* The taxi array data was derived from a very large tabular dataset and rasterized using [datashader](http://https://github.com/bokeh/datashader), an optional add-on to HoloViews and Bokeh that makes it feasible to work with very large datasets in a web browser." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/getting_started/2-Customization.ipynb b/examples/getting_started/2-Customization.ipynb new file mode 100644 index 0000000000..4edc432a43 --- /dev/null +++ b/examples/getting_started/2-Customization.ipynb @@ -0,0 +1,272 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Customizing visual appearance" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "\n", + "HoloViews elements like the `Scatter` points illustrated in the [Introduction](1-Introduction.ipynb) contain two types of information:\n", + "\n", + "- **Your data**, in as close to its original form as possible, so that it can be analyzed and accessed as you see fit.\n", + "- **Metadata specifying what your data *is***, which allows HoloViews to construct a visual representation for it.\n", + "\n", + "What elements do *not* contain is:\n", + "\n", + "- The endless details that one might want to tweak about the visual representation, such as line widths, colors, fonts, and spacing.\n", + "\n", + "HoloViews is designed to let you work naturally with the meaningful features of your data, while making it simple to adjust the display details separately using the Options system. Among many other benefits, this separation of *content* from *presentation* simplifies your data analysis workflow, and makes it independent of any particular plotting backend." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualizing neural spike trains\n", + "\n", + "To illustrate how the options system works, we will use a dataset containing [\"spike\"](https://en.wikipedia.org/wiki/Action_potential) (neural firing) events extracted from the recorded electrical activity of a [neuron](https://en.wikipedia.org/wiki/Neuron). We will be visualizing the first trial of this [publicly accessible neural recording](http://www.neuralsignal.org/data/04/nsa2004.4/433l019). First, we import pandas and holoviews and load our data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import holoviews as hv\n", + "spike_train = pd.read_csv('../assets/spike_train.csv.gz')\n", + "spike_train.head(n=3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This dataset contains the spike times (in milliseconds) for each detected spike event in this five-second recording, along with a spiking frequency (in Hertz, averaging over a rolling 200 millisecond window). We will now declare ``Curve`` and ``Spike`` elements using this data and combine them into a ``Layout``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve = hv.Curve(spike_train, kdims=['milliseconds'], vdims=['Hertz'], group='Firing Rate')\n", + "spikes = hv.Spikes(spike_train.sample(300), kdims=['milliseconds'], vdims=[], group='Spike Train')\n", + "curve + spikes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the representation for this object is purely textual; so far we have not yet loaded any plotting system for HoloViews, and so all you can see is a description of the data stored in the elements. \n", + "\n", + "To be able to see a visual representation and adjust its appearance, we'll need to load a plotting system, and here let's load two so they can be compared:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.extension('bokeh', 'matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Even though we can happily create, analyze, and manipulate HoloViews objects without using any plotting backend, this line is normally executed just after importing HoloViews so that objects can have a rich graphical representation rather than the very-limited textual representation shown above. Putting 'bokeh' first in this list makes visualizations default to using [Bokeh](http://bokeh.pydata.org), but including [matplotlib](http://matplotlib.org) as well means that backend can be selected for any particular plot as shown below." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Default appearance\n", + "\n", + "With the extension loaded, let's look at the default appearance as rendered with Bokeh:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve + spikes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, we can immediately appreciate more about this dataset than we could from the textual representation. The curve plot, in particular, conveys clearly that the firing rate varies quite a bit over this 5-second interval. However, the spikes plot is much more difficult to interpret, because the plot is nearly solid black even though we already downsampled from 700 spikes to 300 spikes when we declared the element. \n", + "\n", + "One thing we can do is enable one of Bokeh's zoom tools and zoom in until individual spikes are clearly visible. Even then, though, it's difficult to relate the spiking and firing-rate representations to each other. Maybe we can do better by adjusting the display options away from their default settings?" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Customization\n", + "\n", + "Let's see what we can achieve when we do decide to customize the appearance:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%output size=150\n", + "%%opts Spikes [height=100 width=600 yaxis=None] (color='grey' line_width=0.25)\n", + "%%opts Curve [height=100 width=600 xaxis=None show_grid=False tools=['hover']]\n", + "%%opts Curve (color='red' line_width=1.5)\n", + "curve = hv.Curve(spike_train, kdims=['milliseconds'], vdims=['Hertz'])\n", + "spikes = hv.Spikes(spike_train, kdims=['milliseconds'], vdims=[])\n", + "(curve+spikes).cols(1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Much better! It's the same underlying data, but now we can clearly see both the individual spike events and how they affect the moving average. You can also see how the moving average trails the actual spiking, due to how the window function was defined.\n", + "\n", + "A detailed breakdown of this exact customization is given in the [User Guide](../user_guide/03-Customizing_Plots.ipynb), but we can use this example to understand a number of important concepts:\n", + "\n", + "* The option system is based around keyword settings.\n", + "* You can customize the output format using the ``%%output`` and the element appearance with the ``%%opts`` *cell magics*.\n", + "* These *cell magics* affect the display output of the Jupyter cell where they are located. For use outside of the Jupyter notebook, consult the [User Guide](../user_guide/03-Customizing_Plots.ipynb) for equivalent Python-compatible syntax.\n", + "* The layout container has a ``cols`` method to specify the number of columns in the layout.\n", + "\n", + "While the ``%%output`` cell magic accepts a simple list of keywords, we see some special syntax used in the ``%%opts`` magic:\n", + "\n", + "* The element type is specified following by special groups of keywords.\n", + "* The keywords in square brackets ``[...]`` are ***plot options*** that instruct HoloViews how to build that type of plot.\n", + "* The keywords in parentheses ``(...)`` are **style options** with keywords that are passed directly to the plotting library when rendering that type of plot.\n", + "\n", + "The corresponding [User Guide](../user_guide/03-Customizing_Plots.ipynb) entry explains the keywords used in detail, but a quick summary is that we have elongated the ``Curve`` and ``Scatter`` elements and toggled various axes with the ***plot options***. We have also specified the color and line widths of the [Bokeh glyphs](http://bokeh.pydata.org/en/latest/docs/user_guide/plotting.html) with the ***style options***.\n", + "\n", + "As you can see, these tools allow significant customization of how our elements appear. HoloViews offers many other tools for setting options either locally or globally, including the ``%output`` and ``%opts`` *line magics*, the ``.opts`` method on all HoloViews objects and the ``hv.output`` and ``hv.opts`` utilities. All these tools, how they work and details of the opts syntax can be found in the [User Guide](../user_guide/03-Customizing_Plots.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Switching to matplotlib\n", + "\n", + "Now let's switch our backend to [matplotlib](http://matplotlib.org/) to show the same elements as rendered with different customizations, in a different output format (SVG), with a completely different plotting library:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%output size=200 backend='matplotlib' fig='svg'\n", + "%%opts Layout [sublabel_format='' vspace=0.1]\n", + "%%opts Spikes [aspect=6 yaxis='bare'] (color='red' linewidth=0.25 )\n", + "%%opts Curve [aspect=6 xaxis=None show_grid=False] (color='blue' linewidth=2 linestyle='dashed')\n", + "(hv.Curve(spike_train, kdims=['milliseconds'], vdims=['Hertz'])\n", + " + hv.Spikes(spike_train, kdims=['milliseconds'], vdims=[])).cols(1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we use the same tools with a different plotting extension. Naturally, a few changes needed to be made:\n", + "\n", + "* A few of the plotting options are different because of differences in how the plotting backends work. For instance, matplotlib uses ``aspect`` instead of setting ``width`` and ``height``. In some cases, but not all, HoloViews can smooth over such differences to make it simpler to switch backends.\n", + "* The Bokeh hover tool is not supported by the matplotlib backend, as you might expect, nor are there any other interactive controls.\n", + "* Some style options have different names; for instance, the Bokeh ``line_width`` option is called ``linewidth`` in matplotlib.\n", + "* Containers like Layouts have plot options, but no style options, because they are processed by HoloViews itself. Here we adjust the gap betwen the plots using ``vspace``.\n", + "\n", + "Note that you can even write options that work across multiple backends, as HoloViews will ignore keywords that are not applicable to the current backend (as long as they are valid for *some* loaded backend). See the [User Guide](../user_guide/03-Customizing_Plots.ipynb) for more details." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Persistent styles\n", + "\n", + "Let's switch back to the default (Bokeh) plotting extension for this notebook and apply the ``.select`` operation illustrated in the Introduction, to the ``spikes`` object we made earlier:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%output size=150\n", + "spikes.select(milliseconds=(2000,4000))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note how HoloViews remembered the Bokeh-specific styles we previously applied to the `spikes` object! This feature allows us to style objects once and then keep that styling as we work, without having to repeat the styles every time we work with that object. You can learn more about the output line magic and the exact semantics of the opts magic in the [User Guide](../user_guide/03-Customizing_Plots.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setting axis labels\n", + "\n", + "If you look closely, the example above might worry you. First we defined our ``Spikes`` element with ``kdims=['milliseconds']``, which we then used as a keyword argument in ``select`` above. This is also the string used as the axis label. Does this mean we are limited to Python literals for axis labels, if we want to use the corresponding dimension with ``select``?\n", + "\n", + "Luckily, there is no limitation involved. Dimensions specified as string are often convenient, but behind the scenes, HoloViews always uses a much richer ``Dimensions`` object which you can pass to the ``kdims`` and ``vdims`` explicitly (see the [User Guide](../user_guide/01-Annotating_Data.ipynb) for more information). One of the things each ``Dimension`` object supports is a long, descriptive ``label``, which complements the short programmer-friendly name.\n", + "\n", + "We can set the dimension labels on our existing ``spikes`` object as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "spikes= spikes.redim.label(milliseconds='Time in milliseconds (10⁻³ seconds)')\n", + "curve = curve.redim.label(Hertz='Frequency (Hz)')\n", + "(curve + spikes).select(milliseconds=(2000,4000)).cols(1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, we can set long descriptive labels on our dimensions (including unicode) while still making use of the short dimension name in methods such as ``select``. \n", + "\n", + "Now that you know how to set up and customize basic visualizations, the next [Getting-Started sections](../3-Tabular_Datasets.ipynb) show how to work with various common types of data in HoloViews." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/getting_started/3-Tabular_Datasets.ipynb b/examples/getting_started/3-Tabular_Datasets.ipynb new file mode 100644 index 0000000000..5c3ca5b481 --- /dev/null +++ b/examples/getting_started/3-Tabular_Datasets.ipynb @@ -0,0 +1,227 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Tabular Datasets" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we have already discovered, Elements are simple wrappers around your data that provide a semantically meaningful representation. HoloViews can work with a wide variety of data types, but many of them can be categorized as either:\n", + "\n", + " * **Tabular:** Tables of flat columns, or\n", + " * **Gridded:** Array-like data on 2-dimensional or N-dimensional grids\n", + " \n", + "These two general data types are explained in detail in the [Tabular Data](../user_guide/07-Tabular_Datasets.ipynb) and [Gridded Data](../user_guide/08-Gridded_Datasets.ipynb) user guides, including all the many supported formats (including Python dictionaries of NumPy arrays, pandas ``DataFrames``, dask ``DataFrames``, and xarray ``DataArrays`` and ``Datasets``). \n", + "\n", + "In this Getting-Started guide we provide a quick overview and introduction to two of the most flexible and powerful formats: columnar **pandas** DataFrames (in this section), and gridded **xarray** Datasets (in the next section)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tabular\n", + "\n", + "Tabular data (also called columnar data) is one of the most common, general, and versatile data formats, corresponding to how data is laid out in a spreadsheet. There are many different ways to put data into a tabular format, but for interactive analysis having [**tidy data**](http://www.jeannicholashould.com/tidy-data-in-python.html) provides flexibility and simplicity. For tidy data, the **columns** of the table represent **variables** or **dimensions** and the **rows** represent **observations**. The best way to understand this format is to look at such a dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('bokeh', 'matplotlib')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "diseases = pd.read_csv('../assets/diseases.csv.gz')\n", + "diseases.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This particular dataset was the subject of an excellent piece of visual journalism in the [Wall Street Journal](http://graphics.wsj.com/infectious-diseases-and-vaccines/#b02g20t20w15). The WSJ data details the incidence of various diseases over time, and was downloaded from the [University of Pittsburgh's Project Tycho](http://www.tycho.pitt.edu/). We can see we have 5 data columns, which each correspond either to independent variables that specify a particular measurement ('Year', 'Week', 'State'), or observed/dependent variables reporting what was then actually measured (the 'measles' or 'pertussis' incidence). \n", + "\n", + "Knowing the distinction between those two types of variables is crucial for doing visualizations, but unfortunately the tabular format does not declare this information. Plotting 'Week' against 'State' would not be meaningful, whereas 'measles' for each 'State' (averaging or summing across the other dimensions) would be fine, and there's no way to deduce those constraints from the tabular format. Accordingly, we will first make a HoloViews object called a ``Dataset`` that declares the independent variables (called key dimeansions or **kdims** in HoloViews) and dependent variables (called value dimensions or **vdims**) that you want to work with:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "vdims = [('measles', 'Measles Incidence'), ('pertussis', 'Pertussis Incidence')]\n", + "ds = hv.Dataset(diseases, kdims=['Year', 'State'], vdims=vdims)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we've used an optional tuple-based syntax **``(name,label)``** to specify a more meaningful description for the ``vdims``, while using the original short descriptions for the ``kdims``. We haven't yet specified what to do with the ``Week`` dimension, but we are only interested in yearly averages, so let's just tell HoloViews to average over all remaining dimensions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ds = ds.aggregate(function=np.mean)\n", + "ds" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "(We'll cover aggregations like ``np.mean`` in detail later, but here the important bit is simply that the ``Week`` dimension can now be ignored.)\n", + "\n", + "The ``repr`` shows us both the ``kdims`` (in square brackets) and the ``vdims`` (in parentheses) of the ``Dataset``. Because it can hold arbitrary combinations of dimensions, a ``Dataset`` is *not* immediately visualizable. There's no single clear mapping from these four dimensions onto a two-dimensional page, hence the textual representation shown above.\n", + "\n", + "To make this data visualizable, we'll need to provide a bit more metadata, by selecting one of the large library of [Elements](../reference/elements/) that can help answer the questions we want to ask about the data. Perhaps the most obvious representation of this dataset is as a Curve displaying the incidence for each year, for each state. We could pull out individual columns one by one from the original dataset, but now that we have declared information about the dimensions, the cleanest approach is to map the dimensions of our ``Dataset`` onto the dimensions of an Element using ``.to``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve [width=600 height=250] {+framewise}\n", + "(ds.to(hv.Curve, 'Year', 'measles') + ds.to(hv.Curve, 'Year', 'pertussis')).cols(1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we specified two ``Curve`` elements showing measles and pertussis incidence respectively (the vdims), per year (the kdim), and laid them out in a vertical column. You'll notice that even though we specified only the short name for the value dimensions, the plot shows the longer names (\"Measles Incidence\", \"Pertussis Incidence\") that we declared on the ``Dataset``.\n", + "\n", + "You'll also notice that we automatically received a dropdown menu to select which ``State`` to view. Each ``Curve`` ignores unused value dimensions, because additional measurements don't affect each other, but HoloViews has to do *something* with every key dimension for every such plot. If the ``State`` (or any other key dimension) isn't somehow plotted or aggregated over, then HoloViews has to leave choosing a value for it to the user, hence the selection widget. Other options for what to do with extra dimensions or just extra data ranges are illustrated below." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Selecting\n", + "\n", + "One of the most common thing we might want to do is to select only a subset of the data. The ``select`` method makes this extremely easy, letting you select a single value, a list of values supplied as a list, or a range of values supplied as a tuple. Here we will use ``select`` to display the measles incidence in four states over one decade. After applying the selection, we use the ``.to`` method as shown earlier, now displaying the data as ``Bars`` indexed by 'Year' and 'State' key dimensions and displaying the 'Measles Incidence' value dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Bars [width=800 height=400 tools=['hover'] group_index=1 legend_position='top_left']\n", + "states = ['New York', 'New Jersey', 'California', 'Texas']\n", + "ds.select(State=states, Year=(1980, 1990)).to(hv.Bars, ['Year', 'State'], 'measles').sort()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Faceting\n", + "\n", + "Above we already saw what happens to key dimensions that we didn't explicitly assign to the Element using the ``.to`` method: they are grouped over, popping up a set of widgets so the user can select the values to show at any one time. However, using widgets is not always the most effective way to view the data, and a ``Dataset`` lets you specify other alternatives using the ``.overlay``, ``.grid`` and ``.layout`` methods. For instance, we can lay out each state separately using ``.grid``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve [width=200] (color='indianred')\n", + "grouped = ds.select(State=states, Year=(1930, 2005)).to(hv.Curve, 'Year', 'measles')\n", + "grouped.grid('State')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Or we can take the same grouped object and ``.overlay`` the individual curves instead of laying them out in a grid:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve [width=600] (color=Cycle(values=['indianred', 'slateblue', 'lightseagreen', 'coral']))\n", + "grouped.overlay('State')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These faceting methods even compose together, meaning that if we had more key dimensions we could ``.overlay`` one dimension, ``.grid`` another and have a widget for any other remaining key dimensions." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Aggregating\n", + "\n", + "Instead of selecting a subset of the data, another common operation supported by HoloViews is computing aggregates. When we first loaded this dataset, we aggregated over the 'Week' column to compute the mean incidence for every year, thereby reducing our data significantly. The ``aggregate`` method is therefore very useful to compute statistics from our data.\n", + "\n", + "A simple example using our dataset is to compute the mean and standard deviation of the Measles Incidence by ``'Year'``. We can express this simply by passing the key dimensions to aggregate over (in this case just the 'Year') along with a function and optional ``spreadfn`` to compute the statistics we want. The ``spread_fn`` will append the name of the function to the dimension name so we can reference the computed value separately. Once we have computed the aggregate, we can simply cast it to a ``Curve`` and ``ErrorBars``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve [width=600]\n", + "agg = ds.aggregate('Year', function=np.mean, spreadfn=np.std)\n", + "(hv.Curve(agg) * hv.ErrorBars(agg,vdims=['measles', 'measles_std'])).redim.range(measles=(0, None))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this way we can summarize a multi-dimensional dataset as something that can be visualized directly, while allowing us to compute arbitrary statistics along a dimension.\n", + "\n", + "## Other data\n", + "\n", + "If you want to know more about working with tabular data, particularly when using datatypes other than pandas, have a look at the [user guide](../user_guide/07-Tabular_Datasets.ipynb). The different interfaces allow you to work with everything from simple NumPy arrays to out-of-core dataframes using dask. Dask dataframes scale to visualizations of billions of rows, when using [datashader](https://anaconda.org/jbednar/holoviews_datashader/notebook) with HoloViews to aggregate the data as needed." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/getting_started/4-Gridded_Datasets.ipynb b/examples/getting_started/4-Gridded_Datasets.ipynb new file mode 100644 index 0000000000..ad265d083a --- /dev/null +++ b/examples/getting_started/4-Gridded_Datasets.ipynb @@ -0,0 +1,273 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Gridded Datasets" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "hv.extension('bokeh', 'matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the [previous guide](3-Tabular_Datasets.ipynb) we discovered how to work with tabular datasets. Although tabular datasets are extremely common, many other datasets are best represented by regularly sampled n-dimensional arrays (such as images, volumetric data, or higher dimensional parameter spaces). On a 2D screen and using traditional plotting libraries, it is often difficult to visualize such parameter spaces quickly and succinctly, but HoloViews lets you quickly slice and dice such a dataset to explore the data and answer questions about it easily." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Gridded\n", + "\n", + "Gridded datasets usually represent observations of some continuous variable across multiple dimensions---a monochrome image representing luminance values across a 2D surface, volumetric 3D data, an RGB image sequence over time, or any other multi-dimensional parameter space. This type of data is particularly common in research areas that make use of spatial imaging or modeling, such as climatology, biology, and astronomy but can also be used to represent any arbitrary data that varies over multiple dimensions.\n", + "\n", + "In HoloViews terminology the dimensions the data varies over are the so called key dimensions (**kdims**), which define the coordinates of the underlying array. The actual value arrays are described by the value dimensions (**vdims**). Libraries like ``xarray`` or ``iris`` allow you to store the coordinates with the array, but here we will declare the coordinate arrays ourselves so we can get a better understanding of how the gridded data interfaces work. We will therefore start by loading a very simple 3D array:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "data = np.load('../assets/twophoton.npz')\n", + "calcium_array = data['Calcium']\n", + "calcium_array.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This particular NumPy dataset contains data from a 2-photon calcium imaging experiment, which provides an indirect measure of neural activity encoded via changes in fluorescent light intensity. The 3D array represents the activity of a 2D imaging plane over time, forming a sequence of images with a shape of (62, 111) over 50 time steps. Just as we did in the [Tabular Dataset](../4-Tabular_Datasets.ipynb) getting-started guide we start by wrapping our data in a HoloViews ``Dataset``. However, for HoloViews to understand the raw NumPy array we need to pass coordinates for each of the dimensions (or axes) of the data. For simplicity, here we will simply use integer coordinates for the ``'Time'``, ``'x'`` and ``'y'`` dimensions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ds = hv.Dataset((np.arange(50), np.arange(111), np.arange(62), calcium_array),\n", + " kdims=['Time', 'x', 'y'], vdims=['Fluorescence'])\n", + "ds" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we should be used to by now the ``Dataset`` repr shows us the dimensions of the data. If we inspect the ``.data`` attribute we can see that by default HoloViews will store this data as a simple dictionary of our key dimension coordinates and value dimension arrays:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "type(ds.data), list(ds.data.keys())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Other datatypes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Instead of defining the coordinates manually, we recommend using [xarray](http://xarray.pydata.org/en/stable/), which will flexibly work with labeled n-dimensional arrays. We can even make a clone of our dataset and set the datatype to xarray to convert to an ``xarray.Dataset``, which is the recommended format for gridded data in HoloViews:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ds.clone(datatype=['xarray']).data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To see more details on working with different datatypes have a look at the [user guide](../user_guide/08-Gridded_Datasets.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Viewing the data\n", + "\n", + "Perhaps the most natural representation of this dataset is as an Image displaying the fluorescence at each point in time. Using the ``.to`` interface, we can map the dimensions of our ``Dataset`` onto the dimensions of an Element. To display an image, we will pick the ``Image`` element and specify the ``'x'`` and ``'y'`` as the key dimensions. Since we only have one value dimension, we won't have to declare it explicitly:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%opts Image (cmap='viridis')\n", + "ds.to(hv.Image, ['x', 'y']).hist()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The slider widget allows you to scrub through the images for each time, and you can also play the frames as an animation in forward or reverse by pressing the ``P`` and ``R`` keys (respectively) after clicking on the slider. \n", + "\n", + "Once you have selected an individual plot, you can interact with it by zooming (which does not happen to give additional detail with this particular downsampled dataset), or by selecting the ``Box select`` tool in the plot toolbar and drawing a Fluorescence range on the Histogram to control the color mapping range. \n", + "\n", + "When using ``.to`` or ``.groupby`` on larger datasets with many key dimensions or many distinct key-dimension values, you can use the ``dynamic=True`` flag, letting you explore the parameter space dynamically (for more detail have a look at the [Live Data](../5-Live_Data.ipynb) and [Pipeline] sections)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Selecting\n", + "\n", + "Often when working with multi-dimensional datasets, we are only interested in small regions of the parameter space. For instance, when working with neural imaging data like this, it is very common to focus on regions of interest (ROIs) within the larger image. Here we will fetch some bounding boxes from the data we loaded earlier. ROIs are often more complex polygons but for simplicity's sake we will use simple rectangular ROIs specified as the left, bottom, right and top coordinate of a bounding box:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ROIs = data['ROIs']\n", + "roi_bounds = hv.NdOverlay({i: hv.Bounds(tuple(roi)) for i, roi in enumerate(ROIs)})\n", + "print(ROIs.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we have 147 ROIs representing bounding boxes around 147 identified neurons in our data. To display them we have wrapped the data in ``Bounds`` elements, which we can overlay on top of our animation. Additionally we will create some ``Text`` elements to label each ROI. Finally we will use the regular Python indexing semantics to select along the Time dimension, which is the first key dimension and can therefore simply be specified like ``ds[21]``. Just like the ``select`` method, indexing like this indexes and slices by value, not the index (which are one and the same here):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Image [width=400 height=400 xaxis=None yaxis=None] \n", + "%%opts Bounds (color='white') Text (text_color='white' text_font_size='8pt')\n", + "\n", + "opts = dict(halign='left', valign='bottom')\n", + "roi_text = hv.NdOverlay({i: hv.Text(roi[0], roi[1], str(i), **opts) for i, roi in enumerate(ROIs)})\n", + "(ds[21].to(hv.Image, ['x', 'y']) * roi_bounds * roi_text).relabel('Time: 21')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can use these bounding boxes to select some data, since they simply represent coordinates. Looking at ROI #60 for example, we can see the neuron activate quite strongly in the middle of our animation. Using the ``select`` method, we can select the x and y-coordinates of our ROI and the rough time period when we saw the neuron respond:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "x0, y0, x1, y1 = ROIs[60]\n", + "roi = ds.select(x=(x0, x1), y=(y0, y1), time=(250, 280)).relabel('ROI #60')\n", + "roi.to(hv.Image, ['x', 'y'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Faceting" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Even though we have selected a very small region of the data, there is still quite a lot of data there. We can use the ``faceting`` methods to display the data in different ways. Since we have only a few pixels in our dataset now, we can for example plot how the fluorescence changes at each pixel in our ROI over time. We simply use the ``.to`` interface to display the data as ``Curve`` types, with time as the key dimension. If you recall from [Tabular Data](3-Tabular_Data.ipynb), the ``.to`` method will group by any remaining key dimensions (in this case ``'x'`` and ``'y'``) to display sliders. Here we will instead facet the ``Curve`` elements using ``.grid``, allowing us to see the evolution of the fluorescence signal over time and space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts GridSpace [shared_xaxis=True shared_yaxis=True]\n", + "roi.to(hv.Curve, 'Time').grid()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The above cell and the previous cell show the same data, but visualized in very different ways depending on how the data was mapped onto the screen." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Aggregating\n", + "\n", + "Instead of generating a Curve for each pixel individually, we may instead want to average the data across x and y to get an aggregated estimate of that neuron's activity. For that purpose we can use the aggregate method to get the average signal within the ROI window. Using the ``spreadfn`` we can also compute the standard deviation between pixels, which helps us understand how variable the signal is across that window (to let us know what we have covered up when aggregating). We will display the mean and standard deviation data as a overlay of a ``Spread`` and ``Curve`` Element:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Overlay [show_legend=False width=600]\n", + "agg = roi.aggregate('Time', np.mean, spreadfn=np.std)\n", + "hv.Spread(agg) * hv.Curve(agg)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Of course, we could combine all of these approaches and aggregate each ROI, faceting the entire dataset by ROI to show how the activity of the various neurons differs.\n", + "\n", + "As you can see, HoloViews makes it simple for you to select and display data from a large gridded dataset, allowing you to focus on whatever aspects of the data are important to answer a given question. The final getting-started section covers how you can provide [Live Data](05-Live_Data.ipynb) visualizations to let users dynamically choose what to display interactively." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/getting_started/5-Live_Data.ipynb b/examples/getting_started/5-Live_Data.ipynb new file mode 100644 index 0000000000..24971ee135 --- /dev/null +++ b/examples/getting_started/5-Live_Data.ipynb @@ -0,0 +1,269 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Live Data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The 'Getting Started' guide has up until this point demonstrated how HoloViews objects can wrap your data and be given a rich, useful representation. All of the visualizations assumed that the data was already available in memory so that it could be used to construct the appropriate object, and all of the resulting visualizations can be viewed in static HTML pages, no longer requiring Python when users interact with them.\n", + "\n", + "In many important scenarios, the assumption that the data is immediately available in memory does not hold. The data of interest may exist on some remote server, making it unavailable locally until it is fetched. In other situations, the data may exist on the local disk, but be too large to fit into memory. Perhaps the data doesn't even exist yet: it may be the result of some computation yet to be performed or the outcome of some live process with the corresponding measurement not yet made.\n", + "\n", + "All these scenarios are examples of *live data* that can be made available to HoloViews using the appropriate Python process. In this section, we will see how HoloViews allows you to build visualizations that update dynamically to newly available data and that can respond to live user interaction.\n", + "\n", + "

Note: To work with live data, you need a live Python server, not a static web site, which is why the outputs shown below are GIF animations. If you run this notebook yourself, you will be able to try out your own interactions and compare them to the displayed GIF animations.

" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A computational process\n", + "\n", + "Let us start by importing NumPy and HoloViews and setting some suitable defaults for the ``Curve`` element we will be using (disabling axes and grid lines):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "import numpy as np\n", + "hv.extension('bokeh')\n", + "%opts Curve [show_grid=False xaxis=None yaxis=None]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are many possible examples of live data, including financial data feeds, real-time scientific measurements, and sophisticated numerical simulations. Here we will consider the path traced by two very simple equations:\n", + "\n", + "$$x_{n+1} = \\sin(ay_n) + c \\cos(ax_n)$$\n", + "$$y_{n+1} = \\sin(bx_n) + d \\cos(by_n)$$\n", + "\n", + "These equations define the 'Clifford Attractor' described in the book \"Chaos In Wonderland\" by [Cliff Pickover](https://en.wikipedia.org/wiki/Clifford_A._Pickover). Now let's write a simple Python function to iterate these two equations starting from position ``(x0,y0)``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def clifford(a,b,c,d,x0,y0):\n", + " xn,yn = x0,y0\n", + " coords = [(x0,y0)]\n", + " for i in range(10000):\n", + " x_n1 = np.sin(a*yn) + c*np.cos(a*xn)\n", + " y_n1 = np.sin(b*xn) + d*np.cos(b*yn)\n", + " xn,yn = x_n1,y_n1\n", + " coords.append((xn,yn))\n", + " return coords" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If we run this function now, we'll get a list of 10000 tuples, which won't be very informative.\n", + "\n", + "The ``Curve`` element accepts the output of our ``clifford`` function, making it trivial to define a function that when called gives us a visualization:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def clifford_attractor(a,b,c,d):\n", + " return hv.Curve(clifford(a,b,c,d,x0=0,y0=0))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can then view the output for some combination of values for ``a,b,c,d``, starting from the origin:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve (line_width=0.03 color='red')\n", + "clifford_attractor(a =-1.5, b=1.5, c=1, d=0.75 )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This HoloViews element gives us a snapshot for the four chosen values, but what we really would like to do is to interact with the four-dimensional parameter space directly, even though that parameter space is too large to compute all possible combinations feasibly." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Live parameter exploration\n", + "\n", + "To dynamically explore these parameters, we can start by declaring a ``DynamicMap``, passing in our function instead of the dictionary of ``Image`` elements we saw in the [Introduction](1-Introduction.ipynb). We declare the four arguments of our function as ``kdims``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap = hv.DynamicMap(clifford_attractor, kdims=['a','b','c','d'])\n", + "dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see from the error message, HoloViews does not yet have the information needed to give us a visualization--it has no way to guess any value to use for the 'a','b','c', and 'd' dimensions. Since we know what suitable values look like, we can easily specify appropriate ranges using the ``redim`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve (line_width=0.03 color='green')\n", + "# When run live, this cell's output should match the behavior of the GIF below\n", + "dmap.redim.range(a=(-1.5,-1),b=(1.5,2),c=(1,1.2),d=(0.75,0.8))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These ranges supplied with ``redim.range`` are semantic specifications of suitable values for each of the parameters and they are used to define suitable ranges for the interactive sliders above. Note how the HoloViews options system described in the [Customization section](2-Customization.ipynb) continues to work with the ``DynamicMap``." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Live interaction\n", + "\n", + "The live visualizations above are indistinguishable from standard HoloViews visualization, apart from the speed and memory usage. With a live Python server and the Bokeh backend, HoloViews can also be used for building highly customized forms of live interactivity using ``DynamicMap`` and the *streams system*. A HoloViews stream is simply a parameter of a corresponding stream class configured to track some variable that reflects a user interaction. For instance, let's write a function that accepts an initial ``x`` and ``y`` value and computes a more complex version of the above plot, showing the ``x``,``y`` point as a dot along with a line segment indicating the first step taken when computing the attractor and some text indicating the starting point:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def interactive_clifford(a,b,c,d,x=0,y=0):\n", + " coords = clifford(a,b,c,d,x0=x,y0=y)\n", + " return (hv.Curve(coords) * hv.Points(coords[0]) * hv.Curve(coords[:2], group='Init')\n", + " * hv.Text(-0.75,1.35, 'x:{x:.2f} y:{y:.2f}'.format(x=coords[0][0],y=coords[0][1])))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All we have done is create an ``Overlay`` as described in the [Introduction](1-Introduction.ipynb) containing our Clifford attractor curve and a few other HoloViews elements parameterized accordingly, including ``Points`` and the ``Text`` annotation. Now by passing this function to ``DynamicMap`` and also passing in a `PointerXY` stream that grabs the x,y locations of the mouse (in data space), we have an explorable visualization you can interact with directly. The plot now shows the ttractor (in blue) and the starting point and first step (in red), with the starting point following the mouse position:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews.streams import PointerXY" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve (line_width=0.03 color='blue') Points (color='red' size=10) Curve.Init (color='red' line_width=2)\n", + "# When run live, this cell's output should match the behavior of the GIF below\n", + "dmap = hv.DynamicMap(interactive_clifford, kdims=['a','b','c','d'], streams=[PointerXY(x=0,y=0)])\n", + "dmap.redim.range(a=(-1.4,-1),b=(1.6,1.8),c=(1,1.5),d=(0.7,0.8))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "By exploring with the mouse, see if you can find the fixed-point location (where the next step maps you to the same position) located at ``x=0.18,y=0.65`` with parameters ``a=1.4, b=1.6, c=1`` and ``d=0.7``.\n", + "\n", + "To learn more about the streams system please consult the [user guide](../user_guide/06-Live_Data.ipynb) and check out our [Streams gallery](../reference/index.html#streams)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tradeoffs using live data\n", + "\n", + "``DynamicMap`` and ``Streams`` allow specification of exciting, dynamic visualizations, allowing you to build full-featured interactive applications and simulations with very little code (particularly when combined with a declarative widget library like [ParamNB](https://github.com/ioam/paramnb) or [ParamBokeh](https://github.com/ioam/parambokeh)). The way these dynamic visualizations work is that HoloViews runs JavaScript in your browser, which then communicates with a running Python server process that may be running in the Jupyter notebook server or in the [Bokeh server](http://bokeh.pydata.org/en/latest/docs/user_guide/server.html). This Python process may be running locally on your machine or on some remote internet or local-network server. Regardless of where it is running, this Python process executes the callback you supply to ``DynamicMap``, allowing HoloViews to update your visualization whenever the parameters change.\n", + "\n", + "This architecture is powerful and fully general, as you can always make static content in memory into dynamic output generated by a function (see the [User Guide](../user_guide/06-Live_Data.ipynb) to learn more). Using live data is not always recommended, however, because using purely static content also has some important advantages:\n", + "\n", + "### Reasons to use live data\n", + "\n", + "* Your data is inherently coming from a live source and your visualization needs to reflect this in real time.\n", + "* You wish to explore a large parameter space and statically sampling this space adequately is prohibitive in memory or computation time.\n", + "* Your data is too big to fit in memory and you only need to explore a portion of it that you can stream in from disk.\n", + "* You want an open-ended visualization that keeps updating indefinitely.\n", + "\n", + "### Reasons to use static data\n", + "\n", + "* You wish to archive or record your visualization in such a way that they exist independently of code execution in a potentially changing codebase.\n", + "* You wish to share visualizations in a static HTML file that does not require running a live server (e.g a file that can be e-mailed and immediately viewed or placed on an HTML server).\n", + "\n", + "The general recommendation is to visualize your data with ``HoloMap`` (as in the introduction to this guide) when you have a small amount of data (typically a few megabytes) that can be quickly computed and can reasonably be embedded into an HTML file. Otherwise, you can use ``DynamicMap`` that you can sample from to generate a ``HoloMap`` from when you wish to share your results (see the [user guide](../user_guide/06-Live_Data.ipynb) for more information on how to turn your ``DynamicMap`` objects into ``HoloMap``s).\n", + "\n", + "Now that you have explored the basic capabilities of HoloViews, you should try it out on your own data, guided by the [user guide](../user_guide/) and following examples in the component [reference gallery](../reference/) and other demos in the [gallery](../gallery/)." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/bokeh/GridSpace.ipynb b/examples/reference/containers/bokeh/GridSpace.ipynb new file mode 100644 index 0000000000..3ac74b832d --- /dev/null +++ b/examples/reference/containers/bokeh/GridSpace.ipynb @@ -0,0 +1,116 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
GridSpace Container
\n", + "
Dependencies
Bokeh
\n", + "
Backends
[Bokeh](./GridSpace.ipynb)
[Matplotlib](../matplotlib/GridSpace.ipynb)
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A ``GridSpace`` is a two-dimensional dictionary of HoloViews objects presented onscreen as a grid. In one sense, due to the restriction on it's dimensionality, a ``GridSpace`` may be considered a special-case of [``HoloMap``](./HoloMap.ipynb). In another sense, ``GridSpace`` may be seen as more general as a ``GridSpace`` can hold a ``HoloMap`` but the converse is not permitted; see the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``GridSpace`` holds two-dimensional dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the ``sine_curve`` function below, we can declare a two-dimensional dictionary of ``Curve`` elements, where the keys are 2-tuples corresponding to (phase, frequency) values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "phases = [0, np.pi/2, np.pi, 3*np.pi/2]\n", + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in phases for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now pass this dictionary of curves to ``GridSpace``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "gridspace = hv.GridSpace(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "gridspace" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``GridSpace`` is similar to ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Other than the difference in the visual semantics, whereby ``GridSpaces`` display their contents together in a two-dimensional grid, ``GridSpaces`` are very similar to ``HoloMap``s (see the [``HoloMap``](./HoloMap.ipynb) notebook for more information).\n", + "\n", + "One way to demonstrate the similarity of these two containers is to cast our ``gridspace`` object to ``HoloMap`` and back to a ``GridSpace``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%output size=50\n", + "hmap = hv.HoloMap(gridspace)\n", + "hmap + hv.GridSpace(hmap)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/bokeh/HoloMap.ipynb b/examples/reference/containers/bokeh/HoloMap.ipynb new file mode 100644 index 0000000000..4f2175faa3 --- /dev/null +++ b/examples/reference/containers/bokeh/HoloMap.ipynb @@ -0,0 +1,234 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
HoloMap Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Bokeh
Matplotlib
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A HoloMap is an explorable multi-dimensional dictionary of HoloViews objects. A ``HoloMap`` cannot contain ``Layouts``, ``NdLayouts``, ``GridSpaces`` or other ``HoloMaps`` or ``DyamicMap`` but can contain any other HoloViews object. See the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` holds dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As a ``HoloMap`` is a dictionary of elements, let us now create a dictionary of sine curves:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "\n", + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "curve_dict = {f:sine_curve(0,f) for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now have a dictionary where the frequency is the key and the corresponding curve element is the value. We can now turn this dictionary into a ``HoloMap`` by declaring the keys as corresponding to the frequency key dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap(curve_dict, kdims=['frequency'])\n", + "hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` is multi-dimensional" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By using tuple keys and making sure each position in the tuple is assigned a corresponding ``kdim``, ``HoloMaps`` allow exploration of a multi-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "phases = [0, np.pi/2, np.pi, 3*np.pi/2]\n", + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in phases for f in frequencies}\n", + "hmap = hv.HoloMap(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` supports dictionary-like behavior" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloMaps support a number of features similar to regular dictionaries, including **assignment**:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap(kdims=['phase', 'frequency'])\n", + "for (phase, freq) in [(0,0.5), (0.5,0.5), (0.5,1), (0,1)]:\n", + " hmap[(phase, freq)] = sine_curve(phase,freq)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Key membership predicate**:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "(0, 0.5) in hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**The ``get`` method:**:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap.get((0,0.5))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` supports multi-dimensional indexing and slicing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One difference with regular dictionaries, is that ``HoloMaps`` support multi-dimensional indexing:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap[0,1] + hmap[0,:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "See the [User Guide] for more information on selecting, slicing and indexing." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` is ordered" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One difference with regular Python dictionaries is that they are *ordered*, which can be observed by inspecting the ``.data`` attribute:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap.data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that internally, ``HoloMaps`` uses [``OrderedDict``](https://docs.python.org/3.6/library/collections.html#collections.OrderedDict) where the keys are sorted by default. You can set ``sort=False`` and then either supply an ordered list of (key, value) tuples, an ``OrderedDict`` or insert items in a chosen order.\n", + "\n", + "That said, there is generally very-little reason to ever use ``sort=False`` as regular Python dictionaries do not have a well-defined key ordering and ``HoloViews`` sliders work regardless of the ordering used. The only reason to set the ordering is if you wish to iterate over a ``HoloMap`` using the ``items``, ``keys``, ``values`` methods or use the iterator interface." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/bokeh/Layout.ipynb b/examples/reference/containers/bokeh/Layout.ipynb new file mode 100644 index 0000000000..764510e4da --- /dev/null +++ b/examples/reference/containers/bokeh/Layout.ipynb @@ -0,0 +1,218 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Layout Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Bokeh
Matplotlib
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A Layout is a collection of HoloViews objects that are related in some way, to be displayed side-by-side. Like ``Overlay`` and unlike other containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdLayout``](./NdLayout.ipynb) a ``Layout`` is *not* dictionary like: it holds potentially heterogeneous types without any dimensioned keys.\n", + "\n", + "\n", + "A ``Layout`` cannot contain ``NdLayouts`` but can otherwise contain *any* other HoloViews object. See [Building Composite Objects](05-Building_Composite_Objects.ipynb) for more details on how to compose containers. It is best to learn about ``Layout`` and [``Overlay``](./Overlay.ipynb) together as they are very closely related objects that share many core concepts." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``Layout`` is a heterogeneous collection" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build a ``Layout`` between any two HoloViews objects (which can have different types) using the ``+`` operator:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve = hv.Curve((xvals, [np.sin(x) for x in xvals]))\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)))\n", + "curve + scatter" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this example, we have a ``Layout`` composed of a ``Curve`` element and a ``Scatter`` element. The one restriction on what you can put in a ``Layout`` is that you cannot combine an [``NdLayout``](./NdLayout.ipynb) with a regular ``Layout``." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Building ``Layout`` from a list" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build Layout's of any size by passing a list of objects to the ``Layout`` constructor as shown below:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_list = [ hv.Curve((xvals, [np.sin(f*x) for x in xvals])) for f in [0.5, 0.75]]\n", + "scatter_list = [hv.Scatter((xvals[::5], f*np.linspace(0,1,20))) for f in [-0.5, 0.5]]\n", + "layout = hv.Layout(curve_list + scatter_list).cols(2)\n", + "layout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note the use of the ``.cols`` method to specify the number of columns, wrapping to the next row in scanline order (top-to-bottom and left-to-right)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A ``Layout`` has two-level attribute access" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Layout`` and ``Overlay`` are fundamentally tree-structures holding arbitrary heterogenous HoloViews objects and are quite different from the dictionary-like dimensioned containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdLayout``](./NdLayout.ipynb).\n", + "\n", + "All HoloViews objects have string ``group`` and ``label`` parameters, resulting in a 2-level attribute access on ``Layout`` objects. First let us see how to index the above example where ``group`` and ``label`` was not specified for any of the elements:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout2 = layout.Curve.I + layout.Scatter.II\n", + "layout2" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we create a second layout by indexing two elements from our earlier ``layout`` object. We see that the first level is the ``group`` string (which defaults to the element class name) followed by the label, which wasn't set and is therefore mapped to a roman numeral (I,II,III etc).\n", + "\n", + "As no group and label was specified, our new ``layout`` will once again have ``Curve.I`` for the curve but as there is only one scatter element, it will have ``Scatter.II`` to index the scatter." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tab-completion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Layout`` and ``Overlay`` are designed to be easy to explore and inspect with tab completion. Try running:\n", + "\n", + "```python\n", + "layout.[tab]\n", + "```\n", + "\n", + "And you should see the first levels of indexing (``Curve`` and ``Scatter``) conveniently listed at the top. If this is not the case, you may need to enable improved tab-completion as described in [Configuring HoloViews](../../../user_guide/Configuring_HoloViews.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using ``group`` and ``label``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's return to the first simple example, although this time we will set a group and label; see the [Annotating Data](../../../user_guide/01-Annotating_Data.ipynb) for more information:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve = hv.Curve((xvals, [np.sin(x) for x in xvals]), group='Sinusoid', label='Example')\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)), group='Linear Points', label='Demo')\n", + "layout3 = curve + scatter\n", + "layout3" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now see how group and label affect access, in addition to being used for setting the titles shown above and for allowing plot customization (see \n", + "[Customizing Plots](../../../user_guide/03-Customizing_Plots.ipynb) for more information):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout3.Linear_Points.Demo + layout3.Sinusoid.Example" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have used the semantic group and label names to access the elements and build a new re-ordered layout." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/bokeh/NdLayout.ipynb b/examples/reference/containers/bokeh/NdLayout.ipynb new file mode 100644 index 0000000000..6d699ea302 --- /dev/null +++ b/examples/reference/containers/bokeh/NdLayout.ipynb @@ -0,0 +1,146 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
NdLayout Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Bokeh
Matplotlib
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An ``NdLayout`` is a multi-dimensional dictionary of HoloViews elements presented side-by-side like a ``Layout``. An ``NdLayout`` can be considered as a special-case of ``HoloMap`` that can hold any one type of HoloViews container or element as long as it isn't another ``NdLayout`` or ``Layout``. Unlike a regular ``Layout`` that can be built with the ``+`` operator, the items in an ``NdOverlay`` container have corresponding keys and must all have the same type. See the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdLayout`` holds dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the ``sine_curve`` function below, we can declare a dictionary of ``Curve`` elements, where the keys correspond to the frequency values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "\n", + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "curve_dict = {f:sine_curve(0,f) for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now have a dictionary where the frequency is the key and the corresponding curve element is the value. We can now turn this dictionary into an ``NdLayout`` by declaring the keys as corresponding to the frequency key dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "NdLayout = hv.NdLayout(curve_dict, kdims=['frequency'])\n", + "NdLayout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdLayout`` is multi-dimensional" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By using tuple keys and making sure each position in the tuple is assigned a corresponding ``kdim``, ``NdLayouts`` allow visualization of a multi-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in [0, np.pi/2] for f in [0.5, 0.75]}\n", + "NdLayout = hv.NdLayout(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "NdLayout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdLayout`` is similar to ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Other than the difference in the visual semantics, whereby ``NdLayout`` displays its contents overlaid, ``NdLayout`` are very similar to ``HoloMap`` (see the [``HoloMap``](./HoloMap.ipynb) notebook for more information).\n", + "\n", + "One way to demonstrate the similarity of these two containers is to cast our ``NdLayout`` object to ``HoloMap``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.HoloMap(NdLayout)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We could now cast this ``HoloMap`` back to an ``NdLayout``. Unlike the other container examples such as [``GridSpace``](./GridSpace.ipynb) and [``NdOverlay``](./NdOverlay.ipynb), we cannot display this reconstituted ``NdLayout`` next to the ``HoloMap`` above using ``+`` as a ``Layout`` cannot hold an ``NdLayout`` in the same way than an ``NdLayout`` cannot hold a ``Layout``." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/bokeh/NdOverlay.ipynb b/examples/reference/containers/bokeh/NdOverlay.ipynb new file mode 100644 index 0000000000..442ec33a89 --- /dev/null +++ b/examples/reference/containers/bokeh/NdOverlay.ipynb @@ -0,0 +1,147 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
NdOverlay Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Bokeh
Matplotlib
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An ``NdOverlay`` is a multi-dimensional dictionary of HoloViews elements presented overlayed in the same space. An ``NdOverlay`` can be considered as a special-case of ``HoloMap`` that can only hold a single type of element at a time. Unlike a regular ``Overlay`` that can be built with the ``*`` operator, the items in an ``NdOverlay`` container have corresponding keys and must all have the same type. See the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdOverlay`` holds dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the ``sine_curve`` function below, we can declare a dictionary of ``Curve`` elements, where the keys correspond to the frequency values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "\n", + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "curve_dict = {f:sine_curve(0,f) for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now have a dictionary where the frequency is the key and the corresponding curve element is the value. We can now turn this dictionary into an ``NdOverlay`` by declaring the keys as corresponding to the frequency key dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ndoverlay = hv.NdOverlay(curve_dict, kdims=['frequency'])\n", + "ndoverlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the ``NdOverlay`` is displayed with a legend using colors defined by the ``Curve`` color cycle. For more information on using ``Cycle`` to define color cycling, see the [User Guide]." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdOverlay`` is multi-dimensional" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By using tuple keys and making sure each position in the tuple is assigned a corresponding ``kdim``, ``NdOverlays`` allow visualization of a multi-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in [0, np.pi/2] for f in [0.5, 0.75]}\n", + "ndoverlay = hv.NdOverlay(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "ndoverlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdOverlay`` is similar to ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Other than the difference in the visual semantics, whereby ``NdOverlay`` displays its contents overlaid, ``NdOverlay`` are very similar to ``HoloMap`` (see the [``HoloMap``](./HoloMap.ipynb) notebook for more information).\n", + "\n", + "One way to demonstrate the similarity of these two containers is to cast our ``ndoverlay`` object to ``HoloMap`` and back to an ``NdOverlay``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap(ndoverlay)\n", + "hmap + hv.NdOverlay(hmap)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/bokeh/Overlay.ipynb b/examples/reference/containers/bokeh/Overlay.ipynb new file mode 100644 index 0000000000..c1755ae61f --- /dev/null +++ b/examples/reference/containers/bokeh/Overlay.ipynb @@ -0,0 +1,219 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Overlay Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Bokeh
Matplotlib
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A Overlay is a collection of HoloViews objects that are related in some way, to be displayed simultanously, overlaid in the space space. Like ``Layout`` and unlike other containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdOverlay``](./NdOverlay.ipynb) a ``Overlay`` is *not* dictionary like: it holds potentially heterogeneous types without any dimensioned keys.\n", + "\n", + "\n", + "A ``Overlay`` cannot contain any other container type other than ``NdOverlay`` but can contain any HoloViews elements. See [Building Composite Objects](05-Building_Composite_Objects.ipynb) for more details on how to compose containers. It is best to learn about ``Overlay`` and [``Layout``](./Layout.ipynb) together as they are very closely related objects that share many core concepts." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``Overlay`` is a heterogeneous collection" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build a ``Overlay`` between any two HoloViews objects (which can have different types) using the ``*`` operator:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve = hv.Curve((xvals, [np.sin(x) for x in xvals]))\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)))\n", + "curve * scatter" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this example, we have a ``Overlay`` composed of a ``Curve`` element and a ``Scatter`` element." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Building ``Overlay`` from a list" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build Overlay's of any size by passing a list of objects to the ``Overlay`` constructor as shown below:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_list = [ hv.Curve((xvals, [np.sin(f*x) for x in xvals])) for f in [0.5, 0.75]]\n", + "scatter_list = [hv.Scatter((xvals[::5], f*np.linspace(0,1,20))) for f in [-0.5, 0.5]]\n", + "overlay = hv.Overlay(curve_list + scatter_list)\n", + "overlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that these curve and scatter elements are using the default color cycle when overlaid; see [Customizing Plots](../../../user_guide/03-Customizing_Plots.ipynb) for more information on cycles." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A ``Overlay`` has two-level attribute access" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Overlay`` and ``Layout`` are fundamentally tree-structures holding arbitrary heterogenous HoloViews objects and are quite different from the dictionary-like dimensioned containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdOverlay``](./NdOverlay.ipynb).\n", + "\n", + "All HoloViews objects have string ``group`` and ``label`` parameters, resulting in a 2-level attribute access on ``Overlay`` objects. First let us see how to index the above example where ``group`` and ``label`` was not specified for any of the elements:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "overlay2 = overlay.Curve.I * overlay.Scatter.II\n", + "overlay2" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we create a second ``Overlay`` by indexing two elements from our earlier ``overlay`` object. We see that the first level is the ``group`` string (which defaults to the element class name) followed by the label, which wasn't set and is therefore mapped to a roman numeral (I,II,III etc).\n", + "\n", + "As no group and label was specified, our new ``Overlay`` will once again have ``Curve.I`` for the curve but as there is only one scatter element, it will have ``Scatter.II`` to index the scatter." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tab-completion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Overlay`` and ``Layout`` are designed to be easy to explore and inspect with tab completion. Try running:\n", + "\n", + "```python\n", + "overlay.[tab]\n", + "```\n", + "\n", + "And you should see the first levels of indexing (``Curve`` and ``Scatter``) conveniently listed at the top. If this is not the case, you may need to enable improved tab-completion as described in [Configuring HoloViews](../../../user_guide/Configuring_HoloViews.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using ``group`` and ``label``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's return to the first simple example, although this time we will set a ``group`` and ``label``; see the [Annotating Data](../../../user_guide/01-Annotating_Data.ipynb) for more information:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve1 = hv.Curve((xvals, [np.sin(x) for x in xvals]), group='Sinusoid', label='Low Frequency')\n", + "curve2 = hv.Curve((xvals, [np.sin(2*x) for x in xvals]), group='Sinusoid', label='High Frequency')\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)), group='Linear Points', label='Demo')\n", + "overlay3 = curve1 * curve2 * scatter\n", + "overlay3" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now see how group and label affect access, in addition to being used for setting the legend shown above and for allowing plot customization (see \n", + "[Customizing Plots](../../../user_guide/03-Customizing_Plots.ipynb) for more information):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "overlay3.Linear_Points.Demo * overlay3.Sinusoid.High_Frequency * overlay3.Sinusoid.Low_Frequency" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have used the semantic group and label names to access the elements and build a new re-ordered ``Overlay`` which we can observe by the switched z-ordering and legend colors used for the two sinusoidal curves." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/matplotlib/GridSpace.ipynb b/examples/reference/containers/matplotlib/GridSpace.ipynb new file mode 100644 index 0000000000..54fc2e7a27 --- /dev/null +++ b/examples/reference/containers/matplotlib/GridSpace.ipynb @@ -0,0 +1,115 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
GridSpace Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Bokeh
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A ``GridSpace`` is a two-dimensional dictionary of HoloViews objects presented onscreen as a grid. In one sense, due to the restriction on it's dimensionality, a ``GridSpace`` may be considered a special-case of [``HoloMap``](./HoloMap.ipynb). In another sense, ``GridSpace`` may be seen as more general as a ``GridSpace`` can hold a ``HoloMap`` but the converse is not permitted; see the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``GridSpace`` holds two-dimensional dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the ``sine_curve`` function below, we can declare a two-dimensional dictionary of ``Curve`` elements, where the keys are 2-tuples corresponding to (phase, frequency) values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "phases = [0, np.pi/2, np.pi, 3*np.pi/2]\n", + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in phases for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now pass this dictionary of curves to ``GridSpace``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "gridspace = hv.GridSpace(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "gridspace" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``GridSpace`` is similar to ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Other than the difference in the visual semantics, whereby ``GridSpaces`` display their contents together in a two-dimensional grid, ``GridSpaces`` are very similar to ``HoloMap``s (see the [``HoloMap``](./HoloMap.ipynb) notebook for more information).\n", + "\n", + "One way to demonstrate the similarity of these two containers is to cast our ``gridspace`` object to ``HoloMap`` and back to a ``GridSpace``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap(gridspace)\n", + "hmap + hv.GridSpace(hmap)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/matplotlib/HoloMap.ipynb b/examples/reference/containers/matplotlib/HoloMap.ipynb new file mode 100644 index 0000000000..67cd320fb0 --- /dev/null +++ b/examples/reference/containers/matplotlib/HoloMap.ipynb @@ -0,0 +1,234 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
HoloMap Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Bokeh
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A HoloMap is an explorable multi-dimensional dictionary of HoloViews objects. A ``HoloMap`` cannot contain ``Layouts``, ``NdLayouts``, ``GridSpaces`` or other ``HoloMaps`` or ``DyamicMap`` but can contain any other HoloViews object. See the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` holds dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As a ``HoloMap`` is a dictionary of elements, let us now create a dictionary of sine curves:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "\n", + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "curve_dict = {f:sine_curve(0,f) for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now have a dictionary where the frequency is the key and the corresponding curve element is the value. We can now turn this dictionary into a ``HoloMap`` by declaring the keys as corresponding to the frequency key dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap(curve_dict, kdims=['frequency'])\n", + "hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` is multi-dimensional" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By using tuple keys and making sure each position in the tuple is assigned a corresponding ``kdim``, ``HoloMaps`` allow exploration of a multi-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "phases = [0, np.pi/2, np.pi, 3*np.pi/2]\n", + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in phases for f in frequencies}\n", + "hmap = hv.HoloMap(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` supports dictionary-like behavior" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloMaps support a number of features similar to regular dictionaries, including **assignment**:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap(kdims=['phase', 'frequency'])\n", + "for (phase, freq) in [(0,0.5), (0.5,0.5), (0.5,1), (0,1)]:\n", + " hmap[(phase, freq)] = sine_curve(phase,freq)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Key membership predicate**:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "(0, 0.5) in hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**The ``get`` method:**:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap.get((0,0.5))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` supports multi-dimensional indexing and slicing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One difference with regular dictionaries, is that ``HoloMaps`` support multi-dimensional indexing:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap[0,1] + hmap[0,:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "See the [User Guide] for more information on selecting, slicing and indexing." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` is ordered" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One difference with regular Python dictionaries is that they are *ordered*, which can be observed by inspecting the ``.data`` attribute:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap.data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that internally, ``HoloMaps`` uses [``OrderedDict``](https://docs.python.org/3.6/library/collections.html#collections.OrderedDict) where the keys are sorted by default. You can set ``sort=False`` and then either supply an ordered list of (key, value) tuples, an ``OrderedDict`` or insert items in a chosen order.\n", + "\n", + "That said, there is generally very-little reason to ever use ``sort=False`` as regular Python dictionaries do not have a well-defined key ordering and ``HoloViews`` sliders work regardless of the ordering used. The only reason to set the ordering is if you wish to iterate over a ``HoloMap`` using the ``items``, ``keys``, ``values`` methods or use the iterator interface." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/matplotlib/Layout.ipynb b/examples/reference/containers/matplotlib/Layout.ipynb new file mode 100644 index 0000000000..6e428257f3 --- /dev/null +++ b/examples/reference/containers/matplotlib/Layout.ipynb @@ -0,0 +1,218 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Layout Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Bokeh
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A Layout is a collection of HoloViews objects that are related in some way, to be displayed side-by-side. Like ``Overlay`` and unlike other containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdLayout``](./NdLayout.ipynb) a ``Layout`` is *not* dictionary like: it holds potentially heterogeneous types without any dimensioned keys.\n", + "\n", + "\n", + "A ``Layout`` cannot contain ``NdLayouts`` but can otherwise contain *any* other HoloViews object. See [Building Composite Objects](05-Building_Composite_Objects.ipynb) for more details on how to compose containers. It is best to learn about ``Layout`` and [``Overlay``](./Overlay.ipynb) together as they are very closely related objects that share many core concepts." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``Layout`` is a heterogeneous collection" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build a ``Layout`` between any two HoloViews objects (which can have different types) using the ``+`` operator:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve = hv.Curve((xvals, [np.sin(x) for x in xvals]))\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)))\n", + "curve + scatter" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this example, we have a ``Layout`` composed of a ``Curve`` element and a ``Scatter`` element. The one restriction on what you can put in a ``Layout`` is that you cannot combine an [``NdLayout``](./NdLayout.ipynb) with a regular ``Layout``." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Building ``Layout`` from a list" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build Layout's of any size by passing a list of objects to the ``Layout`` constructor as shown below:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_list = [ hv.Curve((xvals, [np.sin(f*x) for x in xvals])) for f in [0.5, 0.75]]\n", + "scatter_list = [hv.Scatter((xvals[::5], f*np.linspace(0,1,20))) for f in [-0.5, 0.5]]\n", + "layout = hv.Layout(curve_list + scatter_list).cols(2)\n", + "layout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note the use of the ``.cols`` method to specify the number of columns, wrapping to the next row in scanline order (top-to-bottom and left-to-right)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A ``Layout`` has two-level attribute access" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Layout`` and ``Overlay`` are fundamentally tree-structures holding arbitrary heterogenous HoloViews objects and are quite different from the dictionary-like dimensioned containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdLayout``](./NdLayout.ipynb).\n", + "\n", + "All HoloViews objects have string ``group`` and ``label`` parameters, resulting in a 2-level attribute access on ``Layout`` objects. First let us see how to index the above example where ``group`` and ``label`` was not specified for any of the elements:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout2 = layout.Curve.I + layout.Scatter.II\n", + "layout2" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we create a second layout by indexing two elements from our earlier ``layout`` object. We see that the first level is the ``group`` string (which defaults to the element class name) followed by the label, which wasn't set and is therefore mapped to a roman numeral (I,II,III etc).\n", + "\n", + "As no group and label was specified, our new ``layout`` will once again have ``Curve.I`` for the curve but as there is only one scatter element, it will have ``Scatter.II`` to index the scatter." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tab-completion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Layout`` and ``Overlay`` are designed to be easy to explore and inspect with tab completion. Try running:\n", + "\n", + "```python\n", + "layout.[tab]\n", + "```\n", + "\n", + "And you should see the first levels of indexing (``Curve`` and ``Scatter``) conveniently listed at the top. If this is not the case, you may need to enable improved tab-completion as described in [Configuring HoloViews](../../../user_guide/Configuring_HoloViews.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using ``group`` and ``label``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's return to the first simple example, although this time we will set a group and label; see the [Annotating Data](../../../user_guide/01-Annotating_Data.ipynb) for more information:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve = hv.Curve((xvals, [np.sin(x) for x in xvals]), group='Sinusoid', label='Example')\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)), group='Linear Points', label='Demo')\n", + "layout3 = curve + scatter\n", + "layout3" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now see how group and label affect access, in addition to being used for setting the titles shown above and for allowing plot customization (see \n", + "[Customizing Plots](../../../user_guide/03-Customizing_Plots.ipynb) for more information):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout3.Linear_Points.Demo + layout3.Sinusoid.Example" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have used the semantic group and label names to access the elements and build a new re-ordered layout." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/matplotlib/NdLayout.ipynb b/examples/reference/containers/matplotlib/NdLayout.ipynb new file mode 100644 index 0000000000..2c8718637f --- /dev/null +++ b/examples/reference/containers/matplotlib/NdLayout.ipynb @@ -0,0 +1,146 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
NdLayout Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Bokeh
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An ``NdLayout`` is a multi-dimensional dictionary of HoloViews elements presented side-by-side like a ``Layout``. An ``NdLayout`` can be considered as a special-case of ``HoloMap`` that can hold any one type of HoloViews container or element as long as it isn't another ``NdLayout`` or ``Layout``. Unlike a regular ``Layout`` that can be built with the ``+`` operator, the items in an ``NdOverlay`` container have corresponding keys and must all have the same type. See the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdLayout`` holds dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the ``sine_curve`` function below, we can declare a dictionary of ``Curve`` elements, where the keys correspond to the frequency values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "\n", + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "curve_dict = {f:sine_curve(0,f) for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now have a dictionary where the frequency is the key and the corresponding curve element is the value. We can now turn this dictionary into an ``NdLayout`` by declaring the keys as corresponding to the frequency key dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "NdLayout = hv.NdLayout(curve_dict, kdims=['frequency'])\n", + "NdLayout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdLayout`` is multi-dimensional" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By using tuple keys and making sure each position in the tuple is assigned a corresponding ``kdim``, ``NdLayouts`` allow visualization of a multi-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in [0, np.pi/2] for f in [0.5, 0.75]}\n", + "NdLayout = hv.NdLayout(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "NdLayout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdLayout`` is similar to ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Other than the difference in the visual semantics, whereby ``NdLayout`` displays its contents overlaid, ``NdLayout`` are very similar to ``HoloMap`` (see the [``HoloMap``](./HoloMap.ipynb) notebook for more information).\n", + "\n", + "One way to demonstrate the similarity of these two containers is to cast our ``NdLayout`` object to ``HoloMap``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.HoloMap(NdLayout)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We could now cast this ``HoloMap`` back to an ``NdLayout``. Unlike the other container examples such as [``GridSpace``](./GridSpace.ipynb) and [``NdOverlay``](./NdOverlay.ipynb), we cannot display this reconstituted ``NdLayout`` next to the ``HoloMap`` above using ``+`` as a ``Layout`` cannot hold an ``NdLayout`` in the same way than an ``NdLayout`` cannot hold a ``Layout``." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/matplotlib/NdOverlay.ipynb b/examples/reference/containers/matplotlib/NdOverlay.ipynb new file mode 100644 index 0000000000..348e3f698f --- /dev/null +++ b/examples/reference/containers/matplotlib/NdOverlay.ipynb @@ -0,0 +1,147 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
NdOverlay Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Bokeh
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An ``NdOverlay`` is a multi-dimensional dictionary of HoloViews elements presented overlayed in the same space. An ``NdOverlay`` can be considered as a special-case of ``HoloMap`` that can only hold a single type of element at a time. Unlike a regular ``Overlay`` that can be built with the ``*`` operator, the items in an ``NdOverlay`` container have corresponding keys and must all have the same type. See the [User Guide] for details on how to compose containers." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdOverlay`` holds dictionaries" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the ``sine_curve`` function below, we can declare a dictionary of ``Curve`` elements, where the keys correspond to the frequency values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "frequencies = [0.5, 0.75, 1.0, 1.25]\n", + "\n", + "def sine_curve(phase, freq):\n", + " xvals = [0.1* i for i in range(100)]\n", + " return hv.Curve((xvals, [np.sin(phase+freq*x) for x in xvals]))\n", + "\n", + "curve_dict = {f:sine_curve(0,f) for f in frequencies}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now have a dictionary where the frequency is the key and the corresponding curve element is the value. We can now turn this dictionary into an ``NdOverlay`` by declaring the keys as corresponding to the frequency key dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ndoverlay = hv.NdOverlay(curve_dict, kdims=['frequency'])\n", + "ndoverlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the ``NdOverlay`` is displayed with a legend using colors defined by the ``Curve`` color cycle. For more information on using ``Cycle`` to define color cycling, see the [User Guide]." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdOverlay`` is multi-dimensional" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By using tuple keys and making sure each position in the tuple is assigned a corresponding ``kdim``, ``NdOverlays`` allow visualization of a multi-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_dict_2D = {(p,f):sine_curve(p,f) for p in [0, np.pi/2] for f in [0.5, 0.75]}\n", + "ndoverlay = hv.NdOverlay(curve_dict_2D, kdims=['phase', 'frequency'])\n", + "ndoverlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``NdOverlay`` is similar to ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Other than the difference in the visual semantics, whereby ``NdOverlay`` displays its contents overlaid, ``NdOverlay`` are very similar to ``HoloMap`` (see the [``HoloMap``](./HoloMap.ipynb) notebook for more information).\n", + "\n", + "One way to demonstrate the similarity of these two containers is to cast our ``ndoverlay`` object to ``HoloMap`` and back to an ``NdOverlay``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap(ndoverlay)\n", + "hmap + hv.NdOverlay(hmap)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/containers/matplotlib/Overlay.ipynb b/examples/reference/containers/matplotlib/Overlay.ipynb new file mode 100644 index 0000000000..3c89a6a6d2 --- /dev/null +++ b/examples/reference/containers/matplotlib/Overlay.ipynb @@ -0,0 +1,219 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Overlay Container
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Bokeh
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A Overlay is a collection of HoloViews objects that are related in some way, to be displayed simultanously, overlaid in the space space. Like ``Layout`` and unlike other containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdOverlay``](./NdOverlay.ipynb) a ``Overlay`` is *not* dictionary like: it holds potentially heterogeneous types without any dimensioned keys.\n", + "\n", + "\n", + "A ``Overlay`` cannot contain any other container type other than ``NdOverlay`` but can contain any HoloViews elements. See [Building Composite Objects](05-Building_Composite_Objects.ipynb) for more details on how to compose containers. It is best to learn about ``Overlay`` and [``Layout``](./Layout.ipynb) together as they are very closely related objects that share many core concepts." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``Overlay`` is a heterogeneous collection" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build a ``Overlay`` between any two HoloViews objects (which can have different types) using the ``*`` operator:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve = hv.Curve((xvals, [np.sin(x) for x in xvals]))\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)))\n", + "curve * scatter" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this example, we have a ``Overlay`` composed of a ``Curve`` element and a ``Scatter`` element." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Building ``Overlay`` from a list" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can build Overlay's of any size by passing a list of objects to the ``Overlay`` constructor as shown below:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_list = [ hv.Curve((xvals, [np.sin(f*x) for x in xvals])) for f in [0.5, 0.75]]\n", + "scatter_list = [hv.Scatter((xvals[::5], f*np.linspace(0,1,20))) for f in [-0.5, 0.5]]\n", + "overlay = hv.Overlay(curve_list + scatter_list)\n", + "overlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that these curve and scatter elements are using the default color cycle when overlaid; see [Customizing Plots](../../../user_guide/03-Customizing_Plots.ipynb) for more information on cycles." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A ``Overlay`` has two-level attribute access" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Overlay`` and ``Layout`` are fundamentally tree-structures holding arbitrary heterogenous HoloViews objects and are quite different from the dictionary-like dimensioned containers such as [``HoloMap``](./HoloMap.ipynb) , [``GridSpace``](./GridSpace.ipynb) and [``NdOverlay``](./NdOverlay.ipynb).\n", + "\n", + "All HoloViews objects have string ``group`` and ``label`` parameters, resulting in a 2-level attribute access on ``Overlay`` objects. First let us see how to index the above example where ``group`` and ``label`` was not specified for any of the elements:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "overlay2 = overlay.Curve.I * overlay.Scatter.II\n", + "overlay2" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we create a second ``Overlay`` by indexing two elements from our earlier ``overlay`` object. We see that the first level is the ``group`` string (which defaults to the element class name) followed by the label, which wasn't set and is therefore mapped to a roman numeral (I,II,III etc).\n", + "\n", + "As no group and label was specified, our new ``Overlay`` will once again have ``Curve.I`` for the curve but as there is only one scatter element, it will have ``Scatter.II`` to index the scatter." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tab-completion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Overlay`` and ``Layout`` are designed to be easy to explore and inspect with tab completion. Try running:\n", + "\n", + "```python\n", + "overlay.[tab]\n", + "```\n", + "\n", + "And you should see the first levels of indexing (``Curve`` and ``Scatter``) conveniently listed at the top. If this is not the case, you may need to enable improved tab-completion as described in [Configuring HoloViews](../../../user_guide/Configuring_HoloViews.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using ``group`` and ``label``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's return to the first simple example, although this time we will set a ``group`` and ``label``; see the [Annotating Data](../../../user_guide/01-Annotating_Data.ipynb) for more information:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = [0.1* i for i in range(100)]\n", + "curve1 = hv.Curve((xvals, [np.sin(x) for x in xvals]), group='Sinusoid', label='Low Frequency')\n", + "curve2 = hv.Curve((xvals, [np.sin(2*x) for x in xvals]), group='Sinusoid', label='High Frequency')\n", + "scatter = hv.Scatter((xvals[::5], np.linspace(0,1,20)), group='Linear Points', label='Demo')\n", + "overlay3 = curve1 * curve2 * scatter\n", + "overlay3" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now see how group and label affect access, in addition to being used for setting the legend shown above and for allowing plot customization (see \n", + "[Customizing Plots](../../../user_guide/03-Customizing_Plots.ipynb) for more information):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "overlay3.Linear_Points.Demo * overlay3.Sinusoid.High_Frequency * overlay3.Sinusoid.Low_Frequency" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have used the semantic group and label names to access the elements and build a new re-ordered ``Overlay`` which we can observe by the switched z-ordering and legend colors used for the two sinusoidal curves." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/reference/elements/assets/penguins.png b/examples/reference/elements/assets/penguins.png new file mode 100644 index 0000000000..d288f9352a Binary files /dev/null and b/examples/reference/elements/assets/penguins.png differ diff --git a/examples/elements/bokeh/Area.ipynb b/examples/reference/elements/bokeh/Area.ipynb similarity index 84% rename from examples/elements/bokeh/Area.ipynb rename to examples/reference/elements/bokeh/Area.ipynb index 2dda4b69f1..c2a6eca7a4 100644 --- a/examples/elements/bokeh/Area.ipynb +++ b/examples/reference/elements/bokeh/Area.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Area Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Area.ipynb)
[Matplotlib](../matplotlib/Area.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -47,9 +45,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "xs = np.linspace(0, np.pi*4, 40)\n", @@ -68,9 +64,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "X = np.linspace(0,3,200)\n", @@ -94,9 +88,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "values = np.random.rand(5, 20)\n", @@ -108,22 +100,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Bars.ipynb b/examples/reference/elements/bokeh/Bars.ipynb similarity index 81% rename from examples/elements/bokeh/Bars.ipynb rename to examples/reference/elements/bokeh/Bars.ipynb index 2d3be161a2..40812e075d 100644 --- a/examples/elements/bokeh/Bars.ipynb +++ b/examples/reference/elements/bokeh/Bars.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Bars Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Bars.ipynb)
[Matplotlib](../matplotlib/Bars.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "data = [('one',8),('two', 10), ('three', 16), ('four', 8), ('five', 4), ('six', 1)]\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "bars[['one', 'two', 'three']] + bars[['four', 'five', 'six']]" @@ -76,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Bars.Grouped [group_index='Group'] Bars.Stacked [stack_index='Group']\n", @@ -93,22 +85,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Bounds.ipynb b/examples/reference/elements/bokeh/Bounds.ipynb similarity index 72% rename from examples/elements/bokeh/Bounds.ipynb rename to examples/reference/elements/bokeh/Bounds.ipynb index 4d9d7329da..e685b49b9b 100644 --- a/examples/elements/bokeh/Bounds.ipynb +++ b/examples/reference/elements/bokeh/Bounds.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Bounds Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Bounds.ipynb)
[Matplotlib](../matplotlib/Bounds.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,13 +34,11 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Bounds (color='orange' line_width=6)\n", - "penguins = hv.RGB.load_image('../../../doc/assets/penguins.png')\n", + "penguins = hv.RGB.load_image('../assets/penguins.png')\n", "penguins * hv.Bounds((-0.15, -0.4, 0.2, 0))" ] }, @@ -56,9 +52,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "penguins * penguins[-0.15:0.2, -0.4:0, 'G'] * hv.Bounds((-0.15, -0.4, 0.2, 0))" @@ -66,22 +60,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Box.ipynb b/examples/reference/elements/bokeh/Box.ipynb similarity index 78% rename from examples/elements/bokeh/Box.ipynb rename to examples/reference/elements/bokeh/Box.ipynb index bbac360702..8e90fff126 100644 --- a/examples/elements/bokeh/Box.ipynb +++ b/examples/reference/elements/bokeh/Box.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Box Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Box.ipynb)
[Matplotlib](../matplotlib/Box.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Box (line_width=5 color='red') Image (cmap='gray')\n", @@ -60,9 +56,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Box (line_width=5 color='purple') Image (cmap='gray')\n", @@ -73,22 +67,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/BoxWhisker.ipynb b/examples/reference/elements/bokeh/BoxWhisker.ipynb similarity index 78% rename from examples/elements/bokeh/BoxWhisker.ipynb rename to examples/reference/elements/bokeh/BoxWhisker.ipynb index fe590dcc3b..b3a0d0c03e 100644 --- a/examples/elements/bokeh/BoxWhisker.ipynb +++ b/examples/reference/elements/bokeh/BoxWhisker.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
BoxWhisker Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./BoxWhisker.ipynb)
[Matplotlib](../matplotlib/BoxWhisker.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -45,9 +43,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "hv.BoxWhisker(np.random.randn(1000), vdims=['Value'])" @@ -63,9 +59,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts BoxWhisker [width=600 height=400 show_legend=False] (whisker_color='gray' color='white')\n", @@ -76,22 +70,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Contours.ipynb b/examples/reference/elements/bokeh/Contours.ipynb similarity index 81% rename from examples/elements/bokeh/Contours.ipynb rename to examples/reference/elements/bokeh/Contours.ipynb index 2b71cd4eea..63a4325903 100644 --- a/examples/elements/bokeh/Contours.ipynb +++ b/examples/reference/elements/bokeh/Contours.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Contours Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Contours.ipynb)
[Matplotlib](../matplotlib/Contours.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Contours (cmap='viridis')\n", @@ -61,9 +57,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Contours [show_legend=False colorbar=True width=325] (cmap='fire')\n", @@ -76,22 +70,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Curve.ipynb b/examples/reference/elements/bokeh/Curve.ipynb similarity index 79% rename from examples/elements/bokeh/Curve.ipynb rename to examples/reference/elements/bokeh/Curve.ipynb index cb962e6b0c..84965e1401 100644 --- a/examples/elements/bokeh/Curve.ipynb +++ b/examples/reference/elements/bokeh/Curve.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Curve Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Curve.ipynb)
[Matplotlib](../matplotlib/Curve.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -45,9 +43,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "points = [(0.1*i, np.sin(0.1*i)) for i in range(100)]\n", @@ -71,9 +67,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Curve [width=600] NdOverlay [legend_position='right']\n", @@ -83,22 +77,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Ellipse.ipynb b/examples/reference/elements/bokeh/Ellipse.ipynb similarity index 79% rename from examples/elements/bokeh/Ellipse.ipynb rename to examples/reference/elements/bokeh/Ellipse.ipynb index 6c20b403e3..bffd57468d 100644 --- a/examples/elements/bokeh/Ellipse.ipynb +++ b/examples/reference/elements/bokeh/Ellipse.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Ellipse Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Ellipse.ipynb)
[Matplotlib](../matplotlib/Ellipse.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Ellipse (line_width=6)\n", @@ -62,9 +58,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Ellipse (line_width=6)\n", @@ -74,22 +68,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/ErrorBars.ipynb b/examples/reference/elements/bokeh/ErrorBars.ipynb similarity index 81% rename from examples/elements/bokeh/ErrorBars.ipynb rename to examples/reference/elements/bokeh/ErrorBars.ipynb index df7f1a5df4..ec11a520f9 100644 --- a/examples/elements/bokeh/ErrorBars.ipynb +++ b/examples/reference/elements/bokeh/ErrorBars.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
ErrorBars Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./ErrorBars.ipynb)
[Matplotlib](../matplotlib/ErrorBars.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -47,9 +45,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "np.random.seed(7)\n", @@ -74,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "errors = [(0.1*i, np.sin(0.1*i), np.random.rand()/2, np.random.rand()/4) for i in np.linspace(0, 100, 11)]\n", @@ -85,22 +79,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/HLine.ipynb b/examples/reference/elements/bokeh/HLine.ipynb similarity index 69% rename from examples/elements/bokeh/HLine.ipynb rename to examples/reference/elements/bokeh/HLine.ipynb index 5ee72429f9..8618733842 100644 --- a/examples/elements/bokeh/HLine.ipynb +++ b/examples/reference/elements/bokeh/HLine.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
HLine Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./HLine.ipynb)
[Matplotlib](../matplotlib/HLine.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts HLine (color='blue' line_width=6) Points (color='#D3D3D3')\n", @@ -49,22 +45,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/HSV.ipynb b/examples/reference/elements/bokeh/HSV.ipynb similarity index 79% rename from examples/elements/bokeh/HSV.ipynb rename to examples/reference/elements/bokeh/HSV.ipynb index da65d91448..17a03b760c 100644 --- a/examples/elements/bokeh/HSV.ipynb +++ b/examples/reference/elements/bokeh/HSV.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
HSV Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./HSV.ipynb)
[Matplotlib](../matplotlib/HSV.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "x,y = np.mgrid[-50:51, -50:51] * 0.1\n", @@ -62,9 +58,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%opts Image (cmap='gray')\n", @@ -81,9 +75,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "print(hsv.rgb)\n", @@ -92,22 +84,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/HeatMap.ipynb b/examples/reference/elements/bokeh/HeatMap.ipynb similarity index 81% rename from examples/elements/bokeh/HeatMap.ipynb rename to examples/reference/elements/bokeh/HeatMap.ipynb index 2d3be73bd3..443dbc706b 100644 --- a/examples/elements/bokeh/HeatMap.ipynb +++ b/examples/reference/elements/bokeh/HeatMap.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
HeatMap Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./HeatMap.ipynb)
[Matplotlib](../matplotlib/HeatMap.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "data = [(chr(65+i), chr(97+j), i*j) for i in range(5) for j in range(5) if i!=j]\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "heatmap = hv.HeatMap([(0, 0, 1), (0, 0, 10), (1, 0, 2), (1, 1, 3)])\n", @@ -78,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts HeatMap [tools=['hover'] colorbar=True width=325 toolbar='above']\n", @@ -90,22 +82,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Histogram.ipynb b/examples/reference/elements/bokeh/Histogram.ipynb similarity index 82% rename from examples/elements/bokeh/Histogram.ipynb rename to examples/reference/elements/bokeh/Histogram.ipynb index b054ba3e84..24b6602ca8 100644 --- a/examples/elements/bokeh/Histogram.ipynb +++ b/examples/reference/elements/bokeh/Histogram.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Histogram Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Histogram.ipynb)
[Matplotlib](../matplotlib/Histogram.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "np.random.seed(1)\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "xs = np.linspace(0, np.pi*2)\n", @@ -79,9 +73,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Histogram (alpha=0.3)\n", @@ -96,22 +88,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Image.ipynb b/examples/reference/elements/bokeh/Image.ipynb similarity index 81% rename from examples/elements/bokeh/Image.ipynb rename to examples/reference/elements/bokeh/Image.ipynb index f669a292d3..361f853f58 100644 --- a/examples/elements/bokeh/Image.ipynb +++ b/examples/reference/elements/bokeh/Image.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Image Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Image.ipynb)
[Matplotlib](../matplotlib/Image.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "ls = np.linspace(0, 10, 200)\n", @@ -59,9 +55,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "img + img[-0.5:0.5, -0.5:0.5]" @@ -77,9 +71,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='black' marker='x' size=20)\n", @@ -98,9 +90,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "img.sample(x=0) + img.reduce(x=np.mean)" @@ -108,22 +98,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/ItemTable.ipynb b/examples/reference/elements/bokeh/ItemTable.ipynb similarity index 72% rename from examples/elements/bokeh/ItemTable.ipynb rename to examples/reference/elements/bokeh/ItemTable.ipynb index 50d7cc8161..9770e00692 100644 --- a/examples/elements/bokeh/ItemTable.ipynb +++ b/examples/reference/elements/bokeh/ItemTable.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
ItemTable Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./ItemTable.ipynb)
[Matplotlib](../matplotlib/ItemTable.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts ItemTable [height=60]\n", @@ -47,22 +43,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Path.ipynb b/examples/reference/elements/bokeh/Path.ipynb similarity index 83% rename from examples/elements/bokeh/Path.ipynb rename to examples/reference/elements/bokeh/Path.ipynb index 2677a11977..09e602e693 100644 --- a/examples/elements/bokeh/Path.ipynb +++ b/examples/reference/elements/bokeh/Path.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Path Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Path.ipynb)
[Matplotlib](../matplotlib/Path.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -37,9 +35,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Path (color='black' line_width=4)\n", @@ -63,9 +59,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Path (line_width=4)\n", @@ -74,7 +68,7 @@ "\n", "adultR = [(0.25, 0.45), (0.35,0.35), (0.25, 0.25), (0.15, 0.35), (0.25, 0.45)]\n", "adultL = [(-0.3, 0.4), (-0.3, 0.3), (-0.2, 0.3), (-0.2, 0.4),(-0.3, 0.4)]\n", - "scene = hv.RGB.load_image('../../../doc/assets/penguins.png')\n", + "scene = hv.RGB.load_image('../assets/penguins.png')\n", "\n", "scene * hv.Path([adultL, adultR, baby]) * hv.Path([baby])" ] @@ -89,9 +83,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Path [width=600]\n", @@ -102,22 +94,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Points.ipynb b/examples/reference/elements/bokeh/Points.ipynb similarity index 86% rename from examples/elements/bokeh/Points.ipynb rename to examples/reference/elements/bokeh/Points.ipynb index d94b4e31f8..38de018a58 100644 --- a/examples/elements/bokeh/Points.ipynb +++ b/examples/reference/elements/bokeh/Points.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Points Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Points.ipynb)
[Matplotlib](../matplotlib/Points.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='k' marker='+' size=10)\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='k' marker='+' size=10)\n", @@ -76,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points [color_index=2 size_index=3 scaling_factor=50]\n", @@ -110,22 +102,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Polygons.ipynb b/examples/reference/elements/bokeh/Polygons.ipynb similarity index 79% rename from examples/elements/bokeh/Polygons.ipynb rename to examples/reference/elements/bokeh/Polygons.ipynb index 94bdf7265f..de77358255 100644 --- a/examples/elements/bokeh/Polygons.ipynb +++ b/examples/reference/elements/bokeh/Polygons.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Polygons Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Polygons.ipynb)
[Matplotlib](../matplotlib/Polygons.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Polygons (cmap='hot' line_color='black' line_width=2)\n", @@ -61,9 +57,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "def rectangle(x=0, y=0, width=1, height=1):\n", @@ -76,22 +70,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/QuadMesh.ipynb b/examples/reference/elements/bokeh/QuadMesh.ipynb similarity index 84% rename from examples/elements/bokeh/QuadMesh.ipynb rename to examples/reference/elements/bokeh/QuadMesh.ipynb index d5b32b5335..633fcb37b4 100644 --- a/examples/elements/bokeh/QuadMesh.ipynb +++ b/examples/reference/elements/bokeh/QuadMesh.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
QuadMesh Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./QuadMesh.ipynb)
[Matplotlib](../matplotlib/QuadMesh.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "n = 8 # Number of bins in each direction\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts QuadMesh [tools=['hover'] xticks=[10, 100,1000]] QuadMesh.LogScale [logx=True]\n", @@ -79,9 +73,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "qmesh = hv.QuadMesh((xs, ys, zs))\n", @@ -98,9 +90,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts QuadMesh [tools=['hover']] (alpha=0 hover_line_alpha=1 hover_line_color='black')\n", @@ -117,22 +107,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/RGB.ipynb b/examples/reference/elements/bokeh/RGB.ipynb similarity index 78% rename from examples/elements/bokeh/RGB.ipynb rename to examples/reference/elements/bokeh/RGB.ipynb index d2e2d7752d..d1d4476ec3 100644 --- a/examples/elements/bokeh/RGB.ipynb +++ b/examples/reference/elements/bokeh/RGB.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
RGB Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./RGB.ipynb)
[Matplotlib](../matplotlib/RGB.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,12 +34,10 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ - "hv.RGB.load_image('../../../doc/assets/penguins.png')" + "hv.RGB.load_image('../assets/penguins.png')" ] }, { @@ -54,9 +50,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "x,y = np.mgrid[-50:51, -50:51] * 0.1\n", @@ -78,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Image (cmap='gray')\n", @@ -97,9 +89,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Image (cmap='gray')\n", @@ -113,22 +103,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Raster.ipynb b/examples/reference/elements/bokeh/Raster.ipynb similarity index 75% rename from examples/elements/bokeh/Raster.ipynb rename to examples/reference/elements/bokeh/Raster.ipynb index e5f774d72d..e368d5af3c 100644 --- a/examples/elements/bokeh/Raster.ipynb +++ b/examples/reference/elements/bokeh/Raster.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Raster Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Raster.ipynb)
[Matplotlib](../matplotlib/Raster.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "xvals = np.linspace(0,4,202)\n", @@ -55,22 +51,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Scatter.ipynb b/examples/reference/elements/bokeh/Scatter.ipynb similarity index 87% rename from examples/elements/bokeh/Scatter.ipynb rename to examples/reference/elements/bokeh/Scatter.ipynb index 018671a319..aa7ff4deae 100644 --- a/examples/elements/bokeh/Scatter.ipynb +++ b/examples/reference/elements/bokeh/Scatter.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Scatter Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Scatter.ipynb)
[Matplotlib](../matplotlib/Scatter.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Scatter (color='k' marker='s' size=10)\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Scatter (color='k' marker='s' size=10)\n", @@ -76,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Scatter [color_index=2 size_index=3 scaling_factor=50]\n", @@ -114,22 +106,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Spikes.ipynb b/examples/reference/elements/bokeh/Spikes.ipynb similarity index 83% rename from examples/elements/bokeh/Spikes.ipynb rename to examples/reference/elements/bokeh/Spikes.ipynb index 98d97ba291..535cefaf61 100644 --- a/examples/elements/bokeh/Spikes.ipynb +++ b/examples/reference/elements/bokeh/Spikes.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Spikes Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Spikes.ipynb)
[Matplotlib](../matplotlib/Spikes.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes (line_alpha=0.4) [spike_length=0.1]\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='red')\n", @@ -78,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes (cmap='Reds')\n", @@ -97,9 +89,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes [spike_length=0.1] NdOverlay [show_legend=False]\n", @@ -117,9 +107,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes (line_alpha=0.2)\n", @@ -129,22 +117,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Spline.ipynb b/examples/reference/elements/bokeh/Spline.ipynb similarity index 73% rename from examples/elements/bokeh/Spline.ipynb rename to examples/reference/elements/bokeh/Spline.ipynb index 4dfc41a373..3cf17e82ac 100644 --- a/examples/elements/bokeh/Spline.ipynb +++ b/examples/reference/elements/bokeh/Spline.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Spline Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Spline.ipynb)
[Matplotlib](../matplotlib/Spline.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Curve (color='#D3D3D3') Spline (line_width=6 color='green')\n", @@ -55,22 +51,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Spread.ipynb b/examples/reference/elements/bokeh/Spread.ipynb similarity index 82% rename from examples/elements/bokeh/Spread.ipynb rename to examples/reference/elements/bokeh/Spread.ipynb index d9f10fed28..54aedeced1 100644 --- a/examples/elements/bokeh/Spread.ipynb +++ b/examples/reference/elements/bokeh/Spread.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Spread Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Spread.ipynb)
[Matplotlib](../matplotlib/Spread.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -48,9 +46,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "np.random.seed(42)\n", @@ -76,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spread (fill_color='indianred' fill_alpha=1)\n", @@ -89,22 +83,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Table.ipynb b/examples/reference/elements/bokeh/Table.ipynb similarity index 81% rename from examples/elements/bokeh/Table.ipynb rename to examples/reference/elements/bokeh/Table.ipynb index b9a32fc62d..a8e18a6151 100644 --- a/examples/elements/bokeh/Table.ipynb +++ b/examples/reference/elements/bokeh/Table.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Table Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Table.ipynb)
[Matplotlib](../matplotlib/Table.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "gender = ['M','M', 'M','F']\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Table [height=140]\n", @@ -77,9 +71,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Table [height=140]\n", @@ -96,9 +88,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Table [height=100]\n", @@ -115,9 +105,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "table.select(Gender='M').to.curve(kdims=[\"Age\"], vdims=[\"Weight\"])" @@ -132,22 +120,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/Text.ipynb b/examples/reference/elements/bokeh/Text.ipynb similarity index 66% rename from examples/elements/bokeh/Text.ipynb rename to examples/reference/elements/bokeh/Text.ipynb index 7ddb63f8a4..8b58036135 100644 --- a/examples/elements/bokeh/Text.ipynb +++ b/examples/reference/elements/bokeh/Text.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Text Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./Text.ipynb)
[Matplotlib](../matplotlib/Text.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Curve (color='#D3D3D3')\n", @@ -48,22 +44,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/VLine.ipynb b/examples/reference/elements/bokeh/VLine.ipynb similarity index 68% rename from examples/elements/bokeh/VLine.ipynb rename to examples/reference/elements/bokeh/VLine.ipynb index 492b578778..19338f023b 100644 --- a/examples/elements/bokeh/VLine.ipynb +++ b/examples/reference/elements/bokeh/VLine.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
VLine Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./VLine.ipynb)
[Matplotlib](../matplotlib/VLine.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VLine (color='red' line_width=6) Curve (color='#D3D3D3')\n", @@ -49,22 +45,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/bokeh/VectorField.ipynb b/examples/reference/elements/bokeh/VectorField.ipynb similarity index 85% rename from examples/elements/bokeh/VectorField.ipynb rename to examples/reference/elements/bokeh/VectorField.ipynb index 1e278a9eff..46528fad49 100644 --- a/examples/elements/bokeh/VectorField.ipynb +++ b/examples/reference/elements/bokeh/VectorField.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
VectorField Element
\n", "
Dependencies
Bokeh
\n", - "
Backends
[Bokeh](./VectorField.ipynb)
[Matplotlib](../matplotlib/VectorField.ipynb)
\n", + "
Backends
Bokeh
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [size_index=3]\n", @@ -62,9 +58,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [size_index=3] VectorField.A [color_index=2] VectorField.M [color_index=3]\n", @@ -81,9 +75,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [color_index=2 size_index=3 rescale_lengths=False] (scale=4)\n", @@ -101,9 +93,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "xs, ys = np.arange(0, 2 * np.pi, .2), np.arange(0, 2 * np.pi, .2)\n", @@ -126,9 +116,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [width=500 color_index=3 size_index=3 pivot='tip'] (cmap='fire' scale=0.8) Points (color='black' size=1)\n", @@ -137,22 +125,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Area.ipynb b/examples/reference/elements/matplotlib/Area.ipynb similarity index 84% rename from examples/elements/matplotlib/Area.ipynb rename to examples/reference/elements/matplotlib/Area.ipynb index 9c56f1943b..0e5a31f705 100644 --- a/examples/elements/matplotlib/Area.ipynb +++ b/examples/reference/elements/matplotlib/Area.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Area Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Area.ipynb)
[Bokeh](../bokeh/Area.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -47,9 +45,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "xs = np.linspace(0, np.pi*4, 40)\n", @@ -68,9 +64,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "X = np.linspace(0,3,200)\n", @@ -94,9 +88,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "values = np.random.rand(5, 20)\n", @@ -108,22 +100,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Arrow.ipynb b/examples/reference/elements/matplotlib/Arrow.ipynb similarity index 69% rename from examples/elements/matplotlib/Arrow.ipynb rename to examples/reference/elements/matplotlib/Arrow.ipynb index 25f25cb78b..ba320c8a76 100644 --- a/examples/elements/matplotlib/Arrow.ipynb +++ b/examples/reference/elements/matplotlib/Arrow.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Arrow
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Arrow.ipynb)
\n", + "
Backends
Matplotlib
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Curve (color='#D3D3D3')\n", @@ -49,22 +45,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Bars.ipynb b/examples/reference/elements/matplotlib/Bars.ipynb similarity index 86% rename from examples/elements/matplotlib/Bars.ipynb rename to examples/reference/elements/matplotlib/Bars.ipynb index 5731148e51..33e7cad5ed 100644 --- a/examples/elements/matplotlib/Bars.ipynb +++ b/examples/reference/elements/matplotlib/Bars.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Bars Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Bars.ipynb)
[Bokeh](../bokeh/Bars.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -86,22 +86,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Bounds.ipynb b/examples/reference/elements/matplotlib/Bounds.ipynb similarity index 72% rename from examples/elements/matplotlib/Bounds.ipynb rename to examples/reference/elements/matplotlib/Bounds.ipynb index cd0e4af774..db7d70f90c 100644 --- a/examples/elements/matplotlib/Bounds.ipynb +++ b/examples/reference/elements/matplotlib/Bounds.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Bounds Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Bounds.ipynb)
[Bokeh](../bokeh/Bounds.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,13 +34,11 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Bounds (color='orange' linewidth=6)\n", - "penguins = hv.RGB.load_image('../../../doc/assets/penguins.png')\n", + "penguins = hv.RGB.load_image('../assets/penguins.png')\n", "penguins * hv.Bounds((-0.15, -0.4, 0.2, 0))" ] }, @@ -56,9 +52,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "penguins * penguins[-0.15:0.2, -0.4:0, 'G'] * hv.Bounds((-0.15, -0.4, 0.2, 0))" @@ -66,22 +60,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Box.ipynb b/examples/reference/elements/matplotlib/Box.ipynb similarity index 79% rename from examples/elements/matplotlib/Box.ipynb rename to examples/reference/elements/matplotlib/Box.ipynb index 96713da2ff..c9d54f9423 100644 --- a/examples/elements/matplotlib/Box.ipynb +++ b/examples/reference/elements/matplotlib/Box.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Box Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Box.ipynb)
[Bokeh](../bokeh/Box.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Box (linewidth=5 color='red') Image (cmap='gray')\n", @@ -60,9 +56,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Box (linewidth=5 color='purple') Image (cmap='gray')\n", @@ -73,22 +67,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/BoxWhisker.ipynb b/examples/reference/elements/matplotlib/BoxWhisker.ipynb similarity index 79% rename from examples/elements/matplotlib/BoxWhisker.ipynb rename to examples/reference/elements/matplotlib/BoxWhisker.ipynb index f2bca46fe4..e45fd4f629 100644 --- a/examples/elements/matplotlib/BoxWhisker.ipynb +++ b/examples/reference/elements/matplotlib/BoxWhisker.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
BoxWhisker Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./BoxWhisker.ipynb)
[Bokeh](../bokeh/BoxWhisker.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -45,9 +43,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "hv.BoxWhisker(np.random.randn(1000), vdims=['Value'])" @@ -63,9 +59,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts BoxWhisker [width=600 height=400 show_legend=False] (whisker_color='gray' color='white')\n", @@ -76,22 +70,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Contours.ipynb b/examples/reference/elements/matplotlib/Contours.ipynb similarity index 81% rename from examples/elements/matplotlib/Contours.ipynb rename to examples/reference/elements/matplotlib/Contours.ipynb index b7730686bc..b79a9ce508 100644 --- a/examples/elements/matplotlib/Contours.ipynb +++ b/examples/reference/elements/matplotlib/Contours.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Contours Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Contours.ipynb)
[Bokeh](../bokeh/Contours.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Contours (cmap='viridis')\n", @@ -61,9 +57,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Contours [show_legend=False colorbar=True width=325] (cmap='fire')\n", @@ -76,22 +70,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Curve.ipynb b/examples/reference/elements/matplotlib/Curve.ipynb similarity index 80% rename from examples/elements/matplotlib/Curve.ipynb rename to examples/reference/elements/matplotlib/Curve.ipynb index cd6892410f..2c739eed22 100644 --- a/examples/elements/matplotlib/Curve.ipynb +++ b/examples/reference/elements/matplotlib/Curve.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Curve Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Curve.ipynb)
[Bokeh](../bokeh/Curve.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -45,9 +43,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "points = [(0.1*i, np.sin(0.1*i)) for i in range(100)]\n", @@ -71,9 +67,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts NdOverlay [legend_position='right']\n", @@ -83,22 +77,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Ellipse.ipynb b/examples/reference/elements/matplotlib/Ellipse.ipynb similarity index 79% rename from examples/elements/matplotlib/Ellipse.ipynb rename to examples/reference/elements/matplotlib/Ellipse.ipynb index 15defcc78c..c6bc2bb608 100644 --- a/examples/elements/matplotlib/Ellipse.ipynb +++ b/examples/reference/elements/matplotlib/Ellipse.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Ellipse Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Ellipse.ipynb)
[Bokeh](../bokeh/Ellipse.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Ellipse (linewidth=6)\n", @@ -62,9 +58,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Ellipse (linewidth=6)\n", @@ -74,22 +68,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/ErrorBars.ipynb b/examples/reference/elements/matplotlib/ErrorBars.ipynb similarity index 82% rename from examples/elements/matplotlib/ErrorBars.ipynb rename to examples/reference/elements/matplotlib/ErrorBars.ipynb index 67b660dafc..ba11db8b7d 100644 --- a/examples/elements/matplotlib/ErrorBars.ipynb +++ b/examples/reference/elements/matplotlib/ErrorBars.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
ErrorBars Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./ErrorBars.ipynb)
[Bokeh](../bokeh/ErrorBars.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -47,9 +45,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "np.random.seed(7)\n", @@ -74,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "errors = [(0.1*i, np.sin(0.1*i), np.random.rand()/2, np.random.rand()/4) for i in np.linspace(0, 100, 11)]\n", @@ -85,22 +79,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/HLine.ipynb b/examples/reference/elements/matplotlib/HLine.ipynb similarity index 69% rename from examples/elements/matplotlib/HLine.ipynb rename to examples/reference/elements/matplotlib/HLine.ipynb index 905f2dd4d3..8d6a7ca1bc 100644 --- a/examples/elements/matplotlib/HLine.ipynb +++ b/examples/reference/elements/matplotlib/HLine.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
HLine Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./HLine.ipynb)
[Bokeh](../bokeh/HLine.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts HLine (color='blue' linewidth=6) Points (color='#D3D3D3')\n", @@ -49,22 +45,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/HSV.ipynb b/examples/reference/elements/matplotlib/HSV.ipynb similarity index 80% rename from examples/elements/matplotlib/HSV.ipynb rename to examples/reference/elements/matplotlib/HSV.ipynb index 32f6a1abd6..386064b07a 100644 --- a/examples/elements/matplotlib/HSV.ipynb +++ b/examples/reference/elements/matplotlib/HSV.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
HSV Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./HSV.ipynb)
[Bokeh](../bokeh/HSV.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "x,y = np.mgrid[-50:51, -50:51] * 0.1\n", @@ -62,9 +58,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%opts Image (cmap='gray')\n", @@ -81,9 +75,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "print(hsv.rgb)\n", @@ -92,22 +84,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/HeatMap.ipynb b/examples/reference/elements/matplotlib/HeatMap.ipynb similarity index 81% rename from examples/elements/matplotlib/HeatMap.ipynb rename to examples/reference/elements/matplotlib/HeatMap.ipynb index 1a84b65efc..5302c7ff40 100644 --- a/examples/elements/matplotlib/HeatMap.ipynb +++ b/examples/reference/elements/matplotlib/HeatMap.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
HeatMap Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./HeatMap.ipynb)
[Bokeh](../bokeh/HeatMap.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "data = [(chr(65+i), chr(97+j), i*j) for i in range(5) for j in range(5) if i!=j]\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "heatmap = hv.HeatMap([(0, 0, 1), (0, 0, 10), (1, 0, 2), (1, 1, 3)])\n", @@ -78,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts HeatMap [ colorbar=True fig_size=250]\n", @@ -90,22 +82,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Histogram.ipynb b/examples/reference/elements/matplotlib/Histogram.ipynb similarity index 82% rename from examples/elements/matplotlib/Histogram.ipynb rename to examples/reference/elements/matplotlib/Histogram.ipynb index 4d5b63e598..3f3b2088a4 100644 --- a/examples/elements/matplotlib/Histogram.ipynb +++ b/examples/reference/elements/matplotlib/Histogram.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Histogram Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Histogram.ipynb)
[Bokeh](../bokeh/Histogram.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "np.random.seed(1)\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "xs = np.linspace(0, np.pi*2)\n", @@ -79,9 +73,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Histogram (alpha=0.3)\n", @@ -96,22 +88,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Image.ipynb b/examples/reference/elements/matplotlib/Image.ipynb similarity index 82% rename from examples/elements/matplotlib/Image.ipynb rename to examples/reference/elements/matplotlib/Image.ipynb index b02c29ccad..1c7e0befed 100644 --- a/examples/elements/matplotlib/Image.ipynb +++ b/examples/reference/elements/matplotlib/Image.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Image Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Image.ipynb)
[Bokeh](../bokeh/Image.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "ls = np.linspace(0, 10, 200)\n", @@ -59,9 +55,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "img + img[-0.5:0.5, -0.5:0.5]" @@ -77,9 +71,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='black' marker='x' size=20)\n", @@ -98,9 +90,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "img.sample(x=0) + img.reduce(x=np.mean)" @@ -108,22 +98,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/ItemTable.ipynb b/examples/reference/elements/matplotlib/ItemTable.ipynb similarity index 73% rename from examples/elements/matplotlib/ItemTable.ipynb rename to examples/reference/elements/matplotlib/ItemTable.ipynb index 9d9544c6b4..fee8ba6de2 100644 --- a/examples/elements/matplotlib/ItemTable.ipynb +++ b/examples/reference/elements/matplotlib/ItemTable.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
ItemTable Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./ItemTable.ipynb)
[Bokeh](../bokeh/ItemTable.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "hv.ItemTable([('Age', 10), ('Weight',15), ('Height','0.8 meters')])" @@ -46,22 +42,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Path.ipynb b/examples/reference/elements/matplotlib/Path.ipynb similarity index 83% rename from examples/elements/matplotlib/Path.ipynb rename to examples/reference/elements/matplotlib/Path.ipynb index d2bed428a3..7867475cd1 100644 --- a/examples/elements/matplotlib/Path.ipynb +++ b/examples/reference/elements/matplotlib/Path.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Path Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Path.ipynb)
[Bokeh](../bokeh/Path.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -37,9 +35,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Path (color='black' linewidth=4)\n", @@ -63,9 +59,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Path (linewidth=4)\n", @@ -74,7 +68,7 @@ "\n", "adultR = [(0.25, 0.45), (0.35,0.35), (0.25, 0.25), (0.15, 0.35), (0.25, 0.45)]\n", "adultL = [(-0.3, 0.4), (-0.3, 0.3), (-0.2, 0.3), (-0.2, 0.4),(-0.3, 0.4)]\n", - "scene = hv.RGB.load_image('../../../doc/assets/penguins.png')\n", + "scene = hv.RGB.load_image('../assets/penguins.png')\n", "\n", "scene * hv.Path([adultL, adultR, baby]) * hv.Path([baby])" ] @@ -89,9 +83,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Path [fig_size=400 aspect=3]\n", @@ -102,22 +94,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Points.ipynb b/examples/reference/elements/matplotlib/Points.ipynb similarity index 87% rename from examples/elements/matplotlib/Points.ipynb rename to examples/reference/elements/matplotlib/Points.ipynb index b003dddad2..ebecc2dfff 100644 --- a/examples/elements/matplotlib/Points.ipynb +++ b/examples/reference/elements/matplotlib/Points.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Points Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Points.ipynb)
[Bokeh](../bokeh/Points.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='k' marker='+' size=10)\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='k' marker='+' size=10)\n", @@ -76,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points [color_index=2 size_index=3 scaling_factor=50]\n", @@ -110,22 +102,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Polygons.ipynb b/examples/reference/elements/matplotlib/Polygons.ipynb similarity index 80% rename from examples/elements/matplotlib/Polygons.ipynb rename to examples/reference/elements/matplotlib/Polygons.ipynb index 8105094bab..aadea11b8d 100644 --- a/examples/elements/matplotlib/Polygons.ipynb +++ b/examples/reference/elements/matplotlib/Polygons.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Polygons Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Polygons.ipynb)
[Bokeh](../bokeh/Polygons.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Polygons (cmap='hot' edgecolor='black' linewidth=2)\n", @@ -61,9 +57,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "def rectangle(x=0, y=0, width=1, height=1):\n", @@ -76,22 +70,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/QuadMesh.ipynb b/examples/reference/elements/matplotlib/QuadMesh.ipynb similarity index 82% rename from examples/elements/matplotlib/QuadMesh.ipynb rename to examples/reference/elements/matplotlib/QuadMesh.ipynb index 14b5992a91..f68a09346f 100644 --- a/examples/elements/matplotlib/QuadMesh.ipynb +++ b/examples/reference/elements/matplotlib/QuadMesh.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
QuadMesh Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./QuadMesh.ipynb)
[Bokeh](../bokeh/QuadMesh.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "n = 8 # Number of bins in each direction\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts QuadMesh [xticks=[10, 100,1000]] QuadMesh.LogScale [logx=True]\n", @@ -79,9 +73,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "qmesh = hv.QuadMesh((xs, ys, zs))\n", @@ -97,22 +89,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/RGB.ipynb b/examples/reference/elements/matplotlib/RGB.ipynb similarity index 79% rename from examples/elements/matplotlib/RGB.ipynb rename to examples/reference/elements/matplotlib/RGB.ipynb index 6a7dd933af..dda362f9be 100644 --- a/examples/elements/matplotlib/RGB.ipynb +++ b/examples/reference/elements/matplotlib/RGB.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
RGB Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./RGB.ipynb)
[Bokeh](../bokeh/RGB.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,12 +34,10 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ - "hv.RGB.load_image('../../../doc/assets/penguins.png')" + "hv.RGB.load_image('../assets/penguins.png')" ] }, { @@ -54,9 +50,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "x,y = np.mgrid[-50:51, -50:51] * 0.1\n", @@ -78,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Image (cmap='gray')\n", @@ -97,9 +89,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Image (cmap='gray')\n", @@ -113,22 +103,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Raster.ipynb b/examples/reference/elements/matplotlib/Raster.ipynb similarity index 75% rename from examples/elements/matplotlib/Raster.ipynb rename to examples/reference/elements/matplotlib/Raster.ipynb index c0f4a6ac31..2cd625fbce 100644 --- a/examples/elements/matplotlib/Raster.ipynb +++ b/examples/reference/elements/matplotlib/Raster.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Raster Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Raster.ipynb)
[Bokeh](../bokeh/Raster.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "xvals = np.linspace(0,4,202)\n", @@ -55,22 +51,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Scatter.ipynb b/examples/reference/elements/matplotlib/Scatter.ipynb similarity index 87% rename from examples/elements/matplotlib/Scatter.ipynb rename to examples/reference/elements/matplotlib/Scatter.ipynb index 288ce79be2..74f6a5f88c 100644 --- a/examples/elements/matplotlib/Scatter.ipynb +++ b/examples/reference/elements/matplotlib/Scatter.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Scatter Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Scatter.ipynb)
[Bokeh](../bokeh/Scatter.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Scatter (color='k' marker='s' size=10)\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Scatter (color='k' marker='s' size=10)\n", @@ -76,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Scatter [color_index=2 size_index=3 scaling_factor=50]\n", @@ -114,22 +106,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Scatter3D.ipynb b/examples/reference/elements/matplotlib/Scatter3D.ipynb similarity index 80% rename from examples/elements/matplotlib/Scatter3D.ipynb rename to examples/reference/elements/matplotlib/Scatter3D.ipynb index 3bdec7d49a..bd81158011 100644 --- a/examples/elements/matplotlib/Scatter3D.ipynb +++ b/examples/reference/elements/matplotlib/Scatter3D.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Scatter3D Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](../matplotlib/Image.ipynb)
\n", + "
Backends
Matplotlib
\n", "\n", "
\n", "" @@ -22,7 +22,7 @@ "source": [ "import numpy as np\n", "import holoviews as hv\n", - "hv.notebook_extension('matplotlib')" + "hv.extension('matplotlib')" ] }, { @@ -63,22 +63,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Spikes.ipynb b/examples/reference/elements/matplotlib/Spikes.ipynb similarity index 84% rename from examples/elements/matplotlib/Spikes.ipynb rename to examples/reference/elements/matplotlib/Spikes.ipynb index d0e91bd222..19a4d13cd0 100644 --- a/examples/elements/matplotlib/Spikes.ipynb +++ b/examples/reference/elements/matplotlib/Spikes.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Spikes Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Spikes.ipynb)
[Bokeh](../bokeh/Spikes.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -38,9 +36,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes (alpha=0.4) [spike_length=0.1]\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points (color='red')\n", @@ -78,9 +72,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes (cmap='Reds')\n", @@ -97,9 +89,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes [spike_length=0.1] NdOverlay [show_legend=False]\n", @@ -117,9 +107,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spikes (alpha=0.2)\n", @@ -129,22 +117,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Spline.ipynb b/examples/reference/elements/matplotlib/Spline.ipynb similarity index 74% rename from examples/elements/matplotlib/Spline.ipynb rename to examples/reference/elements/matplotlib/Spline.ipynb index 8a11aacd9b..710460daa5 100644 --- a/examples/elements/matplotlib/Spline.ipynb +++ b/examples/reference/elements/matplotlib/Spline.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Spline Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Spline.ipynb)
[Bokeh](../bokeh/Spline.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Curve (color='#D3D3D3') Spline (linewidth=6 edgecolor='green')\n", @@ -55,22 +51,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Spread.ipynb b/examples/reference/elements/matplotlib/Spread.ipynb similarity index 82% rename from examples/elements/matplotlib/Spread.ipynb rename to examples/reference/elements/matplotlib/Spread.ipynb index fd7f80825a..f5c7544cf3 100644 --- a/examples/elements/matplotlib/Spread.ipynb +++ b/examples/reference/elements/matplotlib/Spread.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Spread Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Spread.ipynb)
[Bokeh](../bokeh/Spread.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -47,9 +45,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "np.random.seed(42)\n", @@ -75,9 +71,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Spread (facecolor='indianred' alpha=1)\n", @@ -88,22 +82,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Surface.ipynb b/examples/reference/elements/matplotlib/Surface.ipynb similarity index 83% rename from examples/elements/matplotlib/Surface.ipynb rename to examples/reference/elements/matplotlib/Surface.ipynb index c0d696be37..73dc3ac87f 100644 --- a/examples/elements/matplotlib/Surface.ipynb +++ b/examples/reference/elements/matplotlib/Surface.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Surface Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](../matplotlib/Surface.ipynb)
\n", + "
Backends
Matplotlib
\n", "\n", "
\n", "" @@ -22,7 +22,7 @@ "source": [ "import numpy as np\n", "import holoviews as hv\n", - "hv.notebook_extension('matplotlib')" + "hv.extension('matplotlib')" ] }, { @@ -69,22 +69,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Table.ipynb b/examples/reference/elements/matplotlib/Table.ipynb similarity index 81% rename from examples/elements/matplotlib/Table.ipynb rename to examples/reference/elements/matplotlib/Table.ipynb index 2377305e57..fe7848665b 100644 --- a/examples/elements/matplotlib/Table.ipynb +++ b/examples/reference/elements/matplotlib/Table.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Table Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Table.ipynb)
[Bokeh](../bokeh/Table.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "gender = ['M','M', 'M','F']\n", @@ -57,9 +53,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "hv.Table({'Gender':gender, 'Age':age, 'Weight':weight, 'Height':height},\n", @@ -76,9 +70,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "table = hv.Table((gender, age, weight, height), kdims = ['Gender', 'Age'], vdims=['Weight', 'Height'])\n", @@ -95,9 +87,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "table.select(Gender='M') + table.select(Gender='M', Age=10)" @@ -113,9 +103,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "table.select(Gender='M').to.curve(kdims=[\"Age\"], vdims=[\"Weight\"])" @@ -130,22 +118,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Text.ipynb b/examples/reference/elements/matplotlib/Text.ipynb similarity index 67% rename from examples/elements/matplotlib/Text.ipynb rename to examples/reference/elements/matplotlib/Text.ipynb index b988bae866..f97edeff6c 100644 --- a/examples/elements/matplotlib/Text.ipynb +++ b/examples/reference/elements/matplotlib/Text.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Text Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./Text.ipynb)
[Bokeh](../bokeh/Text.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Curve (color='#D3D3D3')\n", @@ -48,22 +44,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/Trisurface.ipynb b/examples/reference/elements/matplotlib/Trisurface.ipynb similarity index 77% rename from examples/elements/matplotlib/Trisurface.ipynb rename to examples/reference/elements/matplotlib/Trisurface.ipynb index b39060d8b3..e28fc5e8a0 100644 --- a/examples/elements/matplotlib/Trisurface.ipynb +++ b/examples/reference/elements/matplotlib/Trisurface.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
Trisurface Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](../matplotlib/Trisurface.ipynb)
\n", + "
Backends
Matplotlib
\n", "\n", "
\n", "" @@ -17,14 +17,12 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import holoviews as hv\n", - "hv.notebook_extension('matplotlib')" + "hv.extension('matplotlib')" ] }, { @@ -37,9 +35,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Trisurface [azimuth=30 elevation=30 fig_size=200]\n", @@ -58,9 +54,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Trisurface [fig_size=200 colorbar=True] (cmap='fire')\n", @@ -83,22 +77,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python [conda env:science]", - "language": "python", - "name": "conda-env-science-py" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/VLine.ipynb b/examples/reference/elements/matplotlib/VLine.ipynb similarity index 69% rename from examples/elements/matplotlib/VLine.ipynb rename to examples/reference/elements/matplotlib/VLine.ipynb index 687a96a197..1586b2ec22 100644 --- a/examples/elements/matplotlib/VLine.ipynb +++ b/examples/reference/elements/matplotlib/VLine.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
VLine Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./VLine.ipynb)
[Bokeh](../bokeh/VLine.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VLine (color='red' linewidth=6) Curve (color='#D3D3D3')\n", @@ -49,22 +45,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/elements/matplotlib/VectorField.ipynb b/examples/reference/elements/matplotlib/VectorField.ipynb similarity index 86% rename from examples/elements/matplotlib/VectorField.ipynb rename to examples/reference/elements/matplotlib/VectorField.ipynb index 62c6d63b99..5c668b6bac 100644 --- a/examples/elements/matplotlib/VectorField.ipynb +++ b/examples/reference/elements/matplotlib/VectorField.ipynb @@ -8,7 +8,7 @@ "
\n", "
Title
VectorField Element
\n", "
Dependencies
Matplotlib
\n", - "
Backends
[Matplotlib](./VectorField.ipynb)
[Bokeh](../bokeh/VectorField.ipynb)
\n", + "
Backends
Matplotlib
Bokeh
\n", "
\n", "" ] @@ -16,9 +16,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -36,9 +34,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [size_index=3]\n", @@ -62,9 +58,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [size_index=3] VectorField.A [color_index=2] VectorField.M [color_index=3]\n", @@ -81,9 +75,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [color_index=2 size_index=3 rescale_lengths=False] (scale=4)\n", @@ -101,9 +93,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "xs, ys = np.arange(0, 2 * np.pi, .2), np.arange(0, 2 * np.pi, .2)\n", @@ -126,9 +116,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": false - }, + "metadata": {}, "outputs": [], "source": [ "%%opts VectorField [color_index=3 size_index=3 pivot='tip'] (cmap='fire' scale=0.8) Points (color='black' s=1)\n", @@ -137,22 +125,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 2", - "language": "python", - "name": "python2" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.13" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/reference/elements/plotly/BoxWhiskers.ipynb b/examples/reference/elements/plotly/BoxWhiskers.ipynb new file mode 100644 index 0000000000..be51185f38 --- /dev/null +++ b/examples/reference/elements/plotly/BoxWhiskers.ipynb @@ -0,0 +1,80 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
BoxWhisker Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A ``BoxWhisker`` Element is a quick way of visually summarizing one or more groups of numerical data through their quartiles. \n", + "\n", + "The data of a ``BoxWhisker`` Element may have any number of key dimensions representing the grouping of the value dimension and a single value dimensions representing the distribution of values within each group. See the [Columnar Data Tutorial](../Tutorials/Columnar_Data.ipynb) for supported data formats, which include arrays, pandas dataframes and dictionaries of arrays." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Without any groups a BoxWhisker Element represents a single distribution of values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.BoxWhisker(np.random.randn(1000), vdims=['Value'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By supplying key dimensions we can compare our distributions across multiple variables." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts BoxWhisker [width=600 height=400 show_legend=False]\n", + "groups = [chr(65+g) for g in np.random.randint(0, 3, 200)]\n", + "hv.BoxWhisker((groups, np.random.randint(0, 5, 200), np.random.randn(200)),\n", + " kdims=['Group', 'Category'], vdims=['Value']).sort()" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Curve.ipynb b/examples/reference/elements/plotly/Curve.ipynb new file mode 100644 index 0000000000..3a3f04e523 --- /dev/null +++ b/examples/reference/elements/plotly/Curve.ipynb @@ -0,0 +1,86 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Curve Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Curve`` Elements are used to display quantitative values over a continuous interval or time span. They accept tabular data with one key dimension representing the samples along the x-axis and one value dimension of the height of the curve at for each sample. See the [Columnar Data Tutorial](../Tutorials/Columnar_Data.ipynb) for supported data formats, which include arrays, pandas dataframes and dictionaries of arrays." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Simple Curve\n", + "\n", + "A ``Curve`` is a set of values provided for some set of keys from a [continuously indexable 1D coordinate system](Continuous_Coordinates.ipynb), where the plotted values will be connected up because they are assumed to be samples from a continuous relation." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "points = [(0.1*i, np.sin(0.1*i)) for i in range(100)]\n", + "hv.Curve(points)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Interpolation" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``Curve`` also supports the ``interpolation`` plot option to determine whether to linearly interpolate the curve values or to draw discrete steps:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.NdOverlay({interp: hv.Curve(points[::8])(plot=dict(interpolation=interp))\n", + " for interp in ['linear', 'steps-mid', 'steps-pre', 'steps-post']})" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Distribution.ipynb b/examples/reference/elements/plotly/Distribution.ipynb new file mode 100644 index 0000000000..7ad57b4dca --- /dev/null +++ b/examples/reference/elements/plotly/Distribution.ipynb @@ -0,0 +1,70 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Distribution Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A ``Distribution`` Element is a quick way of visualize the distribution of some data visualizing it as a a histogram or kernel density estimate. Unlike the ``Histogram`` Element ``Distribution`` wraps the raw data rather than representing the already binned data.\n", + "\n", + "Here we will wrap a simple numpy array containing 1000 samples of a normal distribution." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Distribution(np.random.randn(1000))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Distribution`` Elements like all other Elements can be overlaid allowing us to compare two distributions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Distribution(np.random.randn(1000), label='#1') * hv.Distribution(np.random.randn(1000)+2, label='#2')" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/ErrorBars.ipynb b/examples/reference/elements/plotly/ErrorBars.ipynb new file mode 100644 index 0000000000..d41a9853ff --- /dev/null +++ b/examples/reference/elements/plotly/ErrorBars.ipynb @@ -0,0 +1,89 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
ErrorBars Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``ErrorBars`` provide a visual indicator for the variability of the plotted data on a graph. They are usually applied on top of other plots such as scatter, curve or bar plots to indicate the variability in each sample. \n", + "\n", + "``ErrorBars`` may be used to represent symmetric error or assymetric error. An ``ErrorBars`` Element must have one key dimensions representing the samples along the x-axis and two or three value dimensions representing the value of the sample and positive and negative error values associated with that sample. See the [Columnar Data Tutorial](../Tutorials/Columnar_Data.ipynb) for supported data formats, which include arrays, pandas dataframes and dictionaries of arrays." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Symmetric error\n", + "\n", + "By default the ``ErrorBars`` Element accepts x- and y-coordinates along with a symmetric error value:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "np.random.seed(7)\n", + "errors = [(0.1*i, np.sin(0.1*i), np.random.rand()/2) for i in np.linspace(0, 100, 11)]\n", + "hv.Curve(errors) * hv.ErrorBars(errors)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Assymetric error" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``ErrorBars`` is a set of x-/y-coordinates with associated error values. Error values may be either symmetric or asymmetric, and thus can be supplied as an Nx3 or Nx4 array (or any of the alternative constructors Chart Elements allow)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "errors = [(0.1*i, np.sin(0.1*i), np.random.rand()/2, np.random.rand()/4) for i in np.linspace(0, 100, 11)]\n", + "hv.Curve(errors) * hv.ErrorBars(errors, vdims=['y', 'yerrneg', 'yerrpos'])" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/HeatMap.ipynb b/examples/reference/elements/plotly/HeatMap.ipynb new file mode 100644 index 0000000000..cb69c56710 --- /dev/null +++ b/examples/reference/elements/plotly/HeatMap.ipynb @@ -0,0 +1,93 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
HeatMap Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``HeatMap`` visualises tabular data indexed by two key dimensions as a grid of colored values. This allows spotting correlations in multivariate data and provides a high-level overview of how the two variables are plotted.\n", + "\n", + "The data for a ``HeatMap`` may be supplied as 2D tabular data with one or more associated value dimensions. The first value dimension will be colormapped, but further value dimensions may be revealed using the hover tool." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts HeatMap (cmap='RdBu_r')\n", + "data = [(chr(65+i), chr(97+j), i*j) for i in range(5) for j in range(5) if i!=j]\n", + "hv.HeatMap(data)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is important to note that the data should be aggregated before plotting as the ``HeatMap`` cannot display multiple values for one coordinate and will simply use the first value it finds for each combination of x- and y-coordinates." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "heatmap = hv.HeatMap([(0, 0, 0), (0, 0, 10), (1, 0, 2), (1, 1, 3)])\n", + "heatmap + heatmap.aggregate(function=np.max).opts(plot=dict(colorbar=True))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the above example shows before aggregating the second value for the (0, 0) is ignored unless we aggregate the data first.\n", + "\n", + "To reveal the values of a ``HeatMap`` we can enable a ``colorbar`` and if you wish to have interactive hover information, you can use the hover tool in the [Bokeh backend](../bokeh/HeatMap.ipynb):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts HeatMap [colorbar=True]\n", + "hv.HeatMap((np.random.randint(0, 10, 100), np.random.randint(0, 10, 100),\n", + " np.random.randn(100), np.random.randn(100)), vdims=['z', 'z2']).redim.range(z=(-2, 2)).sort()" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Image.ipynb b/examples/reference/elements/plotly/Image.ipynb new file mode 100644 index 0000000000..5519bb69b9 --- /dev/null +++ b/examples/reference/elements/plotly/Image.ipynb @@ -0,0 +1,108 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Image Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Like ``Raster``, a HoloViews ``Image`` allows you to view 2D arrays using an arbitrary color map. Unlike ``Raster``, an ``Image`` is associated with a [2D coordinate system in continuous space](Continuous_Coordinates.ipynb), which is appropriate for values sampled from some underlying continuous distribution (as in a photograph or other measurements from locations in real space)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ls = np.linspace(0, 10, 200)\n", + "xx, yy = np.meshgrid(ls, ls)\n", + "\n", + "bounds=(-1,-1,1,1) # Coordinate system: (left, bottom, top, right)\n", + "img = hv.Image(np.sin(xx)*np.cos(yy), bounds=bounds)\n", + "img" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Slicing, sampling, etc. on an ``Image`` all operate in this continuous space, whereas the corresponding operations on a ``Raster`` work on the raw array coordinates." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img + img[-0.5:0.5, -0.5:0.5]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice how, because our declared coordinate system is continuous, we can slice with any floating-point value we choose. The appropriate range of the samples in the input numpy array will always be displayed, whether or not there are samples at those specific floating-point values. This also allows us to index by a floating value, since the ``Image`` is defined as a continuous space it will snap to the closest coordinate, to inspect the closest coordinate we can use the ``closest`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points (color='black' symbol='x' )\n", + "closest = img.closest((0.1,0.1))\n", + "print('The value at position %s is %s' % (closest, img[0.1, 0.1]))\n", + "img * hv.Points([img.closest((0.1,0.1))])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also easily take cross-sections of the Image by using the sample method or collapse a dimension using the ``reduce`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img.sample(x=0) + img.reduce(x=np.mean)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/ItemTable.ipynb b/examples/reference/elements/plotly/ItemTable.ipynb new file mode 100644 index 0000000000..670b91ce49 --- /dev/null +++ b/examples/reference/elements/plotly/ItemTable.ipynb @@ -0,0 +1,52 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
ItemTable Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An ``ItemTable`` is an ordered collection of key, value pairs. It can be used to directly visualize items in a tabular format where the items may be supplied as an ``OrderedDict`` or a list of (key,value) pairs. A standard Python dictionary can be easily visualized using a call to the ``.items()`` method, though the entries in such a dictionary are not kept in any particular order, and so you may wish to sort them before display. One typical usage for an ``ItemTable`` is to list parameter values or measurements associated with an adjacent ``Element``." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.ItemTable([('Age', 10), ('Weight',15), ('Height','0.8 meters')])" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Points.ipynb b/examples/reference/elements/plotly/Points.ipynb new file mode 100644 index 0000000000..a849a48c8b --- /dev/null +++ b/examples/reference/elements/plotly/Points.ipynb @@ -0,0 +1,112 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Points Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``Points`` element visualizes as markers placed in a space of two independent variables, traditionally denoted *x* and *y*. In HoloViews, the names ``'x'`` and ``'y'`` are used as the default ``key_dimensions`` of the element. We can see this from the default axis labels when visualizing a simple ``Points`` element:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points (color='black' symbol='x')\n", + "np.random.seed(12)\n", + "coords = np.random.rand(50,2)\n", + "hv.Points(coords)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here both the random *x* values and random *y* values are *both* considered to be the 'data' with no dependency between them (compare this to how [``Scatter``](./Scatter.ipynb) elements are defined). You can think of ``Points`` as simply marking positions in some two-dimensional space that can be sliced by specifying a 2D region-of-interest:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points (color='black' symbol='x' size=10)\n", + "hv.Points(coords) + hv.Points(coords)[0.6:0.8,0.2:0.5]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Although the simplest ``Points`` element simply mark positions in a two-dimensional space without any associated value this doesn't mean value dimensions aren't supported. Here is an example with two additional quantities for each point, declared as the ``value_dimension``s *z* and α visualized as the color and size of the dots, respectively:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [color_index=2]\n", + "np.random.seed(10)\n", + "data = np.random.rand(100,4)\n", + "\n", + "points = hv.Points(data, vdims=['z', 'size'])\n", + "points + points[0.3:0.7, 0.3:0.7].hist()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the right subplot, the ``hist`` method is used to show the distribution of samples along the first value dimension we added (*z*).\n", + "\n", + "\n", + "The marker shape specified above can be any supported by [matplotlib](http://matplotlib.org/api/markers_api.html), e.g. ``s``, ``d``, or ``o``; the other options select the color and size of the marker. For convenience with the [bokeh backend](Bokeh_Backend), the matplotlib marker options are supported using a compatibility function in HoloViews." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Note**: Although the ``Scatter`` element is superficially similar to the [``Points``](./Points.ipynb) element (they can generate plots that look identical), the two element types are semantically quite different. The fundamental difference is that [``Points``](./Points.ipynb) are used to visualize data where the *y* variable is *dependent*. This semantic difference also explains why the histogram generated by ``hist`` call above visualizes the distribution of a different dimension than it does for [``Scatter``](./Scatter.ipynb).\n", + "\n", + "This difference means that ``Points`` naturally combine elements that express independent variables in two-dimensional space, for instance [``Raster``](./Raster.ipynb) types such as [``Image``](./Image.ipynb). Similarly, ``Scatter`` expresses a dependent relationship in two-dimensions and combine naturally with ``Chart`` types such as [``Curve``](./Curve.ipynb)." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Raster.ipynb b/examples/reference/elements/plotly/Raster.ipynb new file mode 100644 index 0000000000..dab41a3a6d --- /dev/null +++ b/examples/reference/elements/plotly/Raster.ipynb @@ -0,0 +1,61 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Raster Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A ``Raster`` is the base class for image-like elements (namely [``Image``](./Image.ipynb), [``RGB``](./RGB.ipynb) and [``HSV``](./HSV.ipynb)), but may be used directly to visualize 2D arrays using a color map:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = np.linspace(0,4,202)\n", + "ys,xs = np.meshgrid(xvals, -xvals[::-1])\n", + "hv.Raster(np.sin(((ys)**3)*xs))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + " The coordinate system of a ``Raster`` is the raw indexes of the underlying array, with integer values always starting from (0,0) in the top left, with default extents corresponding to the shape of the array. For a similar element used to visualize arrays but defined in a continuous Cartesian coordinate system, use the [``Image``](./Image.ipynb) element." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Scatter.ipynb b/examples/reference/elements/plotly/Scatter.ipynb new file mode 100644 index 0000000000..c4e337354d --- /dev/null +++ b/examples/reference/elements/plotly/Scatter.ipynb @@ -0,0 +1,116 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Scatter Element
\n", + "
Dependencies
Plotly
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``Scatter`` element visualizes as markers placed in a space of one independent variable, traditionally denoted as *x*, against a dependent variable, traditonally denoted as *y*. In HoloViews, the name ``'x'`` is the default dimension name used in the ``key_dimensions`` and ``'y'`` is the default dimension name used in the ``value_dimensions``. We can see this from the default axis labels when visualizing a simple ``Scatter`` element:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter (color='k' symbol='s' size=10)\n", + "np.random.seed(42)\n", + "coords = [(i, np.random.random()) for i in range(20)]\n", + "hv.Scatter(coords)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here the random *y* values are considered to be the 'data' whereas the x positions express where those values are located (compare this to how [``Points``](./Points.ipynb) elements are defined). In this sense, ``Scatter`` can be thought of as a [``Curve``](./Curve.ipynb) without any lines connecting the samples and you can use slicing to view the *y* values corresponding to a chosen *x* range:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter (color='k' symbol='x' size=10)\n", + "hv.Scatter(coords)[0:12] + hv.Scatter(coords)[12:20]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A ``Scatter`` element must always have at least one value dimension but that doesn't mean additional value dimensions aren't supported. Here is an example with two additional quantities for each point, declared as the ``value_dimension``s *z* and α visualized as the color and size of the dots, respectively:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter [color_index=2]\n", + "np.random.seed(10)\n", + "data = np.random.rand(100,4)\n", + "\n", + "scatter = hv.Scatter(data, vdims=['y', 'z', 'size'])\n", + "scatter + scatter[0.3:0.7, 0.3:0.7].hist()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the right subplot, the ``hist`` method is used to show the distribution of samples along our first value dimension, (*y*)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The marker shape specified above can be any supported by [matplotlib](http://matplotlib.org/api/markers_api.html), e.g. ``s``, ``d``, or ``o``; the other options select the color and size of the marker. For convenience with the [bokeh backend](Bokeh_Backend), the matplotlib marker options are supported using a compatibility function in HoloViews." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Note**: Although the ``Scatter`` element is superficially similar to the [``Points``](./Points.ipynb) element (they can generate plots that look identical), the two element types are semantically quite different: ``Points`` are used to visualize data where the *y* variable is *dependent*. This semantic difference also explains why the histogram generated by ``hist`` call above visualizes the distribution of a different dimension than it does for [``Points``](./Points.ipynb).\n", + "\n", + "This difference means that ``Scatter`` naturally combine elements that express dependent variables in two-dimensional space such as the ``Chart`` types, such as [``Curve``](./Curve.ipynb). Similarly, ``Points`` express a independent relationship in two-dimensions and combine naturally with [``Raster``](./Raster.ipynb) types such as [``Image``](./Image.ipynb)." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Scatter3D.ipynb b/examples/reference/elements/plotly/Scatter3D.ipynb new file mode 100644 index 0000000000..efbf321557 --- /dev/null +++ b/examples/reference/elements/plotly/Scatter3D.ipynb @@ -0,0 +1,72 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Scatter3D Element
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.notebook_extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Scatter3D`` represents three-dimensional coordinates which may be colormapped or scaled in size according to a value. They are therefore very similar to [``Points``](Points.ipynb) and [``Scatter``](Scatter.ipynb) types but have one additional coordinate dimension. Like other 3D elements the camera angle can be controlled using ``azimuth``, ``elevation`` and ``distance`` plot options:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter3D [width=500 height=500 camera_zoom=20 color_index=2] (size=5 cmap='fire')\n", + "y,x = np.mgrid[-5:5, -5:5] * 0.1\n", + "heights = np.sin(x**2+y**2)\n", + "hv.Scatter3D(zip(x.flat,y.flat,heights.flat))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Just like all regular 2D elements, ``Scatter3D`` types can be overlaid and will follow the default color cycle: \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter3D [width=500 height=500] (symbol='x' size=2)\n", + "hv.Scatter3D(np.random.randn(100,4), vdims=['Size']) * hv.Scatter3D(np.random.randn(100,4)+2, vdims=['Size'])" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Surface.ipynb b/examples/reference/elements/plotly/Surface.ipynb new file mode 100644 index 0000000000..c033d76811 --- /dev/null +++ b/examples/reference/elements/plotly/Surface.ipynb @@ -0,0 +1,79 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Surface Element
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Plotly
\n", + "\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.notebook_extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Surface`` is used for a set of gridded points whose associated value dimension represents samples from a continuous surface. ``Surface`` is equivalent to an ``Image`` type and supports all the same data formats, including simply NumPy arrays with associated ``bounds`` and other gridded data formats such as xarray.\n", + "\n", + "Rendering a large can often be quite expensive, using ``rstride`` and ``cstride`` we can draw a coarser surface. We can also control the ``azimuth``, ``elevation`` and ``distance`` as plot options to control the camera angle:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Surface [width=500 height=500] (cmap='plasma')\n", + "hv.Surface(np.sin(np.linspace(0,100*np.pi*2,10000)).reshape(100,100))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to a simple surface plots, the matplotlib surface plot also supports other related ``plot_type`` modes including ``'wireframe'`` and ``'contour'`` plots:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Surface [width=500 height=500] (cmap='fire')\n", + "xs = np.arange(-4, 4, 0.25)\n", + "ys = np.arange(-4, 4, 0.25)\n", + "X, Y = np.meshgrid(xs, ys)\n", + "R = np.sqrt(X**2 + Y**2)\n", + "Z = np.sin(R)\n", + "surface = hv.Surface((xs, ys, Z))\n", + "surface" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Table.ipynb b/examples/reference/elements/plotly/Table.ipynb new file mode 100644 index 0000000000..0fd6eb0850 --- /dev/null +++ b/examples/reference/elements/plotly/Table.ipynb @@ -0,0 +1,128 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Table Element
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Bokeh
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A table is more general than an [``ItemTable``](./ItemTable.ioynb), as it allows multi-dimensional keys and multidimensional values. Let's say we have the following data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "gender = ['M','M', 'M','F']\n", + "age = [10,16,13,12]\n", + "weight = [15,18,16,10]\n", + "height = [0.8,0.6,0.7,0.8]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can construct a ``Table`` using a dictionary format (identical in format as that accepted by the [pandas](http://pandas.pydata.org/) ``DataFrame``):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Table({'Gender':gender, 'Age':age, 'Weight':weight, 'Height':height},\n", + " kdims = ['Gender', 'Age'], vdims=['Weight', 'Height'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Or we can declare the same table by dimension position, with key dimensions followed by value dimensions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "table = hv.Table((gender, age, weight, height), kdims = ['Gender', 'Age'], vdims=['Weight', 'Height'])\n", + "table" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that you can use the ``select`` method using tables by the key dimensions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "table.select(Gender='M') + table.select(Gender='M', Age=10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``Table`` is used as a common data structure that may be converted to any other HoloViews data structure via the ``to`` utility available on the object. Here we use this utility to show the weight of the males in our datset by age:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "table.select(Gender='M').to.curve(kdims=[\"Age\"], vdims=[\"Weight\"])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For more extended usage of table conversion see the [Columnar Data](Columnnar_Data.ipynb) and [Pandas Conversion](Pandas_Conversion.ipynb) Tutorials." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/reference/elements/plotly/Trisurface.ipynb b/examples/reference/elements/plotly/Trisurface.ipynb new file mode 100644 index 0000000000..e81e627bf5 --- /dev/null +++ b/examples/reference/elements/plotly/Trisurface.ipynb @@ -0,0 +1,86 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Trisurface Element
\n", + "
Dependencies
Matplotlib
\n", + "
Backends
Matplotlib
Plotly
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.notebook_extension('plotly')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``Trisurface`` Element renders any collection of 3D points as a surface by applying [Delaunay triangulation](https://en.wikipedia.org/wiki/Delaunay_triangulation). It is therefore useful for plotting an arbitrary collection of datapoints as a 3D surface. Like other 3D elements it supports ``azimuth``, ``elevation`` and ``distance`` plot options to control the camera position:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Trisurface [width=500 height=500]\n", + "y,x = np.mgrid[-5:5, -5:5] * 0.1\n", + "heights = np.sin(x**2+y**2)\n", + "hv.Trisurface((x.flat,y.flat,heights.flat))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Like all other colormapped plots we can easily add a ``colorbar`` and control the ``cmap`` of the plot:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Trisurface [width=500 height=500 colorbar=True] (cmap='fire')\n", + "\n", + "u=np.linspace(0,2*np.pi, 24)\n", + "v=np.linspace(-1,1, 8)\n", + "u,v=np.meshgrid(u,v)\n", + "u=u.flatten()\n", + "v=v.flatten()\n", + "\n", + "#evaluate the parameterization at the flattened u and v\n", + "tp=1+0.5*v*np.cos(u/2.)\n", + "x=tp*np.cos(u)\n", + "y=tp*np.sin(u)\n", + "z=0.5*v*np.sin(u/2.)\n", + "\n", + "surface = hv.Trisurface((x, y, z), label='Moebius band')\n", + "surface" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/streams/bokeh/bounds_selection.ipynb b/examples/reference/streams/bokeh/bounds_selection.ipynb similarity index 88% rename from examples/streams/bokeh/bounds_selection.ipynb rename to examples/reference/streams/bokeh/bounds_selection.ipynb index 1eff1d0243..aac9049c9a 100644 --- a/examples/streams/bokeh/bounds_selection.ipynb +++ b/examples/reference/streams/bokeh/bounds_selection.ipynb @@ -65,22 +65,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
" + "
" ] } ], "metadata": { "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/streams/bokeh/boundsx_selection.ipynb b/examples/reference/streams/bokeh/boundsx_selection.ipynb similarity index 78% rename from examples/streams/bokeh/boundsx_selection.ipynb rename to examples/reference/streams/bokeh/boundsx_selection.ipynb index e5e3904723..9a7e958b34 100644 --- a/examples/streams/bokeh/boundsx_selection.ipynb +++ b/examples/reference/streams/bokeh/boundsx_selection.ipynb @@ -17,11 +17,10 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ + "import pandas as pd\n", "import numpy as np\n", "import holoviews as hv\n", "from holoviews import streams\n", @@ -31,9 +30,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Curve[tools=['xbox_select']]\n", @@ -53,31 +50,16 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "
" ] } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.5.2" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/streams/bokeh/boundsy_selection.ipynb b/examples/reference/streams/bokeh/boundsy_selection.ipynb similarity index 86% rename from examples/streams/bokeh/boundsy_selection.ipynb rename to examples/reference/streams/bokeh/boundsy_selection.ipynb index cda2a49fd9..ba23b752b2 100644 --- a/examples/streams/bokeh/boundsy_selection.ipynb +++ b/examples/reference/streams/bokeh/boundsy_selection.ipynb @@ -61,22 +61,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.5.2" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/reference/streams/bokeh/curve_selection.ipynb b/examples/reference/streams/bokeh/curve_selection.ipynb new file mode 100644 index 0000000000..24157ed30b --- /dev/null +++ b/examples/reference/streams/bokeh/curve_selection.ipynb @@ -0,0 +1,53 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Curve selection streams example
\n", + "
Description
A linked streams example demonstrating how to use the Selection1D stream to access Curves selected using a tap tool on one plot and mirror them on another plot.
\n", + "
Backends
Bokeh
\n", + "
Tags
streams, linked, selection, interactive
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "from holoviews import streams\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts NdOverlay [legend_limit=0] Curve [tools=['tap']] (line_width=10)\n", + "ndoverlay = hv.NdOverlay({i: hv.Curve(np.arange(10)*i) for i in range(5)})\n", + "\n", + "selection = streams.Selection1D(source=ndoverlay)\n", + "dmap = hv.DynamicMap(lambda index: ndoverlay[index] if index else ndoverlay.clone(),\n", + " kdims=[], streams=[selection])\n", + "ndoverlay + dmap" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/streams/bokeh/heatMap_tap.ipynb b/examples/reference/streams/bokeh/heatmap_tap.ipynb similarity index 67% rename from examples/streams/bokeh/heatMap_tap.ipynb rename to examples/reference/streams/bokeh/heatmap_tap.ipynb index cd6fb37cc8..7cfe9f21d5 100644 --- a/examples/streams/bokeh/heatMap_tap.ipynb +++ b/examples/reference/streams/bokeh/heatmap_tap.ipynb @@ -24,7 +24,7 @@ "import numpy as np\n", "import holoviews as hv\n", "from holoviews import streams\n", - "hv.extension('bokeh')" + "hv.extension('bokeh', width=90)" ] }, { @@ -33,25 +33,24 @@ "metadata": {}, "outputs": [], "source": [ - "%opts HeatMap [width=600 height=500 logz=True tools=['hover'] xrotation=90] (cmap='fire') \n", - "%opts Histogram [width=375 height=500] (line_color='white' fill_color='grey') {+framewise}\n", + "%opts HeatMap [width=700 height=500 logz=True fontsize={'xticks': '6pt'}, tools=['hover'] xrotation=90] (cmap='RdBu_r') \n", + "%opts Curve [width=375 height=500 yaxis='right'] (line_color='black') {+framewise}\n", "\n", "# Declare dataset\n", - "df = pd.read_csv('../../assets/disease.csv.gz')\n", - "dataset = hv.Dataset(df, vdims=['Measles Incidence'])\n", + "df = pd.read_csv('http://assets.holoviews.org/data/diseases.csv.gz', compression='gzip')\n", + "dataset = hv.Dataset(df, vdims=[('measles','Measles Incidence')])\n", "\n", "# Declare HeatMap\n", "heatmap = hv.HeatMap(dataset.aggregate(['Year', 'State'], np.mean),\n", - " label='Measles Incidence')\n", + " label='Measles Incidence').select(Year=(1928, 2002))\n", "\n", "# Declare Tap stream with heatmap as source and initial values\n", "posxy = hv.streams.Tap(source=heatmap, x=1951, y='New York')\n", "\n", "# Define function to compute histogram based on tap location\n", "def tap_histogram(x, y):\n", - " hist = hv.operation.histogram(dataset.select(State=y, Year=int(x)), normed=False)\n", - " label = 'Year: %s, State: %s' % (x, y)\n", - " return hist.relabel(group='Histogram', label=label)\n", + " return hv.Curve(dataset.select(State=y, Year=int(x)), kdims=['Week'],\n", + " label='Year: %s, State: %s' % (x, y))\n", "\n", "heatmap + hv.DynamicMap(tap_histogram, kdims=[], streams=[posxy])" ] @@ -60,22 +59,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
" + "
" ] } ], "metadata": { "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/streams/bokeh/linked_pointer_crosssection.ipynb b/examples/reference/streams/bokeh/linked_pointer_crosssection.ipynb similarity index 85% rename from examples/streams/bokeh/linked_pointer_crosssection.ipynb rename to examples/reference/streams/bokeh/linked_pointer_crosssection.ipynb index bc47ed93f8..e11ccd9787 100644 --- a/examples/streams/bokeh/linked_pointer_crosssection.ipynb +++ b/examples/reference/streams/bokeh/linked_pointer_crosssection.ipynb @@ -55,22 +55,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
" + "
" ] } ], "metadata": { "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/streams/bokeh/multiple_selection.ipynb b/examples/reference/streams/bokeh/multiple_selection.ipynb similarity index 83% rename from examples/streams/bokeh/multiple_selection.ipynb rename to examples/reference/streams/bokeh/multiple_selection.ipynb index 4fda718a1a..44ff9e2137 100644 --- a/examples/streams/bokeh/multiple_selection.ipynb +++ b/examples/reference/streams/bokeh/multiple_selection.ipynb @@ -17,9 +17,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "scrolled": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -31,9 +29,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "scrolled": true - }, + "metadata": {}, "outputs": [], "source": [ "%%opts Points [tools=['box_select', 'lasso_select', 'tap']]\n", @@ -58,22 +54,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
" + "
" ] } ], "metadata": { "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/streams/bokeh/point_selection1D.ipynb b/examples/reference/streams/bokeh/point_selection1D.ipynb similarity index 86% rename from examples/streams/bokeh/point_selection1D.ipynb rename to examples/reference/streams/bokeh/point_selection1D.ipynb index 97858e869a..1c387d0110 100644 --- a/examples/streams/bokeh/point_selection1D.ipynb +++ b/examples/reference/streams/bokeh/point_selection1D.ipynb @@ -57,22 +57,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
" + "
" ] } ], "metadata": { "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/streams/bokeh/pointer_crosshair.ipynb b/examples/reference/streams/bokeh/pointer_crosshair.ipynb similarity index 82% rename from examples/streams/bokeh/pointer_crosshair.ipynb rename to examples/reference/streams/bokeh/pointer_crosshair.ipynb index 02e1339600..4f33b2f9a7 100644 --- a/examples/streams/bokeh/pointer_crosshair.ipynb +++ b/examples/reference/streams/bokeh/pointer_crosshair.ipynb @@ -17,9 +17,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "scrolled": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", @@ -53,22 +51,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
" + "
" ] } ], "metadata": { "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/streams/bokeh/range_histogram.ipynb b/examples/reference/streams/bokeh/range_histogram.ipynb similarity index 84% rename from examples/streams/bokeh/range_histogram.ipynb rename to examples/reference/streams/bokeh/range_histogram.ipynb index d41cb1f4b7..9a32efc0a5 100644 --- a/examples/streams/bokeh/range_histogram.ipynb +++ b/examples/reference/streams/bokeh/range_histogram.ipynb @@ -54,22 +54,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
" + "
" ] } ], "metadata": { "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 2 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.11" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/reference/streams/bokeh/regression_tap.ipynb b/examples/reference/streams/bokeh/regression_tap.ipynb new file mode 100644 index 0000000000..05e72d0554 --- /dev/null +++ b/examples/reference/streams/bokeh/regression_tap.ipynb @@ -0,0 +1,88 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + "
\n", + "
Title
Regression selection
\n", + "
Description
A linked streams example demonstrating how to the Selection1D stream to tap on a datapoint and reveal a regression plot. Highlights how custom interactivity can be used to reveal more information about a dataset.
\n", + "
Backends
Bokeh
\n", + "
Tags
streams, linked, tap selection
\n", + "
\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "from holoviews.streams import Selection1D\n", + "from scipy import stats\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter [color_index=2 tools=['tap', 'hover'] width=600] {+framewise} (marker='triangle' cmap='Set1' size=10)\n", + "%%opts Overlay [toolbar='above' legend_position='right'] Curve (line_color='black') {+framewise}\n", + "\n", + "def gen_samples(N, corr=0.8):\n", + " xx = np.array([-0.51, 51.2])\n", + " yy = np.array([0.33, 51.6])\n", + " means = [xx.mean(), yy.mean()] \n", + " stds = [xx.std() / 3, yy.std() / 3]\n", + " covs = [[stds[0]**2 , stds[0]*stds[1]*corr], \n", + " [stds[0]*stds[1]*corr, stds[1]**2]] \n", + "\n", + " return np.random.multivariate_normal(means, covs, N)\n", + "\n", + "data = [('Week %d' % (i%10), np.random.rand(), chr(65+np.random.randint(5)), i) for i in range(100)]\n", + "sample_data = hv.NdOverlay({i: hv.Points(gen_samples(np.random.randint(1000, 5000), r2))\n", + " for _, r2, _, i in data})\n", + "points = hv.Scatter(data, kdims=['Date', 'r2'], vdims=['block', 'id']).redim.range(r2=(0., 1))\n", + "stream = Selection1D(source=points)\n", + "empty = (hv.Points(np.random.rand(0, 2)) * hv.Curve(np.random.rand(0, 2))).relabel('No selection')\n", + "\n", + "def regression(index):\n", + " if not index:\n", + " return empty\n", + " scatter = sample_data[index[0]]\n", + " xs, ys = scatter['x'], scatter['y']\n", + " slope, intercep, rval, pval, std = stats.linregress(xs, ys)\n", + " xs = np.linspace(*scatter.range(0)+(2,))\n", + " reg = slope*xs+intercep\n", + " return (scatter * hv.Curve((xs, reg))).relabel('r2: %.3f' % slope)\n", + "\n", + "reg = hv.DynamicMap(regression, kdims=[], streams=[stream])\n", + "\n", + "average = hv.Curve(points, kdims=['Date'], vdims=['r2']).aggregate(function=np.mean)\n", + "points * average + reg" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/topics/geometry/lsystems.ipynb b/examples/topics/geometry/lsystems.ipynb new file mode 100644 index 0000000000..52cc03aaae --- /dev/null +++ b/examples/topics/geometry/lsystems.ipynb @@ -0,0 +1,535 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# L-Systems\n", + "\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A [Lindenmayer system](https://en.wikipedia.org/wiki/L-system) or L-system is a mathematical system that can be used to describe growth process such as the growth of plants. Formally, it is a symbol expansion system whereby [rewrite rules](https://en.wikipedia.org/wiki/Rewriting) are applies iteratively to generate a longer string of symbols starting from a simple initial state. In this notebook, we will see how various types of fractal, including plant-like ones can be generated with L-systems and visualized with HoloViews." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "import numpy as np\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This notebook makes extensive use of the ``Path`` element and we will want to keep equal aspects and suppress the axes:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%opts Path {+framewise +axiswise} [xaxis=None, yaxis=None show_title=False] (color='black')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Some simple patterns" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this notebook, we will be drawing paths relative to an agent, in the spirit of [turtle graphics](https://en.wikipedia.org/wiki/Turtle_graphics). For this we define a simple agent class that has a ``path`` property to show us the path travelled from the point of initialization:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "class SimpleAgent(object):\n", + " \n", + " def __init__(self, x=0,y=0, heading=0):\n", + " self.x, self.y = x,y\n", + " self.heading = heading\n", + " self.trace = [(self.x, self.y)]\n", + " \n", + " def forward(self, distance):\n", + " self.x += np.cos(2*np.pi * self.heading/360.0)\n", + " self.y += np.sin(2*np.pi * self.heading/360.0)\n", + " self.trace.append((self.x,self.y))\n", + " \n", + " def rotate(self, angle):\n", + " self.heading += angle\n", + " \n", + " def back(self, distance):\n", + " self.heading += 180\n", + " self.forward(distance)\n", + " self.heading += 180\n", + " \n", + " @property\n", + " def path(self):\n", + " return hv.Path([self.trace])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now test our ``SimpleAgent`` by drawing some spirographs:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def pattern(angle= 5):\n", + " agent = SimpleAgent()\n", + " for i in range(360//angle):\n", + " for i in range(4):\n", + " agent.forward(1)\n", + " agent.rotate(90)\n", + " agent.rotate(angle)\n", + " return agent\n", + " \n", + "(pattern(20).path + pattern(10).path + pattern(5).path\n", + " + pattern(5).path * pattern(10).path * pattern(20).path).cols(2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "We can also draw some pretty rose patterns, adapted from [these equations](http://www.mathcats.com/gallery/fiverosedetails.html):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def roses(l,n,k):\n", + " agent = SimpleAgent()\n", + " n * 10\n", + " x = (2.0 * k -n) / (2.0 * n)\n", + " for i in range(360*n):\n", + " agent.forward(l)\n", + " agent.rotate(i + x)\n", + " return agent\n", + "\n", + "roses(5, 7, 3).path + roses(5, 12, 5).path" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Following rules" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now want to the capabilites of our agent with the ability to read instructions, telling it which path to follow. Let's define the meaning of the following symbols:\n", + "\n", + "**F**: Move forward by a pre-specified distance.
\n", + "**B**: Move backwards by a pre-specified distance.
\n", + "**+**: Rotate anti-clockwise by a pre-specified angle.
\n", + "**-**: Rotate clockwise by a pre-specified angle.
\n", + "\n", + "Here is an agent class that can read strings of such symbols to draw the corresponding pattern:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "class Agent(SimpleAgent):\n", + " \"An upgraded agent that can follow some rules\"\n", + " \n", + " default_rules = {'F': lambda t,d,a: t.forward(d),\n", + " 'B': lambda t,d,a: t.back(d),\n", + " '+': lambda t,d,a: t.rotate(-a),\n", + " '-': lambda t,d,a: t.rotate(a)}\n", + " \n", + " def __init__(self, x=0,y=0, instructions=None, heading=0, \n", + " distance=5, angle=60, rules=default_rules):\n", + " super(Agent,self).__init__(x,y, heading)\n", + " self.distance = distance\n", + " self.angle = angle\n", + " self.rules = rules\n", + " if instructions: self.process(instructions, self.distance, self.angle)\n", + " \n", + " def process(self, instructions, distance, angle):\n", + " for i in instructions: \n", + " self.rules[i](self, distance, angle)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Defining L-Systems" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "L-systems are defined with a [rewrite system](https://en.wikipedia.org/wiki/Rewriting), making use of a set of [production rules](https://en.wikipedia.org/wiki/Production_(computer_science). What this means is that L-systems can generate instructions for our agent to follow, and therefore generate paths.\n", + "\n", + "Now we define the ``expand_rules`` function which can process some expansion rules to repeatedly substitute an initial set of symbols with new symbols:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def expand_rules(initial, iterations, productions):\n", + " \"Expand an initial symbol with the given production rules\"\n", + " expansion = initial\n", + " for i in range(iterations):\n", + " intermediate = \"\"\n", + " for ch in expansion:\n", + " intermediate = intermediate + productions.get(ch,ch)\n", + " expansion = intermediate\n", + " return expansion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Koch curve and snowflake" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To demonstrate ``expand_rules``, let's define two different rules:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "koch_curve = {'F':'F+F-F-F+F'} # Replace 'F' with 'F+F-F-F+F'\n", + "koch_snowflake = {'F':'F-F++F-F'} # Replace 'F' with 'F-F++F-F'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here are the first three steps using the first rule:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "for i in range(3):\n", + " print('%d: %s' % (i, expand_rules('F', i, koch_curve)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that these are instructions our agent can follow!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Path {+axiswise} (color=Cycle())\n", + "k1 = Agent(-200, 0, expand_rules('F', 4, koch_curve), angle=90).path\n", + "k2 = Agent(-200, 0, expand_rules('F', 4, koch_snowflake)).path\n", + "k1 + k2 + (k1 * k2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This shows two variants of the [Koch snowflake](https://en.wikipedia.org/wiki/Koch_snowflake) where ``koch_curve`` is a variant that uses right angles." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sierpinski triangle" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The following example introduces a mutual relationship between two symbols, 'A' and 'B', instead of just the single symbol 'F' used above:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sierpinski_triangle = {'A':'B-A-B', 'B':'A+B+A'}\n", + "for i in range(3):\n", + " print('%d: %s' % (i, expand_rules('A', i,sierpinski_triangle)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once again we can use these instructions to draw an interesting shape although we also need to define what these symbols mean to our agent:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Path (color='green')\n", + "sierpinski_rules = {'A': lambda t,d,a: t.forward(d),\n", + " 'B': lambda t,d,a: t.forward(d),\n", + " '+': lambda t,d,a: t.rotate(-a),\n", + " '-': lambda t,d,a: t.rotate(a)}\n", + "\n", + "instructions = expand_rules('A', 9,sierpinski_triangle)\n", + "Agent(x=-200, y=0, rules=sierpinski_rules, instructions=instructions, angle=60).path" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that with our L-system expansion in terms of 'A' and 'B', we have defined the famous [Sierpinski_triangle](https://en.wikipedia.org/wiki/Sierpinski_triangle) fractal." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### The Dragon curve" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now for another famous fractal:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dragon_curve = {'X':'X+YF+', 'Y':'-FX-Y'}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now have two new symbols 'X' and 'Y' which we need to define in addition to 'F', '+' and '-' which we used before:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dragon_rules = dict(Agent.default_rules, X=lambda t,d,a: None, Y=lambda t,d,a: None)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that 'X' and 'Y' don't actual do anything directly! These symbols are important in the expansion process but have no meaning to the agent. This time, let's use a ``HoloMap`` to view the expansion:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Path {+framewise}\n", + "\n", + "def pad_extents(path):\n", + " \"Add 5% padding around the path\"\n", + " minx, maxx = path.range('x')\n", + " miny, maxy = path.range('y')\n", + " xpadding = ((maxx-minx) * 0.1)/2\n", + " ypadding = ((maxy-miny) * 0.1)/2\n", + " path.extents = (minx-xpadding, miny-ypadding, maxx+xpadding, maxy+ypadding)\n", + " return path\n", + " \n", + "hmap = hv.HoloMap(kdims=['Iteration'])\n", + "for i in range(7,17):\n", + " path = Agent(-200, 0, expand_rules('FX', i, dragon_curve), rules=dragon_rules, angle=90).path\n", + " hmap[i] = pad_extents(path)\n", + "hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This fractal is known as the [Dragon Curve](https://en.wikipedia.org/wiki/Dragon_curve)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Plant fractals" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have seen how to generate various fractals with L-systems, but we have not yet seen the plant-like fractals that L-systems are most famous for. This is because we can't draw a realistic plant with a single unbroken line: we need to be able to draw some part of the plant then jump back to an earlier state.\n", + "\n", + "This can be achieved by adding two new actions to our agent: ``push`` to record the current state of the agent and ``pop`` to pop back to the state of the last push:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "class AgentWithState(Agent):\n", + " \"Stateful agent that can follow instructions\"\n", + " \n", + " def __init__(self, x,y, instructions, **kwargs):\n", + " super(AgentWithState, self).__init__(x=x,y=y, instructions=None, **kwargs)\n", + " self.traces = []\n", + " self.state = []\n", + " self.process(instructions, self.distance, self.angle)\n", + " \n", + " def push(self):\n", + " self.traces.append(self.trace[:])\n", + " self.state.append((self.heading, self.x, self.y))\n", + " \n", + " def pop(self):\n", + " self.traces.append(self.trace[:])\n", + " [self.heading, self.x, self.y] = self.state.pop()\n", + " self.trace = [(self.x, self.y)]\n", + " \n", + " @property\n", + " def path(self):\n", + " traces = self.traces + [self.trace]\n", + " return hv.Path(traces)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's look at the first three expansions of a new ruleset we will use to generate a plant-like fractal:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plant_fractal = {'X':'F-[[X]+X]+F[+FX]-X', 'F':'FF'}\n", + "for i in range(3):\n", + " print('%d: %s' % (i, expand_rules('X', i, plant_fractal)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The new symbols '[' and ']' correspond to the new push and pop state actions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plant_rules = dict(Agent.default_rules, X=lambda t,d,a: None, \n", + " **{'[': lambda t,d,a: t.push(), ']': lambda t,d,a: t.pop()})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now generate a nice plant-like fractal:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Path {+framewise} (color='g' line_width=1)\n", + "hmap = hv.HoloMap(kdims=['Iteration'])\n", + "for i in range(7):\n", + " instructions = expand_rules('X', i, plant_fractal)\n", + " if i > 2:\n", + " hmap[i] = AgentWithState(-200, 0, instructions, heading=90, rules=plant_rules, angle=25).path\n", + "hmap" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/topics/geometry/square_limit.ipynb b/examples/topics/geometry/square_limit.ipynb index c4224141c6..7c747abd51 100644 --- a/examples/topics/geometry/square_limit.ipynb +++ b/examples/topics/geometry/square_limit.ipynb @@ -43,7 +43,7 @@ "metadata": {}, "outputs": [], "source": [ - "%opts Spline [xaxis=None yaxis=None aspect='equal' bgcolor='white']" + "%opts Spline [xaxis=None yaxis=None aspect='equal' bgcolor='white'] (linewidth=0.8)" ] }, { @@ -91,9 +91,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "from matplotlib.path import Path\n", @@ -157,13 +155,13 @@ " den = n + m\n", " t1 = Affine2D().scale(n / den, 1)\n", " t2 = Affine2D().scale(m / den, 1).translate(n / den, 0)\n", - " return T(spline1, t1) * T(spline2, t2)\n", + " return combine(T(spline1, t1) * T(spline2, t2))\n", "\n", "def above(spline1, spline2, n=1, m=1):\n", " den = n + m\n", " t1 = Affine2D().scale(1, n / den).translate(0, m / den)\n", " t2 = Affine2D().scale(1, m / den)\n", - " return T(spline1, t1) * T(spline2, t2)\n", + " return combine(T(spline1, t1) * T(spline2, t2))\n", "\n", "beside(fish, fish)* unitsquare + above(fish,fish) * unitsquare" ] @@ -272,7 +270,7 @@ "metadata": {}, "outputs": [], "source": [ - "%%output size=300\n", + "%%output size=250\n", "def squarelimit(n):\n", " return nonet(corner(n), side(n), rot(rot(rot(corner(n)))),\n", " rot(side(n)), u, rot(rot(rot(side(n)))), \n", @@ -282,22 +280,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.1" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/topics/simulation/boids.ipynb b/examples/topics/simulation/boids.ipynb index 15c6b97123..401aacf40d 100644 --- a/examples/topics/simulation/boids.ipynb +++ b/examples/topics/simulation/boids.ipynb @@ -52,9 +52,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "def radarray(N):\n", @@ -80,9 +78,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "def count(mask, n): \n", @@ -108,9 +104,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "class Boids(BoidState):\n", @@ -164,9 +158,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "%opts VectorField [xaxis=None yaxis=None] (scale=0.08)\n", @@ -183,9 +175,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "boids = Boids(500)" @@ -246,9 +236,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "dmap.periodic(0.01, timeout=60, block=True) # Run the simulation for 60 seconds" @@ -256,22 +244,9 @@ } ], "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.1" + "pygments_lexer": "ipython3" } }, "nbformat": 4, diff --git a/examples/user_guide/01-Annotating_Data.ipynb b/examples/user_guide/01-Annotating_Data.ipynb new file mode 100644 index 0000000000..6ad627b8f9 --- /dev/null +++ b/examples/user_guide/01-Annotating_Data.ipynb @@ -0,0 +1,234 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "import holoviews.util\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Annotating your Data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One of the fundamental concepts in HoloViews introduced in the 'Getting Started' guide [Introduction] is that of annotating data with key, semantic metadata. This user guide documents the two main types annotation (1) dimensions used to specify the abstract space in which the data resides (2) the element group/label system used to organize and select data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Specifying dimensionality\n", + "\n", + "Simple datastructure such as dataframes, arrays, lists or dictionaries cannot be given a suitable visual representation without some associated semantic context. Fundamentally, HoloViews lets you specify this context by first selecting a suitable element type from the [gallery] and by then specifying the corresponding *dimensions*.\n", + "\n", + "Here is a very simple example, showing a ``Curve`` element:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = range(-10,11)\n", + "ys = [-el*el for el in xs]\n", + "curve = hv.Curve((xs, ys))\n", + "curve" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All elements (except a small number of annotation elements) have two types of dimensions: *key dimensions* (``kdims``) and *value dimensions* (``vdims``). The *key dimensions* are the dimensions you can index *by* to get the values corresponding to the *value* dimensions. You can learn more about indexing data in the [Indexing and Selecting Data](./09-Indexing_and_Selecting_Data.ipynb) user guide.\n", + "\n", + "Different elements have different numbers of required key dimensions and value dimensions. For instance, a ``Curve`` always has one key dimension and a value dimension. As we did not explicitly specify anything regarding dimensions when declaring the curve above, the ``kdims`` and ``vidms`` use their default names 'x' and 'y':" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "\"Object 'curve' has kdims {kdims} and vdims {vdims} \".format(kdims=curve.kdims, vdims=curve.vdims)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The easiest way to specify dimensions other than the defaults is as strings, which sets the dimension names:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trajectory = hv.Curve((xs, ys), kdims=['distance'], vdims=['height'])\n", + "trajectory" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "\"Object 'trajectory' has kdims {kdims} and vdims {vdims} \".format(kdims=trajectory.kdims, vdims=trajectory.vdims)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see our strings have been 'promoted' to dimension objects describing the space in which our trajectory data resides: the ``kdims`` and ``vdims`` *always* contain instances of the ``Dimension`` class which will be described in a later section. Note that naming our dimensions has given the corresponding visual representation appropriate axis labels." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Dimension parameters\n", + "\n", + "``Dimension`` objects have a number of parameters used to describe the space in which the data resides. Only two of these are considered *core* parameters that identify the identity of the dimension object, the rest consist of auxilliary metadata. Here are the descriptions of the most important ones:\n", + "\n", + "\n", + "
\n", + "
\n", + "
``name``
(core) A concise name for the dimension that should be useable as a Python keyword
\n", + "
``label``
(core) A longer description of the dimension (can contain unicode)
\n", + "
``range``
The minumum and maximum allowable values for the dimension.
\n", + "
``soft_range``
Suggested minumum and maximum values, used to specify a useful portion of the range.
\n", + "
``step``
If specified, the step parameter suggests an appropriate sampling of a continuous range
\n", + "
``unit``
If specified, the name of the unit associated with the dimension.
\n", + "
``values``
Explicit list of allowed dimension values
\n", + "
\n", + "\n", + "\n", + "For the full list of parameters, you can call ``hv.help(hv.Dimension)``.\n", + "\n", + "Note that you can also use a ``(name, label)`` tuple instead of just a string name if you want to specify both ``name`` and ``label`` without building an explicit ``Dimension`` object which can also be used in the ``kdims`` and ``vdims``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "distance = hv.Dimension('distance', label='Horizontal distance', unit='m')\n", + "height = hv.Dimension(('height','Height above sea-level'), unit='m')\n", + "\n", + "wo_unit = hv.Curve((xs, ys), \n", + " kdims=[('distance','Horizontal distance')], \n", + " vdims=[('height','Height above sea-level')])\n", + "with_unit = hv.Curve((xs, ys), kdims=[distance], vdims=[height])\n", + "\n", + "wo_unit + with_unit" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setting properties with redim" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Declaring dimension objects with appropriate parameters can be verbose if you only want a set a few specific parameters. You can often avoid declaring explicit dimension objects using the ``redim`` method which returns a *clone* of the element: the same data, wrapped in a new instance of the same element type with the new dimension settings.\n", + "\n", + "Let's use ``redim`` to swap out the 'height' dimension for the 'altitude' dimension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "renamed_height = trajectory.redim(height='altitude')\n", + "renamed_height" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``redim`` 'method' is actually a utility that can be used to set any of the dimension parameters:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "renamed_height.redim.label(altitude='Altitude above sea-level', distance='Horizontal distance')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This pattern can be used to set any of the parameters listed above (unit, range, values etc) by specifying the dimension name and the new value for the parameter." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Dimension formatters\n", + "\n", + "* Show example of formatters.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Organizing your elements\n", + "\n", + "* Intro paragraph to group/label system.\n", + "\n", + "### Element group and label\n", + "\n", + "* Declare two elements with group and label.\n", + "* Use them with + and point forward to next guide (composition).\n", + "* Use them with * and customize with opts and point forward to composition/customizing plots.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "* Mention mapping from dims to labels, explain that it is used in methods such as select as keywords (can link to getting started introduction).\n" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/user_guide/02-Composing_Elements.ipynb b/examples/user_guide/02-Composing_Elements.ipynb new file mode 100644 index 0000000000..aaa550e0d3 --- /dev/null +++ b/examples/user_guide/02-Composing_Elements.ipynb @@ -0,0 +1,77 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Composing Elements\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Composing collections of data\n", + "\n", + "* Composition is about putting heterogeneous data into a collection with both a visual representation and a way to group and access your data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Composition operators (+ and *)\n", + "\n", + "* Introduce * and + (maybe <<) as ways of quickly composing Elements into Overlays and Layouts" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Accessing and naming your data\n", + "\n", + "* Accessing data on collections and giving them names by defining group, label\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Working with Overlays\n", + "\n", + "* Overview of Overlay API: creating from list, using .get to get layer\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### Working with Layouts\n", + "\n", + "* Overview of Layout API: construct from list, introduce .cols\n" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/user_guide/03-Customizing_Plots.ipynb b/examples/user_guide/03-Customizing_Plots.ipynb new file mode 100644 index 0000000000..c5726aba44 --- /dev/null +++ b/examples/user_guide/03-Customizing_Plots.ipynb @@ -0,0 +1,195 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Customizing Plots\n", + "\n", + "* Intro paragraph. Separation of presentation/content. Option trees.\n", + "* Renderers control output. utilities to control settings described shortly.\n", + "\n", + "HoloViews is designed to be both highly customizable, allowing you to control how your visualizations appear, but also to enforce a strong separation between your data (with any semantically associated metadata, like type, dimension names, and description) and all options related purely to visualization. This separation allows HoloViews objects to be generated easily by external programs, without giving them a dependency on any plotting or windowing libraries. It also helps make it completely clear which parts of your code deal with the actual data, and which are just about displaying it nicely, which becomes very important for complex visualizations that become more complicated than your data itself.\n", + "\n", + "To achieve this separation, HoloViews stores visualization options independently from your data, and applies the options only when rendering the data to a file on disk, a GUI window, or an IPython notebook cell.\n", + "\n", + "This tutorial gives an overview of the different types of options available, how to find out more about them, and how to set them in both regular Python and using the IPython magic interface that is shown elsewhere in the tutorials.\n", + "\n", + "Notes:\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Visual options and styling\n", + "\n", + "* Intro paragraph explaining 3 types of option.\n", + " + plot options\n", + " + style options\n", + " + normalization options\n", + "\n", + "HoloViews provides three categories of visualization options that can be set by the user. In this section we will first describe the different kinds of options, then later sections show you how to list the supported options of each type for a given HoloViews object or class, and how to change them in Python or IPython.\n", + "\n", + "#### ``style`` options:\n", + "\n", + "``style`` options are passed directly to the underlying rendering backend that actually draws the plots, allowing you to control the details of how it behaves. The default backend is matplotlib, but there are other backends either using matplotlib's options (e.g. ``mpld3``), or their own sets of options (e.g. [``bokeh``](Bokeh_Backend) ).\n", + "\n", + "For whichever backend has been selected, HoloViews can tell you which options are supported, but you will need to see the plotting library's own documentation (e.g. [matplotlib](http://matplotlib.org/contents.html), [bokeh](http://bokeh.pydata.org)) for the details of their use.\n", + "\n", + "HoloViews has been designed to be easily extensible to additional backends in the future, such as [Plotly](https://github.com/ioam/holoviews/pull/398), Cairo, VTK, or D3.js, and if one of those backends were selected then the supported style options would differ.\n", + "\n", + "#### ``plot`` options:\n", + "\n", + "Each of the various HoloViews plotting classes declares various [Parameters](http://ioam.github.io/param) that control how HoloViews builds the visualization for that type of object, such as plot sizes and labels. HoloViews uses these options internally; they are not simply passed to the underlying backend. HoloViews documents these options fully in its online help and in the [Reference Manual](http://holoviews.org/Reference_Manual). These options may vary for different backends in some cases, depending on the support available both in that library and in the HoloViews interface to it, but we try to keep any options that are meaningful for a variety of backends the same for all of them.\n", + "\n", + "#### ``norm`` options:\n", + "\n", + "``norm`` options are a special type of plot option that are applied orthogonally to the above two types, to control normalization. Normalization refers to adjusting the properties of one plot relative to those of another. For instance, two images normalized together would appear with relative brightness levels, with the brightest image using the full range black to white, while the other image is scaled proportionally. Two images normalized independently would both cover the full range from black to white. Similarly, two axis ranges normalized together will expand to fit the largest range of either axis, while those normalized separately would cover different ranges.\n", + "\n", + "There are currently only two ``norm`` options supported, ``axiswise`` and ``framewise``, but they can be applied to any of the various object types in HoloViews to specify a huge range of different normalization options.\n", + "\n", + "For a given category or group of HoloViews objects, if ``axiswise`` is True, normalization will be computed independently for all items in that category that have their own axes, such as different ``Image`` plots or ``Curve`` plots. If ``axiswise`` is False, all such objects are normalized together.\n", + "\n", + "For a given category or group of HoloViews objects, if ``framewise`` is True, normalization of any ``HoloMap`` objects included is done independently per frame rendered -- each frame will appear as it would if it were extracted from the ``HoloMap`` and plotted separately. If ``framewise`` is False (the default), all frames in a given ``HoloMap`` are normalized together, so that you can see strength differences over the course of the animation.\n", + "\n", + "As described below, these options can be controlled precisely and in any combination to make sure that HoloViews displays the data of most interest, ignoring irrelevant differences and highlighting important ones." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### Option specification syntax\n", + "\n", + "* Intro to idea of two formats and group/label specificity.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Dictionary format" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### String format" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### %opts\n", + "\n", + "* %opts as a way of specifying things globally.\n", + "* Simple example\n", + "\n", + "The ``%opts`` \"line\" magic (with one ``%``) works just the same as the ``%%opts`` \"cell\" magic, but it changes the global default options for all future cells, allowing you to choose a new default colormap, line width, etc.\n", + "\n", + "Apart from its brevity, a big benefit of using the IPython magic syntax ``%%opts`` or ``%opts`` is that it is fully tab-completable. Each of the options that is currently available will be listed if you press ```` when you are ready to write it, which makes it much easier to find the right parameter. Of course, you will still need to consult the full ``holoviews.help`` documentation (described above) to see the type, allowable values, and documentation for each option, but the tab completion should at least get you started and is great for helping you remember the list of options and see which options are available.\n", + "\n", + "You can even use the succinct IPython-style specification directly in your Python code if you wish, but it requires the external [pyparsing](https://pypi.python.org/pypi/pyparsing) library (which is already available if you are using matplotlib):" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### %%opts\n", + "\n", + "* Simple example with group/label. Show that 1. styling local to that cell display 2. changes persist.\n", + "\n", + "The ``%%opts`` magic works like the pure-Python option for associating options with an object, except that it works on the item in the IPython cell, and it affects the item directly rather than making a copy or applying only in scope. Specifically, it assigns a new ID number to the object returned from this cell, and makes a new ``OptionTree`` containing the options for that ID number.\n", + "\n", + "If the same ``layout`` object is used later in the notebook, even within a complicated container object, it will retain the options set on it.\n", + "\n", + "The options accepted are just the same as for the Python version, but specified more succinctly:\n", + "\n", + "``%%opts`` *target-specification* ``style(``*styleoption*``=``*val* ...``) plot[``*plotoption*``=``*val* ...``] norm{+``*normoption* ``-``*normoption*...``}``\n", + "\n", + "Here *key* lets you specify the object type (e.g. ``Image``), and optionally its ``group`` (e.g. ``Image.Function``) or even both ``group`` and ``label`` (e.g. ``Image.Function.Sine``), if you want to control options very precisely. There is also an even further abbreviated syntax, because the special bracket types alone are enough to indicate which category of option is specified:\n", + "\n", + "``%%opts`` *target-specification* ``(``*styleoption*``=``*val* ...``) [``*plotoption*``=``*val* ...``] {+``*normoption* ``-``*normoption* ...``}``\n", + "\n", + "Here parentheses indicate style options, square brackets indicate plot options, and curly brackets indicate norm options (with ``+axiswise`` and ``+framewise`` indicating True for those values, and ``-axiswise`` and ``-framewise`` indicating False). Additional *target-specification*s and associated options of each type for that *target-specification* can be supplied at the end of this line. This ultra-concise syntax is used throughout the other tutorials, because it helps minimize the code needed to specify the plotting options, and helps make it very clear that these options are handled separately from the actual data.\n", + "\n", + "Here we demonstrate the concise syntax by customizing the style and plot options of the ``Curve`` in the layout:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### hv.opts\n", + "\n", + "* string format or dict format.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### Option trees\n", + "\n", + "* All these mechanisms affect option trees which can be used directly typically at the library level.\n", + "* Explain inheritance" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Customizing output\n", + "\n", + "Plot sizes, backends, output formats and saving to file. There are multiple ways to control these things.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### hv.extension" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### %output\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### %%output\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### hv.output\n" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/04-Dimensioned_Containers.ipynb b/examples/user_guide/04-Dimensioned_Containers.ipynb new file mode 100644 index 0000000000..1cea05c919 --- /dev/null +++ b/examples/user_guide/04-Dimensioned_Containers.ipynb @@ -0,0 +1,70 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Dimensioned Containers" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Exploring parameter spaces\n", + "\n", + "* Discuss exploring multi-dimensional parameter spaces and faceting of data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Declaring n-dimensional collections\n", + "\n", + "* HoloMaps: Introduce HoloMap as general nd-dimensional space (also mention DynamicMap)\n", + "* Explain keys and key dimensions and using select and indexing\n", + "* Explain semantics of the widgets and point forward to streams+custom widgets" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### Visual grouping\n", + "\n", + "* Introduce NdOverlay, NdLayout, GridSpace by casting between the containers\n", + "* Example faceting using 2D HoloMap\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Grouping and faceting\n", + "\n", + "* Explain ``.overlay``, ``.layout``, ``.grid`` on HoloMap to facet multi-dimensional space\n", + "* Example using 4D HoloMap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Grouping and faceting\n", + "\n", + "* Explain ``.overlay``, ``.layout``, ``.grid`` on HoloMap to facet multi-dimensional space\n", + "* Example using 4D HoloMap" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/user_guide/05-Building_Composite_Objects.ipynb b/examples/user_guide/05-Building_Composite_Objects.ipynb new file mode 100644 index 0000000000..eef4c179f9 --- /dev/null +++ b/examples/user_guide/05-Building_Composite_Objects.ipynb @@ -0,0 +1,441 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Building Composite Objects" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The [Containers](Containers.ipynb) tutorial shows examples of each of the container types in HoloViews, and it is useful to look at the description of each type there, as you work through this tutorial. \n", + "\n", + "This tutorial shows you how to combine the various container types, in order to build data structures that can contain all of the data that you want to visualize or analyze, in an extremely flexible way. For instance, you may have a large set of measurements of different types of data (numerical, image, textual notations, etc.) from different experiments done on different days, with various different parameter values associated with each one. HoloViews can store all of this data together, which will allow you to select just the right bit of data \"on the fly\" for any particular analysis or visualization, by indexing, slicing, selecting, and sampling in this data structure." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Nesting hierarchy " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To illustrate the full functionality provided, we will create an example of the maximally nested object structure currently possible with HoloViews:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')\n", + "np.random.seed(10)\n", + "\n", + "def sine_curve(phase, freq, amp, power, samples=102):\n", + " xvals = [0.1* i for i in range(samples)]\n", + " return [(x, amp*np.sin(phase+freq*x)**power) for x in xvals]\n", + "\n", + "phases = [0, np.pi/2, np.pi, 3*np.pi/2]\n", + "powers = [1,2,3]\n", + "amplitudes = [0.5,0.75, 1.0]\n", + "frequencies = [0.5, 0.75, 1.0, 1.25, 1.5, 1.75]\n", + "\n", + "\n", + "gridspace = hv.GridSpace(kdims=['Amplitude', 'Power'], group='Parameters', label='Sines')\n", + "\n", + "for power in powers:\n", + " for amplitude in amplitudes:\n", + " holomap = hv.HoloMap(kdims=['Frequency'])\n", + " for frequency in frequencies:\n", + " sines = {phase : hv.Curve(sine_curve(phase, frequency, amplitude, power))\n", + " for phase in phases}\n", + " ndoverlay = hv.NdOverlay(sines , kdims=['Phase']).relabel(group='Phases',\n", + " label='Sines', depth=1)\n", + " overlay = ndoverlay * hv.Points([(i,0) for i in range(0,10)], group='Markers', label='Dots')\n", + " holomap[frequency] = overlay\n", + " gridspace[amplitude, power] = holomap\n", + "\n", + "penguins = hv.RGB.load_image('../datasets/penguins.png').relabel(group=\"Family\", label=\"Penguin\")\n", + "\n", + "layout = gridspace + penguins" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This code produces what looks like a relatively simple animation of two side-by-side figures, but is actually a deeply nested data structure:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The structure of this object can be seen using ``print()``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(layout)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To help us understand this structure, here is a schematic for us to refer to as we unpack this object, level by level:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Everything that is *displayable* in HoloViews has this same basic structure, although any of the levels can be omitted in simpler cases, and many different Element types (not containers) can be substituted for any other. \n", + "\n", + "Since HoloViews 1.3.0, you are allowed to build data-structures that violate this hierarchy (e.g., you can put ``Layout`` objects into ``HoloMaps``) but the resulting object cannot be displayed. Instead, you will be prompted with a message to call the ``collate`` method. Using the ``collate`` method will allow you to generate the appropriate object that correctly obeys the hierarchy shown above, so that it can be displayed.\n", + "\n", + "As shown in the diagram, there are three different types of container involved:\n", + "\n", + "- Basic Element: elementary HoloViews object containing raw data in an external format like Numpy or pandas.\n", + "- Homogenous container (UniformNdMapping): collections of Elements or other HoloViews components that are all the same type. These are indexed using array-style key access with values sorted along some dimension(s), e.g. ``[0.50]`` or ``[\"a\",7.6]``.\n", + "- Heterogenous container (AttrTree): collections of data of different types, e.g. different types of Element. These are accessed by categories using attributes, e.g. ``.Parameters.Sines``, which does not assume any ordering of a dimension.\n", + "\n", + "We will now go through each of the containers of these different types, at each level." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``Layout`` Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Above, we have already viewed the highest level of our data structure as a Layout. Here is the representation of entire Layout object, which reflects all the levels shown in the diagram:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(layout)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the examples below, we will unpack this data structure using attribute access (explained in the [Introductory tutorial](Introduction.ipynb)) as well as indexing and slicing (explained in the [Sampling Data tutorial](Sampling_Data.ipynb)).\n", + "\n", + "### ``GridSpace`` Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Elements within a ``Layout``, such as the ``GridSpace`` in this example, are reached via attribute access:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout.Parameters.Sines" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``HoloMap`` Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This ``GridSpace`` consists of nine ``HoloMap``s arranged in a two-dimensional space. Let's now select one of these ``HoloMap`` objects, by indexing to retrieve the one at [Amplitude,Power] ``[0.5,1.0]``, i.e. the lowest amplitude and power:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout.Parameters.Sines[0.5, 1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As shown in the schematic above, a ``HoloMap`` contains many elements with associated keys. In this example, these keys are indexed with a dimension ``Frequency``, which is why the ``Frequency`` varies when you play the animation here." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``Overlay`` Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The printed representation showed us that the ``HoloMap`` is composed of ``Overlay`` objects, six in this case (giving six frames to the animation above). Let us access one of these elements, i.e. one frame of the animation above, by indexing to retrieve an ``Overlay`` associated with the key with a ``Frequency`` of *1.0*:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout.Parameters.Sines[0.5, 1][1.0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### NdOverlay Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the representation shows, the ``Overlay`` contains a ``Points`` object and an ``NdOverlay`` object. We can access either one of these using the attribute access supported by ``Overlay``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "(layout.Parameters.Sines[0.5, 1][1].Phases.Sines +\n", + " layout.Parameters.Sines[0.5, 1][1].Markers.Dots)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### ``Curve`` Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``NdOverlay`` is so named because it is an overlay of items indexed by dimensions, unlike the regular attribute-access overlay types. In this case it is indexed by ``Phase``, with four values. If we index to select one of these values, we will get an individual ``Curve``, e.g. the one with zero phase:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "l=layout.Parameters.Sines[0.5, 1][1].Phases.Sines[0.0]\n", + "l" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(l)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Data Level" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "At this point, we have reached the end of the HoloViews objects; below this object is only the raw data as a Numpy array:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "type(layout.Parameters.Sines[0.5, 1][1].Phases.Sines[0.0].data)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Actually, HoloViews will let you go even further down, accessing data inside the Numpy array using the continuous (floating-point) coordinate systems declared in HoloViews. E.g. here we can ask for a single datapoint, such as the value at x=5.2:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout.Parameters.Sines[0.5, 1][1].Phases.Sines[0.0][5.2]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indexing into 1D Elements like Curve and higher-dimensional but regularly gridded Elements like Image, Surface, and HeatMap will return the nearest defined value (i.e., the results \"snap\" to the nearest data item):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout.Parameters.Sines[0.5, 1][1].Phases.Sines[0.0][5.23], layout.Parameters.Sines[0.5, 1][1].Phases.Sines[0.0][5.27]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For other Element types, such as Points, snapping is not supported and thus indexing down into the .data array will be less useful, because it will only succeed for a perfect floating-point match on the key dimensions. In those cases, you can still use all of the access methods provided by the numpy array itself, via ``.data``, e.g. ``.data[52]``, but note that such native operations force you to use the native indexing scheme of the array, i.e. integer access starting at zero, not the more convenient and semantically meaningful [continuous coordinate systems](Continuous_Coordinates.ipynb) we provide through HoloViews." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Indexing using ``.select``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The curve displayed immediately above shows the final, deepest Element access possible in HoloViews for this object:\n", + "\n", + "```python\n", + "layout.Parameters.Sines[0.5, 1][1].Phases.Sines[0.0]\n", + "```\n", + "This is the curve with an amplitude of *0.5*, raised to a power of *1.0* with frequency of *1.0* and *0* phase. These are all the numbers, in order, used in the access shown above.\n", + "\n", + "The ``.select`` method is a more explicit way to use key access, with both of these equivalent to each other:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "o1 = layout.Parameters.Sines.select(Amplitude=0.5, Power=1.0).select(Frequency=1.0)\n", + "o2 = layout.Parameters.Sines.select(Amplitude=0.5, Power=1.0, Frequency=1.0)\n", + "o1 + o2" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The second form demonstrates HoloViews' **deep indexing** feature, which allows indexes to cross nested container boundaries. The above is as far as we can index before reaching a heterogeneous type (the ``Overlay``), where we need to use attribute access. Here is the more explicit method of indexing down to a curve, using ``.select`` to specify dimensions by name instead of bracket-based indexing by position:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "layout.Parameters.Sines.select(Amplitude=0.5,Power=1.0, \n", + " Frequency=1.0).Phases.Sines.select(Phase=0.0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Traversing the hierarchy\n", + "\n", + "* Show map/traverse etc to go through the hierarchy." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, HoloViews lets you compose objects of heterogenous types, and objects covering many different numerical or other dimensions, laying them out spatially, temporally, or overlaid. The resulting data structures are complex, but they are composed of simple elements with well-defined interactions, making it feasible to express nearly any relationship that will characterize your data. In practice, you will probably not need this many levels, but given this complete example, you should be able to construct an appropriate organization for whatever type of data that you do want to organize or visualize. \n", + "\n", + "As emphasized above, it is not possible to combine these objects in other orderings. Of course, any ``Element`` can be substituted for any other, which doesn't change the structure. But you cannot e.g. display an ``Overlay`` or ``HoloMap`` of ``Layout`` objects. Confusingly, the objects may *act* as if you have these arrangements. For instance, a ``Layout`` of ``HoloMap`` objects will be animated, like ``HoloMap`` objects, but only because of the extra dimension(s) provided by the enclosed ``HoloMap`` objects, not because the ``Layout`` itself has data along those dimensions. Similarly, you cannot have a ``Layout`` of ``Layout`` objects, even though it looks like you can. E.g. the ``+`` operator on two ``Layout`` objects will not create a ``Layout`` of ``Layout`` objects; it just creates a new ``Layout`` object containing the data from both of the other objects. Similarly for the ``Overlay`` of ``Overlay`` objects using ``*``; only a single combined ``Overlay`` is returned.\n", + "\n", + "If you are confused about how all of this works in practice, you can use the examples in the tutorials to guide you, especially the [Exploring Data](Exploring_Data.ipynb) tutorial. " + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/06-Live_Data.ipynb b/examples/user_guide/06-Live_Data.ipynb new file mode 100644 index 0000000000..29e59ffadc --- /dev/null +++ b/examples/user_guide/06-Live_Data.ipynb @@ -0,0 +1,672 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Live Data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The [Containers](Containers.ipynb) Tutorial introduced the [HoloMap](Containers.ipynb#HoloMap), a core HoloViews data structure that allows easy exploration of parameter spaces. The essence of a HoloMap is that it contains a collection of [Elements](Elements.ipynb) (e.g. Images and Curves) that you can easily select and visualize.\n", + "\n", + "HoloMaps hold fully constructed Elements at specifically sampled points in a multidimensional space. Although HoloMaps are useful for exploring high-dimensional parameter spaces, they can very quickly consume huge amounts of memory to store all these Elements. For instance, a hundred samples along four orthogonal dimensions would need a HoloMap containing a hundred *million* Elements, each of which could be a substantial object that takes time to create and costs memory to store. Thus ``HoloMaps`` have some clear limitations:\n", + "\n", + "* HoloMaps may require the generation of millions of Elements before the first element can be viewed.\n", + "* HoloMaps can easily exhaust all the memory available to Python.\n", + "* HoloMaps can even more easily exhaust all the memory in the browser when displayed.\n", + "* Static export of a notebook containing HoloMaps can result in impractically large HTML files.\n", + "\n", + "The ``DynamicMap`` addresses these issues by computing and displaying elements dynamically, allowing exploration of much larger datasets:\n", + "\n", + "* DynamicMaps generate elements on the fly, allowing the process of exploration to begin immediately.\n", + "* DynamicMaps do not require fixed sampling, allowing exploration of parameters with arbitrary resolution.\n", + "* DynamicMaps are lazy in the sense they only compute only as much data as the user wishes to explore.\n", + "\n", + "Of course, these advantages come with some limitations:\n", + "\n", + "* DynamicMaps require a live notebook server and cannot be fully exported to static HTML.\n", + "* DynamicMaps store only a portion of the underlying data, in the form of an Element cache and their output is dependent on the particular version of the executed code. \n", + "* DynamicMaps (and particularly their element caches) are typically stateful (with values that depend on patterns of user interaction), which can make them more difficult to reason about.\n", + "\n", + "In addition to the different computational requirements of ``DynamicMaps``, they can be used to build sophisticated, interactive vizualisations that cannot be achieved using only ``HoloMaps``. This notebook demonstrates some basic examples and the [Streams](Streams.ipynb) notebook follows on by introducing the streams system. The [Linked Streams](Linked_Streams.ipynb) tutorial shows how you can directly interact with your plots when using the Bokeh backend.\n", + "\n", + "When DynamicMap was introduced in version 1.6, it support multiple different 'modes' which have now been deprecated. This notebook demonstrates the simpler, more flexible and more powerful DynamicMap introduced in version 1.7. Users who have been using the previous version of DynamicMap should be unaffected as backwards compatibility has been preserved for the most common cases.\n", + "\n", + "All this will make much more sense once we've tried out some ``DynamicMaps`` and showed how they work, so let's create one!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
To visualize and use a DynamicMap you need to be running a live Jupyter server.
This tutorial assumes that it will be run in a live notebook environment.
\n", + "When viewed statically, DynamicMaps will only show the first available Element,
and will thus not have any slider widgets, making it difficult to follow the descriptions below.

\n", + "It's also best to run this notebook one cell at a time, not via \"Run All\",
so that subsequent cells can reflect your dynamic interaction with widgets in previous cells.
" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## ``DynamicMap`` " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's start by importing HoloViews and loading the extension:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "import numpy as np\n", + "hv.extension()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will now create ``DynamicMap`` similar to the ``HoloMap`` introduced in the [Containers Tutorial](Containers.ipynb#HoloMap). The ``HoloMap`` in that tutorial consisted of ``Image`` elements defined by a function returning NumPy arrays called ``sine_array``. Here we will define a ``waves_image`` function that returns an array pattern parameterized by arbitrary ``alpha`` and ``beta`` parameters inside a HoloViews [Image](Elements.ipynb#Image) element:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = np.linspace(-4,0,202)\n", + "yvals = np.linspace(4,0,202)\n", + "xs,ys = np.meshgrid(xvals, yvals)\n", + "\n", + "def waves_image(alpha, beta):\n", + " return hv.Image(np.sin(((ys/alpha)**alpha+beta)*xs))\n", + "\n", + "waves_image(0,0) + waves_image(0,4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can demonstrate the possibilities for exploration enabled by the simplest declaration of a ``DynamicMap``." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Basic ``DynamicMap`` declaration" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A simple ``DynamicMap`` declaration looks identical to that needed to declare a ``HoloMap``. Instead supplying some initial data, we will supply the ``waves_image`` function instead with key dimensions simply declaring the arguments of that function:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap = hv.DynamicMap(waves_image, kdims=['alpha', 'beta'])\n", + "dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This object is created instantly, but because it doesn't generate any `hv.Image` objects initially it only shows the printed representation of this object along with some information about how to display it. We will refer to a ``DynamicMap`` that doesn't have enough information to display itself as 'unbounded'.\n", + "\n", + "The textual representation of all ``DynamicMaps`` look similar, differing only in the listed dimensions until they have been evaluated at least once." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Explicit indexing\n", + "\n", + "Unlike a corresponding ``HoloMap`` declaration, this simple unbounded ``DynamicMap`` cannot yet visualize itself. To view it, we can follow the advice in the warning message. First we will explicitly index into our ``DynamicMap`` in the same way you would access a key on a ``HoloMap``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap[0,1] + dmap.select(alpha=1, beta=2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the declared kdims are specifying the arguments *by position* as they do not match the argument names of the ``waves_image`` function. If you *do* match the argument names *exactly*, you can map a kdim position to any argument position of the callable. For instance, the declaration ``kdims=['freq', 'phase']`` would index first by frequency, then phase without mixing up the arguments to ``waves_image`` when indexing." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Setting dimension ranges\n", + "\n", + "The second suggestion proposed by the warning was to supply dimension ranges using the ``redim.range`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.redim.range(alpha=(0,5.0), beta=(1,5.0))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here each `hv.Image` object visualizing a particular sine ring pattern with the given parameters is created dynamically, whenever the slider is set to that value. Any value in the allowable range can be requested by dragging the sliders or by tweaking the values using the left and right arrow keys.\n", + "\n", + "Of course, we didn't have to use the ``redim.range`` method and we could have simply declared the ranges right away using explicit ``hv.Dimension`` objects. This would allow us to declare other dimension properties such as the step size used by the sliders: by default each slider can select around a thousand distinct values along its range but you can specify your own step value via the dimension ``step`` parameter. If you use integers in your range declarations, integer stepping will be assumed with a step size of one.\n", + "\n", + "It is important to note that whenever the ``redim`` method is used, a new ``DynamicMap`` is returned with the updated dimensions. In other words, the original ``dmap`` remains unbounded with default dimension objects.\n", + "\n", + "\n", + "#### Setting dimension values\n", + "\n", + "This ``DynamicMap`` above allows exploration of *any* phase and frequency within the declared range unlike an equivalent ``HoloMap`` which would have to be composed of a finite set of samples. We can achieve a similar discrete sampling using ``DynamicMap`` by setting the ``values`` parameter on the dimensions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.redim.values(alpha=[0,1,2], beta=[0.1, 1.0, 2.5])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The sliders now snap to the specified dimension values and if you are running this tutorial in a live notebook, the above cell should look like the ``HoloMap`` in the [Containers Tutorial](Containers.ipynb#HoloMap). ``DynamicMap`` is in fact a subclass of ``HoloMap`` with some crucial differences:\n", + "\n", + "* You can now pick as many values of **alpha** or **beta** as allowed by the slider.\n", + "* What you see in the cell above will not be exported in any HTML snapshot of the notebook\n", + "\n", + "\n", + "We will now explore how ``DynamicMaps`` relate to ``HoloMaps`` including conversion operations between the two types. As we will see, there are other ways to display a ``DynamicMap`` without using explict indexing or redim." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Interaction with ``HoloMap``s" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To explore the relationship between ``DynamicMap`` and ``HoloMap``, let's declare another callable to draw some shapes we will use in a new ``DynamicMap``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def shapes(N, radius=0.5): # Positional keyword arguments are fine\n", + " paths = [hv.Path([[(radius*np.sin(a), radius*np.cos(a)) \n", + " for a in np.linspace(-np.pi, np.pi, n+2)]], \n", + " extents=(-1,-1,1,1)) \n", + " for n in range(N,N+3)]\n", + " return hv.Overlay(paths)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Sampling ``DynamicMap`` from a ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When combining a ``HoloMap`` with a ``DynamicMap``, it would be very awkward to have to match the declared dimension ``values`` of the DynamicMap with the keys of the ``HoloMap``. Fortunately you don't have to:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Path (linewidth=1.5)\n", + "holomap = hv.HoloMap({(N,r):shapes(N, r) for N in [3,4,5] for r in [0.5,0.75]}, kdims=['N', 'radius'])\n", + "dmap = hv.DynamicMap(shapes, kdims=['N','radius'])\n", + "holomap + dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we declared a ``DynamicMap`` without using ``redim``, but we can view its output because it is presented alongside a ``HoloMap`` which defines the available keys. This convenience is subject to three particular restrictions:\n", + "\n", + "\n", + "* You cannot display a layout consisting of unbounded ``DynamicMaps`` only, because at least one HoloMap is needed to define the samples.\n", + "* The HoloMaps provide the necessary information required to sample the DynamicMap. \n", + "\n", + "Note that there is one way ``DynamicMap`` is less restricted than ``HoloMap``: you can freely combine bounded ``DynamicMaps`` together in a ``Layout``, even if they don't share key dimensions." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Also notice that the ``%%opts`` cell magic allows you to style DynamicMaps can be styled in exactly the same way as HoloMaps. We will now use the ``%opts`` line magic to set the linewidths of all ``Path`` elements in the rest of the notebook:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%opts Path (linewidth=1.5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Converting from ``DynamicMap`` to ``HoloMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Above we mentioned that ``DynamicMap`` is an instance of ``HoloMap``. Does this mean it has a ``.data`` attribute?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dtype = type(dmap.data).__name__\n", + "length = len(dmap.data)\n", + "print(\"DynamicMap 'dmap' has an {dtype} .data attribute of length {length}\".format(dtype=dtype, length=length))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is exactly the same sort of ``.data`` as the equivalent ``HoloMap``, except that its values will vary according to how much you explored the parameter space of ``dmap`` using the sliders above. In a ``HoloMap``, ``.data`` contains a defined sampling along the different dimensions, whereas in a ``DynamicMap``, the ``.data`` is simply the *cache*.\n", + "\n", + "The cache serves two purposes:\n", + "\n", + "* Avoids recomputation of an element should we revisit a particular point in the parameter space. This works well for categorical or integer dimensions, but doesn't help much when using continuous sliders for real-valued dimensions.\n", + "* Records the space that has been explored with the ``DynamicMap`` for any later conversion to a ``HoloMap`` up to the allowed cache size.\n", + "\n", + "We can always convert *any* ``DynamicMap`` directly to a ``HoloMap`` as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.HoloMap(dmap)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is in fact equivalent to declaring a HoloMap with the same parameters (dimensions, etc.) using ``dmap.data`` as input, but is more convenient. Note that the slider positions reflect those we sampled from the ``HoloMap`` in the previous section.\n", + "\n", + "Although creating a HoloMap this way is easy, the result is poorly controlled, as the keys in the DynamicMap cache are usually defined by how you moved the sliders around. If you instead want to specify a specific set of samples, you can easily do so by using the same key-selection semantics as for a ``HoloMap`` to define exactly which elements are to be sampled and put into the cache:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.HoloMap(dmap[{(2,0.3), (2,0.6), (3,0.3), (3,0.6)}])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we index the ``dmap`` with specified keys to return a *new* DynamicMap with those keys in its cache, which we then cast to a ``HoloMap``. This allows us to export specific contents of ``DynamicMap`` to static HTML which will display the data at the sampled slider positions.\n", + "\n", + "The key selection above happens to define a Cartesian product, which is one of the most common way to sample across dimensions. Because the list of such dimension values can quickly get very large when enumerated as above, we provide a way to specify a Cartesian product directly, which also works with ``HoloMaps``. Here is an equivalent way of defining the same set of four points in that two-dimensional space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "samples = hv.HoloMap(dmap[{2,3},{0.5,1.0}])\n", + "samples" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "samples.data.keys()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The default cache size of 500 Elements is relatively high so that interactive exploration will work smoothly, but you can reduce it using the ``cache_size`` parameter if you find you are running into issues with memory consumption. A bounded ``DynamicMap`` with ``cache_size=1`` requires the least memory, but will recompute a new Element every time the sliders are moved, making it less responsive." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Converting from ``HoloMap`` to ``DynamicMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We have now seen how to convert from a ``DynamicMap`` to a ``HoloMap`` for the purposes of static export, but why would you ever want to do the inverse?\n", + "\n", + "Although having a ``HoloMap`` to start with means it will not save you memory, converting to a ``DynamicMap`` does mean that the rendering process can be deferred until a new slider value requests an update. You can achieve this conversion using the ``Dynamic`` utility as demonstrated here by applying it to the previously defined ``HoloMap`` called ``samples``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews.util import Dynamic\n", + "dynamic = Dynamic(samples)\n", + "print('After apply Dynamic, the type is a {dtype}'.format(dtype=type(dynamic).__name__))\n", + "dynamic" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this particular example, there is no real need to use ``Dynamic`` as each frame renders quickly enough. For visualizations that are slow to render, using ``Dynamic`` can result in more responsive visualizations. \n", + "\n", + "The ``Dynamic`` utility is very versatile and is discussed in more detail in the [Dynamic Operations](Dynamic_Operations.ipynb) tutorial." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Slicing ``DynamicMaps``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we have seen we can either declare dimension ranges directly in the kdims or use the ``redim.range`` convenience method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap = hv.DynamicMap(shapes, kdims=['N','radius']).redim.range(N=(2,20), radius=(0.5,1.0))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The declared dimension ranges define the absolute limits allowed for exploration in this continuous, bounded DynamicMap . That said, you can use the soft_range parameter to view subregions within that range. Setting the soft_range parameter on dimensions can be done conveniently using slicing:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sliced = dmap[4:8, :]\n", + "sliced" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "Notice that N is now restricted to the range 4:8. Open slices are used to release any ``soft_range`` values, which resets the limits back to those defined by the full range:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sliced[:, 0.8:1.0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``[:]`` slice leaves the soft_range values alone and can be used as a convenient way to clone a ``DynamicMap``. Note that mixing slices with any other object type is not supported. In other words, once you use a single slice, you can only use slices in that indexing operation." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using groupby to discretize a DynamicMap\n", + "\n", + "A DynamicMap also makes it easy to partially or completely discretize a function to evaluate in a complex plot. By grouping over specific dimensions that define a fixed sampling via the Dimension values parameter, the DynamicMap can be viewed as a ``GridSpace``, ``NdLayout``, or ``NdOverlay``. If a dimension specifies only a continuous range it can't be grouped over, but it may still be explored using the widgets. This means we can plot partial or completely discretized views of a parameter space easily.\n", + "\n", + "#### Partially discretize\n", + "\n", + "The implementation for all the groupby operations uses the ``.groupby`` method internally, but we also provide three higher-level convenience methods to group dimensions into an ``NdOverlay`` (``.overlay``), ``GridSpace`` (``.grid``), or ``NdLayout`` (``.layout``).\n", + "\n", + "Here we will evaluate a simple sine function with three dimensions, the phase, frequency, and amplitude. We assign the frequency and amplitude discrete samples, while defining a continuous range for the phase:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = np.linspace(0, 2*np.pi)\n", + "\n", + "def sin(ph, f, amp):\n", + " return hv.Curve((xs, np.sin(xs*f+ph)*amp))\n", + "\n", + "kdims=[hv.Dimension('phase', range=(0, np.pi)),\n", + " hv.Dimension('frequency', values=[0.1, 1, 2, 5, 10]),\n", + " hv.Dimension('amplitude', values=[0.5, 5, 10])]\n", + "\n", + "waves_dmap = hv.DynamicMap(sin, kdims=kdims)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next we define the amplitude dimension to be overlaid and the frequency dimension to be gridded:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts GridSpace [show_legend=True fig_size=200]\n", + "waves_dmap.overlay('amplitude').grid('frequency')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, instead of having three sliders (one per dimension), we've now laid out the frequency dimension as a discrete set of values in a grid, and the amplitude dimension as a discrete set of values in an overlay, leaving one slider for the remaining dimension (phase). This approach can help you visualize a large, multi-dimensional space efficiently, with full control over how each dimension is made visible.\n", + "\n", + "\n", + "#### Fully discretize\n", + "\n", + "Given a continuous function defined over a space, we could sample it manually, but here we'll look at an example of evaluating it using the groupby method. Let's look at a spiral function with a frequency and first- and second-order phase terms. Then we define the dimension values for all the parameters and declare the DynamicMap:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%opts Path (linewidth=1 color=Palette('Blues'))\n", + "\n", + "def spiral_equation(f, ph, ph2):\n", + " r = np.arange(0, 1, 0.005)\n", + " xs, ys = (r * fn(f*np.pi*np.sin(r+ph)+ph2) for fn in (np.cos, np.sin))\n", + " return hv.Path((xs, ys))\n", + "\n", + "spiral_dmap = hv.DynamicMap(spiral_equation, kdims=['f','ph','ph2']).\\\n", + " redim.values(f=np.linspace(1, 10, 10),\n", + " ph=np.linspace(0, np.pi, 10),\n", + " ph2=np.linspace(0, np.pi, 4))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can make use of the ``.groupby`` method to group over the frequency and phase dimensions, which we will display as part of a GridSpace by setting the ``container_type``. This leaves the second phase variable, which we assign to an NdOverlay by setting the ``group_type``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts GridSpace [xaxis=None yaxis=None] Path [bgcolor='w' xaxis=None yaxis=None]\n", + "spiral_dmap.groupby(['f', 'ph'], group_type=hv.NdOverlay, container_type=hv.GridSpace)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This grid shows a range of frequencies `f` on the x axis, a range of the first phase variable `ph` on the `y` axis, and a range of different `ph2` phases as overlays within each location in the grid. As you can see, these techniques can help you visualize multidimensional parameter spaces compactly and conveniently.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## DynamicMaps and normalization" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By default, a ``HoloMap`` normalizes the display of elements according the minimum and maximum values found across the ``HoloMap``. This automatic behavior is not possible in a ``DynamicMap``, where arbitrary new elements are being generated on the fly. Consider the following examples where the arrays contained within the returned ``Image`` objects are scaled with time:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Image {+axiswise}\n", + "ls = np.linspace(0, 10, 200)\n", + "xx, yy = np.meshgrid(ls, ls)\n", + "\n", + "def cells(time):\n", + " return hv.Image(time*np.sin(xx+time)*np.cos(yy+time), vdims=['Intensity'])\n", + "\n", + "dmap = hv.DynamicMap(cells, kdims=['time']).redim.range(time=(1,20))\n", + "dmap + dmap.redim.range(Intensity=(0,10))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here we use ``+axiswise`` to see the behavior of the two cases independently. We see in **A** that when only the time dimension is given a range, no automatic normalization occurs (unlike a ``HoloMap``). In **B** we see that normalization is applied, but only when the value dimension ('Intensity') range has been specified. \n", + "\n", + "In other words, ``DynamicMaps`` cannot support automatic normalization across their elements, but do support the same explicit normalization behavior as ``HoloMaps``. Values that are generated outside this range are simply clipped according the usual semantics of explicit value dimension ranges. \n", + "\n", + "Note that we can always have the option of casting a ``DynamicMap`` to a ``HoloMap`` in order to automatically normalize across the cached values, without needing explicit value dimension ranges." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using DynamicMaps in your code\n", + "\n", + "As you can see, ``DynamicMaps`` let you use HoloViews with a very wide range of dynamic data formats and sources, making it simple to visualize ongoing processes or very large data spaces. \n", + "\n", + "Given unlimited computational resources, the functionality covered in this tutorial would match that offered by ``HoloMap`` but with fewer normalization options. ``DynamicMap`` actually enables a vast range of new possibilities for dynamic, interactive visualizations as covered in the [Streams](Streams.ipynb) tutorial. Following on from that, the [Linked Streams](Linked_Streams.ipynb) tutorial shows how you can directly interact with your plots when using the Bokeh backend." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/07-Tabular_Datasets.ipynb b/examples/user_guide/07-Tabular_Datasets.ipynb new file mode 100644 index 0000000000..e9b8250a1e --- /dev/null +++ b/examples/user_guide/07-Tabular_Datasets.ipynb @@ -0,0 +1,589 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Tabular Datasets" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this Tutorial we will explore how to work with columnar data in HoloViews. Columnar data has a fixed list of column headings, with values stored in an arbitrarily long list of rows. Spreadsheets, relational databases, CSV files, and many other typical data sources fit naturally into this format. HoloViews defines an extensible system of interfaces to load, manipulate, and visualize this kind of data, as well as allowing conversion of any of the non-columnar data types into columnar data for analysis or data interchange.\n", + "\n", + "By default HoloViews will use one of three storage formats for columnar data:\n", + "\n", + "* A pure Python dictionary containing each column.\n", + "* A purely NumPy-based format for numeric data.\n", + "* Pandas DataFrames\n", + "* Dask DataFrames" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv\n", + "from IPython.display import HTML\n", + "hv.extension()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Simple Dataset" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Usually when working with data we have one or more independent variables, taking the form of categories, labels, discrete sample coordinates, or bins. These variables are what we refer to as key dimensions (or ``kdims`` for short) in HoloViews. The observer or dependent variables, on the other hand, are referred to as value dimensions (``vdims``), and are ordinarily measured or calculated given the independent variables. The simplest useful form of a Dataset object is therefore a column 'x' and a column 'y' corresponding to the key dimensions and value dimensions respectively. An obvious visual representation of this data is a Table:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = range(10)\n", + "ys = np.exp(xs)\n", + "\n", + "table = hv.Table((xs, ys), kdims=['x'], vdims=['y'])\n", + "table" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "However, this data has many more meaningful visual representations, and therefore the first important concept is that Dataset objects are interchangeable as long as their dimensionality allows it, meaning that you can easily create the different objects from the same data (and cast between the objects once created):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Scatter(table) + hv.Curve(table) + hv.Bars(table)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Each of these three plots uses the same data, but represents a different assumption about the semantic meaning of that data -- the Scatter plot is appropriate if that data consists of independent samples, the Curve plot is appropriate for samples chosen from an underlying smooth function, and the Bars plot is appropriate for independent categories of data. Since all these plots have the same dimensionality, they can easily be converted to each other, but there is normally only one of these representations that is semantically appropriate for the underlying data. For this particular data, the semantically appropriate choice is Curve, since the *y* values are samples from the continuous function ``exp``.\n", + "\n", + "As a guide to which Elements can be converted to each other, those of the same dimensionality here should be interchangeable, because of the underlying similarity of their columnar representation:\n", + "\n", + "* 0D: BoxWhisker, Spikes, Distribution*, \n", + "* 1D: Scatter, Curve, ErrorBars, Spread, Bars, BoxWhisker, Regression*\n", + "* 2D: Points, HeatMap, Bars, BoxWhisker, Bivariate*\n", + "* 3D: Scatter3D, Trisurface, VectorField, BoxWhisker, Bars\n", + "\n", + "\\* - requires Seaborn\n", + "\n", + "This categorization is based only on the ``kdims``, which define the space in which the data has been sampled or defined. An Element can also have any number of value dimensions (``vdims``), which may be mapped onto various attributes of a plot such as the color, size, and orientation of the plotted items. For a reference of how to use these various Element types, see the [Elements Tutorial](Elements.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Data types and Constructors\n", + "\n", + "As discussed above, Dataset provide an extensible interface to store and operate on data in different formats. All interfaces support a number of standard constructors." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Storage formats" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Dataset types can be constructed using one of three supported formats, (a) a dictionary of columns, (b) an NxD array with N rows and D columns, or (c) pandas dataframes:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(hv.Scatter({'x': xs, 'y': ys}) +\n", + " hv.Scatter(np.column_stack([xs, ys])) +\n", + " hv.Scatter(pd.DataFrame({'x': xs, 'y': ys})))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Literals\n", + "\n", + "In addition to the main storage formats, Dataset Elements support construction from three Python literal formats: (a) An iterator of y-values, (b) a tuple of columns, and (c) an iterator of row tuples." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(hv.Scatter(ys) + hv.Scatter((xs, ys)) + hv.Scatter(zip(xs, ys)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For these inputs, the data will need to be copied to a new data structure, having one of the three storage formats above. By default Dataset will try to construct a simple array, falling back to either pandas dataframes (if available) or the dictionary-based format if the data is not purely numeric. Additionally, the interfaces will try to maintain the provided data's type, so numpy arrays and pandas DataFrames will therefore always be parsed by the array and dataframe interfaces first respectively." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "df = pd.DataFrame({'x': xs, 'y': ys, 'z': ys*2})\n", + "print(type(hv.Scatter(df).data))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Dataset will attempt to parse the supplied data, falling back to each consecutive interface if the previous could not interpret the data. The default list of fallbacks and simultaneously the list of allowed datatypes is:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Dataset.datatype" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note these include grid based datatypes, which are not covered in this tutorial. To select a particular storage format explicitly, supply one or more allowed datatypes:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(type(hv.Scatter((xs, ys), datatype=['array']).data))\n", + "print(type(hv.Scatter((xs, ys), datatype=['dictionary']).data))\n", + "print(type(hv.Scatter((xs, ys), datatype=['dataframe']).data))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Sharing Data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since the formats with labelled columns do not require any specific order, each Element can effectively become a view into a single set of data. By specifying different key and value dimensions, many Elements can show different values, while sharing the same underlying data source." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "overlay = hv.Scatter(df, kdims='x', vdims='y') * hv.Scatter(df, kdims='x', vdims='z')\n", + "overlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can quickly confirm that the data is actually shared:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "overlay.Scatter.I.data is overlay.Scatter.II.data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For columnar data, this approach is much more efficient than creating copies of the data for each Element, and allows for some advanced features like linked brushing in the [Bokeh backend](Bokeh_Backend.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Converting to raw data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Column types make it easy to export the data to the three basic formats: arrays, dataframes, and a dictionary of columns.\n", + "\n", + "###### Array" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "table.array()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "###### Pandas DataFrame" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "HTML(table.dframe().head().to_html())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "###### Dataset dictionary" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "table.columns()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Creating tabular data from Elements using the .table and .dframe methods\n", + "\n", + "If you have data in some other HoloViews element and would like to use the columnar data features, you can easily tabularize any of the core Element types into a ``Table`` Element, using the ``.table()`` method. Similarly, the ``.dframe()`` method will convert an Element into a pandas DataFrame. These methods are very useful if you want to then transform the data into a different Element type, or to perform different types of analysis.\n", + "\n", + "## Tabularizing simple Elements\n", + "\n", + "For a simple example, we can create a ``Curve`` of an exponential function and convert it to a ``Table`` with the ``.table`` method, with the same result as creating the Table directly from the data as done earlier on this Tutorial:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = np.arange(10)\n", + "curve = hv.Curve(zip(xs, np.exp(xs)))\n", + "curve * hv.Scatter(zip(xs, curve)) + curve.table()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similarly, we can get a pandas dataframe of the Curve using ``curve.dframe()``. Here we wrap that call as raw HTML to allow automated testing of this notebook, but just calling ``curve.dframe()`` would give the same result visually:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "HTML(curve.dframe().to_html())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Although 2D image-like objects are *not* inherently well suited to a flat columnar representation, serializing them by converting to tabular data is a good way to reveal the differences between Image and Raster elements. Rasters are a very simple type of element, using array-like integer indexing of rows and columns from their top-left corner as in computer graphics applications. Conversely, Image elements are a higher-level abstraction that provides a general-purpose continuous Cartesian coordinate system, with x and y increasing to the right and upwards as in mathematical applications, and each point interpreted as a sample representing the pixel in which it is located (and thus centered within that pixel). Given the same data, the ``.table()`` representation will show how the data is being interpreted (and accessed) differently in the two cases (as explained in detail in the [Continuous Coordinates Tutorial](Continuous_Coordinates.ipynb)):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points (s=200) [size_index=None]\n", + "extents = (-1.6,-2.7,2.0,3)\n", + "np.random.seed(42)\n", + "mat = np.random.rand(3, 3)\n", + "\n", + "img = hv.Image(mat, bounds=extents)\n", + "raster = hv.Raster(mat)\n", + "\n", + "img * hv.Points(img) + img.table() + \\\n", + "raster * hv.Points(raster) + raster.table()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tabularizing space containers\n", + "\n", + "Even deeply nested objects can be deconstructed in this way, serializing them to make it easier to get your raw data out of a collection of specialized Element types. Let's say we want to make multiple observations of a noisy signal. We can collect the data into a HoloMap to visualize it and then call ``.table()`` to get a columnar object where we can perform operations or transform it to other Element types. Deconstructing nested data in this way only works if the data is homogenous. In practical terms, the requirement is that your data structure contains Elements (of any types) in these Container types: NdLayout, GridSpace, HoloMap, and NdOverlay, with all dimensions consistent throughout (so that they can all fit into the same set of columns).\n", + "\n", + "Let's now go back to the Image example. We will now collect a number of observations of some noisy data into a HoloMap and display it:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "obs_hmap = hv.HoloMap({i: hv.Image(np.random.randn(10, 10), bounds=(0,0,3,3))\n", + " for i in range(3)}, key_dimensions=['Observation'])\n", + "obs_hmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can serialize this data just as before, where this time we get a four-column (4D) table. The key dimensions of both the HoloMap and the Images, as well as the z-values of each Image, are all merged into a single table. We can visualize the samples we have collected by converting it to a Scatter3D object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Layout [fig_size=150] Scatter3D [color_index=3 size_index=None] (cmap='hot' edgecolor='k' s=50)\n", + "obs_hmap.table().to.scatter3d() + obs_hmap.table()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here the `z` dimension is shown by color, as in the original images, and the other three dimensions determine where the datapoint is shown in 3D. This way of deconstructing will work for any data structure that satisfies the conditions described above, no matter how nested. If we vary the amount of noise while continuing to performing multiple observations, we can create an ``NdLayout`` of HoloMaps, one for each level of noise, and animated by the observation number." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from itertools import product\n", + "extents = (0,0,3,3)\n", + "error_hmap = hv.HoloMap({(i, j): hv.Image(j*np.random.randn(3, 3), bounds=extents)\n", + " for i, j in product(range(3), np.linspace(0, 1, 3))},\n", + " key_dimensions=['Observation', 'noise'])\n", + "noise_layout = error_hmap.layout('noise')\n", + "noise_layout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And again, we can easily convert the object to a ``Table``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Table [fig_size=150]\n", + "noise_layout.table()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Applying operations to the data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Sorting by columns" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once data is in columnar form, it is simple to apply a variety of operations. For instance, Dataset can be sorted by their dimensions using the ``.sort()`` method. By default, this method will sort by the key dimensions, but any other dimension(s) can be supplied to specify sorting along any other dimensions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "bars = hv.Bars((['C', 'A', 'B', 'D'], [2, 7, 3, 4]))\n", + "bars + bars.sort() + bars.sort(['y'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Working with categorical or grouped data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Data is often grouped in various ways, and the Dataset interface provides various means to easily compare between groups and apply statistical aggregates. We'll start by generating some synthetic data with two groups along the x-axis and 4 groups along the y axis." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "n = np.arange(1000)\n", + "xs = np.repeat(range(2), 500)\n", + "ys = n%4\n", + "zs = np.random.randn(1000)\n", + "table = hv.Table((xs, ys, zs), kdims=['x', 'y'], vdims=['z'])\n", + "table" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since there are repeat observations of the same x- and y-values, we have to reduce the data before we display it or else use a datatype that supports plotting distributions in this way. The ``BoxWhisker`` type allows doing exactly that:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts BoxWhisker [aspect=2 fig_size=200 bgcolor='w']\n", + "hv.BoxWhisker(table)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Aggregating/Reducing dimensions" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most types require the data to be non-duplicated before being displayed. For this purpose, HoloViews makes it easy to ``aggregate`` and ``reduce`` the data. These two operations are simple inverses of each other--aggregate computes a statistic for each group in the supplied dimensions, while reduce combines all the groups except the supplied dimensions. Supplying only a function and no dimensions will simply aggregate or reduce all available key dimensions." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Bars [show_legend=False] {+axiswise}\n", + "hv.Bars(table).aggregate(function=np.mean) + hv.Bars(table).reduce(x=np.mean)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "(**A**) aggregates over both the x and y dimension, computing the mean for each x/y group, while (**B**) reduces the x dimension leaving just the mean for each group along y." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Collapsing multiple Dataset Elements" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When multiple observations are broken out into a HoloMap they can easily be combined using the ``collapse`` method. Here we create a number of Curves with increasingly larger y-values. By collapsing them with a ``function`` and a ``spreadfn`` we can compute the mean curve with a confidence interval. We then simply cast the collapsed ``Curve`` to a ``Spread`` and ``Curve`` Element to visualize them." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap({i: hv.Curve(np.arange(10)*i) for i in range(10)})\n", + "collapsed = hmap.collapse(function=np.mean, spreadfn=np.std)\n", + "hv.Spread(collapsed) * hv.Curve(collapsed) + collapsed.table()" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/08-Gridded_Datasets.ipynb b/examples/user_guide/08-Gridded_Datasets.ipynb new file mode 100644 index 0000000000..ec9788d905 --- /dev/null +++ b/examples/user_guide/08-Gridded_Datasets.ipynb @@ -0,0 +1,417 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Gridded Datasets" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import xarray as xr\n", + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('matplotlib', 'bokeh')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%opts Scatter3D [size_index=None color_index=3] (cmap='fire')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the [Columnar Data Tutorial](Columnar_Data.ipynb) we covered how to work with columnar data in HoloViews. Apart from tabular or column based data there is another data format that is particularly common in the science and engineering contexts, namely multi-dimensional arrays. The gridded data interfaces allow interfacing with grid-based datasets directly.\n", + "\n", + "Grid-based datasets have two types of dimensions:\n", + "\n", + "* they have coordinate or key dimensions, which describe the sampling of each dimension in the value arrays\n", + "* they have value dimensions which describe the quantity of the multi-dimensional value arrays\n", + "\n", + "\n", + "## Declaring gridded data\n", + "\n", + "All Elements that support a ColumnInterface also support the GridInterface. The simplest example of a multi-dimensional (or more precisely 2D) gridded dataset is an image, which has implicit or explicit x-coordinates, y-coordinates and an array representing the values for each combination of these coordinates. Let us start by declaring an Image with explicit x- and y-coordinates:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img = hv.Image((range(10), range(5), np.random.rand(5, 10)), datatype=['grid'])\n", + "img" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the above example we defined that there would be 10 samples along the x-axis, 5 samples along the y-axis and then defined a random ``5x10`` array, matching those dimensions. This follows the NumPy (row, column) indexing convention. When passing a tuple HoloViews will use the first gridded data interface, which stores the coordinates and value arrays as a dictionary mapping the dimension name to a NumPy array representing the data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img.data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "However HoloViews also ships with interfaces for ``xarray`` and ``iris``, two common libraries for working with multi-dimensional datasets:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xr_img = img.clone(datatype=['xarray'])\n", + "arr_img = img.clone(datatype=['image'])\n", + "iris_img = img.clone(datatype=['cube'])\n", + "\n", + "print(type(xr_img.data))\n", + "print(type(iris_img.data))\n", + "print(type(arr_img.data))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the case of an Image HoloViews also has a simple image representation which stores the data as a single array and converts the x- and y-coordinates to a set of bounds:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(\"Array type: %s with bounds %s\" % (type(arr_img.data), arr_img.bounds))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To summarize the constructor accepts a number of formats where the value arrays should always match the shape of the coordinate arrays:\n", + "\n", + " 1. A simple np.ndarray along with (l, b, r, t) bounds\n", + " 2. A tuple of the coordinate and value arrays\n", + " 3. A dictionary of the coordinate and value arrays indexed by their dimension names\n", + " 3. XArray DataArray or XArray Dataset\n", + " 4. An Iris cube" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Working with a multi-dimensional dataset" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A gridded Dataset may have as many dimensions as desired, however individual Element types only support data of a certain dimensionality. Therefore we usually declare a ``Dataset`` to hold our multi-dimensional data and take it from there." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dataset3d = hv.Dataset((range(3), range(5), range(7), np.random.randn(7, 5, 3)),\n", + " kdims=['x', 'y', 'z'], vdims=['Value'])\n", + "dataset3d" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is because even a 3D multi-dimensional array represents volumetric data which we can only easily display if it only contains a few samples. In this simple case we can get an overview of what this data looks like by casting it to a ``Scatter3D`` Element (which will help us visualize the operations we are applying to the data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Scatter3D(dataset3d)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Indexing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to explore the dataset we therefore often want to define a lower dimensional slice into the array and then convert the dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dataset3d.select(x=1).to(hv.Image, ['y', 'z']) + hv.Scatter3D(dataset3d.select(x=1))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Groupby\n", + "\n", + "Another common method to apply to our data is to facet or animate the data is ``groupby`` operations. HoloViews provides a convient interface to apply ``groupby`` operations and select which dimensions to visualize. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "(dataset3d.to(hv.Image, ['y', 'z'], 'Value', ['x']) +\n", + "hv.HoloMap({x: hv.Scatter3D(dataset3d.select(x=x)) for x in range(3)}, kdims=['x']))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Aggregating" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Another common operation is to aggregate the data with a function thereby reducing a dimension. You can either ``aggregate`` the data by passing the dimensions to aggregate or ``reduce`` a specific dimension. Both have the same function:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Image(dataset3d.aggregate(['x', 'y'], np.mean)) + hv.Image(dataset3d.reduce(z=np.mean))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By aggregating the data we can reduce it to any number of dimensions we want. We can for example compute the spread of values for each z-coordinate and plot it using a ``Spread`` and ``Curve`` Element. We simply aggregate by that dimension and pass the aggregation functions we want to apply:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.Spread(dataset3d.aggregate('z', np.mean, np.std)) * hv.Curve(dataset3d.aggregate('z', np.mean))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is also possible to generate lower-dimensional views into the dataset which can be useful to summarize the statistics of the data along a particular dimension. A simple example is a box-whisker of the ``Value`` for each x-coordinate. Using the ``.to`` conversion interface we declare that we want a ``BoxWhisker`` Element indexed by the ``x`` dimension showing the ``Value`` dimension. Additionally we have to ensure to set ``groupby`` to an empty list because by default the interface will group over any remaining dimension." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dataset3d.to(hv.BoxWhisker, 'x', 'Value', groupby=[])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similarly we can generate a ``Distribution`` Element showing the ``Value`` dimension, group by the 'x' dimension and then overlay the distributions, giving us another statistical summary of the data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dataset3d.to(hv.Distribution, [], 'Value', groupby='x').overlay()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Categorical dimensions" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The key dimensions of the multi-dimensional arrays do not have to represent continuous values, we can display datasets with categorical variables as a ``HeatMap`` Element:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "heatmap = hv.HeatMap((['A', 'B', 'C'], ['a', 'b', 'c', 'd', 'e'], np.random.rand(5, 3)))\n", + "heatmap + heatmap.table()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# API" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Accessing the data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to be able to work with data in different formats it defines a general interface to access the data. The dimension_values method allows returning underlying arrays.\n", + "\n", + "#### Key dimensions (coordinates)\n", + "\n", + "By default ``dimension_values`` will return the expanded columnar format of the data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "heatmap.dimension_values('x')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To access just the unique coordinates along a dimension simply supply the ``expanded=False`` keyword:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "heatmap.dimension_values('x', expanded=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally we can also get a non-flattened, expanded coordinate array returning a coordinate array of the same shape as the value arrays" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "heatmap.dimension_values('x', flat=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Value dimensions" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When accessing a value dimension the method will also return a flat view of the data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "heatmap.dimension_values('z')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can pass the ``flat=False`` argument to access the multi-dimensional array:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "heatmap.dimension_values('z', flat=False)" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/09-Indexing_and_Selecting_Data.ipynb b/examples/user_guide/09-Indexing_and_Selecting_Data.ipynb new file mode 100644 index 0000000000..6910acbc4e --- /dev/null +++ b/examples/user_guide/09-Indexing_and_Selecting_Data.ipynb @@ -0,0 +1,541 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Indexing and Selecting data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As explained in the [Composing Data](Composing_Data.ipynb) and [Containers](Containers.ipynb) tutorials, HoloViews allows you to build up hierarchical containers that express the natural relationships between your data items, in whatever multidimensional space best characterizes your application domain. Once your data is in such containers, individual visualizations are then made by choosing subregions of this multidimensional space, either smaller numeric ranges (as in cropping of photographic images), or lower-dimensional subsets (as in selecting frames from a movie, or a specific movie from a large library), or both (as in selecting a cropped version of a frame from a specific movie from a large library). \n", + "\n", + "In this tutorial, we show how to specify such selections, using four different (but related) operations that can act on an element ``e``:\n", + "\n", + "| Operation | Example syntax | Description |\n", + "|:---------------|:----------------:|:-------------|\n", + "| **indexing** | e[5.5], e[3,5.5] | Selecting a single data value, returning one actual numerical value from the existing data\n", + "| **slice** | e[3:5.5], e[3:5.5,0:1] | Selecting a contiguous portion from an Element, returning the same type of Element\n", + "| **sample** | e.sample(y=5.5),
e.sample((3,3)) | Selecting one or more regularly spaced data values, returning a new type of Element\n", + "| **select** | e.select(y=5.5),
e.select(y=(3,5.5)) | More verbose notation covering all supporting slice and index operations by dimension name.\n", + "| **iloc** | e[2, :],
e[2:5, :] | Indexes and slices by row and column tabular index supporting integer indexes, slices, lists and boolean indices.\n", + "\n", + "These operations are all concerned with selecting some subset of your data values, without combining across data values (e.g. averaging) or otherwise transforming your actual data. In the [Tabular Data](Tabular_Datasets.ipynb) tutorial we will look at other operations on the data that reduce, summarize, or transform the data in other ways, rather than selections as covered here.\n", + "\n", + "We'll be going through each operation in detail and provide a visual illustration to help make the semantics of each operation clear. This Tutorial assumes that you are familiar with continuous and discrete coordinate systems, so please review our [Continuous Coordinates Tutorial](Continuous_Coordinates.ipynb) if you have not done so already." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Indexing and slicing Elements" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the [Exploring Data Tutorial](Exploring_Data.ipynb) we saw examples of how to select individual elements embedded in a multi-dimensional space. We also briefly introduced \"deep slicing\" of the ``RGB`` elements to select a subregion of the images. The [Continuous Coordinates Tutorial](Continuous_Coordinates.ipynb) covered slicing and indexing in Elements representing continuous coordinate coordinate systems such as ``Image`` types. Here we'll be going through each operation in full detail, providing a visual illustration to help make the semantics of each operation clear.\n", + "\n", + "How the Element may be indexed depends on the key dimensions (or ``kdims``) of the Element. It is thus important to consider the nature and dimensionality of your data when choosing the Element type for it." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension()\n", + "%opts Layout [fig_size=125] Points [size_index=None] (s=50) Scatter3D [size_index=None]\n", + "%opts Bounds (linewidth=2 color='k') {+axiswise} Text (fontsize=16 color='k') Image (cmap='Reds')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 1D Elements: Slicing and indexing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Certain Chart elements support both single-dimensional indexing and slicing: ``Scatter``, ``Curve``, ``Histogram``, and ``ErrorBars``. Here we'll look at how we can easily slice a ``Histogram`` to select a subregion of it:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "np.random.seed(42)\n", + "edges, data = np.histogram(np.random.randn(100))\n", + "hist = hv.Histogram(edges, data)\n", + "subregion = hist[0:1]\n", + "hist * subregion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The two bins in a different color show the selected region, overlaid on top of the full histogram. We can also access the value for a specific bin in the ``Histogram``. A continuous-valued index that falls inside a particular bin will return the corresponding value or frequency." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hist[0.25], hist[0.5], hist[0.55]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can slice a ``Curve`` the same way:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = np.linspace(0, np.pi*2, 21)\n", + "curve = hv.Curve((xs, np.sin(xs)))\n", + "subregion = curve[np.pi/2:np.pi*1.5]\n", + "curve * subregion * hv.Scatter(curve)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here again the region in a different color is the specified subregion, and we've also marked each discrete point with a dot using the ``Scatter`` ``Element``. As before we can also get the value for a specific sample point; whatever x-index is provided will snap to the closest sample point and return the dependent value:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve[4.05], curve[4.1], curve[4.17], curve[4.3]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is important to note that an index (or a list of indices, as for the 2D and 3D cases below) will always return the raw indexed (dependent) value, i.e. a number. A slice (indicated with `:`), on the other hand, will retain the Element type even in cases where the plot might not be useful, such as having only a single value, two values, or no value at all in that range:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve[4:4.5]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2D and 3D Elements: slicing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For data defined in a 2D space, there are 2D equivalents of the 1D Curve and Scatter types. A ``Points``, for example, can be thought of as a number of points in a 2D space." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "r = np.arange(0, 1, 0.005)\n", + "xs, ys = (r * fn(85*np.pi*r) for fn in (np.cos, np.sin))\n", + "paths = hv.Points((xs, ys))\n", + "paths + paths[0:1, 0:1]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "However, indexing is not supported in this space, because there could be many possible points near a given set of coordinates, and finding the nearest one would require a search across potentially incommensurable dimensions, which is poorly defined and difficult to support.\n", + "\n", + "Slicing in 3D works much like slicing in 2D, but indexing is not supported for the same reason as in 2D:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = np.linspace(0, np.pi*8, 201)\n", + "scatter = hv.Scatter3D((xs, np.sin(xs), np.cos(xs)))\n", + "scatter + scatter[5:10, :, 0:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2D Raster and Image: slicing and indexing\n", + "\n", + "Raster and the various other image-like objects (Images, RGB, HSV, etc.) can all sliced and indexed, as can Surface, because they all have an underlying regular grid of key dimension values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%opts Image (cmap='Blues') Bounds (color='red')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "np.random.seed(0)\n", + "extents = (0, 0, 10, 10)\n", + "img = hv.Image(np.random.rand(10, 10), bounds=extents)\n", + "img_slice = img[1:9,4:5]\n", + "box = hv.Bounds((1,4,9,5))\n", + "img*box + img_slice" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img[4.2,4.2], img[4.3,4.2], img[5.0,4.2]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Tabular indexing and slicing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "While most indexing in HoloViews works by selecting the values along a dimension it is also frequently useful to index and slice using integer row and column indices. For this purpose most HoloViews objects have a ``.iloc`` indexing interface (mirroring the [pandas](http://pandas.pydata.org/pandas-docs/stable/indexing.html#different-choices-for-indexing) API), which supports all the usual indexing semantics including:\n", + "\n", + "* An integer e.g. 5\n", + "\n", + "* A list or array of integers [4, 3, 0]\n", + "\n", + "* A slice object with ints 1:7\n", + "\n", + "* A boolean array" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Indexing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this way we can for example select the x- and y-values in the 8th row of our ``Curve``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = np.linspace(0, np.pi*2, 21)\n", + "curve = hv.Curve((xs, np.sin(xs)))\n", + "print('x: %s, y: %s' % (curve.iloc[8, 0], curve.iloc[8, 1]))\n", + "curve * hv.Scatter(curve.iloc[8])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Slicing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Alternatively we can select every second sample between indices 5 and 16 of a ``Curve``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve + curve.iloc[5:16:2]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Lists of integers and boolean indices" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally we may also pass a list of the integer samples to select or use boolean indices. This mode of indexing can be very useful for randomly sampling an Element or picking a specific set of rows or (columns):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve.iloc[[0, 5, 10, 15, 20]] + curve.iloc[xs>3]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sampling\n", + "\n", + "Sampling is essentially a process of indexing an Element at multiple index locations, and collecting the results. Thus any Element that can be indexed can also be sampled. Compared to regular indexing, sampling is different in that multiple indices may be supplied at the same time. Also, indexing will only return the value at that location, whereas the return type from a sampling operation is another ``Element`` type, usually either a ``Table`` or a ``Curve``, to allow both key and value dimensions to be returned.\n", + "\n", + "### Sampling Elements\n", + "\n", + "Sampling can use either an explicit list of samples, or or by passing the samples for each dimension keyword arguments.\n", + "\n", + "We'll start by taking a single sample of an Image object, to make it clear how sampling and indexing are similar operations yet different in their results:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img_coords = hv.Points(img.table(), extents=extents)\n", + "labeled_img = img * img_coords * hv.Points([img.closest([(5.1,4.9)])])(style=dict(color='r'))\n", + "img + labeled_img + img.sample([(5.1,4.9)])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img[5.1,4.9]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, the output of the indexing operation is the value (0.1965823616800535) from the location closest to the specified , whereas ``.sample()`` returns a Table that lists both the coordinates *and* the value, and slicing (in previous section) returns an Element of the same type, not a Table.\n", + "\n", + "\n", + "Next we can try sampling along only one Dimension on our 2D Image, leaving us with a 1D Element (in this case a ``Curve``):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sampled = img.sample(y=5)\n", + "labeled_img = img * img_coords * hv.Points(zip(sampled['x'], [img.closest(y=5)]*10))\n", + "img + labeled_img + sampled" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Sampling works on any regularly sampled Element type. For example, we can select multiple samples along the x-axis of a Curve." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = np.arange(10)\n", + "samples = [2, 4, 6, 8]\n", + "curve = hv.Curve(zip(xs, np.sin(xs)))\n", + "curve_samples = hv.Scatter(zip(xs, [0] * 10)) * hv.Scatter(zip(samples, [0]*len(samples))) \n", + "curve + curve_samples + curve.sample(samples)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Sampling HoloMaps\n", + "\n", + "Sampling is often useful when you have more data than you wish to visualize or analyze at one time. First, let's create a HoloMap containing a number of observations of some noisy data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "obs_hmap = hv.HoloMap({i: hv.Image(np.random.randn(10, 10), bounds=extents)\n", + " for i in range(3)}, key_dimensions=['Observation'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloMaps also provide additional functionality to perform regular sampling on your data. In this case we'll take 3x3 subsamples of each of the Images." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sample_style = dict(edgecolors='k', alpha=1)\n", + "all_samples = obs_hmap.table().to.scatter3d()(style=dict(alpha=0.15))\n", + "sampled = obs_hmap.sample((3,3))\n", + "subsamples = sampled.to.scatter3d()(style=sample_style)\n", + "all_samples * subsamples + sampled" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By supplying bounds in as a (left, bottom, right, top) tuple we can also sample a subregion of our images:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sampled = obs_hmap.sample((3,3), bounds=(2,5,5,10))\n", + "subsamples = sampled.to.scatter3d()(style=sample_style)\n", + "all_samples * subsamples + sampled" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since this kind of sampling is only well supported for continuous coordinate systems, we can only apply this kind of sampling to Image types for now.\n", + "\n", + "### Sampling Charts\n", + "\n", + "Sampling Chart-type Elements like Curve, Scatter, Histogram is only supported by providing an explicit list of samples, since those Elements have no underlying regular grid." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xs = np.arange(10)\n", + "extents = (0, 0, 2, 10)\n", + "curve = hv.HoloMap({(i) : hv.Curve(zip(xs, np.sin(xs)*i))\n", + " for i in np.linspace(0.5, 1.5, 3)},\n", + " key_dimensions=['Observation'])\n", + "all_samples = curve.table().to.points()\n", + "sampled = curve.sample([0, 2, 4, 6, 8])\n", + "sampling = all_samples * sampled.to.points(extents=extents)(style=dict(color='r'))\n", + "sampling + sampled" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Alternatively, you can always deconstruct your data into a Table (see the [Columnar Data](Columnar_Data.ipynb) tutorial) and perform ``select`` operations instead. This is also the easiest way to sample ``NdElement`` types like Bars. Individual samples should be supplied as a set, while ranges can be specified as a two-tuple." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sampled = curve.table().select(Observation=(0, 1.1), x={0, 2, 4, 6, 8})\n", + "sampling = all_samples * sampled.to.points(extents=extents)(style=dict(color='r'))\n", + "sampling + sampled" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These tools should help you index, slice, sample, and select your data with ease. The [Columnar Data](Columnar_Data.ipynb) tutorial) explains how to do other types of operations, such as averaging and other reduction operations." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/10-Transforming_Elements.ipynb b/examples/user_guide/10-Transforming_Elements.ipynb new file mode 100644 index 0000000000..007a5f5592 --- /dev/null +++ b/examples/user_guide/10-Transforming_Elements.ipynb @@ -0,0 +1,331 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Applying Transformations" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import param\n", + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh', 'matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloViews objects provide a convenient way of wrapping your data along with some metadata for exploration and visualization. For the simplest visualizations, you can simply declare a small collection of element which can then be composed or placed in an appropriate container. As soon as the task becomes more complex, it is natural to write a function of a class to output HoloViews objects.\n", + "\n", + "In this tutorial, the concept of ``Operations`` are introduced which help structure such code, making it possible to write general code that can process HoloViews objects. This enables powerful new ways of specifying HoloViews objects computed from existing data, allowing flexible data processing pipelines to be constructed. Examples of such operations are ``histogram``, ``rolling``, ``datashade`` or ``decimate``, which apply some computation on certain types of Element and return a new Element with the transformed data.\n", + "\n", + "In this Tutorial we will discover how operations work, how to control their parameters and how to chain them. The [Dynamic_Operations](Dynamic_Operations.ipynb) extends what we have learned to demonstrate how operations can be applied lazily by using the ``dynamic`` flag, letting us define deferred processing pipelines that can drive highly complex visualizations and dashboards.\n", + "\n", + "\n", + "## Operations are parameterized\n", + "\n", + "Operations in HoloViews are subclasses of ``Operation``, which transform one Element or ``Overlay`` of Elements by returning a new Element that may be a transformation of the original. All operations are parameterized using the [param](https://ioam.github.io/param/) library which allows easy validation and documentation of the operation arguments. In particular, operations are instances of ``param.ParameterizedFunction`` which allows operations to be used in the same way as normal python functions.\n", + "\n", + "This approach has several advantages, one of which is that we can manipulate the operations parameters at several different levels: at the class-level, at the instance-level or when we call it. Another advantage, is that using parameterizing operations allows them to be inspected just like any other HoloViews object using ``hv.help``. We will now do this for the ``histogram`` operation:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews.operation import histogram\n", + "hv.help(histogram)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Applying operations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Above we can see a listing of all the parameters of the operation, with their defaults, the expected types and detailed docstrings for each one. The ``histogram`` operation can be applied to any Element and will by default generate a histogram for the first value dimension defined on the object it is applied to. As a simple example we can create an ``BoxWhisker`` Element containing samples from a normal distribution, and then apply the ``histogram`` operation to those samples in two ways: 1) by creating an instance on which we will change the ``num_bins`` and 2) by passing ``bin_range`` directly when calling the operation:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "boxw = hv.BoxWhisker(np.random.randn(10000))\n", + "histop_instance = histogram.instance(num_bins=50)\n", + "\n", + "boxw + histop_instance(boxw).relabel('num_bins=50') + histogram(boxw, bin_range=(0, 3)).relabel('bin_range=(0, 3)')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that these two ways of using operations gives us convenient control over how the parameters are applied. An instance allows us to persist some defaults which will be used in all subsequent calls, while passing keyword argument to the operations applies the parameters for just that particular call.\n", + "\n", + "The third way to manipulate parameters is to set them at the class level. If we always want to use ``num_bins=30`` instead of the default of ``num_bins=20`` shown in the help output above, we can simply set ``histogram.num_binds=30``. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Operations on containers" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "``Operations`` in HoloViews are applied to individual elements, which means that when you apply an operation to a container object (such as ``NdLayout``, ``GridSpace`` and ``HoloMap``) the operation is once applied per element. For an operation to work, all the elements must be of the same type which means the operation effectively acts to map the operation over all the contained elements. As a simple example we can define a HoloMap of ``BoxWhisker`` Elements by varying the width of the distribution via the ``Sigma`` value and then apply the histogram operation to it:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "holomap = hv.HoloMap({(i*0.1+0.1): hv.BoxWhisker(np.random.randn(10000)*(i*0.1+0.1)) for i in range(5)},\n", + " kdims=['Sigma'])\n", + "holomap + histogram(holomap)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see the operation has generated a ``Histogram`` for each value of ``Sigma`` in the ``HoloMap``. In this way we can apply the operation to the entire parameter space defined by a ``HoloMap``, ``GridSpace``, and ``NdLayout``." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Combining operations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since operations take a HoloViews object as input and return another HoloViews object we can very easily chain and combine multiple operations to perform complex analyses quickly and easily, while instantly visualizing the output.\n", + "\n", + "In this example we'll work with a timeseries, so we'll define a small function to generate a random, noisy timeseries:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews.operation import timeseries\n", + "\n", + "def time_series(T = 1, N = 100, mu = 0.1, sigma = 0.1, S0 = 20): \n", + " \"\"\"Parameterized noisy time series\"\"\"\n", + " dt = float(T)/N\n", + " t = np.linspace(0, T, N)\n", + " W = np.random.standard_normal(size = N) \n", + " W = np.cumsum(W)*np.sqrt(dt) # standard brownian motion\n", + " X = (mu-0.5*sigma**2)*t + sigma*W \n", + " S = S0*np.exp(X) # geometric brownian motion\n", + " return S\n", + "\n", + "curve = hv.Curve(time_series(N=1000))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we will start applying some operations to this data. HoloViews ships with two ready-to-use timeseries operations: the ``rolling`` operation, which applies a function over a rolling window, and a ``rolling_outlier_std`` operation that computes outlier points in a timeseries by excluding points less than ``sigma`` standard deviation removed from the rolling mean:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter [width=600] (color='black')\n", + "smoothed = curve * timeseries.rolling(curve) * timeseries.rolling_outlier_std(curve)\n", + "smoothed" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the next section we will define a custom operation that will compose with the ``smoothed`` operation output above to form a short operation pipeline." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Defining custom operations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now define our own custom ``Operation`` which as you may recall can process either elements and overlays. This means we can define a simple operation that takes our ``smoothed`` overlay and computes a difference between the raw and smoothed curves that it contains. Such a subtraction will give us the residual between the smoothed and unsmoothed ``Curve`` elements, removing long-term trends and leaving the short-term variation.\n", + "\n", + "Defining an operation is very simple. An ``Operation`` subclass should define a ``_process`` method, which simply accepts an ``element`` argument. Optionally we can also define parameters on the operation, which we can access using the ``self.p`` attribute on the operation. In this case we define a ``String`` parameter, which specifies the name of the subtracted value dimension on the returned Element." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews.operation import Operation\n", + "\n", + "class residual(Operation):\n", + " \"\"\"\n", + " Subtracts two curves from one another.\n", + " \"\"\"\n", + " \n", + " label = param.String(default='Residual', doc=\"\"\"\n", + " Defines the label of the returned Element.\"\"\")\n", + " \n", + " def _process(self, element, key=None):\n", + " # Get first and second Element in overlay\n", + " el1, el2 = element.get(0), element.get(1)\n", + " \n", + " # Get x-values and y-values of curves\n", + " xvals = el1.dimension_values(0)\n", + " yvals = el1.dimension_values(1)\n", + " yvals2 = el2.dimension_values(1)\n", + " \n", + " # Return new Element with subtracted y-values\n", + " # and new label\n", + " return el1.clone((xvals, yvals-yvals2),\n", + " vdims=[self.p.label])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Having defined the residual operation let's try it out right away by applying it to our original and smoothed ``Curve``. We'll place the two objects on top of each other so they can share an x-axis and we can compare them directly:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve [width=600] Overlay [xaxis=None]\n", + "(smoothed + residual(smoothed)).cols(1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this view we can immediately see that only a very small residual is left when applying this level of smoothing. However we have only tried one particular ``rolling_window`` value, the default value of ``10``. To assess how this parameter affects the residual we can evaluate the operation over a number different parameter settongs, as we will now see in the next section." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Evaluating operation parameters" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When applying an operation there are often various parameters to vary. Using traditional plotting approaches it's often difficult to evaluate them interactively to get an detailed understanding of what they do. Here we will apply the ``rolling`` operations with varying ``rolling_window`` widths and ``window_type``s across a ``HoloMap``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "rolled = hv.HoloMap({(w, str(wt)): timeseries.rolling(curve, rolling_window=w, window_type=wt)\n", + " for w in [10, 25, 50, 100, 200] for wt in [None, 'hamming', 'triang']},\n", + " kdims=['Window', 'Window Type'])\n", + "rolled" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This visualization is already useful as we can compare between various parameter values by moving the slider and trying different window options. However since we can also chain operations we can also easily compute the residual and view the two together. \n", + "\n", + "To do this we simply overlay the ``HoloMap`` of smoothed curves on top of the original curve and pass it to our new ``residual`` function. Then we can combine the smoothed view with the original and see how the smoothing and residual curves vary across parameter values:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve [width=600] Overlay [legend_position='top_left']\n", + "(curve(style=dict(color='black')) * rolled + residual(curve * rolled)).cols(1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using a few additional lines we have now evaluated the operation over a number of different parameters values, allowing us to process the data with different smoothing parameters. In addition, by interacting with this visualization we can gain a better understanding of the operation parameters as well as gain insights into the structure of the underlying data.\n", + "\n", + "## Benefits of using ``Operation``\n", + "\n", + "Now that we have seen some operations in action we can get some appreciation of what makes them useful. When working with data interactively you often end up applying a lot of ad-hoc data transforms, which provides maximum flexibility but is neither reproducible nor maintainable. Operations allow you to encapsulate analysis code using a well defined interface that is well suited for building complex analysis pipelines:\n", + "\n", + "1. ``Operation`` parameters are well defined by declaring parameters on the class. These parameters can be easily documented and automatically carry out validation on the types and ranges of the inputs. These parameters are documented using ``hv.help``.\n", + "\n", + "2. Both inputs and outputs of an operation are instantly visualizable, because the data **is** the visualization. This means you're not constantly context switching between data processing and visualization --- visualization comes for free as you build your data processing pipeline.\n", + "\n", + "3. Operations understand HoloViews datastructures and can be immediately applied to any appropriate collection of elements, allowing you to evaluate the operation with permutations of parameter values. This flexibility makes it easy to assess the effect of operation parameters and their effect on your data.\n", + "\n", + "4. As we will discover in the [Dynamic Operation Tutorial](Dynamic_Operations.ipynb), operations can be applied lazily to build up complex deferred data-processing pipelines, which can aid your data exploration and drive interactive visualizations and dashboards.\n", + "\n", + "## Other types of operation\n", + "\n", + "As we have seen ``Operation`` is defined at the level of processing HoloViews elements or overlays of elements. In some situations, you may want to compute a new HoloViews datastructure from a number of elements contained in a structure other than an overlay, such as a HoloMap or a Layout. \n", + "\n", + "One such pattern is an operation that accepts and returns a ``HoloMap`` where each of the output element depends on all the data in the input ``HoloMap``. For situations such as these, subclassing ``Operation`` is not appropriate and we recommend defining your own function. These custom operation types won't automatically gain support for lazy pipelines as described in the [Dynamic Operation Tutorial](Dynamic_Operations.ipynb) and how these custom operations are pipelined is left as a design decision for the user. Note that as long as these functions return simple elements or containers, their output can be used by subclasses of ``Operation`` as normal. \n", + "\n", + "What we *do* recommend is that you subclass from ``param.ParameterizedFunction`` so that you can declare well-documented and validated parameters, add a description of your operation with a class level docstring and gain automatic documentation support via ``hv.help``." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/user_guide/11-Responding_to_Events.ipynb b/examples/user_guide/11-Responding_to_Events.ipynb new file mode 100644 index 0000000000..c14fc953ba --- /dev/null +++ b/examples/user_guide/11-Responding_to_Events.ipynb @@ -0,0 +1,825 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Responding to Events" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "import numpy as np\n", + "hv.extension()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the [DynamicMap tutorial](Dynamic_Map.ipynb) we saw how ``DynamicMap`` allows you to explore high dimensional data using the widgets in the same style as ``HoloMaps``. Although suitable for unbounded exploration of large parameter spaces, the ``DynamicMaps`` described in that notebook support exactly the same mode of interaction as ``HoloMaps``. In particular, the key dimensions are used to specify a set of widgets that when manipulated apply the appopriate indexing to invoke the user-supplied callable.\n", + "\n", + "In this tutorial we will explore the HoloViews streams system that allows *any* sort of value to be supplied from *anywhere*. This system opens a huge set of new possible visualization types, including continuously updating plots that reflect live data as well as dynamic visualizations that can be interacted with directly, as described in the [Linked Streams](Linked_Streams.ipynb) tutorial.\n", + "\n", + "
To use visualize and use a DynamicMap you need to be running a live Jupyter server.
This tutorial assumes that it will be run in a live notebook environment.
\n", + "When viewed statically, DynamicMaps will only show the first available Element,
" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Styles and plot options used in this tutorial\n", + "%opts Ellipse [bgcolor='w'] (color='b')\n", + "%opts Image (cmap='viridis')\n", + "%opts VLine (color='r' linewidth=2) HLine (color='r' linewidth=1)\n", + "%opts Path [show_grid=False bgcolor='w'] (color='k' linestyle='-.')\n", + "%opts Area (hatch='\\\\' facecolor='cornsilk' linewidth=2 edgecolor='k')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A simple ``DynamicMap``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before introducing streams, let us declare a simple ``DynamicMap`` of the sort discussed in the [DynamicMap tutorial](Dynamic_Map.ipynb). This example consists of an ``Curve`` element showing a [Lissajous curve](https://en.wikipedia.org/wiki/Lissajous_curve) with ``VLine`` and ``HLine`` annotation to form a crosshair:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "lin = np.linspace(-np.pi,np.pi,300)\n", + "\n", + "def lissajous(t, a,b, delta):\n", + " return (np.sin(a * t + delta), np.sin(b * t))\n", + "\n", + "def lissajous_curve(t, a=3,b=5, delta=np.pi/2):\n", + " (x,y) = lissajous(t,a,b,delta)\n", + " return hv.Path(lissajous(lin,a,b,delta)) * hv.VLine(x) * hv.HLine(y)\n", + "\n", + "hv.DynamicMap(lissajous_curve, kdims=['t']).redim.range(t=(-3.,3.))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the declared key dimension (``kdims``) has turned into slider widgets that let us move the crosshair along the curve. Now let's see how to position the crosshair using streams." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Introducing streams\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The core concept behind a stream is simple: it is a parameter that can change over time that automatically refreshes code depending on those parameter values. \n", + "\n", + "Like all objects in HoloViews, these parameters are declared using [param](https://ioam.github.io/param) and you can define streams as a parameterized subclass of the ``holoviews.streams.Stream``. A more convenient way is to use the ``Stream.define`` classmethod:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews.streams import Stream, param\n", + "Time = Stream.define('Time', t=0.0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This results in a ``Time`` class with a numeric ``t`` parameter that defaults to zero. As this object is parameterized, we can use ``hv.help`` to view it's parameters:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.help(Time)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This parameter is a ``param.Number`` as we supplied a float, if we have supplied an integer it would have been a ``param.Integer``. Notice that there is no docstring in the help output above but we can add one by explicit defining the parameter as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Time = Stream.define('Time', t=param.Number(default=0.0, doc='A time parameter'))\n", + "hv.help(Time)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we have defined this ``Time`` stream class, we can make of an instance of it and looks at its parameters:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "time_dflt = Time()\n", + "print('This Time instance has parameter t={t}'.format(t=time_dflt.t))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As with all parameterized classes, we can instantiate our parameters with a suitable value of our choice instead of relying on defaults." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "time = Time(t=np.pi/4)\n", + "print('This Time instance has parameter t={t}'.format(t=time.t))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For more information on defining ``Stream`` classes this way, use ``hv.help(Stream.define)``." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Simple streams example" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now supply this streams object to a ``DynamicMap`` using the same ``lissajous_curve`` callback above by adding it to the ``streams`` list:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap = hv.DynamicMap(lissajous_curve, streams=[time])\n", + "dmap + lissajous_curve(t=np.pi/4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Immediately we see that the crosshair position of the ``DynamicMap`` reflects the ``t`` parameter values we set on the ``Time`` stream. This means that the ``t`` parameter was supplied as the argument to the ``lissajous_curve`` callback. As we now have no key dimensions, there is no widgets for the ``t`` dimensions.\n", + "\n", + "Although we have what looks like a static plot, it is in fact dynamic and can be updated in place at any time. To see this, we can call the ``event`` method on our ``DynamicMap``:\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.event( t=0.2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Running this cell will have updated the crosshair from its original position where $t=\\frac{\\pi}{4}$ to a new position where ``t=0.2``. Try running the cell above with different values of ``t`` and watch the plot update!\n", + "\n", + "This ``event`` method is the recommended way of updating the stream parameters on a ``DynamicMap`` but if you have a handle on the relevant stream instance, you can also call the ``event`` method on that:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "time.event(t=-0.2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Running the cell above also move the crosshair to a new position. As there are no key dimensions, there is only a single valid (empty) key that can be accessed with ``dmap[()]`` or ``dmap.select()`` making ``event`` the only way to explore new parameters.\n", + "\n", + "We will examine the ``event`` method and the machinery that powers streams in more detail later in the tutorial after we have looked at more examples of how streams are used in practice." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Working with multiple streams" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The previous example showed a curve parameterized by a single dimension ``t``. Often you will have multiple stream parameters you would like to declare as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ls = np.linspace(0, 10, 200)\n", + "xx, yy = np.meshgrid(ls, ls)\n", + "\n", + "XY = Stream.define('XY',x=0.0,y=0.0)\n", + "\n", + "def marker(x,y):\n", + " return hv.Image(np.sin(xx)*np.cos(yy)) * hv.VLine(x) * hv.HLine(y)\n", + "\n", + "dmap = hv.DynamicMap(marker, streams=[XY()])\n", + "dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can update both ``x`` and ``y`` by passing multiple keywords to the ``event`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.event(x=-0.2, y=0.1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the definition above behaves the same as this definition where we define separate ``X`` and ``Y`` stream classes:\n", + "\n", + "```python\n", + "X = Stream.define('X',x=0.0)\n", + "Y = Stream.define('Y',y=0.0)\n", + "hv.DynamicMap(crosshairs, streams=[X(),Y()])\n", + "```\n", + "\n", + "The reason why you might want to list multiple streams instead of always defining a single stream containing all the required stream parameters will be made clear in the [Linked Streams](Linked_Streams.ipynb) tutorial." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Combining streams and key dimensions\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All the ``DynamicMap`` examples above can't be indexed with anything other than ``dmap[()]`` or ``dmap.select()`` as none of them had any key dimensions. This was to focus exclusively on the streams system at the start of the tutorial and not because you can't combine key dimensions and streams:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%opts Curve (linestyle='-')\n", + "xs = np.linspace(-3, 3, 400)\n", + "\n", + "def function(xs, time):\n", + " \"Some time varying function\"\n", + " return np.exp(np.sin(xs+np.pi/time))\n", + "\n", + "def integral(limit, time):\n", + " curve = hv.Curve((xs, function(xs, time)))[limit:]\n", + " area = hv.Area((xs, function(xs, time)))[:limit]\n", + " summed = area.dimension_values('y').sum() * 0.015 # Numeric approximation\n", + " return (area * curve * hv.VLine(limit) * hv.Text(limit + 0.5, 2.0, '%.2f' % summed))\n", + "\n", + "Time = Stream.define('Time', time=1.0)\n", + "dmap=hv.DynamicMap(integral, kdims=['limit'], streams=[Time()]).redim.range(limit=(-3,2))\n", + "dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this example, you can drag the slider to see a numeric approximation to the integral on the left side on the ``VLine``.\n", + "\n", + "As ``'limit'`` is declared as a key dimension, it is given a normal HoloViews slider. As we have also defined a ``time`` stream, we can update the displayed curve for any time value:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.event(time=8)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We now see how to control the ``time`` argument of the integral function by triggered an event with a new time value, and how to control the ``limit`` argument by moving a slider. Controlling ``limit`` with a slider this way is valid but also a little unintuitive: what if you could control ``limit`` just by hovering over the plot?\n", + "\n", + "In the [Linked Streams](Linked_Streams.ipynb) tutorial, we will see how we can do exactly this by switching to the bokeh backend and using the linked streams system." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Matching names to arguments" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that in the example above, the key dimension names and the stream parameter names match the arguments to the callable. This *must* be true for stream parameters but this isn't a requirement of key dimensions: if you replace the word 'radius' with 'size' in the example above after ``XY`` is defined, the example still works. \n", + "\n", + "Here are the rules regarding the callback argument names:\n", + "\n", + "* If your key dimensions and stream parameters match the callable argument names, the definition is valid.\n", + "* If your callable accepts mandatory positional arguments and their number matches the number of key dimensions, the names don't need to match and these arguments will be passed key dimensions values.\n", + "\n", + "As stream parameters always need to match the argument names, there is a method to allow them to be easily renamed. Let's say you imported a stream class as shown in [Custom_Interactivity](12-Custom_Interactivity.ipynb) or for this example, reuse the existing ``XY`` stream class. You can then use the ``rename`` method allowing the following definition:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def integral2(lim, t): \n", + " 'Same as integral with different argument names'\n", + " return integral(lim, t)\n", + "\n", + "dmap = hv.DynamicMap(integral2, kdims=['limit'], streams=[Time().rename(time='t')]).redim.range(limit=(-3.,3.))\n", + "dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Occasionally, it is useful to suppress some of the stream parameters of a stream class, especially when using the *linked streams* described in [Custom_Interactivity](12-Custom_Interactivity.ipynb). To do this you can rename the stream parameter to ``None`` so that you no longer need to worry about it being passed as an argument to the callable. To re-enable a stream parameter, it is sufficient to either give the stream parameter it's original string name or a new string name." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Overlapping stream and key dimensions" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the above example above, the stream parameters do not overlap with the declared key dimension. What happens if we add 'time' to the declared key dimensions?\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap=hv.DynamicMap(integral, kdims=['time','limit'], streams=[Time()]).redim.range(limit=(-3.,3.))\n", + "dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First you might notice that the 'time' value is now shown in the title but that there is no corresponding time slider as its value is supplied by the stream.\n", + "\n", + "The 'time'parameters is now what is called 'dimensioned streams' which renables indexing of these dimensions:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap[1,0] + dmap.select(time=3,limit=1.5) + dmap[None,1.5]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In **A**, we supply our own values for the 'time and 'limit' parameters. This doesn't change the values of the 'time' parameters on the stream itself but it does allow us to see what would happen when the time value is one. Note the use of ``None`` in **C** as a way of leaving an explicit value unspecified, allowing the current stream value to be used.\n", + "\n", + "This is one good reason to use dimensioned streams - it restores access to convenient indexing and selecting operation as a way of exploring your visualizations. The other reason it is useful is that if you keep all your parameters dimensioned, it re-enables the ``DynamicMap`` cache described in the [DynamicMap tutorial](Dynamic_Map.ipynb), allowing you to record your interaction with streams and allowing you to cast to ``HoloMap`` for export:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.reset() # Reset the cache, we don't want the values from the cell above\n", + "# TODO: redim the limit dimension to a default of 0\n", + "dmap.event(time=1)\n", + "dmap.event(time=1.5)\n", + "dmap.event(time=2)\n", + "hv.HoloMap(dmap)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One use of this would be to have a simulator drive a visualization forward using ``event`` in a loop. You could then stop your simulation and retain the a recent history of the output as long as the allowed ``DynamicMap`` cache." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Generators and argument-free callables" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to callables, Python supports [generators](https://docs.python.org/3/glossary.html#term-generator) that can be defined with the ``yield`` keyword. Calling a function that uses yield returns a [generator iterator](https://docs.python.org/3/glossary.html#term-generator-iterator) object that accepts no arguments but returns new values when iterated or when ``next()`` is applied to it.\n", + "\n", + "HoloViews supports Python generators for completeness and [generator expressions](https://docs.python.org/3/glossary.html#term-generator-expression) can be a convenient way to define code inline instead of using lambda functions. As generators expressions don't accept arguments and can get 'exhausted' ***we recommend using callables with ``DynamicMap``*** - exposing the relevant arguments also exposes control over your visualization.\n", + "\n", + "Callables that have arguments, unlike generators allow you to re-visit portions of your parameter space instead of always being forced in one direction via calls to ``next()``. With this caveat in mind, here is an example of a generator and the corresponding generator iterator that returns a ``BoxWhisker`` element:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def sample_distributions(samples=10, tol=0.04):\n", + " np.random.seed(42)\n", + " while True:\n", + " gauss1 = np.random.normal(size=samples)\n", + " gauss2 = np.random.normal(size=samples)\n", + " data = (['A']*samples + ['B']*samples, np.hstack([gauss1, gauss2]))\n", + " yield hv.BoxWhisker(data, kdims=['Group'], vdims=['Value'])\n", + " samples+=1\n", + " \n", + "sample_generator = sample_distributions()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This returns two box whiskers representing samples from two Gaussian distributions of 10 samples. Iterating over this generator simply resamples from these distributions using an additional sample each time.\n", + "\n", + "As with a callable, we can pass our generator iterator to ``DynamicMap``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.DynamicMap(sample_generator)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Without using streams, we now have a problem as there is no way to trigger the generator to view the next distribution in the sequence. We can solve this by defining a stream with no parameters:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap = hv.DynamicMap(sample_generator, streams=[Stream.define('Next')()])\n", + "dmap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Stream event update loops" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we can simply use ``event()`` to drive the generator forward and update the plot, showing how the two Gaussian distributions converge as the number of samples increase." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "for i in range(40):\n", + " dmap.event()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that there is a better way to run loops that drive ``dmap.event()`` which supports a ``period`` (in seconds) between updates and a ``timeout`` argument (also in seconds):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.periodic(0.1, 1000, timeout=3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this generator example, ``event`` does not require any arguments but you can set the ``param_fn`` argument to a callable that takes an iteration counter and returns a dictionary for setting the stream parameters. In addition you can use ``block=False`` to avoid blocking the notebook using a threaded loop. This can be very useful although there can have two downsides 1. all running visualizations using non-blocking updates will be competing for computing resources 2. if you override a variable that the thread is actively using, there can be issues with maintaining consistent state in the notebook.\n", + "\n", + "Generally, the ``periodic`` utility is recommended for all such event update loops and it will be used instead of explicit loops in the rest of the tutorials involving streams.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Using ``next()``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The approach shown above of using an empty stream works in an exactly analogous fashion for callables that take no arguments. In both cases, the ``DynamicMap`` ``next()`` method is enabled:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.HoloMap({i:next(dmap) for i in range(10)}, kdims=['Iteration'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Next steps" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The streams system allows you to update plots in place making it possible to build live visualizations that update in response to incoming live data or any other type of event. As we have seen in this tutorial, you can use streams together with key dimensions to add additional interactivity to your plots while retaining the familiar widgets.\n", + "\n", + "This tutorial used examples that work with either the matplotlib or bokeh backends. In the [Linked Streams](Linked_Streams.ipynb) tutorial, you will see how you can directly interact with dynamic visualizations when using the bokeh backend." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## [Advanced] How streams work\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This optional section is not necessary for users who simply want to use the streams system, but it does describes how streams actually work in more detail.\n", + "\n", + "A stream class is one that inherits from ``Stream`` that typically defines some new parameters. We have already seen one convenient way of defining a stream class:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "defineXY = Stream.define('defineXY', x=0.0, y=0.0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is equivalent to the following definition which would be more appropriate in library code or for complex stream class requiring lots of parameters that need to be documented:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "class XY(Stream):\n", + " x = param.Number(default=0.0, constant=True, doc='An X position.')\n", + " y = param.Number(default=0.0, constant=True, doc='A Y position.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we have already seen, we can make an instance of ``XY`` with some initial values for ``x`` and ``y``." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xy = XY(x=2,y=3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "However, trying to modify these parameters directly will result in an exception as they have been declared constant (e.g ``xy.x=4`` will throw an error). This is because there are two allowed ways of modifying these parameters, the simplest one being ``update``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xy.update(x=4,y=50)\n", + "xy.rename(x='xpos', y='ypos').contents" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This shows how you can update the parameters and also shows the correct way to view the stream parameter values via the ``contents`` property as this will apply any necessary renaming.\n", + "\n", + "So far, using ``update`` has done nothing but forced us to access parameter a certain way. What makes streams work are the side-effects you can trigger when changing a value via the ``event`` method. The relevant side-effect is to invoke callables called 'subscribers'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Subscribers" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Without defining any subscribes, the ``event`` method is identical to ``update``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xy = XY()\n", + "xy.event(x=4,y=50)\n", + "xy.contents" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's add a subscriber:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def subscriber(xpos,ypos):\n", + " print('The subscriber received xpos={xpos} and ypos={ypos}'.format(xpos=xpos,ypos=ypos))\n", + "\n", + "xy = XY().rename(x='xpos', y='ypos')\n", + "xy.add_subscriber(subscriber)\n", + "xy.event(x=4,y=50)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we can see, now when you call ``event``, our subscriber is called with the updated parameter values, renamed as appropriate. The ``event`` method accepts the original parameter names and the subscriber receives the new values after any renaming is applied. You can add as many subscribers as you want and you can clear them using the ``clear`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xy.clear()\n", + "xy.event(x=0,y=0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When you define a ``DynamicMap`` using streams, the HoloViews plotting system install the necessary callbacks as subscibers to update the plot when the stream parameters change. The above example clears all subscribers (it is equivalent to ``clear('all')``. To clear only the subscribers you define yourself use ``clear('user')`` and to clear any subscribers installed by the HoloViews plotting system use ``clear('internal')``.\n", + "\n", + "When using linked streams as described in the [Linked Streams](Linked_Streams.ipynb) tutorial, the plotting system recognizes the stream class and registers the necessary machinery with Bokeh to update the stream values based on direct interaction with the plot." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/12-Custom_Interactivity.ipynb b/examples/user_guide/12-Custom_Interactivity.ipynb new file mode 100644 index 0000000000..9e6d2e0b60 --- /dev/null +++ b/examples/user_guide/12-Custom_Interactivity.ipynb @@ -0,0 +1,412 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Custom Interactivity" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import param\n", + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh', 'matplotlib')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In previous notebooks we discovered how the [DynamicMap](Dynamic_Map.ipynb) class allows us to declare objects in a lazy way to enable exploratory analysis of large parameter spaces. In the [Streams](Streams.ipynb) tutorial we learned how to interactively push updates to existing plots by declaring Streams on a DynamicMap. In this tutorial we will extend the idea to so called *linked* Streams, which allows complex interactions to be declared by specifying which events should be exposed when a plot is interacted with. By passing information about live interactions to a simple Python based callback, you will be able to build richer, even more interactive visualizations that enable seemless data exploration.\n", + "\n", + "Some of the possibilities this opens up include:\n", + "\n", + "* Dynamically aggregating datasets of billions of datapoints depending on the plot axis ranges using the [datashader](https://anaconda.org/jbednar/holoviews_datashader/notebook) library.\n", + "* Responding to ``Tap`` and ``DoubleTap`` events to reveal more information in subplots.\n", + "* Computing statistics in response to selections applied with box- and lasso-select tools.\n", + "\n", + "Currently only the bokeh backend for HoloViews supports the linked streams system but the principles used should extend to any backend can define callbacks that fire when a user zooms or pans or interacts with a plot.\n", + "\n", + "
To use and visualize DynamicMap or Stream objects you need to be running a live Jupyter server.
This tutorial assumes that it will be run in a live notebook environment.
\n", + "When viewed statically, DynamicMaps will only show the first available Element,
\n", + "\n", + "## Available Linked Streams\n", + "\n", + "There are a huge number of ways one might want to interact with a plot. The HoloViews streams module aims to expose many of the most common interactions you might want want to employ, while also supporting extensibility via custom linked Streams. \n", + "\n", + "Here is the full list of linked Stream that are all descendents of the ``LinkedStream`` baseclass:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews import streams\n", + "listing = ', '.join(sorted([str(s.name) for s in param.descendents(streams.LinkedStream)]))\n", + "print('The linked stream classes supported by HoloViews are:\\n\\n{listing}'.format(listing=listing))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, most of these events are about specific interactions with a plot such as the current axis ranges (the ``RangeX``, ``RangeY`` and ``RangeXY`` streams), the mouse pointer position (the ``PointerX``, ``PointerY`` and ``PointerXY`` streams), click or tap positions (``Tap``, ``DoubleTap``). Additionally there a streams to access plotting selections made using box- and lasso-select tools (``Selection1D``), the plot size (``PlotSize``) and the ``Bounds`` of a selection. \n", + "\n", + "Each of these linked Stream types has a corresponding backend specific ``Callback``, which defines which plot attributes or events to link the stream to and triggers events on the ``Stream`` in response to changes on the plot. Defining custom ``Stream`` and ``Callback`` types will be covered in the [Stream Callback Tutorial](Stream_Callbacks.ipynb)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Linking streams to plots\n", + "\n", + "At the end of the [Streams](./Streams.ipynb) tutorial we discovered that streams have ``subscribers``, which allow defining user defined callbacks on events, but also allows HoloViews to install subscribers that allow plots to respond to Stream updates. Linked streams add another concept on top of ``subscribers``, namely the Stream ``source``.\n", + "\n", + "The source of a linked stream defines which plot element to receive events from. Any plot containing the ``source`` object will be attached to the corresponding linked stream and will send event values in response to the appropriate interactions.\n", + "\n", + "Let's start with a simple example. We will declare one of the linked Streams from above, the ``PointerXY`` stream. This stream sends the current mouse position in plot axes coordinates, which may be continuous or categorical. The first thing to note is that we haven't specified a ``source`` which means it uses the default value of ``None``." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pointer = streams.PointerXY()\n", + "print(pointer.source)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before continuing, we can check the stream parameters that are made available to user callbacks from a given stream instance by looking at its contents:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print('The %s stream has contents %r' % (pointer, pointer.contents))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Automatic linking" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A stream instance is automatically linked to the first ``DynamicMap`` we pass it to, which we can confirm by inspecting the stream's ``source`` attribute after supplying it to a ``DynamicMap``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pointer_dmap = hv.DynamicMap(lambda x, y: hv.Points([(x, y)]), streams=[pointer])\n", + "print(pointer.source is pointer_dmap)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``DynamicMap`` we defined above simply defines returns a ``Points`` object composed of a single point that marks the current ``x`` and ``y`` position supplied by our ``PointerXY`` stream. The stream is linked whenever this ``DynamicMap`` object is displayed as it is the stream source:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pointer_dmap(style={\"Points\": dict(size=10)})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you hover over the plot canvas above you can see that the point tracks the current mouse position. We can also inspect the last cursor position by examining the stream contents:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pointer.contents" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the [Streams](Streams.ipynb) tutorial, we introduced an integral example that would work more intuitively with linked streams. Here it is again with the ``limit`` value controlled by the ``PointerX`` linked stream:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Area (color='#fff8dc' line_width=2) Curve (color='black') VLine (color='red')\n", + "xs = np.linspace(-3, 3, 400)\n", + "\n", + "def function(xs, time):\n", + " \"Some time varying function\"\n", + " return np.exp(np.sin(xs+np.pi/time))\n", + "\n", + "def integral(limit, time):\n", + " limit = -3 if limit is None else np.clip(limit,-3,3)\n", + " curve = hv.Curve((xs, function(xs, time)))[limit:]\n", + " area = hv.Area((xs, function(xs, time)))[:limit]\n", + " summed = area.dimension_values('y').sum() * 0.015 # Numeric approximation\n", + " return (area * curve * hv.VLine(limit) * hv.Text(limit + 0.8, 2.0, '%.2f' % summed))\n", + "\n", + "hv.DynamicMap(integral, streams=[streams.Stream.define('Time', time=1.0)(), \n", + " streams.PointerX().rename(x='limit')])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We only needed to import and use ``PointerX`` stream and rename the ``x`` parameter that tracks the cursor position to 'limit' so that it maps to the corresponding argument. Otherwise, the example only required bokeh specific style options to match the matplotlib example as closely as possible." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Explicit linking\n", + "\n", + "In the example above, we took advantage of the fact that a ``DynamicMap`` automatically becomes the stream source if a source isn't explicitly specified. If we want to link the stream instance to a different object we can specify our our source explicitly. Here we will create a 2D ``Image`` of sine gratings, declare that is then declared as the ``source`` of the ``PointerXY`` stream. This pointer stream is then used to generate a single point that tracks the cursor when hovering over the image:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xvals = np.linspace(0,4,202)\n", + "ys,xs = np.meshgrid(xvals, -xvals[::-1])\n", + "img = hv.Image(np.sin(((ys)**3)*xs))\n", + "\n", + "pointer = streams.PointerXY(x=0,y=0, source=img)\n", + "pointer_dmap = hv.DynamicMap(lambda x, y: hv.Points([(x, y)]), streams=[pointer])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now if we display a ``Layout`` consisting of the ``Image`` acting as the source together with the ``DynamicMap``, the point shown on the right tracks the cursor position when hovering over the image on the left:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img + pointer_dmap(style={\"Points\": dict(size=10)})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This will even work across different cells. If we use this particular stream instance in another ``DynamicMap`` and display it, this new visualization will also be supplied with the cursor position when hovering over the image. \n", + "\n", + "To illustrate this, we will now use the pointer ``x`` and ``y`` position to generate cross-sections of the image at the cursor position on the ``Image``, making use of the ``Image.sample`` method. Note the use of ``np.clip`` to make sure the cross-section is well defined when the cusor goes out of bounds:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Curve {+framewise}\n", + "hv.DynamicMap(lambda x, y: img.sample(y=np.clip(y,-.49,.49)), streams=[pointer]) +\\\n", + "hv.DynamicMap(lambda x, y: img.sample(x=np.clip(x,-.49,.49)), streams=[pointer])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now when you hover over the ``Image`` above, you will see the cross-sections update while the point position to the right of the ``Image`` simultaneously updates." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Unlinking objects" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Sometimes we just want to display an object designated as a source without linking it to the stream. If the object is not a ``DynamicMap``, like the ``Image`` we designated as a ``source`` above, we can make a copy of the object using the ``clone`` method. We can do the same with ``DynamicMap`` though we just need to supply ``link_inputs=False`` as an extra argument.\n", + "\n", + "Here we will create a ``DynamicMap`` that draws a cross-hair at the cursor position:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pointer = streams.PointerXY(x=0, y=0)\n", + "cross_dmap = hv.DynamicMap(lambda x, y: (hv.VLine(x) * hv.HLine(y)), streams=[pointer])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we will add two copies of the ``cross_dmap`` into a Layout but the subplot on the right will not be linking the inputs. Try hovering over the two subplots and observe what happens:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "cross_dmap + cross_dmap.clone(link_inputs=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice how hovering over the left plot updates the crosshair position on both subplots, while hovering over the right subplot has no effect." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Transient linked streams" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the basic [Streams](Streams.ipynb) tutorial we saw that stream parameters can be updated and those values are then passed to the callback. This model works well for many different types of streams that have a well-defined values at all times.\n", + "\n", + "This approach is not suitable for certain events which only have a well defined value at a particular point in time. For instance, when you hover your mouse over a plot, the hover position always has a well-defined value but the click position is only defined when a click occurs (if it occurs).\n", + "\n", + "This latter case are what are called 'transient' streams. These streams are supplied new values only when they occur and fall back to a default value at all other times. This default value is typically ``None`` to indicate that the event is not occuring and therefore has no data.\n", + "\n", + "\n", + "Transient streams are particularly useful when you are subscribed to multiple streams, some of which are only occasionally triggered. A good example are the ``Tap`` and ``DoubleTap`` streams; while you sometimes just want to know the last tapped position, we can only tell the two events apart if their values are ``None`` when not active. \n", + "\n", + "We'll start by declaring a ``Tap`` and a ``DoubleTap`` stream as ``transient``. Since both streams supply 'x' and 'y' parameters, we will rename the ``DoubleTap`` parameters to 'x2' and 'y2'." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tap = streams.Tap(transient=True)\n", + "double_tap = streams.DoubleTap(rename={'x': 'x2', 'y': 'y2'}, transient=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next we define a list of taps we can append to and a function that performs that accumulates the tap and double tap coordinate along with the number of taps, returning a ``Points`` Element of the tap positions." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "taps = []\n", + "\n", + "def record_taps(x, y, x2, y2):\n", + " if None not in [x,y]:\n", + " taps.append((x, y, 1))\n", + " elif None not in [x2, y2]:\n", + " taps.append((x2, y2, 2))\n", + " return hv.Points(taps, vdims=['Taps'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally we can create a ``DynamicMap`` from our callback and attach the streams. We also apply some styling so the points are colored depending on the number of taps." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [color_index='Taps' tools=['hover']] (size=10 cmap='Set1')\n", + "hv.DynamicMap(record_taps, streams=[tap, double_tap])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now try single- and double-tapping within the plot area, each time you tap a new point is appended to the list and displayed. Single taps show up in red and double taps show up in grey. We can also inspect the list of taps directly:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "taps" + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/14-Large_Data.ipynb b/examples/user_guide/14-Large_Data.ipynb new file mode 100644 index 0000000000..0744b75df8 --- /dev/null +++ b/examples/user_guide/14-Large_Data.ipynb @@ -0,0 +1,368 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Working with large data using datashader" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The various plotting backends supported by HoloViews (such as Matplotlib and Bokeh) each have limitations on the amount of data that is practical to work with, for a variety of reasons. For instance, Bokeh mirrors your data directly into an HTML page viewable in your browser, which can cause problems when data sizes approach the limited memory available for each web page in your browser.\n", + "\n", + "Luckily, a visualization of even the largest dataset will be constrained by the resolution of your display device, and so one approach to handling such data is to pre-render or rasterize the data into a fixed-size array or image before sending it to the backend. The [Datashader package](https://github.com/bokeh/datashader) provides a high-performance big-data rasterization pipeline that works seamlessly with HoloViews to support datasets that are orders of magnitude larger than those supported natively by the plotting backends." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "import datashader as ds\n", + "from holoviews.operation.datashader import aggregate, shade, datashade, dynspread\n", + "from holoviews.operation import decimate\n", + "hv.extension('bokeh')\n", + "decimate.max_samples=1000\n", + "dynspread.max_px=20\n", + "dynspread.threshold=0.5\n", + "\n", + "def random_walk(n, f=5000):\n", + " \"\"\"Random walk in a 2D space, smoothed with a filter of length f\"\"\"\n", + " xs = np.convolve(np.random.normal(0, 0.1, size=n), np.ones(f)/f).cumsum()\n", + " ys = np.convolve(np.random.normal(0, 0.1, size=n), np.ones(f)/f).cumsum()\n", + " xs += 0.1*np.sin(0.1*np.array(range(n-1+f))) # add wobble on x axis\n", + " xs += np.random.normal(0, 0.005, size=n-1+f) # add measurement noise\n", + " ys += np.random.normal(0, 0.005, size=n-1+f)\n", + " return np.column_stack([xs, ys])\n", + "\n", + "def random_cov():\n", + " \"\"\"Random covariance for use in generating 2D Gaussian distributions\"\"\"\n", + " A = np.random.randn(2,2)\n", + " return np.dot(A, A.T)\n", + "\n", + "def time_series(T = 1, N = 100, mu = 0.1, sigma = 0.1, S0 = 20): \n", + " \"\"\"Parameterized noisy time series\"\"\"\n", + " dt = float(T)/N\n", + " t = np.linspace(0, T, N)\n", + " W = np.random.standard_normal(size = N) \n", + " W = np.cumsum(W)*np.sqrt(dt) # standard brownian motion\n", + " X = (mu-0.5*sigma**2)*t + sigma*W \n", + " S = S0*np.exp(X) # geometric brownian motion\n", + " return S" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Supported ``Elements``\n", + "\n", + "As HoloViews elements are fundamentally data containers, not visualizations you can very quickly declare elements such as ``Points`` or ``Path`` containing datasets that may be as large as the available memory. For instance, this means you can immediately specify a datastructure that you can work with until you try to visualize it directly with either the matplotlib or bokeh plotting extensions as the rendering process may now be prohibitively expensive.\n", + "\n", + "Let's start with a simple example we can visualize as normal:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "np.random.seed(1)\n", + "points = hv.Points(np.random.multivariate_normal((0,0), [[0.1, 0.1], [0.1, 1.0]], (1000,)),label=\"Points\")\n", + "paths = hv.Path([random_walk(2000,30)], label=\"Paths\")\n", + "\n", + "points + paths" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These browser-based plots are fully interactive, as you can see if you select the Wheel Zoom or Box Zoom tools and use your scroll wheel or click and drag. \n", + "\n", + "Because all of the data in these plots gets transferred directly into the web browser, the interactive functionality will be available even on a static export of this figure as a web page. Note that even though the visualization above is not computationally expensive, even with just 1000 points as in the scatterplot above, the plot already suffers from [overplotting](https://anaconda.org/jbednar/plotting_pitfalls), with later points obscuring previously plotted points. With much larger datasets, these issues will quickly make it impossible to see the true structure of the data. \n", + "\n", + "\n", + "# Datashader operations\n", + "\n", + "If we tried to visualize the same two elements below which are just larger versions of the same data above, the plots would be nearly unusable even if the browser did not crash:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "np.random.seed(1)\n", + "points = hv.Points(np.random.multivariate_normal((0,0), [[0.1, 0.1], [0.1, 1.0]], (1000000,)),label=\"Points\")\n", + "paths = hv.Path([0.15*random_walk(100000) for i in range(10)],label=\"Paths\")\n", + "\n", + "#points + paths ## Danger! Browsers can't handle 1 million points!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Luckily, because elements are just containers for data and associated metadata, not plots, HoloViews can generate entirely different types of visualizations from the same data structure when appropriate. For instance, in the plot on the left below you can see the result of applying a `decimate()` operation acting on the `points` object, which will automatically downsample this million-point dataset to at most 1000 points at any time as you zoom in or out:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "decimate(points) + datashade(points) + datashade(paths)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Decimating a plot in this way can be useful, but it discards most of the data, yet still suffers from overplotting. If you have Datashader installed, you can instead use the `datashade()` operation to create a dynamic Datashader-based Bokeh plot. (Here `datashade()` is just a convenient shortcut for the two main steps using in datashader, i.e., `shade(aggregate())`, which can also be invoked separately.) The middle plot above shows the result of using `datashade()` to create a dynamic Datashader-based plot out of an Element with arbitrarily large data. In the Datashader version, a new image is regenerated automatically on every zoom or pan event, revealing all the data available at that zoom level and avoiding issues with overplotting.\n", + "\n", + "These two Datashader-based plots are similar to the native Bokeh plots above, but instead of making a static Bokeh plot that embeds points or line segments directly into the browser, HoloViews sets up a Bokeh plot with dynamic callbacks that render the data as an RGB image using Datashader instead. The dynamic re-rendering provides an interactive user experience even though the data itself is never provided directly into the browser. Of course, because the full data is not in the browser, a static export of this page (e.g. on anaconda.org) will only show the initially rendered version, and will not update with new images when zooming as it will when there is a live Python process available.\n", + "\n", + "Though you can no longer have a completely interactive exported file, with the Datashader version on a live server you can now change 1000000 to 10000000 or more to see how well your machine will handle larger datasets. It will get a bit slower, but if you have enough memory, it should still be very usable, and should never crash your browser as transferring the whole dataset into your browser would. If you don't have enough memory, you can instead set up a [Dask](http://dask.pydata.org) dataframe as shown in other Datashader examples, which will provide out of core and/or distributed processing to handle even the largest datasets." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Spreading\n", + "\n", + "\n", + "\n", + "The Datashader examples above treat points and lines as infinitesimal in width, such that a given point or small bit of line segment appears in at most one pixel. This approach ensures that the overall distribution of the points will be mathematically well founded -- each pixel will scale in value directly by the number of points that fall into it, or by the lines that cross it.\n", + "\n", + "However, many monitors are sufficiently high resolution that the resulting point or line can be difficult to see---a single pixel may not actually be visible on its own, and the color of it is likely to be very difficult to make out. To compensate for this issue, HoloViews provides access to Datashader's image-based \"spreading\", which makes isolated pixels \"spread\" into adjacent ones for visibility. Because the amount of spreading that's useful depends on how close the datapoints are to each other on screen, the most useful such function is `dynspread`, which spreads up to a maximum sized as long as it does not exceed a specified fraction of adjacency between pixels. You can compare the results in the two plots below after zooming in:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "datashade(points) + dynspread(datashade(points))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Both plots show the same data, and look identical when zoomed out, but when zoomed in enough you should be able to see the individual data points on the right while the ones on the left are barely visible. The dynspread parameters typically need some hand tuning, as the only purpose of such spreading is to make things visible on a particular monitor for a particular observer; the underlying mathematical operations in Datashader do not normally need parameters to be adjusted.\n", + "\n", + "The same operation works similarly for line segments:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "datashade(paths) + dynspread(datashade(paths))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Multidimensional plots\n", + "\n", + "The above plots show two dimensions of data plotted along *x* and *y*, but Datashader operations can be used with additional dimensions as well. For instance, an extra dimension (here called `k`), can be treated as a category label and used to colorize the points or lines. Compared to a standard scatterplot that would suffer from overplotting, here the result will be merged mathematically by Datashader, completely avoiding any overplotting issues except local ones due to spreading:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts RGB [width=400] {+axiswise}\n", + "\n", + "np.random.seed(3)\n", + "kdims=['d1','d2']\n", + "num_ks=8\n", + "\n", + "def rand_gauss2d():\n", + " return 100*np.random.multivariate_normal(np.random.randn(2), random_cov(), (100000,))\n", + "\n", + "gaussians = {i: hv.Points(rand_gauss2d(), kdims=kdims) for i in range(num_ks)}\n", + "lines = {i: hv.Curve(time_series(N=10000, S0=200+np.random.rand())) for i in range(num_ks)}\n", + "\n", + "gaussspread = dynspread(datashade(hv.NdOverlay(gaussians, kdims=['k']), aggregator=ds.count_cat('k')))\n", + "linespread = dynspread(datashade(hv.NdOverlay(lines, kdims=['k']), aggregator=ds.count_cat('k')))\n", + "\n", + "gaussspread + linespread" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Because Bokeh only ever sees an image, providing legends and keys has to be done separately, though we are working to make this process more seamless. For now, you can show a legend by adding a suitable collection of labeled points:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts RGB [width=600]\n", + "\n", + "from datashader.colors import Sets1to3 # default datashade() and shade() color cycle\n", + "color_key = list(enumerate(Sets1to3[0:num_ks]))\n", + "color_points = hv.NdOverlay({k: hv.Points([0,0], label=str(k))(style=dict(color=v)) for k, v in color_key})\n", + "\n", + "color_points * gaussspread" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `hv.NdOverlay` data structure merges all values along that dimension into the same image, (optionally) coloring each point or line to keep the values visibly different. If you prefer to keep the values completely separate so that every image contains only one value along the `k` dimension, you can put the data into an `hv.HoloMap`, which lets you index to choose a value for that dimension (or for multiple such dimensions). If you have not indexed into the dimension(s) to choose one location in the multidimensional space, HoloViews will automatically generate slider widgets to allow you to choose such a value interactively:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts RGB [width=300] {+axiswise}\n", + "\n", + "datashade(hv.HoloMap(gaussians, kdims=['k'])) + datashade(hv.HoloMap(lines, kdims=['k']))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can thus very naturally explore even very large multidimensional datasets. Note that the static exported version (e.g. on anaconda.org) will only show a single frame, rather than the entire set of frames visible with a live Python server.\n", + "\n", + "\n", + "## Working with time series\n", + "\n", + "Although Datashader does not natively [support datetime(64)](https://github.com/bokeh/datashader/issues/270) types for its dimensions, we can convert to an integer representation and apply a custom formatter using HoloViews and Bokeh:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from bokeh.models import DatetimeTickFormatter\n", + "def apply_formatter(plot, element):\n", + " plot.handles['xaxis'].formatter = DatetimeTickFormatter()\n", + " \n", + "import pandas as pd\n", + "drange = pd.date_range(start=\"2014-01-01\", end=\"2016-01-01\", freq='1D') # or '1min'\n", + "dates = drange.values.astype('int64')/10**6 # Convert dates to ints" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts RGB [finalize_hooks=[apply_formatter] width=800]\n", + "\n", + "curve = hv.Curve((dates, time_series(N=len(dates), sigma = 1)))\n", + "datashade(curve, cmap=[\"blue\"])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloViews also supplies some operations that are useful in combination with Datashader timeseries. For instance, you can compute a rolling mean of the results and then show a subset of outlier points, which will then support hover, selection, and other interactive Bokeh features:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Overlay [finalize_hooks=[apply_formatter] width=800] \n", + "%%opts Scatter [tools=['hover', 'box_select']] (line_color=\"black\" fill_color=\"red\" size=10)\n", + "from holoviews.operation.timeseries import rolling, rolling_outlier_std\n", + "smoothed = rolling(curve, rolling_window=50)\n", + "outliers = rolling_outlier_std(curve, rolling_window=50, sigma=2)\n", + "\n", + "datashade(curve, cmap=[\"blue\"]) * dynspread(datashade(smoothed, cmap=[\"red\"]),max_px=1) * outliers" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "rolling.function" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the above plot will look blocky in a static export (such as on anaconda.org), because the exported version is generated without taking the size of the actual plot (using default height and width for Datashader) into account, whereas the live notebook automatically regenerates the plot to match the visible area on the page.\n", + "\n", + "# Hover info\n", + "\n", + "As you can see, converting the data to an image using Datashader makes it feasible to work with even very large datasets interactively. One unfortunate side effect is that the original datapoints and line segments can no longer be used to support \"tooltips\" or \"hover\" information directly; that data simply is not present at the browser level, and so the browser cannot unambiguously report information about any specific datapoint. Luckily, you can still provide hover information that reports properties of a subset of the data in a separate layer (as above), or you can provide information for a spatial region of the plot rather than for specific datapoints. For instance, in some small rectangle you can provide statistics such as the mean, count, standard deviation, etc:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts QuadMesh [tools=['hover']] (alpha=0 hover_alpha=0.2)\n", + "from holoviews.streams import RangeXY\n", + "\n", + "fixed_hover = datashade(points, width=400, height=400) * \\\n", + " hv.QuadMesh(aggregate(points, width=10, height=10, dynamic=False))\n", + "\n", + "dynamic_hover = datashade(points, width=400, height=400) * \\\n", + " hv.util.Dynamic(aggregate(points, width=10, height=10, streams=[RangeXY]), operation=hv.QuadMesh)\n", + "\n", + "fixed_hover + dynamic_hover" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the above examples, the plot on the left provides hover information at a fixed spatial scale, while the one on the right reports on an area that scales with the zoom level so that arbitrarily small regions of data space can be examined, which is generally more useful.\n", + "\n", + "As you can see, HoloViews exposes the functionality of Datashader, connecting this powerful rasterized with Bokeh output so you can visualize data of any size in a browser using just a few lines of code. Because Datashader-based HoloViews plots are just a few extra steps added on to regular HoloViews plots, they support all of the same features as regular HoloViews objects, and can freely be laid out, overlaid, and nested together them." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/Configuring_HoloViews.ipynb b/examples/user_guide/Configuring_HoloViews.ipynb new file mode 100644 index 0000000000..679da3eadd --- /dev/null +++ b/examples/user_guide/Configuring_HoloViews.ipynb @@ -0,0 +1,116 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Configuring HoloViews\n", + "\n", + "HoloViews offers several types of configuration." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## ``hv.config`` settings\n", + "\n", + "HoloViews has a top-level ``hv.config`` object with flags to control various backwards-compatibility options:\n", + "\n", + "* ``style_17`` : Enables the styling used before HoloViews 1.7\n", + "* ``warn_options_call``: Warn when using the to-be-deprecated ``__call__`` syntax for specifying options, instead of the recommended ``.opts`` method.\n", + "\n", + "It is recommended you set ``warn_options_call`` to ``True`` in your holoviews.rc (see section below).\n", + "\n", + "To set the configuration, you can use:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.config(style_17=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Although this should be done as early as possible as some configuration options must be set before the corresponding plotting extensions are imported. For this reason, the following way of setting configuration options is recommended:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "hv.extension(config=dict(style_17=True))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Jupyter notebook configuration\n", + "\n", + "In jupyter_notebook_config.py:\n", + "\n", + "\n", + "```\n", + "c = get_config()\n", + "c.NotebookApp.iopub_data_rate_limit=100000000\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Improved tab-completion\n", + "\n", + "Both ``Layout`` and ``Overlay`` are designed around convenient tab-completion, with the expectation of upper-case names being listed first. In recent versions of IPython there has been a regression whereby the tab-completion is no longer case-sensitive. This can be fixed with:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv\n", + "hv.extension(case_sensitive_completion=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## The holoviews.rc file\n", + "\n", + "HoloViews supports an rc file that is searched for in the following places: ``\"~/.holoviews.rc\"``, ``\"~/.config/holoviews/holoviews.rc\"`` and the in parent directory of the top-level ``__init__.py`` file (useful for developers working out of the HoloViews git repo). A different location to find the rc file can be specified via the ``HOLOVIEWSRC`` environment variable.\n", + "\n", + "This rc file is executed right after HoloViews, imports. For instance you can use an rc file with:\n", + "\n", + "```\n", + "import holoviews as hv\n", + "hv.config(warn_options_call=True)\n", + "hv.extension.case_sensitive_completion=True\n", + "```\n", + "\n", + "So that the case-sensitive tab-completion described in the previous section is enabled by default." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/user_guide/Continuous_Coordinates.ipynb b/examples/user_guide/Continuous_Coordinates.ipynb new file mode 100644 index 0000000000..71f4582dfd --- /dev/null +++ b/examples/user_guide/Continuous_Coordinates.ipynb @@ -0,0 +1,393 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Continuous Coordinates" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloViews is designed to work with scientific and engineering data, which is often in the form of discrete samples from an underlying continuous system. Imaging data is one clear example: measurements taken at a regular interval over a grid covering a two-dimensional area. Although the measurements are discrete, they approximate a continuous distribution, and HoloViews provides extensive support for working naturally with data of this type." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2D Continuous spaces" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this tutorial we will show the support provided by HoloViews for working with two-dimensional regularly sampled grid data like images, and then in subsequent sections discuss how HoloViews supports one-dimensional, higher-dimensional, and irregularly sampled data with continuous coordinates." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension()\n", + "\n", + "np.set_printoptions(precision=2, linewidth=80)\n", + "%opts HeatMap (cmap=\"hot\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First, let's consider: \n", + "\n", + "|||\n", + "|:--------------:|:----------------|\n", + "| **``f(x,y)``** | a simple function that accepts a location in a 2D plane specified in millimeters (mm) |\n", + "| **``region``** | a 1mm×1mm square region of this 2D plane, centered at the origin, and |\n", + "| **``coords``** | a function returning a square (s×s) grid of (x,y) coordinates regularly sampling the region in the given bounds, at the centers of each grid cell |\n", + "||||\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def f(x,y): \n", + " return x+y/3.1\n", + " \n", + "region=(-0.5,-0.5,0.5,0.5)\n", + "\n", + "def coords(bounds,samples):\n", + " l,b,r,t=bounds\n", + " hc=0.5/samples\n", + " return np.meshgrid(np.linspace(l+hc,r-hc,samples),\n", + " np.linspace(b+hc,t-hc,samples))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's build a Numpy array regularly sampling this function at a density of 5 samples per mm: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "f5=f(*coords(region,5))\n", + "f5" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can visualize this array (and thus the function ``f``) either using a ``Raster``, which uses the array's own integer-based coordinate system (which we will call \"array\" coordinates), or an ``Image``, which uses a continuous coordinate system, or as a ``HeatMap`` labelling each value explicitly:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "r5 = hv.Raster(f5, label=\"R5\")\n", + "i5 = hv.Image( f5, label=\"I5\", bounds=region)\n", + "h5 = hv.HeatMap([(x, y, f5[4-y,x]) for x in range(0,5) for y in range(0,5)], label=\"H5\")\n", + "r5+i5+h5" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Both the ``Raster`` and ``Image`` ``Element`` types accept the same input data and show the same arrangement of colors, but a visualization of the ``Raster`` type reveals the underlying raw array indexing, while the ``Image`` type has been labelled with the coordinate system from which we know the data has been sampled. All ``Image`` operations work with this continuous coordinate system instead, while the corresponding operations on a ``Raster`` use raw array indexing.\n", + "\n", + "For instance, all five of these indexing operations refer to the same element of the underlying Numpy array, i.e. the second item in the first row:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "\"r5[0,1]=%0.2f r5.data[0,1]=%0.2f i5[-0.2,0.4]=%0.2f i5[-0.24,0.37]=%0.2f i5.data[0,1]=%0.2f\" % \\\n", + "(r5[1,0], r5.data[0,1], i5[-0.2,0.4], i5[-0.24,0.37], i5.data[0,1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that the ``Raster`` and the underlying ``.data`` elements both use Numpy's raw integer indexing, while the ``Image`` uses floating-point values that are then mapped onto the appropriate array element.\n", + "\n", + "This diagram should help show the relationships between the ``Raster`` coordinate system in the plot (which ranges from 0 at the top edge to 5 at the bottom), the underlying raw Numpy integer array indexes (labelling each dot in the **Array coordinates** figure), and the underlying **Continuous coordinates**:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "
Array coordinates
Continuous coordinates
" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Importantly, although we used a 5×5 array in this example, we could substitute a much larger array with the same continuous coordinate system if we wished, without having to change any of our continuous indexes -- they will still point to the correct location in the continuous space:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "f10=f(*coords(region,10))\n", + "f10" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "r10 = hv.Raster(f10, label=\"R10\")\n", + "i10 = hv.Image(f10, label=\"I10\", bounds=region)\n", + "r10+i10" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The image now has higher resolution, but still visualizes the same underlying continuous function, now evaluated at 100 grid positions instead of 25:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "
Array coordinates
Continuous coordinates
" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indexing the exact same coordinates as above now gets very different results:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "\"r10[0,1]=%0.2f r10.data[0,1]=%0.2f i10[-0.2,0.4]=%0.2f i10[-0.24,0.37]=%0.2f i10.data[0,1]=%0.2f\" % \\\n", + "(r10[1,0], r10.data[0,1], i10[-0.2,0.4], i10[-0.24,0.37], i10.data[0,1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The array-based indexes used by ``Raster`` and the Numpy array in ``.data`` still return the second item in the first row of the array, but this array element now corresponds to location (-0.35,0.4) in the continuous function, and so the value is different. These indexes thus do *not* refer to the same location in continuous space as they did for the other array density, because raw Numpy-based indexing is *not* independent of density or resolution.\n", + "\n", + "Luckily, the two continuous coordinates still return very similar values to what they did before, since they always return the value of the array element corresponding to the closest location in continuous space. They now return elements just above and to the right, or just below and to the left, of the earlier location, because the array now has a higher resolution with elements centered at different locations. \n", + "\n", + "Indexing in continuous coordinates always returns the value closest to the requested value, given the available resolution. Note that in the case of coordinates truly on the boundary between array elements (as for -0.2,0.4), the bounds of each array cell are taken as right exclusive and upper exclusive, and so (-0.2,0.4) returns array index (3,0). " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Slicing in 2D" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to indexing (looking up a value), slicing (selecting a region) works as expected in continuous space (see the [Sampling Data](Sampling_Data) tutorial for more explanation). For instance, we can ask for a slice from (-0.275,-0.0125) to (0.025,0.2885) in continuous coordinates:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sl10=i10[-0.275:0.025,-0.0125:0.2885]\n", + "sl10.data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sl10" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This slice has selected those array elements whose centers are contained within the specified continuous space. To do this, the continuous coordinates are first converted by HoloViews into the floating-point range (5.125,2.250) (2.125,5.250) of array coordinates, and all those elements whose centers are in that range are selected:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "
Array coordinates
Continuous coordinates
" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Slicing also works for ``Raster`` elements, but it results in an object that always reflects the contents of the underlying Numpy array (i.e., always with the upper left corner labelled 0,0):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "r5[0:3,1:3] + r5[0:3,1:2]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Hopefully these examples make it clear that if you are using data that is sampled from some underlying continuous system, you should use the continuous coordinates offered by HoloViews objects like ``Image`` so that your programs can be independent of the resolution or sampling density of that data, and so that your axes and indexes can be expressed naturally, using the actual units of the underlying continuous space. The data will still be stored in the same Numpy array, but now you can treat it consistently like the approximation to continuous values that it is." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 1D and nD Continuous coordinates" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All of the above examples use the common case for visualizations of a two-dimensional regularly gridded continuous space, which is implemented in ``holoviews.core.sheetcoords.SheetCoordinateSystem``. \n", + "\n", + "Similar continuous coordinates and slicing are also supported for ``Chart`` elements, such as ``Curve``s, but using a single index and allowing arbitrary irregular spacing, implemented in ``holoviews.elements.chart.Chart``. \n", + "\n", + "They also work the same for the n-dimensional coordinates and slicing supported by the [container](Containers) types ``HoloMap``, ``NdLayout``, and ``NdOverlay``, implemented in ``holoviews.core.dimension.Dimensioned`` and again allowing arbitrary irregular spacing. \n", + "\n", + "Together, these powerful continuous-coordinate indexing and slicing operations allow you to work naturally and simply in the full *n*-dimensional space that characterizes your data and parameter values." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The above examples focus on indexing and slicing, but as described in the [Sampling Data](Sampling_Data) tutorial there is another related operation supported for continuous spaces, called sampling. Sampling is similar to indexing and slicing, in that all of them can reduce the dimensionality of your data, but sampling is implemented in a general way that applies for any of the 1D, 2D, or nD datatypes. For instance, if we take our 10×10 array from above, we can ask for the value at a given location, which will come back as a ``Table``, i.e. a dictionary with one (key,value) pair:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "e10=i10.sample(x=-0.275, y=0.2885)\n", + "e10" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similarly, if we ask for the value of a given *y* location in continuous space, we will get a ``Curve`` with the array row closest to that *y* value in the ``Image`` 2D array returned as arrays of $x$ values and the corresponding *z* value from the image:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "r10=i10.sample(y=0.2885)\n", + "r10" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The same sampling syntax can be used on HoloViews objects with any number of continuous-coordinate dimensions, in each case returning a HoloViews object of the correct dimensionality. This support for working in continuous spaces makes it much more natural to work with HoloViews objects than directly with the underlying raw Numpy arrays, but the raw data always remains available when needed." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/Deploying_Bokeh_Apps.ipynb b/examples/user_guide/Deploying_Bokeh_Apps.ipynb new file mode 100644 index 0000000000..ef8dc83812 --- /dev/null +++ b/examples/user_guide/Deploying_Bokeh_Apps.ipynb @@ -0,0 +1,643 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Deploying Bokeh Apps" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Purpose\n", + "\n", + "HoloViews is an incredible convenient way of working interactively and exploratively within a notebook or commandline context, however when you have implemented a polished interactive dashboard or some other complex interactive visualization you often want to deploy it outside the notebook. The [bokeh server](http://bokeh.pydata.org/en/latest/docs/user_guide/server.html) provides a very convenient way of deploying HoloViews plots and interactive dashboards in a scalable and flexible manner. The bokeh server allows all the usual interactions that HoloViews lets you define and more including:\n", + "\n", + "* responding to plot events and tool interactions via [Linked Streams](Linked_Stream.ipynb)\n", + "* generating and interacting with plots via the usual widgets that HoloViews supports for HoloMap and DynamicMap objects.\n", + "* using periodic and timeout events to drive plot updates\n", + "* combining HoloViews plots with custom bokeh plots to quickly write highly customized apps.\n", + "\n", + "## Overview\n", + "\n", + "In this guide we will cover how we can deploy a bokeh app from a HoloViews plot in a number of different ways:\n", + "\n", + "1. Inline from within the Jupyter notebook\n", + "\n", + "2. Starting a server interactively and open it in a new browser window.\n", + "\n", + "3. From a standalone script file\n", + "\n", + "4. Combining HoloViews and Bokeh models to create a more customized app\n", + "\n", + "If you have read a bit about HoloViews you will know that HoloViews objects are not themselves plots, instead they contain sufficient data and metadata allowing them to be rendered automatically in a notebook context. In other words when a HoloViews object is evaluated a backend specific ``Renderer`` converts the HoloViews object into bokeh models, a matplotlib figure or a plotly graph. This intermediate representation is then rendered as an image or as HTML with associated Javascript, which is what ends up being displayed.\n", + "\n", + "## The workflow\n", + "\n", + "The most convenient way to work with HoloViews is to iteratively improve a visualization in the notebook. Once you have developed a visualization or dashboard that you would like to deploy you can use the ``BokehRenderer`` to save the visualization or deploy it as a bokeh server app. \n", + "\n", + "Here we will create a small interactive plot, using [Linked Streams](Linked_Streams.html), which mirrors the points selected using box- and lasso-select tools in a second plot and computes some statistics:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [tools=['box_select', 'lasso_select']]\n", + "\n", + "# Declare some points\n", + "points = hv.Points(np.random.randn(1000,2 ))\n", + "\n", + "# Declare points as source of selection stream\n", + "selection = hv.streams.Selection1D(source=points)\n", + "\n", + "# Write function that uses the selection indices to slice points and compute stats\n", + "def selected_info(index):\n", + " arr = points.array()[index]\n", + " if index:\n", + " label = 'Mean x, y: %.3f, %.3f' % tuple(arr.mean(axis=0))\n", + " else:\n", + " label = 'No selection'\n", + " return points.clone(arr, label=label)(style=dict(color='red'))\n", + "\n", + "# Combine points and DynamicMap\n", + "layout = points + hv.DynamicMap(selected_info, streams=[selection])\n", + "layout" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Working with the BokehRenderer\n", + "\n", + "When working with bokeh server or wanting to manipulate a backend specific plot object you will have to use a HoloViews ``Renderer`` directly to convert the HoloViews object into the backend specific representation. Therefore we will start by getting a hold of a ``BokehRenderer``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "renderer = hv.renderer('bokeh')\n", + "print(renderer)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "```python\n", + "BokehRenderer()\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All ``Renderer`` classes in HoloViews are so called ParameterizedFunctions, they provide both classmethods and instance methods to render an object. You can easily create a new ``Renderer`` instance using the ``.instance`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "renderer = renderer.instance(mode='server')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Renderers can also have different modes, in this case we will instantiate the renderer in ``'server'`` mode, which tells the Renderer to render the HoloViews object to a format that can easily be deployed as a server app. Before going into more detail about deploying server apps we will quickly remind ourselves how the renderer turns HoloViews objects into bokeh models." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Figures" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The BokehRenderer converts the HoloViews object to a HoloViews ``Plot``, which holds the bokeh models that will be rendered to screen. As a very simple example we can convert a HoloViews ``Image`` to a HoloViews plot:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hvplot = renderer.get_plot(layout)\n", + "print(hvplot)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "```\n", + "\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the ``state`` attribute on the HoloViews plot we can access the bokeh ``Column`` model, which we can then work with directly." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hvplot.state" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Column**(\tid = '5a8b7949-decd-4a96-b1f8-8f77ec90e5bf', …)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the background this is how HoloViews converts any HoloViews object into bokeh models, which can then be converted to embeddable or standalone HTML and be rendered in the browser. This conversion is usually done in the background using the ``figure_data`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "html = renderer.figure_data(hvplot)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Bokeh Documents" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In bokeh the [``Document``](http://bokeh.pydata.org/en/latest/docs/reference/document.html) is the basic unit at which Bokeh models (such as plots, layouts and widgets) are held and serialized. The serialized JSON representation is then sent to BokehJS on the client side browser. When in ``'server'`` mode the BokehRenderer will automatically return a server Document:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "renderer(layout)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "```\n", + "(,\n", + " {'file-ext': 'html', 'mime_type': u'text/html'})\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also easily use the ``server_doc`` method to get a bokeh ``Document``, which does not require you to make an instance in 'server' mode." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "doc = renderer.server_doc(layout)\n", + "doc.title = 'HoloViews App'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Deploying with ``bokeh serve``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Deployment from a script with bokeh serve is one of the most common ways to deploy a bokeh app. Any ``.py`` or ``.ipynb`` file that attaches a plot to bokeh's ``curdoc`` can be deployed using ``bokeh serve``. The easiest way to do this is using the ``BokehRenderer.server_doc`` method, which accepts any HoloViews object generates the appropriate bokeh models and then attaches them to ``curdoc``. See below to see a full standalone script:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "```python\n", + "import numpy as np\n", + "import holoviews as hv\n", + "import holoviews.plotting.bokeh\n", + "\n", + "renderer = hv.renderers('bokeh')\n", + "\n", + "points = hv.Points(np.random.randn(1000,2 ))(plot=dict(tools=['box_select', 'lasso_select']))\n", + "selection = hv.streams.Selection1D(source=points)\n", + "\n", + "def selected_info(index):\n", + " arr = points.array()[index]\n", + " if index:\n", + " label = 'Mean x, y: %.3f, %.3f' % tuple(arr.mean(axis=0))\n", + " else:\n", + " label = 'No selection'\n", + " return points.clone(arr, label=label)(style=dict(color='red'))\n", + "\n", + "layout = points + hv.DynamicMap(selected_info, streams=[selection])\n", + "\n", + "doc = renderer.server_doc(layout)\n", + "doc.title = 'HoloViews App'\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In just a few steps, i.e. by our plot to a Document ``renderer.server_doc`` we have gone from an interactive plot which we can iteratively refine in the notebook to a deployable bokeh app. Note also that we can also deploy an app directly from a notebook. By adding ``BokehRenderer.server_doc(holoviews_object)`` to the end of the notebook any regular ``.ipynb`` file can be made into a valid bokeh app, which can be served with ``bokeh serve example.ipynb``.\n", + "\n", + "In addition to starting a server from a script we can also start up a server interactively, so let's do a quick deep dive into bokeh ``Application`` and ``Server`` objects and how we can work with them from within HoloViews." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Bokeh Applications and Server" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A bokeh ``Application`` encapsulates a Document and allows it to be deployed on a bokeh server. The ``BokehRenderer.app`` method provides an easy way to create an ``Application`` and either display it immediately in a notebook or manually include it in a server app.\n", + "\n", + "To let us try this out we'll define a slightly simpler plot to deploy as a server app. We'll define a ``DynamicMap`` of a sine ``Curve`` varying by frequency, phase and an offset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def sine(frequency, phase, amplitude):\n", + " xs = np.linspace(0, np.pi*4)\n", + " return hv.Curve((xs, np.sin(frequency*xs+phase)*amplitude))(plot=dict(width=800))\n", + "\n", + "ranges = dict(frequency=(1, 5), phase=(-np.pi, np.pi), amplitude=(-2, 2), y=(-2, 2))\n", + "dmap = hv.DynamicMap(sine, kdims=['frequency', 'phase', 'amplitude']).redim.range(**ranges)\n", + "\n", + "app = renderer.app(dmap)\n", + "print(app)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "```\n", + "\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once we have a bokeh Application we can manually create a ``Server`` instance to deploy it. To start a ``Server`` instance we simply define a mapping between the URL paths and apps that we want to deploy. Additionally we define a port (defining ``port=0`` will use any open port), and the ``IOLoop``. To get an ``IOLoop`` we simply use ``IOLoop.current()``, which will already be running if working from within a notebook but will give you a new ``IOLoop`` outside a notebook context." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from tornado.ioloop import IOLoop\n", + "from bokeh.server.server import Server\n", + "\n", + "loop = IOLoop.current()\n", + "server = Server({'/': app}, port=0, loop=loop)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next we can define a callback on the IOLoop that will open the server app in a new browser window and actually start the app (and if outside the notebook the IOLoop):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def show_callback():\n", + " server.show('/')\n", + "loop.add_callback(show_callback)\n", + "server.start()\n", + "# Outside the notebook ioloop needs to be started\n", + "# loop.start() " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After running the cell above you should have noticed a new browser window popping up displaying our plot. Once you are done playing with it you can stop it with:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "server.stop()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``BokehRenderer.app`` method allows us to the same thing automatically (but less flexibly) using the ``show=True`` and ``new_window=True`` arguments:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "server = renderer.app(dmap, show=True, new_window=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We will once again stop this Server before continuing:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "server.stop()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Inlining apps in the notebook" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Instead of displaying our app in a new browser window and manually creating a ``Server`` instance we can also display an app inline in the notebook simply by supplying the ``show=True`` argument to the ``BokehRenderer.app`` method. The server app will be killed whenever you rerun or delete the cell that contains the output. Additionally, if your Jupyter Notebook server is not running on the default address or port (``localhost:8888``) supply the websocket origin, which should match the first part of the URL of your notebook:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "renderer.app(dmap, show=True, websocket_origin='localhost:8888')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Periodic callbacks\n", + "\n", + "One of the most important features of deploying apps is the ability to attach asynchronous, periodic callbacks, which update the plot. The simplest way of achieving this is to attach a ``Counter`` stream on the plot which is incremented on each callback. As a simple demo we'll simply compute a phase offset from the counter value, animating the sine wave:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def sine(counter):\n", + " phase = counter*0.1%np.pi*2\n", + " xs = np.linspace(0, np.pi*4)\n", + " return hv.Curve((xs, np.sin(xs+phase)))(plot=dict(width=800))\n", + "\n", + "dmap = hv.DynamicMap(sine, streams=[hv.streams.Counter()])\n", + "\n", + "app = renderer.app(dmap, show=True, websocket_origin='localhost:8888')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once we have created the app we can start a periodic callback with the ``periodic`` method on the ``DynamicMap``. The first argument to the method is the period and the second argument the number of executions to trigger (we can set this value to ``None`` to set up an indefinite callback). As soon as we start this callback you should see the Curve above become animated." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dmap.periodic(0.1, 100)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Combining HoloViews and Bokeh Plots/Widgets" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "While HoloViews provides very convenient ways of creating an app it is not as fully featured as bokeh itself is. Therefore we often want to extend a HoloViews based app with bokeh plots and widgets created directly using the bokeh API. Using the ``BokehRenderer`` we can easily convert a HoloViews object into a bokeh model, which we can combine with other bokeh models as desired.\n", + "\n", + "To see what this looks like we will use the sine example again but this time connect a [Stream](Stream.ipynb) to a manually created bokeh slider widget and play button. To display this in the notebook we will reuse what we learned about creating a ``Server`` instance using a ``FunctionHandler``, you can of course run this in a script by calling the ``modify_doc`` function with with the ``Document`` returned by the bokeh ``curdoc()`` function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "\n", + "from bokeh.application.handlers import FunctionHandler\n", + "from bokeh.application import Application\n", + "from bokeh.io import show\n", + "from bokeh.layouts import layout\n", + "from bokeh.models import Slider, Button\n", + "\n", + "renderer = hv.renderer('bokeh').instance(mode='server')\n", + "\n", + "# Create the holoviews app again\n", + "def sine(phase):\n", + " xs = np.linspace(0, np.pi*4)\n", + " return hv.Curve((xs, np.sin(xs+phase)))(plot=dict(width=800))\n", + "\n", + "stream = hv.streams.Stream.define('Phase', phase=0.)()\n", + "dmap = hv.DynamicMap(sine, streams=[stream])\n", + "\n", + "# Define valid function for FunctionHandler\n", + "# when deploying as script, simply attach to curdoc\n", + "def modify_doc(doc):\n", + " # Create HoloViews plot and attach the document\n", + " hvplot = renderer.get_plot(dmap, doc)\n", + "\n", + " # Create a slider and play buttons\n", + " def animate_update():\n", + " year = slider.value + 0.2\n", + " if year > end:\n", + " year = start\n", + " slider.value = year\n", + "\n", + " def slider_update(attrname, old, new):\n", + " # Notify the HoloViews stream of the slider update \n", + " stream.event(phase=new)\n", + " \n", + " start, end = 0, np.pi*2\n", + " slider = Slider(start=start, end=end, value=start, step=0.2, title=\"Phase\")\n", + " slider.on_change('value', slider_update)\n", + "\n", + " def animate():\n", + " if button.label == '► Play':\n", + " button.label = '❚❚ Pause'\n", + " doc.add_periodic_callback(animate_update, 50)\n", + " else:\n", + " button.label = '► Play'\n", + " doc.remove_periodic_callback(animate_update)\n", + " button = Button(label='► Play', width=60)\n", + " button.on_click(animate)\n", + " \n", + " # Combine the holoviews plot and widgets in a layout\n", + " plot = layout([\n", + " [hvplot.state],\n", + " [slider, button]], sizing_mode='fixed')\n", + " \n", + " doc.add_root(plot)\n", + " return doc\n", + "\n", + "# To display in the notebook\n", + "handler = FunctionHandler(modify_doc)\n", + "app = Application(handler)\n", + "show(app, notebook_url='localhost:8888')\n", + "\n", + "# To display in a script\n", + "# doc = modify_doc(curdoc()) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you had trouble following the last example, you will noticed how much verbose things can get when we drop down to the bokeh API. The ability to customize the plot comes at the cost of additional complexity. However when we need it the additional flexibility of composing plots manually is there." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/user_guide/Exporting_and_Archiving.ipynb b/examples/user_guide/Exporting_and_Archiving.ipynb new file mode 100644 index 0000000000..30bfcf5909 --- /dev/null +++ b/examples/user_guide/Exporting_and_Archiving.ipynb @@ -0,0 +1,490 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Exporting and Archiving" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Most of the other tutorials show you how to use HoloViews for interactive, exploratory visualization of your data, while the [Options](Options.ipynb) tutorial shows how to use HoloViews completely non-interactively, generating and rendering images directly to disk. In this notebook, we show how HoloViews works together with the IPython/Jupyter Notebook to establish a fully interactive yet *also* fully reproducible scientific or engineering workflow for generating reports or publications. That is, as you interactively explore your data and build visualizations in the notebook, you can automatically generate and export them as figures that will feed directly into your papers or web pages, along with records of how those figures were generated and even storing the actual data involved so that it can be re-analyzed later. \n", + "\n", + "## Reproducible research\n", + "\n", + "To understand why this capability is important, let's consider the process by which scientific results are typically generated and published without HoloViews. Scientists and engineers use a wide variety of data-analysis tools, ranging from GUI-based programs like Excel spreadsheets, mixed GUI/command-line programs like Matlab, or purely scriptable tools like matplotlib or bokeh. The process by which figures are created in any of these tools typically involves copying data from its original source, selecting it, transforming it, choosing portions of it to put into a figure, choosing the various plot options for a subfigure, combining different subfigures into a complete figure, generating a publishable figure file with the full figure, and then inserting that into a report or publication. \n", + "\n", + "If using GUI tools, often the final figure is the only record of that process, and even just a few weeks or months later a researcher will often be completely unable to say precisely how a given figure was generated. Moreover, this process needs to be repeated whenever new data is collected, which is an error-prone and time-consuming process. The lack of records is a serious problem for building on past work and revisiting the assumptions involved, which greatly slows progress both for individual researchers and for the field as a whole. Graphical environments for capturing and replaying a user's GUI-based workflow have been developed, but these have greatly restricted the process of exploration, because they only support a few of the many analyses required, and thus they have rarely been successful in practice. With GUI tools it is also very difficult to \"curate\" the sequence of steps involved, i.e., eliminating dead ends, speculative work, and unnecessary steps, with a goal of showing the clear path from incoming data to a final figure.\n", + "\n", + "In principle, using scriptable or command-line tools offers the promise of capturing the steps involved, in a form that can be curated. In practice, however, the situation is often no better than with GUI tools, because the data is typically taken through many manual steps that culminate in a published figure, and without a laboriously manually created record of what steps are involved, the provenance of a given figure remains unknown. Where reproducible workflows are created in this way, they tend to be \"after the fact\", as an explicit exercise to accompany a publication, and thus (a) they are rarely done, (b) they are very difficult to do if any of the steps were not recorded originally. \n", + "\n", + "An IPython/Jupyter notebook helps significantly to make the scriptable-tools approach viable, by recording both code and the resulting output, and can thus in principle act as a record for establishing the full provenance of a figure. But because typical plotting libraries require so much plotting-specific code before any plot is visible, the notebook quickly becomes unreadable. To make notebooks readable, researchers then typically move the plotting code for a specific figure to some external file, which then drifts out of sync with the notebook so that the notebook no longer acts as a record of the link between the original data and the resulting figure. \n", + "\n", + "HoloViews provides the final missing piece in this approach, by allowing researchers to work directly with their data interactively in a notebook, using small amounts of code that focus on the data and analyses rather than plotting code, yet showing the results directly alongside the specification for generating them. This tutorial will describe how use a Jupyter notebook with HoloViews to export your results in a way that preserves the information about how those results were generated, providing a clear chain of provenance and making reproducible research practical at last." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "from holoviews.operation import contours\n", + "hv.extension()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Exporting specific files" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "During interactive exploration in the IPython Notebook, your results are always visible within the notebook itself, but you can explicitly request that any IPython cell is also exported to an external file on disk:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%output filename=\"penguin_plot\" fig=\"png\" holomap=\"gif\"\n", + "penguins = hv.RGB.load_image('../datasets/penguins.png')\n", + "penguins" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This mechanism can be used to provide a clear link between the steps for generating the figure, and the file on disk. You can now load the exported plot back into HoloViews, if you like, though the result would be a bit confusing due to the additional set of axes applied to the new plot:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.RGB.load_image('penguin_plot.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``fig=\"png\"`` part of the ``%%output`` magic above specified that the file should be saved in PNG format, which is useful for posting on web pages or editing in raster-based graphics programs. It also specified that if the object contained a ``HoloMap`` (which this particular one does not), it would be saved in GIF format, which supports animation. Because of the need for animation, objects containing a ``HoloMap`` are handled specially, as animation is not supported by the common PNG or SVG formats.\n", + "\n", + "For a publication, you will usually want to select SVG format, using ``fig=\"svg\"``, because this vector format preserves the full resolution of all text and drawing elements. SVG files can be be used in some document preparation programs directly (e.g. [LibreOffice](http://www.libreoffice.org/)), and can easily be converted using e.g. [Inkscape](https://inkscape.org) to PDF for use with PDFLaTeX or to EMF for use with Microsoft Word. They can also be edited using Inkscape or other vector drawing programs to move graphical elements around, add arbitrary text, etc., if you need to make final tweaks before using the figures in a document. You can also embed them within other SVG figures in such a drawing program, e.g. by creating a larger figure as a template that automatically incorporates multiple SVG files you have exported separately." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Exporting notebooks" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``%%output`` magic is useful when you want specific plots saved into specific files. Often, however, a notebook will contain an entire suite of results contained in multiple different cells, and manually specifying these cells and their filenames is error-prone, with a high likelihood of accidentally creating multiple files with the same name or using different names in different notebooks for the same objects.\n", + "\n", + "To make the exporting process easier for large numbers of outputs, as well as more predictable, HoloViews also offers a powerful automatic notebook exporting facility, creating an archive of all your results. Automatic export is very useful in the common case of having a notebook that contains a series of figures to be used in a report or publication, particularly if you are repeatedly re-running the notebook as you finalize your results, and want the full set of current outputs to be available to an external document preparation system." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To turn on automatic adding of your files to the export archive, run ``hv.archive.auto()``:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.archive.auto()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This object's behavior can be customized extensively; try pressing shift-[tab] twice within the parentheses for a list of options, which are described more fully below.\n", + "\n", + "By default, the output will go into a directory with the same name as your notebook, and the names for each object will be generated from the groups and labels used by HoloViews. Objects that contain HoloMaps are not exported by default, since those are usually rendered as animations that are not suitable for inclusion in publications, but you can change it to ``.auto(holomap='gif')`` if you want those as well. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Adding files to an archive" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To see how the auto-exporting works, let's define a few HoloViews objects:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "penguins[:,:,'R'].relabel(\"Red\") + penguins[:,:,'G'].relabel(\"Green\") + penguins[:,:,'B'].relabel(\"Blue\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "penguins * hv.Arrow(0.15, 0.3, 'Penguin', '>')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Contours (linewidth=1.3) Image (cmap=\"gray\")\n", + "cs = contours(penguins[:,:,'R'], levels=[0.10,0.80])\n", + "cs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now list what has been captured, along with the names that have been generated:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.archive.contents()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here each object has resulted in two files, one in SVG format and one in Python \"pickle\" format (which appears as a ``zip`` file with extension ``.hvz`` in the listing). We'll ignore the pickle files for now, focusing on the SVG images.\n", + "\n", + "The name generation code for these files is heavily customizable, but by default it consists of a list of dimension values and objects:\n", + "\n", + " ``{dimension},{dimension},...{group}-{label},{group}-{label},...``. \n", + "\n", + "The ``{dimension}`` shows what dimension values are included anywhere in this object, if it contains any high-level ``Dimensioned`` objects like ``HoloMap``, ``NdOverlay``, and ``GridLayout``. In the last SVG image in the contents list above, which is for the ``contours`` object, there is one dimension ``Levels``, and the name shows that dimension values included in this object range from 0.1 to 0.8 (as is visible in the contours specification above.) Of course, nearly all HoloViews objects have dimensions, such as ``x`` and ``y`` in this case, but those dimensions are not used in the filenames because they are explicitly shown in the plots; only the top-level dimensions are used (those that determine which plot this is, not those that are shown in the plot itself.)\n", + "\n", + "The ``{group}-{label}`` information lists the names HoloViews uses for default titles and for attribute access for the various objects that make up a given displayed object. E.g. the first SVG image in the list is a ``Layout`` of the three given ``Image`` objects, and the second one is an ``Overlay`` of an ``RGB`` object and an ``Arrow`` object. This information usually helps distinguish one plot from another, because they will typically be plots of objects that have different labels. \n", + "\n", + "If the generated names are not unique, a numerical suffix will be added to make them unique. A maximum filename length is enforced, which can be set with ``hv.archive.max_filename=``_num_.\n", + "\n", + "If you prefer a fixed-width filename, you can use a hash for each name instead (or in addition), where ``:.8`` specifies how many characters to keep from the hash:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.archive.filename_formatter=\"{SHA:.8}\"\n", + "cs" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.archive.contents()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that the newest files added have the shorter, fixed-width format, though the names are no longer meaningful. If the ``filename_formatter`` had been set from the start, all filenames would have been of this type, which has both practical advantages (short names, all the same length) and disadvantages (no semantic clue about the contents)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Generated indexes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to the files that were added to the archive for each of the cell outputs above, the archive exporter also adds an ``index.html`` file with a static copy of the notebook, with each cell labelled with the filename used to save it. This HTML file acts as a definitive index to your results, showing how they were generated and where they were exported on disk. \n", + "\n", + "The exporter will also add a cleared, runnable copy of the notebook ``index.ipynb`` (with output deleted), so that you can later regenerate all of the output, with changes if necessary. \n", + "\n", + "The exported archive will thus be a complete set of your results, along with a record of how they were generated, plus a recipe for regenerating them -- i.e., fully reproducible research! This HTML file and .ipynb file can the be submitted as supplemental materials for a paper, allowing any reader to build on your results, or it can just be kept privately so that future collaborators can start where this research left off." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Adding your own data to the archive" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Of course, your results may depend on a lot of external packages, libraries, code files, and so on, which will not automatically be included or listed in the exported archive.\n", + "\n", + "Luckily, the archive support is very general, and you can add any object to it that you want to be exported along with your output. For instance, you can store arbitrary metadata of your choosing, such as version control information, here as a JSON-format text file: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import json\n", + "hv.archive.add(filename='metadata.json', \n", + " data=json.dumps({'repository':'git@github.com:ioam/holoviews.git',\n", + " 'commit':'437e8d69'}), info={'mime_type':'text/json'})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The new file can now be seen in the contents listing:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.archive.contents()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can get a more direct list of filenames using the ``listing`` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "listing = hv.archive.listing()\n", + "listing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this way, you should be able to automatically generate output files, with customizable filenames, storing any data or metadata you like along with them so that you can keep track of all the important information for reproducing these results later." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Controlling the behavior of ``hv.archive``" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ``hv.archive`` object provides numerous parameters that can be changed. You can e.g.:\n", + "\n", + "- output the whole directory to a single compressed ZIP or tar archive file (e.g. ``hv.archive.set_param(pack=False, archive_format='zip')`` or ``archive_format='tar'``)\n", + "\n", + "- generate a new directory or archive every time the notebook is run (``hv.archive.uniq_name=True``); otherwise the old output directory is erased each time \n", + "\n", + "- choose your own name for the output directory or archive (e.g. ``hv.archive.export_name=\"{timestamp}\"``)\n", + "\n", + "- change the format of the optional timestamp (e.g. to retain snapshots hourly, ``archive.set_param(export_name=\"{timestamp}\", timestamp_format=\"%Y_%m_%d-%H\")``)\n", + "\n", + "- select PNG output, at a specified rendering resolution:\n", + "```hv.archive.exporters=[hv.Store.renderers['matplotlib'].instance(size=50, fig='png', dpi=144)])\n", + "```\n", + "\n", + "These options and any others listed above can all be set in the ``hv.archive.auto()`` call at the start, for convenience and to ensure that they apply to all of the files that are added." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Writing the archive to disk" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To actually write the files you have stored in the archive to disk, you need to call ``export()`` after any cell that might contain computation-intensive code. Usually it's best to do so as the last or nearly last cell in your notebook, though here we do it earlier because we wanted to show how to use the exported files." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.archive.export()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Shortly after the ``export()`` command has been executed, the output should be available as a directory on disk, by default in the same directory as the notebook file, named with the name of the notebook: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.getcwd()\n", + "if os.path.exists(hv.archive.notebook_name):\n", + " print('\\n'.join(sorted(os.listdir(hv.archive.notebook_name))))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For technical reasons to do with how the IPython Notebook interacts with JavaScript, if you use the IPython command ``Run all``, the ``hv.archive.export()`` command is not actually executed when the cell with that call is encountered during the run. Instead, the ``export()`` is queued until after the final cell in the notebook has been executed. This asynchronous execution has several awkward but not serious consequences:\n", + "\n", + "- It is not possible for the ``export()`` cell to show whether any errors were encountered during exporting, because these will not occur until after the notebook has completed processing. To see any errors, you can run ``hv.archive.last_export_status()`` separately, *after* the ``Run all`` has completed. E.g. just press shift-[Enter] in the following cell, which will tell you whether the previous export was successful.\n", + "\n", + "- If you use ``Run all``, the directory listing ``os.listdir()`` above will show the results from the *previous* time this notebook was run, since it executes before the export. Again, you can use shift-[Enter] to update the data once complete.\n", + "\n", + "- The ``Export name:`` in the output of ``hv.archive.export()`` will not always show the actual name of the directory or archive that will be created. In particular, it may say ``{notebook}``, which when saving will actually expand to the name of your IPython Notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.archive.last_export_status()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Accessing your saved data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By default, HoloViews saves not only your rendered plots (PNG, SVG, etc.), but also the actual HoloViews objects that the plots visualize, which contain all your actual data. The objects are stored in compressed Python pickle files (``.hvz``), which are visible in the directory listings above but have been ignored until now. The plots are what you need for writing a document, but the raw data is is a crucial record to keep as well. For instance, you now can load in the HoloViews object, and manipulate it just as you could when it was originally defined. E.g. we can re-load our ``Levels`` ``Overlay`` file, which has the contours overlaid on top of the image, and easily pull out the underlying ``Image`` object:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "from holoviews.core.io import Unpickler\n", + "c, a = None,None\n", + "hvz_file = [f for f in listing if f.endswith('hvz')][0]\n", + "path = os.path.join(hv.archive.notebook_name, hvz_file)\n", + "\n", + "if os.path.isfile(path):\n", + " print('Unpickling {filename}'.format(filename=hvz_file))\n", + " obj = Unpickler.load(open(path,\"rb\"))\n", + " print(obj)\n", + "else:\n", + " print('Could not find file {path}'.format(path=path))\n", + " print('Current directory is {cwd}'.format(cwd=os.getcwd()))\n", + " print('Containing files and directories: {listing}'.format(listing=os.listdir(os.getcwd())))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Given the ``Image``, you can also access the underlying array data, because HoloViews objects are simply containers for your data and associated metadata. This means that years from now, as long as you can still run HoloViews, you can now easily re-load and explore your data, plotting it entirely different ways or running different analyses, even if you no longer have any of the original code you used to generate the data. All you need is HoloViews, which is permanently archived on GitHub and is fully open source and thus should always remain available. Because the data is stored conveniently in the archive alongside the figure that was published, you can see immediately which file corresponds to the data underlying any given plot in your paper, and immediately start working with the data, rather than laboriously trying to reconstruct the data from a saved figure.\n", + "\n", + "If you do not want the pickle files, you can of course turn them off if you prefer, by changing ``hv.archive.auto()`` to:\n", + "\n", + "```python\n", + "hv.archive.auto(exporters=[hv.Store.renderers['matplotlib'].instance(holomap=None)])\n", + "```\n", + "\n", + "Here, the exporters list has been updated to include the usual default exporters *without* the `Pickler` exporter that would usually be included." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using HoloViews to do reproducible research" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The export options from HoloViews help you establish a feasible workflow for doing reproducible research: starting from interactive exploration, either export specific files with ``%%output``, or enable ``hv.archive.auto()``, which will store a copy of your notebook and its output ready for inclusion in a document but retaining the complete recipe for reproducing the results later. \n", + "\n", + "HoloViews also works very well with the [Lancet](http://ioam.github.io/lancet) tool for exploring large parameter spaces, and Lancet provides an interface to HoloViews that makes Lancet output directly available for use in HoloViews. Lancet, when used with IPython Notebook and HoloViews, makes it feasible to work with large numbers of computation-intensive processes that generate heterogeneous data that needs to be collated, analyzed, and visualized. For more background and a suggested workflow, see our [2013 paper on using Lancet](http://dx.doi.org/10.3389/fninf.2013.00044) with IPython Notebook. Because that paper was written before the release of HoloViews, it does not discuss how HoloViews helps in this process, but that aspect is covered in our [2015 paper on using HoloViews for reproducible research](http://conference.scipy.org/proceedings/scipy2015/pdfs/jean-luc_stevens.pdf)." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/examples/user_guide/Plotting_with_Bokeh.ipynb b/examples/user_guide/Plotting_with_Bokeh.ipynb new file mode 100644 index 0000000000..aa532d1957 --- /dev/null +++ b/examples/user_guide/Plotting_with_Bokeh.ipynb @@ -0,0 +1,504 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Plotting with Bokeh" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + " This tutorial contains a lot of bokeh plots, which may take a little while to load and render.\n", + "
\n", + "\n", + "One of the major design principles of HoloViews is that the declaration of data is completely independent from the plotting implementation. This means that the visualization of HoloViews data structures can be performed by different plotting backends. As part of the 1.4 release of HoloViews, a [Bokeh](http://bokeh.pydata.org) backend was added in addition to the default ``matplotlib`` backend. Bokeh provides a powerful platform to generate interactive plots using HTML5 canvas and WebGL, and is ideally suited towards interactive exploration of data.\n", + "\n", + "By combining the ease of generating interactive, high-dimensional visualizations with the interactive widgets and fast rendering provided by Bokeh, HoloViews becomes even more powerful.\n", + "\n", + "This tutorial will cover some basic options on how to style and change various plot attributes and explore some of the more advanced features like interactive tools, linked axes, and brushing." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As usual, the first thing we do is initialize the HoloViews notebook extension, but we now specify the backend specifically." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "import holoviews as hv" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We could instead leave the default backend as ``'matplotlib'``, and then switch only some specific cells to use bokeh using a cell magic:\n", + "\n", + "```python\n", + "%%output backend='bokeh'\n", + "obj\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Element Style options\n", + "\n", + "Most Bokeh Elements support a mixture of the following fill, line, and text style customization options:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from holoviews.plotting.bokeh.element import (line_properties, fill_properties,\n", + " text_properties)\n", + "print(\"\"\"\n", + " Line properties: %s\\n\n", + " Fill properties: %s\\n\n", + " Text properties: %s\n", + " \"\"\" % (line_properties, fill_properties, text_properties))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's an example of HoloViews Elements using a Bokeh backend, with bokeh's style options:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curve_opts = dict(line_width=10, line_color='indianred', line_dash='dotted', line_alpha=0.5)\n", + "point_opts = dict(fill_color='#00AA00', fill_alpha=0.5, line_width=1, line_color='black', size=5)\n", + "text_opts = dict(text_align='center', text_baseline='middle', text_color='gray', text_font='Arial')\n", + "xs = np.linspace(0, np.pi*4, 100)\n", + "data = (xs, np.sin(xs))\n", + "\n", + "(hv.Curve(data)(style=curve_opts) +\n", + " hv.Points(data)(style=point_opts) +\n", + " hv.Text(6, 0, 'Here is some text')(style=text_opts))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that because the first two plots use the same underlying data, they become linked, such that zooming or panning one of the plots makes the corresponding change on the other." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Sizing Elements" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Sizing and aspect of Elements in bokeh are always computed in absolute pixels. To change the size or aspect of an Element set the ``width`` and ``height`` plot options." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points.A [width=300 height=300] Points.B [width=600 height=300]\n", + "hv.Points(data, group='A') + hv.Points(data, group='B')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Controlling axes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Bokeh provides a variety of options to control the axes. Here we provide a quick overview of linked plots for the same data displayed differently by applying log axes, disabling axes, rotating ticks, specifying the number of ticks, and supplying an explicit list of ticks." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "points = hv.Points(np.exp(xs)) \n", + "axes_opts = [('Plain', {}),\n", + " ('Log', {'logy': True}),\n", + " ('None', {'yaxis': None}),\n", + " ('Rotate', {'xrotation': 90}),\n", + " ('N Ticks', {'xticks': 3}),\n", + " ('List Ticks', {'xticks': [0, 100, 300, 500]})]\n", + "\n", + "hv.Layout([points.relabel(group=group)(plot=opts)\n", + " for group, opts in axes_opts]).cols(3).display('all')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Datetime axes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Both the matplotlib and the bokeh backends allow plotting datetime data, if you ensure the dates array is of a datetime dtype. Note also that the legends are interactive and can be toggled by clicking on its entries. Additionally the style of unselected curve can be controlled by setting the ``muted_alpha`` and ``muted_color`` style options." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Overlay [width=600 legend_position='top_left'] Curve (muted_alpha=0.5 muted_color='black')\n", + "try:\n", + " import bokeh.sampledata.stocks\n", + "except:\n", + " import bokeh.sampledata\n", + " bokeh.sampledata.download()\n", + "\n", + "from bokeh.sampledata.stocks import GOOG, AAPL\n", + "goog_dates = np.array(GOOG['date'], dtype=np.datetime64)\n", + "aapl_dates = np.array(AAPL['date'], dtype=np.datetime64)\n", + "hv.Curve((goog_dates, GOOG['adj_close']), kdims=['Date'], vdims=['Stock Index'], label='Google') *\\\n", + "hv.Curve((aapl_dates, AAPL['adj_close']), kdims=['Date'], vdims=['Stock Index'], label='Apple')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Categorical axes\n", + "\n", + "A number of Elements will also support categorical (i.e. string types) as dimension values, these include ``HeatMap``, ``Points``, ``Scatter``, ``Curve``, ``ErrorBar`` and ``Text`` types.\n", + "\n", + "Here we create a set of points indexed by ascending alphabetical x- and y-coordinates and values multiplying the integer index of each coordinate. We then overlay a ``HeatMap`` of the points with the points themselves enabling the hover tool for both and scaling the point size by the 'z' coordines." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [size_index='z' tools=['hover']] HeatMap [toolbar='above' tools=['hover']]\n", + "points = hv.Points([(chr(i+65), chr(j+65), i*j) for i in range(10) for j in range(10)], vdims=['z'])\n", + "hv.HeatMap(points) * points" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the example above both axes are categorical because a HeatMap by definition represents 2D categorical coordinates (unlike Image and Raster types). Other Element types will automatically infer a categorical dimension if the coordinates along that dimension include string types.\n", + "\n", + "Here we will generate random samples indexed by categories from 'A' to 'E' using the Scatter Element and overlay them.\n", + "Secondly we compute the mean and standard deviation for each category and finally we overlay these two elements with a curve representing the mean value and a text element specifying the global mean. All these Elements respect the categorical index, providing us a view of the distribution of values in each category:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Overlay [show_legend=False height=400 width=600] ErrorBars (line_width=5) Scatter(alpha=0.2 size=6)\n", + "\n", + "overlay = hv.NdOverlay({group: hv.Scatter(([group]*100, np.random.randn(100)*(i+1)+i))\n", + " for i, group in enumerate(['A', 'B', 'C', 'D', 'E'])})\n", + "\n", + "errorbars = hv.ErrorBars([(k, el.reduce(function=np.mean), el.reduce(function=np.std))\n", + " for k, el in overlay.items()])\n", + "\n", + "global_mean = hv.Text('A', 12, 'Global mean: %.3f' % overlay.dimension_values('y').mean())\n", + "\n", + "errorbars * overlay * hv.Curve(errorbars) * global_mean" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Containers" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Tabs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using bokeh, both ``(Nd)Overlay`` and ``(Nd)Layout`` types may be displayed inside a ``tabs`` widget. This may be enabled via a plot option ``tabs``, and may even be nested inside a Layout." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Overlay [tabs=True] Image [width=400 height=400]\n", + "x,y = np.mgrid[-50:51, -50:51] * 0.1\n", + "\n", + "img = hv.Image(np.sin(x**2+y**2), bounds=(-1,-1,1,1))\n", + "img.relabel('Low') * img.clone(img.data*2).relabel('High')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Another reason to use ``tabs`` is that some Layout combinations may not be able to be displayed directly using HoloViews. For example, it is not currently possible to display a ``GridSpace`` as part of a ``Layout`` in any backend, and this combination will automatically switch to a ``tab`` representation for the bokeh backend." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Marginals\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Bokeh backend also supports marginal plots to generate adjoined plots. The most convenient way to build an AdjointLayout is with the ``.hist()`` method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "points = hv.Points(np.random.randn(500,2))\n", + "points.hist(num_bins=51, dimension=['x','y'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When the histogram represents a quantity that is mapped to a value dimension with a corresponding colormap, it will automatically share the colormap, making it useful as a colorbar for that dimension as well as a histogram." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "img.hist(num_bins=100, dimension=['x', 'y'], weight_dimension='z', mean_weighted=True) +\\\n", + "img.hist(dimension='z')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## HoloMaps" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "HoloMaps work in bokeh just as in other backends." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hmap = hv.HoloMap({phase: img.clone(np.sin(x**2+y**2+phase))\n", + " for phase in np.linspace(0, np.pi*2, 6)}, kdims=['Phase'])\n", + "hmap.hist(num_bins=100, dimension=['x', 'y'], weight_dimension='z', mean_weighted=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tools: Interactive widgets" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Hover tools" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Some Elements allow revealing additional data by hovering over the data. To enable the hover tool, simply supply ``'hover'`` as a list to the ``tools`` plot option." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [tools=['hover']] (size=5) HeatMap [tools=['hover']] Histogram [tools=['hover']] Layout [shared_axes=False]\n", + "error = np.random.rand(100, 3)\n", + "heatmap_data = {(chr(65+i),chr(97+j)):i*j for i in range(5) for j in range(5) if i!=j}\n", + "data = [np.random.normal() for i in range(10000)]\n", + "hist = np.histogram(data, 20)\n", + "hv.Points(error) + hv.HeatMap(heatmap_data).sort() + hv.Histogram(hist)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Selection tools" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Bokeh provides a number of tools for selecting data points including ``tap``, ``box_select``, ``lasso_select`` and ``poly_select``. To distinguish between selected and unselected data points we can also control the color and alpha of the ``selection`` and ``nonselection`` points. You can try out any of these selection tools and see how the plot is affected:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Points [tools=['box_select', 'lasso_select', 'tap']] (size=10 nonselection_color='red' color='blue')\n", + "hv.Points(error)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Selection widget with shared axes and linked brushing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When dealing with complex multi-variate data it is often useful to explore interactions between variables across plots. HoloViews will automatically link the data sources of plots in a Layout if they draw from the same data, allowing for both linked axes and brushing.\n", + "\n", + "We'll see what this looks like in practice using a small dataset of macro-economic data:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "macro_df = pd.read_csv('http://assets.holoviews.org/macro.csv', '\\t')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By creating two ``Points`` Elements, which both draw their data from the same pandas DataFrame, the two plots become automatically linked. This linking behavior can be toggled with the ``shared_datasource`` plot option on a ``Layout`` or ``GridSpace``. You can try selecting data in one plot, and see how the corresponding data (those on the same rows of the DataFrame, even if the plots show different data, will be highlighted in each." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter [tools=['box_select', 'lasso_select']] Layout [shared_axes=True shared_datasource=True]\n", + "hv.Scatter(macro_df, kdims=['year'], vdims=['gdp']) +\\\n", + "hv.Scatter(macro_df, kdims=['gdp'], vdims=['unem'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A gridmatrix is a clear use case for linked plotting. This operation plots any combination of numeric variables against each other, in a grid, and selecting datapoints in any plot will highlight them in all of them. Such linking can thus reveal how values in a particular range (e.g. very large outliers along one dimension) relate to each of the other dimensions." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%opts Scatter [tools=['box_select', 'lasso_select', 'hover'] border=0] Histogram {+axiswise}\n", + "table = hv.Table(macro_df, kdims=['year', 'country'])\n", + "matrix = hv.operation.gridmatrix(table.groupby('country'))\n", + "matrix.select(country=['West Germany', 'United Kingdom', 'United States'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The [Bokeh Elements](Bokeh_Elements.ipynb) tutorial shows examples of all the Elements supported for Bokeh, in a format that can be compared with the default matplotlib [Elements](Elements.ipynb) tutorial." + ] + } + ], + "metadata": { + "language_info": { + "name": "python", + "pygments_lexer": "ipython3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +}