Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement variable inspector for kernels #443

Open
selimrbd opened this issue Jul 18, 2016 · 96 comments
Open

Implement variable inspector for kernels #443

selimrbd opened this issue Jul 18, 2016 · 96 comments

Comments

@selimrbd
Copy link

Hi,

It is useful when doing interactive data analysis to quickly get the name/quick summary (type/ size) of in-memory variables.

I was wondering if having a block for displaying this kind of information could be an interesting feature for JupyterLab ? Such as the one in RStudio:

screen shot 2016-07-18 at 15 55 52

@blink1073
Copy link
Member

Hi @SelimRABOUDI, this feature would be a most welcome contribution.

@sccolbert
Copy link
Contributor

👍 Would be a great project for a community developed extension.

@ellisonbg
Copy link
Contributor

Great idea and +1 to someone from the community implementing this plugin. A design question: would you want to know the ENV vars for kernels or the main notebook server, or both?

@zertrin
Copy link

zertrin commented Jul 19, 2016

I'm afraid there is a misunderstanding between the title of this issue and the description.

OP seems to be asking more or less for the "Workspace Variables" block as can be seen in Matlab or similar programs, and not for the Environment variables which are a complete different thing.

Quoting him:

get the name/quick summary (type/ size) of in-memory variables

I suggest changing the title of the issue to accurately reflect the description.

@selimrbd
Copy link
Author

selimrbd commented Jul 19, 2016

@zertrin : indeed I've used the wrong vocabulary. Correcting the title.

A positive side-effect of my mistake is that the idea of an "ENV vars" block could also be relevant.

I'll look into how I can do a first implementation of the "in-memory vars" block (if you think it still makes sense ?)

@selimrbd selimrbd changed the title Feature idea: "Environment Variables" block Feature idea: "In-memory Variables" block Jul 19, 2016
@danielballan
Copy link
Contributor

I heard a couple people at SciPy express interest in a feature like this. Could it be done with a thin wrapper around IPython's who magic? Is there a way to plug into cell execution such that who would be re-run and the widget updated every time a cell is executed?

@blink1073
Copy link
Member

If not, we can add one ;).

@blink1073
Copy link
Member

If you have the kernel, you can connect to iopubMessage, filter on execute_result messages, and take appropriate action.

@jasongrout
Copy link
Contributor

(brainstorming here...)

Maybe we should have the concept of "kernel widgets" which are widgets that interact with a single kernel. You could open such a widget from a tab listing all running kernels, or maybe there would be a way to open the widgets from the kernel indication in a document. This variable explorer would be an example of a kernel widget. Likely a kernel widget could depend on the type of kernel, or it could be available for all kernels.

On the other hand, if this variable explorer was a kernel widget, what would we do if the notebook we launched it from started a different kernel? Would we want the variable explorer to automatically know to switch to the new kernel? I think that's probably what the user would expect, so the notebook that launched the kernel widget should keep a handle on it so it could change the kernel.

@svenefftinge
Copy link
Contributor

Here again the notion of an "active" kernel could be useful, so the user doesn't need to select the kernel for such a widget. (mentioned in #450 (comment))

@danielballan
Copy link
Contributor

I usually have more than one kernel running at a time, and I think I'd have trouble keeping track of which one was "active." I like @jasongrout's idea that kernel widgets like a "variable explorer" could be launched from a notebook and retain references in both directions. Other kernel widgets (like an HDF5 viewer) might close over a private kernel used only by that widget.

@aggFTW
Copy link
Contributor

aggFTW commented Jul 26, 2016

Love the idea of a kernel widget.

In our case, we'd love to develop a widget that knows the state of its kernel and displays information to the user based on it. In this case, the state is user variables, but in the case of a Spark kernel, it might be the connection string that the kernel is using to connect to Spark, as well as general information regarding this Spark instance.

If a user has multiple notebook tabs open, I believe good UX would be to have the widget change contents appropriately as the user selects multiple notebook tabs. One way of achieving this might be that kernel widgets are part of the notebooks JupyterLab panel, and can be minimized/maximized/docked/undocked in the context of the notebook panel.

@ellisonbg ellisonbg changed the title Feature idea: "In-memory Variables" block Implement variable inspector for kernels Aug 25, 2016
@ellisonbg ellisonbg added this to the Backlog milestone Aug 25, 2016
@SynapticSage
Copy link

This feature would make JupyterLab a closer Pythonic analogue of Matlab's IDE

@OldGuyInTheClub
Copy link

OldGuyInTheClub commented Jun 8, 2017

Yes, please. With knobs on. This is absolutely essential for scientific programming as is a visual debugger with breakpoints.

@Gerenuk
Copy link

Gerenuk commented Jun 8, 2017

The least it should show somehow is the keys and column names of dicts and tables. Looking up those identifiers happens very often. Maybe in a tooltip or expandable view.

@vidartf
Copy link
Member

vidartf commented Jun 8, 2017

Spyder IDE already has this for their IPython consoles (Spyder tries to be similar to Matlab IDE), and is licensed under MIT. Even if the language/framework used is different, it could serve as a source of inspiration.

@OldGuyInTheClub
Copy link

Spyder is ok but a friend recently showed me Rstudio and it was very impressive, very close to Matlab's UI and now has a notebook capability. I don't think it fully supports other kernels though. It is good enough that I am thinking about learning R.

@astrojuanlu
Copy link

This is absolutely essential for scientific programming as is a visual debugger with breakpoints.

Is there a separate issue/feature request for a visual debugger? I guess it should have to be developed as an extension. Perhaps this Python 3.7 addition will be of great help:

https://www.python.org/dev/peps/pep-0553/

@blink1073
Copy link
Member

There isn't an issue for a debugger, please feel free to open one, @Juanlu001.

@astrojuanlu
Copy link

Done! #3049

@hershelm
Copy link

This feature is one of my favorites in RStudio, and would love to see it in JupyterLab. Especially nice if combined with the ability to click dataframes and explore them (similar to View() in RStudio)

@vidartf
Copy link
Member

vidartf commented May 1, 2019

Considering that such an extension exists, I guess I should modify that to "anybody who's developing such an extension ".

@thbak
Copy link

thbak commented Dec 22, 2019

I have installed the extension and it works perfectly - any chance it could support the Matlab kernel too?

@nielsenrechia
Copy link

nielsenrechia commented Dec 9, 2020

Hello Guys,

I'm trying to use the variable inspector with R kernel. It shows to me the environment variables perfectly. My doubt is about opening data frames. It is possible with R kernel? In my case, It's only working with python, but with R it is not.

There is some possibility to open a R dataframe?

Thanks,

@M00NSH0T
Copy link

M00NSH0T commented Jan 14, 2021

it seems that the author of the variableinspector extension may not be actively maintaining it and so it's not compatible with JupyterLab3. (see here).

I'd love to see this functionality be rolled into the core JupyterLab functionality, as has been done with the debugger. As is, I'm sticking with JupyterLab2 for now just because I really love the variableinspector extension.

@jasongrout
Copy link
Contributor

FYI, if I recall correctly, a variable inspector is on the roadmap for the debugger extension.

@jasongrout
Copy link
Contributor

Updating that extension may be as simple as bumping the dependencies at https://github.com/lckr/jupyterlab-variableInspector/blob/50c522c613c6ce59fa19f88e4c7db11fa4b1dd56/package.json#L34-L40

@giswqs
Copy link

giswqs commented Mar 3, 2021

Is it possible to return all defined variables as a list programmatically without using the magic command (%who) or the variable inspector? I need the list of variable names to be passed into a function. Thanks.

@jasongrout
Copy link
Contributor

Is it possible to return all defined variables as a list programmatically without using the magic command (%who) or the variable inspector? I need the list of variable names to be passed into a function. Thanks.

Sounds like you may want the Python globals() function?

@giswqs
Copy link

giswqs commented Mar 3, 2021

Is it possible to return all defined variables as a list programmatically without using the magic command (%who) or the variable inspector? I need the list of variable names to be passed into a function. Thanks.

Sounds like you may want the Python globals() function?

The list of variable names defined in the notebook is to be passed into a function defined in a Python script. globals(), locals(), dirs() won't work in this case, as they only list the functions and variables defined in a Python script, not the actual ones defined in a notebook. I want to use the list of various names in an ipywidgets DropDown list within an ipyleaflet Map object.

@jasongrout
Copy link
Contributor

I meant to call globals() in the notebook and pass the resulting dictionary into your function imported from your python script.

@giswqs
Copy link

giswqs commented Mar 3, 2021

I meant to call globals() in the notebook and pass the resulting dictionary into your function imported from your python script.

For example, the function below is defined in a module of a python package. When calling this function (from a *.py file) within Jupyter notebook, globals() actually gets all the variables/functions/classes defined in that module, not the ones defined in the notebook.

def list_vars():

    result = []    
    for var in globals():
        if not var.startswith("_"):
            result.append(var)                
    return result

@jasongrout
Copy link
Contributor

I think we are probably talking past each other, since it sounds like both of us understand how things work. My thought is that you could use globals() in a notebook cell (getting things defined in the notebook), and pass the result into a function.

@Atcold
Copy link

Atcold commented Nov 2, 2022

Are there any updates on this?

@anthonytec2
Copy link
Contributor

From my understanding this was released recently: https://github.com/jupyterlab/jupyterlab/blob/f0790b3ffd187b7dc923156b8ae227855abecefb/packages/inspector/README.md. There is now a little bug on the side of the jupyterlab panel, where the variable inspector is.

@jakirkham
Copy link

Yeah thought this was done a while ago (unless I'm misunderstanding the ask)

https://blog.jupyter.org/a-visual-debugger-for-jupyter-914e61716559

@Atcold
Copy link

Atcold commented Nov 3, 2022

That requires using some niche kernel, where matplotlib and other libraries don't work.
I was just hoping one could glance at an interactive output of %whos.

@astrojuanlu
Copy link

The blog post is outdated: ipykernel, the mainstream kernel, supports the debugging protocol since 6.0.0, released in June 2021: https://github.com/ipython/ipykernel/releases/tag/6.0.0, https://jupyterlab.readthedocs.io/en/stable/user/debugger.html#requirements

image

@jakirkham
Copy link

Sounds like this issue should be closed then?

@Atcold
Copy link

Atcold commented Nov 18, 2022

I don't think so.
I just tried using the debugger for showing some ndarrays and tensors, and it's basically unusable.

import numpy
import torch
a = 3
b = 7
c = numpy.random.rand(4)
d = torch.rand(4)

image
image
To be more explicit, for an ndarray and a tensor the shape should be reported, perhaps the dtype.

Using MATLAB (for comparison) we have

a = 3
b = 7
c = rand([1, 4])
d = rand(4)

image

@jakirkham
Copy link

That sounds like a new issue. Given this issue was about implementing the feature (as opposed to usability bugs with the feature)

@Atcold
Copy link

Atcold commented Nov 18, 2022

Fine with me.
Let's close this and I'll open the new one.

@krassowski
Copy link
Member

Just to point out, since JupyterLab 3.3 you can hover over the list and click loop icon (or-right click to be offered an option to render the variable), as in:

demo_render_variable

Currently it uses the default MIME type but we should be able to do better (like render a table for tabular pandas/numpy data even if currently we show just the default HTML preview).

@Atcold Did you open a new issue? We have a triage process which means that comments on old issues may not receive the same attention as new issues (and definitely we cannot tag a meta-issue for any specific release).

@jakirkham I personally would not consider this one closed as it required a debugger-enabled kernel.

@Atcold
Copy link

Atcold commented Dec 6, 2022

@krassowski: done.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests