Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose metrics to JS #4654

Merged
merged 1 commit into from
Apr 3, 2018
Merged

Conversation

betodealmeida
Copy link
Member

@vylc requested the ability of exposing the metrics to the JS advanced tooltip:

screen shot 2018-03-20 at 2 59 37 pm

I also added a try/except block around the sandbox eval, so that the query still runs when the JS is broken, and the exception is shown:

screen shot 2018-03-20 at 3 03 19 pm

@codecov-io
Copy link

Codecov Report

Merging #4654 into master will decrease coverage by <.01%.
The diff coverage is 75%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #4654      +/-   ##
==========================================
- Coverage   71.25%   71.25%   -0.01%     
==========================================
  Files         190      190              
  Lines       14916    14918       +2     
  Branches     1102     1102              
==========================================
+ Hits        10629    10630       +1     
- Misses       4284     4285       +1     
  Partials        3        3
Impacted Files Coverage Δ
superset/viz.py 78.46% <ø> (ø) ⬆️
superset/assets/javascripts/modules/sandbox.js 93.75% <75%> (-6.25%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update fc47729...8e8c871. Read the comment docs.

vm.runInNewContext(codeToEval, sandbox, opts);
return sandbox[resultKey];
} catch (error) {
return () => error;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why returning a function that returns an error?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function sandboxedEval returns a function (either to mutate the data or generate the tooltip). I'm being consistent here, and returning this ensures that the error is displayed in the tooltip (see second screenshot).

@mistercrunch mistercrunch merged commit ab7ba20 into apache:master Apr 3, 2018
@mistercrunch mistercrunch deleted the DPTOOLS-393_expose_metrics branch April 3, 2018 00:48
michellethomas pushed a commit to michellethomas/panoramix that referenced this pull request May 24, 2018
timifasubaa pushed a commit to timifasubaa/incubator-superset that referenced this pull request May 31, 2018
wenchma pushed a commit to wenchma/incubator-superset that referenced this pull request Nov 16, 2018
@mistercrunch mistercrunch added 🏷️ bot A label used by `supersetbot` to keep track of which PR where auto-tagged with release labels 🚢 0.25.0 labels Feb 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🏷️ bot A label used by `supersetbot` to keep track of which PR where auto-tagged with release labels 🚢 0.25.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants