Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Saved scripted fields #1537

Closed
5 tasks done
rashidkpc opened this issue Oct 6, 2014 · 12 comments
Closed
5 tasks done

Saved scripted fields #1537

rashidkpc opened this issue Oct 6, 2014 · 12 comments
Assignees
Labels
Milestone

Comments

@rashidkpc
Copy link
Contributor

Now that elasticsearch scripting is safe we could abstract out script fields and let people provide computed fields, say from a combination of several fields, or with math done on numeric fields.

@elvarb
Copy link

elvarb commented Oct 7, 2014

Would it be possible to script based on "external" data?

The use case I can think of is if you have a "amount" field and "currency", to be able to look up what is needed to convert the field to some specific currency.

@rashidkpc
Copy link
Contributor Author

It would not be possible to script based on external data unless that data was available to elasticsearch, this would use Elasticsearch's scripting support.

@rashidkpc
Copy link
Contributor Author

This could also be used to combine time fields, for example if you had startTime and endTime and want to see the duration.

@elvarb
Copy link

elvarb commented Oct 8, 2014

Nice use case, for the start and end time, wanted this for a long time. Thought about using Logstash's ruby filter to do the calculations there. If we have scripted fields like this, would we be able to sort by those fields or query against them? (for example, sort by duration or show only durations over x)

To do the currency calculation, would it be possible to use a percolator instead of elasticsearch scripting and then use Kibana's scripted fields to multiply one entry field with a percolator field?

That way you would update the percolator daily with new currency data and the scripted fields would automatically translate any currency in a field to the same currency.

@rashidkpc rashidkpc mentioned this issue Oct 8, 2014
@rashidkpc
Copy link
Contributor Author

Other uses for this include running a terms facet over time field to show say, what hour of the day is most popular, or what day of the week, over several weeks.

@andersmartinsson
Copy link

Will it be possible to make a scripted field that calculates the margin from for example a cost and sales price field and use that value in a histogram? We are using kibana for sales figures and so far I have not been able to show the margin just the sales figures.... Being able to do this would give us alot better view of the profitmargin over time.

@ellkay-joshe
Copy link

When you say abstract out script fields, do you mean that any aspect of scripting we want to use will have to be explicitly supported in Kibana? One of the things I've wanted for a while was to use a script in Kibana to pull the day out of a timestamp in ES, so would I be selecting a field, then selecting "Day" or something or would I be able to just write a script expression to pull out the value I want?

<edit> oh wow I totally missed the thing you said above that actually covered my exact case.

@jayswan
Copy link

jayswan commented Oct 16, 2014

See this thread for additional discussion of Kibana and ES scripting:

https://groups.google.com/forum/#!topic/logstash-users/crgmYEkXbGI

Specifically, I'd like to have a Kibana-based method to extract data from arbitrary fields and (ideally) get stats on them, as in a terms aggregation. My use case is this: some common log sources (e.g., Cisco routers & switches) have log formats that aren't consistently formatted enough to be able to parse their details in advance. Frequently, I want to extract a field using a regex or field position, then to the equivalent of a terms aggregation for it.

Today, I find the appropriate query in Kibana, paste it into curl, extract the message field using jq, then use Unix toolchain commands (awk, grep, wc, etc) to manually summarize data.

This request is similar to the search-time field extraction feature in Splunk.

@w33ble
Copy link
Contributor

w33ble commented Dec 8, 2014

All tasks complete and merged into master for release in Beta 3.

@w33ble w33ble closed this as completed Dec 8, 2014
@tomryanx
Copy link

Could I get a pointer to how to implement this: "Other uses for this include running a terms facet over time field to show say, what hour of the day is most popular, or what day of the week, over several weeks."

@w33ble
Copy link
Contributor

w33ble commented Jun 30, 2015

@tomryanx Unfortunately, that functionality was provided for the dynamic groovy scripting, which has since been disabled for security reasons. You could still expose those scripts via indexed scripting or file based scripting. If you'd like specific pointers, please post the question over at our discussion boards located at https://discuss.elastic.co/c/kibana

@GrahamHannington
Copy link

#561 was closed in favor of this issue. Similar to requirements described by other users, I would like to split charts by day of week (to compare number of transactions at the same time each day), or by month of year. Guess I'll go looking for some groovy scripts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants