Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DOCS] Updates add data content #81093

Merged
merged 13 commits into from
Oct 30, 2020
123 changes: 45 additions & 78 deletions docs/setup/connect-to-elasticsearch.asciidoc
Original file line number Diff line number Diff line change
@@ -1,34 +1,57 @@
[[connect-to-elasticsearch]]
== Add data to {kib}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar to the comment from Nathan, should we simply have "Add data" here to avoid confusion?


To start working with your data in {kib}, you can:
The fastet way to add data to {kib} is to use one of the built-in options,
available from the home page. You can:

* Upload a CSV, JSON, or log file with the File Data Visualizer.
* Add data from popular apps and services
* Add data using an Elastic Agent
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not exactly clear how line 7 is different from line 8 (Elastic Agent is the new way of adding data from popular apps and services, no?)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd also indicate here that this is beta

* Upload a file

* Upload geospatial data with the GeoJSON Upload feature.
[role="screenshot"]
image::images/add-data-home.png[Built-in options for adding data to Kibana]

* Index logs, metrics, events, or application data by setting up a Beats module.
If you're not ready to use your own data, you can add a <<get-started, sample data set>>
to give {kib} a test drive.

* Connect {kib} with existing {es} indices.
[float]
[[add-data-tutorial-kibana]]
=== Add data tutorials

If you're not ready to use your own data, you can add a <<get-started, sample data set>>
to see all that you can do in {kib}.
These tutorials guide you through installing and configuring a
Beats data shipper to collect metrics, logs, security events, and application data and send it to {es}.
You can then use the pre-built dashboards to explore and analyze the data.

If a tutorial doesn’t exist for your data, go to the {beats-ref}/beats-reference.html[Beats overview]
to learn about other data shippers in the Beats family.

[role="screenshot"]
image::images/add-data-tutorials.png[Add Data tutorials]


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be worth stating explicitly somewhere that Elastic Agent is an alternative to installing and managing Beats individually. I think there will be some confusion for users because Elastic Agent installs and manages Beats under the hood. We really want that to be an implementation detail that most users don't need to know about, though.

[discrete]
[[add-data-fleet-kibana]]
=== Add an Elatic Agent

beta[]

Fleet is a single, unified way to collect logs and metrics from systems
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This description blurs the lines between Fleet and agent. Agent is how you collect logs/metrics. Fleet is the UI you use to add and manage agents. You might want to look at the Fleet overview (thought honestly I want to rework that because it's a bit disjointed). https://www.elastic.co/guide/en/ingest-management/7.10/fleet-overview.html

and services across your organization. You don't have to install multiple
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"You don't have to" makes it sound like this is optional.

Maybe: "Instead of installing multiple agents on each host you want to monitor, you install a single agent that manages data collection, making it easier to deploy policies across your infrastructure."

(Kind of wordy and I'm not sure users will understand what a policy is at this point, but hopefully you see my point.)

Beats and other agents, making easier to deploy policies across your infrastructure.
For more information, see

[role="screenshot"]
image::images/add-data-fleet.png[Add data using Fleet]

[float]
[[upload-data-kibana]]
=== Upload a CSV, JSON, or log file
=== Upload a file

experimental[]

To visualize data in a CSV, JSON, or log file, you can upload it using the File
Data Visualizer. On the home page, click *Import a CSV, NDSON, or log file*, and
then drag your file into the File Data Visualizer. Alternatively, you can open
it by navigating to *Machine Learning* from the side navigation and selecting
*Data Visualizer*.

[role="screenshot"]
image::images/data-viz-homepage.jpg[File Data Visualizer on the home page]

Data Visualizer. On the home page, click *Upload a file*, and
then drag your file into the *File Data Visualizer*.
You can upload a file up to 100 MB. This value is configurable up to 1 GB in
<<kibana-ml-settings, Advanced Settings>>.

Expand All @@ -41,67 +64,11 @@ the uploaded file and to suggest ingest pipelines and mappings for your data.
NOTE: This feature is not intended for use as part of a
repeated production process, but rather for the initial exploration of your data.

[float]
[[upload-geoipdata-kibana]]
=== Upload geospatial data

To visualize geospatial data in a point or shape file, you can upload it using the <<import-geospatial-data, GeoJSON Upload>>
feature in Maps, and then use that data as a layer in a map.
The data is also available for use in the broader Kibana ecosystem, for example,
in visualizations and Canvas workpads.
With GeoJSON Upload, you can upload a file up to 50 MB.
To visualize geospatial data in a point or shape file, use the <<import-geospatial-data, GeoJSON Upload>>
feature in *Maps*.

[float]
[[add-data-tutorial-kibana]]
=== Index metrics, log, security, and application data

The built-in data tutorials can help you quickly get up and running with
metrics data, log analytics, security events, and application data.
These tutorials walk you through installing and configuring a
Beats data shipper to periodically collect and send data to {es}.
You can then use the pre-built dashboards to explore and analyze the data.
[discrete]
=== Or, load your data yourself

You access the tutorials from the home page.
If a tutorial doesn’t exist for your data, go to the {beats-ref}/beats-reference.html[Beats overview]
to learn about other data shippers in the Beats family.

[role="screenshot"]
image::images/add-data-tutorials.png[Add Data tutorials]


[float]
[[connect-to-es]]
=== Connect with {es} indices

To visualize data in existing {es} indices, you must
create an index pattern that matches the names of the indices that you want to explore.
When you add data with the File Data Visualizer, GeoJSON Upload feature,
or built-in tutorial, an index pattern is created for you.

. Go to *Stack Management*, and then click *Index Patterns*.

. Click *Create index pattern*.

. Specify an index pattern that matches the name of one or more of your Elasticsearch indices.
+
For example, an index pattern can point to your Apache data from yesterday,
`filebeat-apache-4-3-2022`, or any index that matches the pattern, `filebeat-*`.
Using a wildcard is the more popular approach.


. Click *Next Step*, and then select the index field that contains the timestamp you want to use to perform time-based
comparisons.
+
Kibana reads the index mapping and lists all fields that contain a timestamp. If your
index doesn't have time-based data, choose *I don't want to use the time filter*.
+
You must select a time field to use global time filters on your dashboards.

. Click *Create index pattern*.
+
{kib} is now configured to access your {es} indices.
You’ll see a list of fields configured for the matching index.
You can designate your index pattern as the default by clicking the star icon on this page.
+
When searching in *Discover* and creating visualizations, you choose a pattern
from the index pattern menu to specify the {es} indices that contain the data you want to explore.
You can also index your data into Elasticsearch with REST APIs or client libraries.
After you add your data, you'll need to create an <<index-patterns,index pattern>>.
Binary file added docs/setup/images/add-data-fleet.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/setup/images/add-data-home-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/setup/images/add-data-home.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
29 changes: 10 additions & 19 deletions docs/user/introduction.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -31,25 +31,16 @@ and processes the data, with {kib} sitting on top.

From the home page, {kib} provides these options for adding data:

* Import data using the
https://www.elastic.co/blog/importing-csv-and-log-data-into-elasticsearch-with-file-data-visualizer[File Data visualizer].
* Set up a data flow to Elasticsearch using our built-in tutorials.
If a tutorial doesn’t exist for your data, go to the
{beats-ref}/beats-reference.html[Beats overview] to learn about other data shippers
in the {beats} family.
* <<add-sample-data, Add a sample data set>> and take {kib} for a test drive without loading data yourself.
* Index your data into Elasticsearch with {ref}/getting-started-index.html[REST APIs]
or https://www.elastic.co/guide/en/elasticsearch/client/index.html[client libraries].
+
[role="screenshot"]
image::images/intro-data-tutorial.png[Ways to get data in from the home page]
* Set up a data flow using popular apps and services
* Collect data using Elastic Agent
* Upload data using the
https://www.elastic.co/blog/importing-csv-and-log-data-into-elasticsearch-with-file-data-visualizer[File Data visualizer]

If you're not ready to use your own data, you can add a <<get-started, sample data set>>
to give {kib} a test drive.

{kib} uses an
<<index-patterns, index pattern>> to tell it which {es} indices to explore.
If you add upload a file, run a built-in tutorial, or add sample data, you get an index pattern for free,
and are good to start exploring. If you load your own data, you can create
an index pattern in <<management, Stack Management>>.
[role="screenshot"]
image::setup/images/add-data-home.png[Ways to get data in from the home page]

[float]
[[explore-and-query]]
Expand Down Expand Up @@ -94,7 +85,7 @@ and dynamic client-side styling.

* <<tsvb, TSVB>> allows you to combine
an infinite number of aggregations to display complex data.
With TSVB, you can analyze multiple index patterns and customize
With TSVB, you can customize
every aspect of your visualization. Choose your own date format and color
gradients, and easily switch your data view between time series, metric,
top N, gauge, and markdown.
Expand Down Expand Up @@ -124,7 +115,7 @@ dashboards in one space, but full access to all of Kibana’s features in anothe
=== Manage all things Elastic Stack

<<management, Stack Management>> provides guided processes for managing all
things Elastic Stack &mdash; indices, clusters, licenses, UI settings, index patterns,
things Elastic Stack &mdash; indices, clusters, licenses, UI settings,
and more. Want to update your {es} indices? Set user roles and privileges?
Turn on dark mode? Kibana has UIs for all that.

Expand Down