Skip to content

Commit

Permalink
Merge pull request #334 from quixio/dev
Browse files Browse the repository at this point in the history
Docs Release 2024-05-004
  • Loading branch information
tbedford authored May 14, 2024
2 parents c195af1 + 52cd1f5 commit 58387f7
Show file tree
Hide file tree
Showing 24 changed files with 146 additions and 128 deletions.
6 changes: 6 additions & 0 deletions WRITING-STYLE.md
Original file line number Diff line number Diff line change
Expand Up @@ -344,6 +344,12 @@ When inserting example code in the text:
* There should *not* be a space before the colon.
* Place a blank line after the colon and before the code block.

## Substitution prompts

When substitutions are required in code blocks, such as requiring the reader to write their own API key, write the text prompt enclosed in a less than sign and a greater than sign. For example: `<your_API_key>`.

Except for proper nouns and acronyms, use lower case.

## Acronyms

Define acronyms on first use. On subsequent use in a topic you do not need to redefine the acronym, unless you feel it would provide clarity.
Expand Down
4 changes: 4 additions & 0 deletions docs/apis/query-api/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ description: The Query API enables you to fetch persisted data stored in the Qui

The Query API enables you to fetch persisted data stored in the Quix platform. You can use it for exploring the platform, prototyping applications, or working with stored data in any language with HTTP capabilities.

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

!!! note

The Query API is primarily designed for **testing purposes only**. For production storage of data, Quix recommends using one of the numerous [connectors](../../connectors/index.md) to persist data in the database technology of your choice.
Expand Down
4 changes: 4 additions & 0 deletions docs/cli/cli-reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ description: The Quix Command-Line Interface reference guide.

This is the reference guide for the Quix CLI.

!!! warning "Quix CLI is in development"

Quix CLI is currently Beta, and is under development. This documentation may not be completely up to date, as the CLI is updated frequently, with new commands added, some commands removed, and changes to command syntax. Please use the Quix CLI built-in help for the very latest information.

## Installation

Read the latest [install guide](https://github.com/quixio/quix-cli?tab=readme-ov-file#installation-of-quix-cli){target=_blank}.
Expand Down
34 changes: 20 additions & 14 deletions docs/cli/cli-template.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,23 +6,29 @@ In this tutorial you'll step through copying a template project into your Quix a

The tutorial assumes you have:

* Docker Desktop (and a docker CLI)
* Quix CLI
* Quix Cloud account
* [Docker Desktop](https://www.docker.com/products/docker-desktop/){target=_blank} (and a docker CLI) installed.
* [Quix CLI](https://github.com/quixio/quix-cli){target=_blank} installed.
* A free [Quix Cloud](https://portal.platform.quix.io/self-sign-up){target=_blank} account. Make sure you are signed up and logged in.

## Step 1: Copy the template

1. Navigate to the templates page and locate the Hello Quix template.
To copy the project template into your Quix Cloud account:

2. Select the Hello Quix panel and click the `Clone this project` button.
1. Navigate to the [templates page](https://quix.io/templates){target=_blank} and locate the Hello Quix template.

3. Use the single-click option to create your project in Quix.
2. Click the Hello Quix panel and then click the `Clone this project` button.

You are taken to Quix Cloud.

3. In Quix Cloud, in the `Import project` dialog, use the single-click option to create your project.

4. Navigate into the project environment.

## Step 2: Get your Gitea Git credentials
## Step 2: Get your Gitea credentials

1. Click on your profile picture, and select `Manage Git credentials`.
To obtain your Gitea credentials:

1. Click on your profile picture in Quix Cloud, and select `Manage Git credentials`.

2. Make a note of your Gitea username.

Expand All @@ -42,21 +48,21 @@ You'll now clone the project repo so you can work on it locally.

1. Create a directory for your project (for example `mkdir hello-quix`), and change into it.

2. Clone the project:
2. Clone the project, pasting in the Gitea URL you obtained in Step 2:

```
git clone https://gitea.platform.quix.io/your-org/hello-quix-clone.git .
```
```
git clone https://gitea.platform.quix.io/your-org/hello-quix-clone.git .
```
Use your organization name. You'll be prompted for your Gitea username and password.
Use your organization name. You'll be prompted for your Gitea username and password.
You now have a local clone of your project.
## Step 4: Run the project locally
There are various ways you can run your code locally. In this template you can run the entire pipeline locally using docker.
1. To run the pipeline locally:
1. To run the pipeline locally, in your terminal enter the following command:
```
docker compose up --build
Expand Down
8 changes: 7 additions & 1 deletion docs/cli/overview.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,18 @@
---
title: Quix CLI
description: The Quix Command-Line Interface.
description: The Quix Command-Line Interface. A powerful command-line companion for developing locally, and deploying to local brokers, hosted brokers, or Quix Cloud.
---

# Quix Command-Line Interface (CLI)

The [Quix CLI](https://github.com/quixio/quix-cli){target=_blank} is a powerful command-line companion for seamlessly managing and interacting with the features of Quix Cloud. While Quix offers a robust UI for a user-friendly experience, the CLI empowers you with efficiency and flexibility, enabling you to streamline your workflow, and take control from the command line.

!!! warning "Quix CLI is in development"

Quix CLI is currently Beta, and is under development. This documentation may not be completely up to date, as the CLI is updated frequently, with new commands added, some commands removed, and changes to command syntax. Please use the Quix CLI built-in help for the very latest information.

Some features are:

* Effortless Control: Execute commands effortlessly to manage various aspects of your Quix organization.

* Script Automation: Integrate Quix operations into your scripts for automated workflows and enhanced productivity.
Expand Down
8 changes: 6 additions & 2 deletions docs/create/create-environment.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,10 @@ These options determine the following:

### Persisted storage

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

Persisted storage is when you enable persistence on a topic:

![Topic persistence](../images/create-environment/topic-persistence.png){width=80%}
Expand All @@ -131,10 +135,10 @@ When this option is selected, data in the topic is persisted to a Quix database
Services that experience improved performance when selecting the "High performance" option include the following:

* GitService - this is the service that synchronizes your Quix environment with the project's Git repository.
* [Replay Service](../manage/replay.md) - enables replay of persisted data into a topic.
* [Replay Service](../manage/replay.md) - enables replay of persisted data into a topic. **Note:** feature is only available to legacy customers.
* [Streaming Reader](../apis/streaming-reader-api/overview.md) - service that enables a client to subscribe to a Quix topic.
* [Streaming Writer](../apis/streaming-writer-api/overview.md) - service that enables a client to publish to a Quix topic.
* [Query API](../apis/query-api/overview.md) - query data persisted in the Quix database.
* [Query API](../apis/query-api/overview.md) - query data persisted in the Quix database. **Note:** feature is only available to legacy customers.

Generally, if you notice sluggish performance in one of these services, it may mean for the volumes and frequency of data you are processing, you might need the High performance option.

Expand Down
35 changes: 7 additions & 28 deletions docs/develop/integrate-data/external-source.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,34 +2,13 @@

One simple way to write data into a Quix topic, is to use the prebuilt connector called `External source`.

To use the `External source` connector, step through the following procedure:

1. In the UI click on `Code Samples` in the left-hand sidebar.

2. Search for `External source`.

3. Click `Add external source`.

4. Select the output topic that you want to publish data to.

5. Give your source a name.
To add an external source:

1. Go to the pipeline view.
2. Click `+ New` in the top right corner of the view.
3. Select `External source`.
4. In `Output` select the topic you are going to publish to, or add a new topic with `+ Add new`.
5. In `Name` type any suitable name, such as "Laptop CPU Load".
6. Click `Add external source`.

7. In the Pipeline view click the newly created source and the following is displayed:

![External source options](../../images/external-source-options.png){width=80%}

8. For this example, select `HTTP API - JavaScript`. Code is generated for you that uses the Streaming Writer API (HTTP interface).

9. Click the `Copy code` button to copy the code to your clipboard.

You can now paste the code into your JavaScript code, for example, your web browser client code. The code writes data into the Quix topic that you configured.

As you can see there are other options such as generating Curl code that can be run in your shell to also write data into Quix.

The code samples generated are meant to provide you with a starting point from which you can build your own solutions. They provide a convenient way to see how the API works.

## Next steps

Further information can be found in the [Streaming Writer API](../../apis/streaming-writer-api/overview.md) documentation.
The external source now appears in the pipeline view as a reminder (visual cue) as to the nature of the source generating the data for this topic.
4 changes: 4 additions & 0 deletions docs/develop/integrate-data/jupyter-nb.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@

In this documentation, you learn how to use Jupyter Notebook to analyze data persisted in Quix.

!!! danger "Legacy features"

Some of the features on this page are not available to new users, including those related to the Quix Data Explorer and the topic Persistence feature. However, legacy users may still have access to these facilities.

## Why this is important

Although Quix is a real-time platform, to build real-time in-memory models and data processing pipelines, you need to understand data first. To help with that, Quix offers the option to persist data in topics. This data can be accessed using the [Query API](../../apis/query-api/overview.md). This helps make data discovery and analysis easier.
Expand Down
Binary file removed docs/images/external-source-options.png
Binary file not shown.
4 changes: 4 additions & 0 deletions docs/kb/glossary.md
Original file line number Diff line number Diff line change
Expand Up @@ -168,6 +168,10 @@ A project contains one or more [environments](#environment), so typically you cr

## Query API

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

The [Query API](../apis/query-api/overview.md) is used to query persisted data. Most commonly used for dashboards, analytics and training ML models. Also useful to call historical data when running an ML model, or to call historical data from an external application. This API is primarily iused for testing and debugging purposes.

## Quix UI
Expand Down
8 changes: 4 additions & 4 deletions docs/kb/what-is-quix.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Briefly, here's how you would build a Python stream processing pipeline with Qui

Quix is designed to remove as much complexity as possible from the process of creating, deploying, and monitoring your streaming data pipelines.

Quix leverages industry-standard technologies, such as Kafka to provide the core functionality for data streaming, Kubernetes for scaling your deployments, InfluxDB and MongoDB for data persistence, Git for revision control, and Python as the main language for programming your solutions.
Quix leverages industry-standard technologies, such as Kafka to provide the core functionality for data streaming, Kubernetes for scaling your deployments, Git for revision control, and Python as the main language for programming your solutions.

The following sections take a look at the key components of creating your streaming data solutions:

Expand Down Expand Up @@ -144,9 +144,9 @@ Quix provides numerous standard [connectors](../connectors/index.md) for both so

Quix provides several APIs to help you work with streaming data. These include:

* [**Stream Writer API**](../apis/streaming-writer-api/overview.md): enables you to send any data to a Kafka topic in Quix using HTTP. This API handles encryption, serialization, and conversion to the Quix Streams format, ensuring efficiency and performance of down-stream processing regardless of the data source.
* [**Stream Reader API**](../apis/streaming-reader-api/overview.md): enables you to push live data from a Quix topic to your application, ensuring low latency by avoiding any disk operations.
* [**Query API**](../apis/query-api/overview.md): enables you to query persisted data streams. This is provided primarily for testing purposes.
* [**Streaming Writer API**](../apis/streaming-writer-api/overview.md): enables you to send any data to a Kafka topic in Quix using HTTP. This API handles encryption, serialization, and conversion to the Quix Streams format, ensuring efficiency and performance of down-stream processing regardless of the data source.
* [**Streaming Reader API**](../apis/streaming-reader-api/overview.md): enables you to push live data from a Quix topic to your application, ensuring low latency by avoiding any disk operations.
* [**Query API**](../apis/query-api/overview.md): enables you to query persisted data streams. This is provided primarily for testing purposes. **Note:** available to legacy customers only.
* [**Portal API**](../apis/portal-api/overview.md): enables you to automate Quix tasks such as creating environments, topics, and deployments.

### Quix Streams
Expand Down
22 changes: 21 additions & 1 deletion docs/manage/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,21 @@ There is also a tab with messages view. This is described in a later section.

## Data explorer

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

When your pipeline is running, and the applications are generating data on topics, you can use the Data Explorer to view data in real time.

You can select the topic you want to view data on, and then the stream within that topic, as well as the specific parameters and events you are interested in. These can be displayed in waveform, table, or message view. The following screenshot illustrates the waveform view:

![Data explorer](../images/manage/data-explorer.png)

Table view enables you to view the parameter data in a tabular format, and messages view enables you to view the raw message data.
Table view enables you to view the parameter data in a tabular format, and messages view enables you to view the raw message data.

!!! tip

You can access similar functionality by clicking `Topics` in the left-hand sidebar, and then clicking on the topic whose messages you want to view.

## Message viewer

Expand All @@ -43,6 +51,10 @@ The data explorer also has a message viewer tab, as can be seen in the [data exp

## Persistence

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

While data in a Kafka topic is retained according to the topic retention time configured when you create a new topic:

![New topic](../images/manage/new-topic.png)
Expand All @@ -59,12 +71,20 @@ The replay service is used to play back persisted data into a topic.

## Replay service

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

The replay service enables you to play persisted data back into a topic.

You can read more about the [replay service](./replay.md) in the docs.

## Query API

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

The Query API enables you to programmatically retrieve persisted data from the database.

You can read more about the [Query API](../apis/query-api/overview.md) in the docs.
Expand Down
4 changes: 4 additions & 0 deletions docs/manage/replay.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# How to replay data

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

Quix features a **replay service**. This service enables you to replay persisted data into a topic, as if it were live data. This is very useful for the following use cases:

* Testing and debugging connectors and transforms
Expand Down
8 changes: 8 additions & 0 deletions docs/manage/testing-data-store.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Data store for testing

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

Quix provides a data store for testing and debugging purposes.

While [topics](../kb/glossary.md#topic) do provide a configurable retention time, persisting data into a database provides advantages - for example, you can perform powerful queries to retrieve historical data. This data can be retrieved and displayed using the Data Explorer, or retrieved using the [Query API](../apis/query-api/overview.md).
Expand All @@ -12,6 +16,10 @@ Quix provides a very simple way to persist data in a topic. Simply locate the to

## Replay service

!!! danger "Legacy feature"

This feature is not available to new users. However, legacy users may still have access to this functionality.

When data has been persisted, you have the option to not only query and display it, but replay it into your pipeline. This can be very useful for testing and debugging pipelines using historical data.

See how to [use the Quix replay service](../manage/replay.md).
Expand Down
17 changes: 17 additions & 0 deletions docs/manage/troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,23 @@

This section contains solutions, fixes, hints and tips to help you solve the most common issues encountered when using Quix.

## Kafka disconnections

Sometimes you can experience Kafka disconnection messages such as the following:

```
[2024-04-30 14:51:46,791] [INFO] : FAIL [rdkafka#producer-4] [thrd:sasl_ssl://kafka-k5.quix.io:9093/5]: sasl_ssl://kafka-k5.quix.io:9093/5: Disconnected (after 154139ms in state UP, 1 identical error(s) suppressed)
[2024-04-30 14:51:46,791] [INFO] : FAIL [rdkafka#producer-4] [thrd:sasl_ssl://kafka-k5.quix.io:9093/5]: sasl_ssl://kafka-k5.quix.io:9093/5: Disconnected (after 154139ms in state UP, 1 identical error(s) suppressed)
[2024-04-30 14:51:46,791] [ERROR] : Kafka producer error: sasl_ssl://kafka-k5.quix.io:9093/5: Disconnected (after 154139ms in state UP, 1 identical error(s) suppressed) code="-195"
[2024-04-30 14:51:46,791] [ERROR] : Kafka producer error: sasl_ssl://kafka-k5.quix.io:9093/5: Disconnected (after 154139ms in state UP, 1 identical error(s) suppressed) code="-195"
```

This happens because idle connections are reaped on occasion, and nodes are sometimes restarted to apply security fixes and so on. Kafka being high availability by design, your topics are replicated, in the case of Quix-managed broker, twice. Your service will automatically fail over to the other node, while the connection to the other node recovers.

In the underlying Kafka library, disconnects are often reported, even if there is no message to be delivered, which can be problematic for producers. The connection is re-established in the background, but the default log level does not record this, so it may appear inactive according to the logs, while in fact it is still functioning.

Quix aims to restart nodes as infrequently as possible, but it may happen every now and then. When it does happen, they are restarted in a rolling manner to always have at least one replica for your streaming needs available.

## Kafka message too large errors

Sometimes you may receive Kafka message too large errors, if your messages are larger than 1MB. You would receive the following error message:
Expand Down
Loading

0 comments on commit 58387f7

Please sign in to comment.