From 8823a91241c695520188469f9880f90cd188df24 Mon Sep 17 00:00:00 2001
From: Joel Hamill <11722533+joel-hamill@users.noreply.github.com>
Date: Fri, 27 Jan 2023 11:23:34 -0800
Subject: [PATCH] DOCS-21490 - Remove nodejs tutorial (#1186)
* DOCS-21490 - Remove nodejs tutorial
* remove ksqldb datagen
* fix removed file
* Make titles better
* fixup titles
---
clients/docs/clojure.rst | 4 +-
clients/docs/groovy.rst | 4 +-
clients/docs/kafka-connect-datagen.rst | 4 +-
clients/docs/kafkacat.rst | 4 +-
clients/docs/kotlin.rst | 4 +-
clients/docs/ksql-datagen.rst | 217 -------------------------
clients/docs/nodejs.rst | 130 ---------------
clients/docs/ruby.rst | 4 +-
clients/docs/rust.rst | 4 +-
clients/docs/scala.rst | 4 +-
10 files changed, 16 insertions(+), 363 deletions(-)
delete mode 100644 clients/docs/ksql-datagen.rst
delete mode 100644 clients/docs/nodejs.rst
diff --git a/clients/docs/clojure.rst b/clients/docs/clojure.rst
index ff5426633..fea7146cd 100644
--- a/clients/docs/clojure.rst
+++ b/clients/docs/clojure.rst
@@ -1,7 +1,7 @@
.. _client-examples-clojure:
-Clojure
-=======
+Create an Apache Kafka Client App for Clojure
+=============================================
In this tutorial, you will run a Clojure client application that produces
messages to and consumes messages from an |ak-tm| cluster.
diff --git a/clients/docs/groovy.rst b/clients/docs/groovy.rst
index 216b6f353..6152b67fc 100644
--- a/clients/docs/groovy.rst
+++ b/clients/docs/groovy.rst
@@ -1,7 +1,7 @@
.. _client-examples-groovy:
-Groovy
-======
+Create an Apache Kafka Client App for Groovy
+============================================
In this tutorial, you will run a Groovy client application that produces
messages to and consumes messages from an |ak-tm| cluster.
diff --git a/clients/docs/kafka-connect-datagen.rst b/clients/docs/kafka-connect-datagen.rst
index 1e4799323..95be969ae 100644
--- a/clients/docs/kafka-connect-datagen.rst
+++ b/clients/docs/kafka-connect-datagen.rst
@@ -1,7 +1,7 @@
.. _client-examples-kafka-connect-datagen:
-Kafka Connect Datagen
-=====================
+Create an Apache Kafka Client App for |kconnect-long| Datagen
+=============================================================
In this tutorial, you will run a |kconnect-long| Datagen source connector using
`Kafka Connect Datagen
diff --git a/clients/docs/kafkacat.rst b/clients/docs/kafkacat.rst
index d03b31c09..1e2374e00 100644
--- a/clients/docs/kafkacat.rst
+++ b/clients/docs/kafkacat.rst
@@ -1,7 +1,7 @@
.. _client-examples-kafkacat:
-kafkacat
-========
+Create an Apache Kafka Client App for kafkacat
+==============================================
In this tutorial, you will run a |kcat| client application that produces
messages to and consumes messages from an |ak-tm| cluster.
diff --git a/clients/docs/kotlin.rst b/clients/docs/kotlin.rst
index a55d74b8f..903e6428a 100644
--- a/clients/docs/kotlin.rst
+++ b/clients/docs/kotlin.rst
@@ -1,7 +1,7 @@
.. _client-examples-kotlin:
-Kotlin
-======
+Create an Apache Kafka Client App for Kotlin
+============================================
In this tutorial, you will run a Kotlin client application that produces
messages to and consumes messages from an |ak-tm| cluster.
diff --git a/clients/docs/ksql-datagen.rst b/clients/docs/ksql-datagen.rst
deleted file mode 100644
index 309429a50..000000000
--- a/clients/docs/ksql-datagen.rst
+++ /dev/null
@@ -1,217 +0,0 @@
-.. _client-examples-ksql-datagen:
-
-KSQL Datagen
-============
-
-In this tutorial, you will run a KSQL Datagen client application using the `KSQL
-Datagen command-line tool
-`__
-that produces messages to and consumes messages from an |ak-tm| cluster.
-
-.. note::
-
- Use KSQL Datagen for development purposes only. It isn't suitable for a
- production environment.
-
-.. include:: includes/client-example-overview.rst
-
-
-
-Prerequisites
--------------
-
-Client
-~~~~~~
-
-- Docker
-
-- `Download `__ |cp| |release|
-
-Kafka Cluster
-~~~~~~~~~~~~~
-
-.. include:: includes/client-example-prerequisites.rst
-
-
-Setup
------
-
-#. .. include:: includes/clients-checkout.rst
-
-#. Change directory to the example for KSQL Datagen.
-
- .. code-block:: bash
-
- cd clients/cloud/ksql-datagen/
-
-#. .. include:: includes/client-example-create-file-java.rst
-
-
-Basic Producer and Consumer
----------------------------
-
-.. include:: includes/producer-consumer-description.rst
-
-
-Produce Records
-~~~~~~~~~~~~~~~
-
-#. Create the |ak| topic.
-
- .. code-block:: bash
-
- kafka-topics --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` --command-config $HOME/.confluent/java.config --topic test1 --create --replication-factor 3 --partitions 6
-
-#. Generate a file of ENV variables used by Docker to set the bootstrap
- servers and security configuration.
-
- .. code-block:: bash
-
- ../../../ccloud/ccloud-generate-cp-configs.sh $HOME/.confluent/java.config
-
-#. Source the generated file of ``ENV`` variables.
-
- .. code-block:: bash
-
- source ./delta_configs/env.delta
-
-#. Start Docker by running the following command:
-
- .. code-block:: bash
-
- docker-compose up -d
-
-#. View the :devx-examples:`ksql-datagen code|clients/cloud/ksql-datagen/start-docker.sh`.
-
-
-Consume Records
-~~~~~~~~~~~~~~~
-
-#. Consume from topic ``test1`` by doing the following:
-
- - Referencing a properties file
-
- .. code-block:: bash
-
- docker-compose exec connect bash -c 'kafka-console-consumer --topic test1 --bootstrap-server $CONNECT_BOOTSTRAP_SERVERS --consumer.config /tmp/ak-tools-ccloud.delta --max-messages 5'
-
- - Referencing individual properties
-
- .. code-block:: bash
-
- docker-compose exec connect bash -c 'kafka-console-consumer --topic test1 --bootstrap-server $CONNECT_BOOTSTRAP_SERVERS --consumer-property sasl.mechanism=PLAIN --consumer-property security.protocol=SASL_SSL --consumer-property sasl.jaas.config="$SASL_JAAS_CONFIG_PROPERTY_FORMAT" --max-messages 5'
-
- You should see messages similar to the following:
-
- .. code-block:: text
-
- {"ordertime":1489322485717,"orderid":15,"itemid":"Item_352","orderunits":9.703502112840228,"address":{"city":"City_48","state":"State_21","zipcode":32731}}
-
-#. When you are done, press ``CTRL-C``.
-
-#. View the :devx-examples:`consumer code|clients/cloud/ksql-datagen/start-docker.sh`.
-
-
-Avro and Confluent Cloud Schema Registry
-----------------------------------------
-
-.. include:: includes/schema-registry-scenario-explain.rst
-
-#. .. include:: includes/client-example-schema-registry-1.rst
-
-#. .. include:: includes/client-example-vpc.rst
-
-#. .. include:: includes/schema-registry-java.rst
-
-#. .. include:: includes/client-example-schema-registry-2-java.rst
-
-
-Produce Avro Records
-~~~~~~~~~~~~~~~~~~~~
-
-#. Create the topic in |ccloud|.
-
- .. code-block:: bash
-
- kafka-topics --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` --command-config $HOME/.confluent/java.config --topic test2 --create --replication-factor 3 --partitions 6
-
-#. Generate a file of ``ENV`` variables used by Docker to set the bootstrap
- servers and security configuration.
-
- .. code-block:: bash
-
- ../../../ccloud/ccloud-generate-cp-configs.sh $HOME/.confluent/java.config
-
-#. Source the generated file of ``ENV`` variables.
-
- .. code-block:: bash
-
- source ./delta_configs/env.delta
-
-#. Start Docker by running the following command:
-
- .. code-block:: bash
-
- docker-compose up -d
-
-#. View the :devx-examples:`ksql-datagen Avro code|clients/cloud/ksql-datagen/start-docker-avro.sh`.
-
-
-Consume Avro Records
-~~~~~~~~~~~~~~~~~~~~
-
-#. Consume from topic ``test2`` by doing the following:
-
- - Referencing a properties file
-
- .. code-block:: bash
-
- docker-compose exec connect bash -c 'kafka-avro-console-consumer --topic test2 --bootstrap-server $CONNECT_BOOTSTRAP_SERVERS --consumer.config /tmp/ak-tools-ccloud.delta --property basic.auth.credentials.source=$CONNECT_VALUE_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE --property schema.registry.basic.auth.user.info=$CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO --property schema.registry.url=$CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL --max-messages 5'
-
- - Referencing individual properties
-
- .. code-block:: bash
-
- docker-compose exec connect bash -c 'kafka-avro-console-consumer --topic test2 --bootstrap-server $CONNECT_BOOTSTRAP_SERVERS --consumer-property sasl.mechanism=PLAIN --consumer-property security.protocol=SASL_SSL --consumer-property sasl.jaas.config="$SASL_JAAS_CONFIG_PROPERTY_FORMAT" --property basic.auth.credentials.source=$CONNECT_VALUE_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE --property schema.registry.basic.auth.user.info=$CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO --property schema.registry.url=$CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL --max-messages 5'
-
- You should see messages similar to the following:
-
- .. code-block:: text
-
- {"ordertime":{"long":1494153923330},"orderid":{"int":25},"itemid":{"string":"Item_441"},"orderunits":{"double":0.9910185646928878},"address":{"io.confluent.ksql.avro_schemas.KsqlDataSourceSchema_address":{"city":{"string":"City_61"},"state":{"string":"State_41"},"zipcode":{"long":60468}}}}
-
-
-#. When you are done, press ``CTRL-C``.
-
-#. View the :devx-examples:`consumer Avro code|clients/cloud/ksql-datagen/start-docker-avro.sh`.
-
-
-Confluent Cloud Schema Registry
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-#. View the schema information registered in |sr-ccloud|. In the following
- output, substitute values for ````, ````, and
- ````.
-
- .. code-block:: text
-
- curl -u : https:///subjects
-
-#. Verify the subject ``test2-value`` exists.
-
- .. code-block:: text
-
- ["test2-value"]
-
-#. View the schema information for subject ``test2-value``. In the following
- output, substitute values for ````, ````, and ````.
-
- .. code-block:: text
-
- curl -u : https:///subjects/test2-value/versions/1
-
-#. Verify the schema information for subject ``test2-value``.
-
- .. code-block:: text
-
- {"subject":"test2-value","version":1,"id":100001,"schema":"{\"type\":\"record\",\"name\":\"KsqlDataSourceSchema\",\"namespace\":\"io.confluent.ksql.avro_schemas\",\"fields\":[{\"name\":\"ordertime\",\"type\":[\"null\",\"long\"],\"default\":null},{\"name\":\"orderid\",\"type\":[\"null\",\"int\"],\"default\":null},{\"name\":\"itemid\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"orderunits\",\"type\":[\"null\",\"double\"],\"default\":null},{\"name\":\"address\",\"type\":[\"null\",{\"type\":\"record\",\"name\":\"KsqlDataSourceSchema_address\",\"fields\":[{\"name\":\"city\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"state\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"zipcode\",\"type\":[\"null\",\"long\"],\"default\":null}]}],\"default\":null}]}"}
diff --git a/clients/docs/nodejs.rst b/clients/docs/nodejs.rst
deleted file mode 100644
index a78420745..000000000
--- a/clients/docs/nodejs.rst
+++ /dev/null
@@ -1,130 +0,0 @@
-.. _client-examples-nodejs:
-
-Node.js
-=======
-
-In this tutorial, you will run a Node.js client application that produces
-messages to and consumes messages from an |ak-tm| cluster.
-
-.. include:: includes/client-example-overview.rst
-
-
-Prerequisites
--------------
-
-Client
-~~~~~~
-
-- `Node.js `__ version 8.6 or higher installed on
- your local machine.
-
-- Users of macOS 10.13 (High Sierra) and later should read `node-rdkafka’s
- additional configuration instructions related to OpenSSL
- `__
- before running ``npm install``.
-
-- `OpenSSL `__ version 1.0.2.
-
-Kafka Cluster
-~~~~~~~~~~~~~
-
-.. include:: includes/client-example-prerequisites.rst
-
-
-Setup
------
-
-#. .. include:: includes/clients-checkout.rst
-
-#. Change directory to the example for Node.js.
-
- .. code-block:: bash
-
- cd clients/cloud/nodejs/
-
-#. .. include:: includes/client-example-create-file-librdkafka.rst
-
-
-Basic Producer and Consumer
----------------------------
-
-.. include:: includes/producer-consumer-description.rst
-
-
-Produce Records
-~~~~~~~~~~~~~~~
-
-#. Install npm dependencies.
-
- .. code-block:: bash
-
- npm install
-
-#. Run the producer, passing in arguments for:
-
- - the local file with configuration parameters to connect to your |ak|
- cluster
- - the topic name
-
- .. code-block:: bash
-
- node producer.js -f $HOME/.confluent/librdkafka.config -t test1
-
-#. Verify the producer sent all the messages. You should see:
-
- .. code-block:: text
-
- Created topic test1
- Producing record alice {"count":0}
- Producing record alice {"count":1}
- Producing record alice {"count":2}
- Producing record alice {"count":3}
- Producing record alice {"count":4}
- Producing record alice {"count":5}
- Producing record alice {"count":6}
- Producing record alice {"count":7}
- Producing record alice {"count":8}
- Producing record alice {"count":9}
- Successfully produced record to topic "test1" partition 0 {"count":0}
- Successfully produced record to topic "test1" partition 0 {"count":1}
- Successfully produced record to topic "test1" partition 0 {"count":2}
- Successfully produced record to topic "test1" partition 0 {"count":3}
- Successfully produced record to topic "test1" partition 0 {"count":4}
- Successfully produced record to topic "test1" partition 0 {"count":5}
- Successfully produced record to topic "test1" partition 0 {"count":6}
- Successfully produced record to topic "test1" partition 0 {"count":7}
- Successfully produced record to topic "test1" partition 0 {"count":8}
- Successfully produced record to topic "test1" partition 0 {"count":9}
-
-#. View the :devx-examples:`producer code|clients/cloud/nodejs/producer.js`.
-
-
-Consume Records
-~~~~~~~~~~~~~~~
-
-#. Run the consumer, passing in arguments for:
-
- - the local file with configuration parameters to connect to your |ak| cluster
- - the topic name you used earlier
-
- .. code-block:: bash
-
- node consumer.js -f $HOME/.confluent/librdkafka.config -t test1
-
-#. Verify the consumer received all the messages:
-
- .. code-block:: text
-
- Consuming messages from test1
- Consumed record with key alice and value {"count":0} of partition 0 @ offset 0. Updated total count to 1
- Consumed record with key alice and value {"count":1} of partition 0 @ offset 1. Updated total count to 2
- Consumed record with key alice and value {"count":2} of partition 0 @ offset 2. Updated total count to 3
- Consumed record with key alice and value {"count":3} of partition 0 @ offset 3. Updated total count to 4
- Consumed record with key alice and value {"count":4} of partition 0 @ offset 4. Updated total count to 5
- Consumed record with key alice and value {"count":5} of partition 0 @ offset 5. Updated total count to 6
- Consumed record with key alice and value {"count":6} of partition 0 @ offset 6. Updated total count to 7
- Consumed record with key alice and value {"count":7} of partition 0 @ offset 7. Updated total count to 8
- Consumed record with key alice and value {"count":8} of partition 0 @ offset 8. Updated total count to 9
- Consumed record with key alice and value {"count":9} of partition 0 @ offset 9. Updated total count to 10
-
-#. View the :devx-examples:`consumer code|clients/cloud/nodejs/consumer.js`.
diff --git a/clients/docs/ruby.rst b/clients/docs/ruby.rst
index 8484a94ae..131956d74 100644
--- a/clients/docs/ruby.rst
+++ b/clients/docs/ruby.rst
@@ -1,7 +1,7 @@
.. _client-examples-ruby:
-Ruby
-====
+Create an Apache Kafka Client App for Ruby
+==========================================
In this tutorial, you will run a Ruby client application using the `ZenDesk Ruby
Client for Apache Kafka `__ that produces
diff --git a/clients/docs/rust.rst b/clients/docs/rust.rst
index 0f5bc4e9c..83d4619c3 100644
--- a/clients/docs/rust.rst
+++ b/clients/docs/rust.rst
@@ -1,7 +1,7 @@
.. _client-examples-rust:
-Rust
-====
+Create an Apache Kafka Client App for Rust
+==========================================
In this tutorial, you will run a Rust client application that produces messages
to and consumes messages from an |ak-tm| cluster.
diff --git a/clients/docs/scala.rst b/clients/docs/scala.rst
index dd91995c6..1cf4abffa 100644
--- a/clients/docs/scala.rst
+++ b/clients/docs/scala.rst
@@ -1,7 +1,7 @@
.. _client-examples-scala:
-Scala
-======
+Create an Apache Kafka Client App for Scala
+===========================================
In this tutorial, you will run a Scala client application that produces messages
to and consumes messages from an |ak-tm| cluster.