-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support of custom labels #57
base: master
Are you sure you want to change the base?
Conversation
- update dependencies - new setting: metrics-name-to-lowercase - add support for dynamic labels via SparkConf.set("spark.SPARK_METRIC_LABELS", "key=value")
# Conflicts: # build.sbt # src/main/scala/com/banzaicloud/spark/metrics/DropwizardExports.scala # src/main/scala/com/banzaicloud/spark/metrics/DropwizardExportsWithMetricNameTransform.scala # src/main/scala/com/banzaicloud/spark/metrics/sink/PrometheusSink.scala
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
* 2. property labels | ||
* @return | ||
*/ | ||
private def collectLabels(sparkConf: SparkConf): Map[String, String] = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
private def collectLabels(sparkConf: SparkConf): Map[String, String] = { | |
private def collectLabels(sparkConf: SparkConf): Try[Map[String, String]] = { |
As parseLabels, may throw an exception.
configLabels ++ sparkConfLabels | ||
} | ||
|
||
private def collectLabelsFromSparkConf(sparkConf: SparkConf): Map[String, String] = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
private def collectLabelsFromSparkConf(sparkConf: SparkConf): Map[String, String] = { | |
private def collectLabelsFromSparkConf(sparkConf: SparkConf): Try[Map[String, String]] = { |
As parseLabel, may throw an exception.
@@ -4,3 +4,5 @@ resolvers += "sonatype-releases" at "https://oss.sonatype.org/content/repositori | |||
|
|||
addSbtPlugin("com.jsuereth" % "sbt-pgp" % "2.0.0-M2") | |||
addSbtPlugin("org.xerial.sbt" % "sbt-sonatype" % "2.3") | |||
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.10") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this plugin required by something in this PR? If not, I think it should be removed or moved to its own PR.
if (kvs.isEmpty) { | ||
return Map.empty | ||
} | ||
kvs.split(",").map(parseLabel).toMap |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if (kvs.isEmpty) Map.empty else kvs.split(",").map(parseLabel).toMap
would be more Scala-esque
import io.prometheus.client.dropwizard.DropwizardExports | ||
import io.prometheus.jmx.JmxCollector | ||
|
||
import scala.collection.JavaConverters._ | ||
import scala.util.matching.Regex | ||
|
||
object NameDecorator { | ||
case class Replace(Regex: Regex, replacement: String) | ||
case class Replace(Regex: Regex, replacement: String, toLowerCase: Boolean = false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really wouldn't put this here: coherence is low, semantics are unintuitive (or at least ambiguous). A clearer implementation would be to create a case object ToLowerCase
and handle it in the NameDecorator
trait.
The best solution would be to refactor name decoration to be composable from different name decorators (replacer, case changer, etc...).
yegor seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
What's in this PR?
how to use:
then metrics would have this labels.
Our use case - we are running daily pipelines, and it's convenient to distinguish metrics by job type: for example by processed topic, processed date, etc.
2. introduce
metrics-name-to-lowercase
Additional context
As mentioned "cooked"
SparkConf
should be passed to create spark job, I don't think settings apply in runtime.Checklist
To Do