-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get Microsoft.Spark and Microsoft.Spark.Worker assembly version information #715
Conversation
|
||
string tempColName = "WorkerVersionInfo"; | ||
DataFrame workerInfoTempDf = df | ||
.Repartition(1000) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the 1000
is too big for local settings. Is there some spark setting we can leverage to set this value ? Or what do you think are safe defaults ? we can also pass this in as a parameter and have a safe default of 10 or something ?
/// <returns> | ||
/// A <see cref="DataFrame"/> containing the <see cref="VersionSensor.VersionInfo"/> | ||
/// </returns> | ||
public DataFrame Version(int numPartitions = 10) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just curious, why have we chosen numPartitions
as 10 here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any number is fine, just some safe default to run for local / small clusters. Do you have a recommendation on what value we can use ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No 10 is fine, I was just wondering if there was a specific reason.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to add a test covering this since it is a public API?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also I am wondering if asking for numPartitions
is really necessary for finding the version from the user perspective. Would having it as a constant inside the function pose a problem?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What would be the pros and cons of hardcoding the numbers instead ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pros I think would be usability, since it might be hard to intuitively understand what number of partitions a user should enter and what it has to do with getting version information. Cons, I guess include not being able to trigger every worker node and if some nodes have a different worker version on them then that would go uncaught. But that could happen even with taking numPartitions
as an argument so not sure what the benefit there is, unless we document it in more depth so as to advise the user of the importance of estimating a 'correct' numPartitions
value, maybe explain in detail what best effort
means etc. Thoughts @imback82 ?
/// <returns> | ||
/// A <see cref="DataFrame"/> containing the <see cref="VersionSensor.VersionInfo"/> | ||
/// </returns> | ||
public DataFrame Version(int numPartitions = 10) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should name this AssemblyVersion
/AssemblyVersionInfo
/VersionInfo
or something unique to dotnet just in case Spark ever adds a method called Version
?
/// <returns> | ||
/// A <see cref="DataFrame"/> containing the <see cref="VersionSensor.VersionInfo"/> | ||
/// </returns> | ||
public DataFrame Version(int numPartitions = 10) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about making this an extension method under Experimental
namespace?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Keep in same Microsoft.Spark
project just a different namespace, right ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved to Microsoft.Spark.Experimental.Sql
/// <returns> | ||
/// A <see cref="DataFrame"/> containing the <see cref="VersionSensor.VersionInfo"/> | ||
/// </returns> | ||
public DataFrame Version(int numPartitions = 10) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about the assembly version on the driver?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Which assembly version on the driver ? We are getting the Microsoft.Spark assembly version on the driver and the Microsoft.Spark.Worker version on whichever machines the spark executors are spinning up on.
|
||
internal string AssemblyName { get; set; } | ||
internal string AssemblyVersion { get; set; } | ||
internal string HostName { get; set; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can add BuildDate (by getting assembly file creation date) if we think it'd be useful/nice to have.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can iterate on this as we get feedbacks
src/csharp/Microsoft.Spark.E2ETest/IpcTests/Sql/SparkSessionTests.cs
Outdated
Show resolved
Hide resolved
|
||
internal string AssemblyName { get; set; } | ||
internal string AssemblyVersion { get; set; } | ||
internal string HostName { get; set; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can iterate on this as we get feedbacks
src/csharp/Microsoft.Spark/Experimental/Sql/SparkSessionExtensions.cs
Outdated
Show resolved
Hide resolved
src/csharp/Microsoft.Spark/Experimental/Sql/SparkSessionExtensions.cs
Outdated
Show resolved
Hide resolved
src/csharp/Microsoft.Spark/Experimental/Sql/SparkSessionExtensions.cs
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks @suhsteve!
/// <returns> | ||
/// A <see cref="DataFrame"/> containing the <see cref="AssemblyInfoProvider.AssemblyInfo"/> | ||
/// </returns> | ||
public static DataFrame Version(this SparkSession session, int numPartitions = 10) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I missed this. Should we also rename this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe GetAssemblyInfo
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated.
|
||
internal class AssemblyInfo | ||
{ | ||
internal static readonly StructType s_schema = new StructType( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we make this lazy ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point. Can you just move this and ToGenericRow
to GetAssemblyInfo
? I will probably move this to non-experimental in my next PR since this class will be more general after moving it out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
moved to SparkSessionExtensions
Method added that returns a DataFrame that contains the current
Microsoft.Spark
assembly version running on theDriver
and tries to make abest effort
attempt in determining the assembly version of theMicrosoft.Spark.Worker
.There is no guarantee that a Spark Executor will be run on all the nodes in a cluster. To increase the likelyhood, the spark conf
spark.executor.instances
and thenumPartitions
(parameter toGetAssemblyInfo(...)
should be a adjusted to a reasonable number relative to the number of nodes in the Spark cluster.Building
Microsoft.Spark.Worker
with an updated version and rerunning the above yieldsFixes #713