This property specifies the number of cores each Spark executor can consume when Kyvos submits a request to Spark. Kyvos delegates this property value to the same-named property of Spark when Kyvos submits a dataset build, cube build, or data profile job to Spark.
Values and behavior:
Any positive integer value. Spark will use the specified number of cores for each executor.
- Connection: If the property is set at the connection then the property value is applicable for all dataset build, cube build, or data profile jobs launched using Spark.
- Cube: If the property is set on a cube, then the value will override the connection level value for that cube’s build job.
- Dataset: If the property is set on a dataset then the value will override cube level value for that cube’s dataset build job.
NOTE: If the property is set on a dataset and a dataset is built, then the value will override the connection level value for that dataset build job.
Comes into effect:
This property comes into effect only when kyvos.build.execution.engine is set as Spark. This means when the underlying build engine is set to Spark. The value of the property can be changed at any time and will be respected in the next build.
Dependencies and related properties:
Either 4 or 5 cores are recommended to be used. The recommended is 4 as per our lab tests.