Page 2 of 22 results (0.006 seconds)

CVSS: 9.1EPSS: 0%CPEs: 14EXPL: 1

29 Jan 2020 — HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header. "El archivo HttpObjectDecoder.java en Netty versiones anteriores a 4.1.44, permite que un encabezado Content-Length esté acompañado por un segundo encabezado Content-Length o por un encabezado Transfer-Encoding." A flaw was found in Netty before version 4.1.44, where it accepted multiple Content-Length headers and also accepted both Transfer-Enco... • https://access.redhat.com/errata/RHSA-2020:0497 • CWE-444: Inconsistent Interpretation of HTTP Requests ('HTTP Request/Response Smuggling') •

CVSS: 7.5EPSS: 0%CPEs: 6EXPL: 1

18 Nov 2019 — A flaw was found in org.codehaus.jackson:jackson-mapper-asl:1.9.x libraries. XML external entity vulnerabilities similar CVE-2016-3720 also affects codehaus jackson-mapper-asl libraries but in different classes. Se detectó un fallo en las bibliotecas org.codehaus.jackson:jackson-mapper-asl:1.9.x. Las vulnerabilidades de tipo XML external entity similares a CVE-2016-3720, también afectan a las bibliotecas codehaus jackson-mapper-asl pero en diferentes clases. A flaw was found in org.codehaus.jackson:jackson-... • https://github.com/rusakovichma/CVE-2019-10172 • CWE-611: Improper Restriction of XML External Entity Reference •

CVSS: 7.5EPSS: 0%CPEs: 5EXPL: 0

07 Aug 2019 — Prior to Spark 2.3.3, in certain situations Spark would write user data to local disk unencrypted, even if spark.io.encryption.enabled=true. This includes cached blocks that are fetched to disk (controlled by spark.maxRemoteBlockSizeFetchToMem); in SparkR, using parallelize; in Pyspark, using broadcast and parallelize; and use of python udfs. Spark anterior a versión 2.3.3, en ciertas situaciones, Spark escribiría los datos de usuario en el disco local sin cifrar, incluso si spark.io.encryption.enabled=true... • https://lists.apache.org/thread.html/c2a39c207421797f82823a8aff488dcd332d9544038307bf69a2ba9e%40%3Cuser.spark.apache.org%3E • CWE-312: Cleartext Storage of Sensitive Information •

CVSS: 5.5EPSS: 0%CPEs: 5EXPL: 0

04 Feb 2019 — When using PySpark , it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application. This affects versions 1.x, 2.0.x, 2.1.x, 2.2.0 to 2.2.2, and 2.3.0 to 2.3.1. Al utilizar PySpark, es posible que un usuario local diferente se conecte a la aplicación de Spark y suplante al usuario que ejecuta la aplicación de Spark. Afecta a las versiones 1.x, 2.0.x, 2.1.x, 2.2.0 a 2.2.2 y desde la 2.3.0 hasta la 2.3.1. • http://www.securityfocus.com/bid/106786 •

CVSS: 7.5EPSS: 0%CPEs: 2EXPL: 0

24 Oct 2018 — Spark's Apache Maven-based build includes a convenience script, 'build/mvn', that downloads and runs a zinc server to speed up compilation. It has been included in release branches since 1.3.x, up to and including master. This server will accept connections from external hosts by default. A specially-crafted request to the zinc server could cause it to reveal information in files readable to the developer account running the build. Note that this issue does not affect end users of Spark, only developers bui... • http://www.securityfocus.com/bid/105756 •

CVSS: 4.9EPSS: 87%CPEs: 1EXPL: 2

13 Aug 2018 — From version 1.3.0 onward, Apache Spark's standalone master exposes a REST API for job submission, in addition to the submission mechanism used by spark-submit. In standalone, the config property 'spark.authenticate.secret' establishes a shared secret for authenticating requests to submit jobs via spark-submit. However, the REST API does not use this or any other authentication mechanism, and this is not adequately documented. In this case, a user would be able to run a driver program without authenticating... • https://github.com/ivanitlearning/CVE-2018-11770 • CWE-287: Improper Authentication •

CVSS: 4.7EPSS: 0%CPEs: 3EXPL: 0

12 Jul 2018 — In Apache Spark 1.0.0 to 2.1.2, 2.2.0 to 2.2.1, and 2.3.0, when using PySpark or SparkR, it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application. En Apache Spark 1.0.0 a 2.1.2, 2.2.0 a 2.2.1 y 2.3.0, al emplear PySpark o SparkR, es posible que un usuario local diferente se conecte a la aplicación Spark y suplante al usuario que ejecuta la aplicación Spark. • https://lists.apache.org/thread.html/4d6d210e319a501b740293daaeeeadb51927111fb8261a3e4cd60060%40%3Cdev.spark.apache.org%3E • CWE-200: Exposure of Sensitive Information to an Unauthorized Actor •

CVSS: 5.4EPSS: 53%CPEs: 4EXPL: 0

12 Jul 2018 — In Apache Spark 2.1.0 to 2.1.2, 2.2.0 to 2.2.1, and 2.3.0, it's possible for a malicious user to construct a URL pointing to a Spark cluster's UI's job and stage info pages, and if a user can be tricked into accessing the URL, can be used to cause script to execute and expose information from the user's view of the Spark UI. While some browsers like recent versions of Chrome and Safari are able to block this type of attack, current versions of Firefox (and possibly others) do not. En Apache Spark versión 2.... • https://lists.apache.org/thread.html/5f241d2cda21cbcb3b63e46e474cf5f50cce66927f08399f4fab0aba%40%3Cdev.spark.apache.org%3E • CWE-200: Exposure of Sensitive Information to an Unauthorized Actor •

CVSS: 5.3EPSS: 0%CPEs: 1EXPL: 0

31 Mar 2018 — In Spark before 2.7.2, a remote attacker can read unintended static files via various representations of absolute or relative pathnames, as demonstrated by file: URLs and directory traversal sequences. NOTE: this product is unrelated to Ignite Realtime Spark. En Spark en versiones anteriores a la 2.7.2, un atacante remoto puede leer archivos estáticos no deseados mediante varias representaciones de nombres de ruta relativos o absolutos, tal y como queda demostrado con las secuencias de URL de archivos y sal... • http://sparkjava.com/news#spark-272-released • CWE-22: Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal') •

CVSS: 7.8EPSS: 0%CPEs: 9EXPL: 0

13 Sep 2017 — In Apache Spark 1.6.0 until 2.1.1, the launcher API performs unsafe deserialization of data received by its socket. This makes applications launched programmatically using the launcher API potentially vulnerable to arbitrary code execution by an attacker with access to any user account on the local machine. It does not affect apps run by spark-submit or spark-shell. The attacker would be able to execute code as the user that ran the Spark application. Users are encouraged to update to version 2.2.0 or later... • http://www.securityfocus.com/bid/100823 • CWE-502: Deserialization of Untrusted Data •