CVE-2019-20445 – netty: HttpObjectDecoder.java allows Content-Length header to accompanied by second Content-Length header
https://notcve.org/view.php?id=CVE-2019-20445
HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header. "El archivo HttpObjectDecoder.java en Netty versiones anteriores a 4.1.44, permite que un encabezado Content-Length esté acompañado por un segundo encabezado Content-Length o por un encabezado Transfer-Encoding." A flaw was found in Netty before version 4.1.44, where it accepted multiple Content-Length headers and also accepted both Transfer-Encoding, as well as Content-Length headers where it should reject the message under such circumstances. In circumstances where Netty is used in the context of a server, it could result in a viable HTTP smuggling vulnerability. • https://access.redhat.com/errata/RHSA-2020:0497 https://access.redhat.com/errata/RHSA-2020:0567 https://access.redhat.com/errata/RHSA-2020:0601 https://access.redhat.com/errata/RHSA-2020:0605 https://access.redhat.com/errata/RHSA-2020:0606 https://access.redhat.com/errata/RHSA-2020:0804 https://access.redhat.com/errata/RHSA-2020:0805 https://access.redhat.com/errata/RHSA-2020:0806 https://access.redhat.com/errata/RHSA-2020:0811 https://github.com/netty/netty/compare& • CWE-444: Inconsistent Interpretation of HTTP Requests ('HTTP Request/Response Smuggling') •
CVE-2019-10172 – jackson-mapper-asl: XML external entity similar to CVE-2016-3720
https://notcve.org/view.php?id=CVE-2019-10172
A flaw was found in org.codehaus.jackson:jackson-mapper-asl:1.9.x libraries. XML external entity vulnerabilities similar CVE-2016-3720 also affects codehaus jackson-mapper-asl libraries but in different classes. Se detectó un fallo en las bibliotecas org.codehaus.jackson:jackson-mapper-asl:1.9.x. Las vulnerabilidades de tipo XML external entity similares a CVE-2016-3720, también afectan a las bibliotecas codehaus jackson-mapper-asl pero en diferentes clases. A flaw was found in org.codehaus.jackson:jackson-mapper-asl:1.9.x libraries such that an XML external entity (XXE) vulnerability affects codehaus's jackson-mapper-asl libraries. • https://github.com/rusakovichma/CVE-2019-10172 https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2019-10172 https://lists.apache.org/thread.html/r0066c1e862613de402fee04e81cbe00bcd64b64a2711beb9a13c3b25%40%3Ccommits.cassandra.apache.org%3E https://lists.apache.org/thread.html/r04ecadefb27cda84b699130b11b96427f1d8a7a4066d8292f7f15ed8%40%3Ccommon-issues.hadoop.apache.org%3E https://lists.apache.org/thread.html/r08e1b73fabd986dcd2ddd7d09480504d1472264bed2f19b1d2002a9c%40%3Ccommon-issues.hadoop.apache.org%3E https://lists.apache.org/thread.html/r0d8c3e32a0a2d8a0b6118f5 • CWE-611: Improper Restriction of XML External Entity Reference •
CVE-2019-10099
https://notcve.org/view.php?id=CVE-2019-10099
Prior to Spark 2.3.3, in certain situations Spark would write user data to local disk unencrypted, even if spark.io.encryption.enabled=true. This includes cached blocks that are fetched to disk (controlled by spark.maxRemoteBlockSizeFetchToMem); in SparkR, using parallelize; in Pyspark, using broadcast and parallelize; and use of python udfs. Spark anterior a versión 2.3.3, en ciertas situaciones, Spark escribiría los datos de usuario en el disco local sin cifrar, incluso si spark.io.encryption.enabled=true. Esto incluye los bloques almacenados en caché que son traídos al disco (controlados por spark.maxRemoteBlockSizeFetchToMem); en SparkR, usando parallelize; en Pyspark, usando broadcast y parallelize; y el uso de udfs de python. • https://lists.apache.org/thread.html/c2a39c207421797f82823a8aff488dcd332d9544038307bf69a2ba9e%40%3Cuser.spark.apache.org%3E https://lists.apache.org/thread.html/ra216b7b0dd82a2c12c2df9d6095e689eb3f3d28164e6b6587da69fae%40%3Ccommits.spark.apache.org%3E https://lists.apache.org/thread.html/rabe1d47e2bf8b8f6d9f3068c8d2679731d57fa73b3a7ed1fa82406d2%40%3Cissues.spark.apache.org%3E • CWE-312: Cleartext Storage of Sensitive Information •
CVE-2018-11760
https://notcve.org/view.php?id=CVE-2018-11760
When using PySpark , it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application. This affects versions 1.x, 2.0.x, 2.1.x, 2.2.0 to 2.2.2, and 2.3.0 to 2.3.1. Al utilizar PySpark, es posible que un usuario local diferente se conecte a la aplicación de Spark y suplante al usuario que ejecuta la aplicación de Spark. Afecta a las versiones 1.x, 2.0.x, 2.1.x, 2.2.0 a 2.2.2 y desde la 2.3.0 hasta la 2.3.1. • http://www.securityfocus.com/bid/106786 https://lists.apache.org/thread.html/6d015e56b3a3da968f86e0b6acc69f17ecc16b499389e12d8255bf6e%40%3Ccommits.spark.apache.org%3E https://lists.apache.org/thread.html/a86ee93d07b6f61b82b61a28049aed311f5cc9420d26cc95f1a9de7b%40%3Cuser.spark.apache.org%3E •
CVE-2018-11804
https://notcve.org/view.php?id=CVE-2018-11804
Spark's Apache Maven-based build includes a convenience script, 'build/mvn', that downloads and runs a zinc server to speed up compilation. It has been included in release branches since 1.3.x, up to and including master. This server will accept connections from external hosts by default. A specially-crafted request to the zinc server could cause it to reveal information in files readable to the developer account running the build. Note that this issue does not affect end users of Spark, only developers building Spark from source code. • http://www.securityfocus.com/bid/105756 https://lists.apache.org/thread.html/2b11aa4201e36f2ec8f728e722fe33758410f07784379cbefd0bda9d%40%3Cdev.spark.apache.org%3E https://spark.apache.org/security.html#CVE-2018-11804 •