14 results (0.003 seconds)

CVSS: 9.8EPSS: 0%CPEs: 3EXPL: 0

Apache Hadoop's FileUtil.unTar(File, File) API does not escape the input file name before being passed to the shell. An attacker can inject arbitrary commands. This is only used in Hadoop 3.3 InMemoryAliasMap.completeBootstrapTransfer, which is only ever run by a local user. It has been used in Hadoop 2.x for yarn localization, which does enable remote code execution. It is used in Apache Spark, from the SQL command ADD ARCHIVE. • https://lists.apache.org/thread/mxqnb39jfrwgs3j6phwvlrfq4mlox130 https://security.netapp.com/advisory/ntap-20220915-0007 • CWE-78: Improper Neutralization of Special Elements used in an OS Command ('OS Command Injection') •

CVSS: 9.8EPSS: 1%CPEs: 4EXPL: 1

In Apache Hadoop, The unTar function uses unTarUsingJava function on Windows and the built-in tar utility on Unix and other OSes. As a result, a TAR entry may create a symlink under the expected extraction directory which points to an external directory. A subsequent TAR entry may extract an arbitrary file into the external directory using the symlink name. This however would be caught by the same targetDirPath check on Unix because of the getCanonicalPath call. However on Windows, getCanonicalPath doesn't resolve symbolic links, which bypasses the check. unpackEntries during TAR extraction follows symbolic links which allows writing outside expected base directory on Windows. • https://lists.apache.org/thread/hslo7wzw2449gv1jyjk8g6ttd7935fyz https://security.netapp.com/advisory/ntap-20220519-0004 • CWE-59: Improper Link Resolution Before File Access ('Link Following') •

CVSS: 8.8EPSS: 1%CPEs: 7EXPL: 0

In Apache Hadoop 3.2.0 to 3.2.1, 3.0.0-alpha1 to 3.1.3, and 2.0.0-alpha to 2.10.0, WebHDFS client might send SPNEGO authorization header to remote URL without proper verification. En Apache Hadoop versiones 3.2.0 hasta 3.2.1, versiones 3.0.0-alpha1 hasta 3.1.3 y versiones 2.0.0-alpha hasta 2.10.0, el cliente WebHDFS puede enviar el encabezado de autorización SPNEGO hacia una URL remota sin la comprobación apropiada A flaw was found in Apache hadoop. The WebHDFS client can send a SPNEGO authorization header to a remote URL without proper verification which could lead to an access restriction bypass. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability. • https://lists.apache.org/thread.html/r0a534f1cde7555f7208e9f9b791c1ab396d215eaaef283b3a9153429%40%3Ccommits.druid.apache.org%3E https://lists.apache.org/thread.html/r49c9ab444ab1107c6a8be8a0d66602dec32a16d96c2631fec8d309fb%40%3Cissues.solr.apache.org%3E https://lists.apache.org/thread.html/r4a57de5215494c35c8304cf114be75d42df7abc6c0c54bf163c3e370%40%3Cissues.solr.apache.org%3E https://lists.apache.org/thread.html/r513758942356ccd0d14538ba18a09903fc72716d74be1cb727ea91ff%40%3Cgeneral.hadoop.apache.org%3E https://lists.apache.org/thread.html/r6341f2a468ced8872a71997aa1786ce036242413484f0fa68dc9ca02% • CWE-863: Incorrect Authorization •

CVSS: 7.5EPSS: 0%CPEs: 27EXPL: 0

In Apache Hadoop 3.1.0 to 3.1.1, 3.0.0-alpha1 to 3.0.3, 2.9.0 to 2.9.1, and 2.0.0-alpha to 2.8.4, the user/group information can be corrupted across storing in fsimage and reading back from fsimage. En Apache Hadoop versiones 3.1.0 hasta 3.1.1, 3.0.0-alpha1 hasta 3.0.3, 2.9.0 hasta 2.9.1 y 2.0.0-alpha hasta 2.8.4, la información de user/group puede corromperse durante el almacenamiento en fsimage y una lectura nuevamente desde fsimage. • https://lists.apache.org/thread.html/2067a797b330530a6932f4b08f703b3173253d0a2b7c8c524e54adaf%40%3Cgeneral.hadoop.apache.org%3E https://lists.apache.org/thread.html/2c9cc65864be0058a5d5ed2025dfb9c700bf23d352b0c826c36ff96a%40%3Chdfs-dev.hadoop.apache.org%3E https://lists.apache.org/thread.html/72ca514e01cd5f08151e74f9929799b4cbe1b6e9e6cd24faa72ffcc6%40%3Cdev.lucene.apache.org%3E https://lists.apache.org/thread.html/9b609d4392d886711e694cf40d86f770022baf42a1b1aa97e8244c87%40%3Cdev.lucene.apache.org%3E https://lists.apache.org/thread.html/caacbbba2dcc1105163f76f3dfee5fbd22e0417e0783212787086378%4 • CWE-119: Improper Restriction of Operations within the Bounds of a Memory Buffer •

CVSS: 8.8EPSS: 1%CPEs: 12EXPL: 1

Apache Hadoop 3.1.0, 3.0.0-alpha to 3.0.2, 2.9.0 to 2.9.1, 2.8.0 to 2.8.4, 2.0.0-alpha to 2.7.6, 0.23.0 to 0.23.11 is exploitable via the zip slip vulnerability in places that accept a zip file. Apache Hadoop 3.1.0, 3.0.0-alpha a 3.0.2, 2.9.0 a 2.9.1, 2.8.0 a 2.8.4, 2.0.0-alpha a 2.7.6 y 0.23.0 a 0.23.11 puede explotarse mediante la vulnerabilidad "zip slip" en lugares que aceptan un archivo zip. • http://www.securityfocus.com/bid/105927 https://access.redhat.com/errata/RHSA-2019:3892 https://hadoop.apache.org/cve_list.html#cve-2018-8009-http-cve-mitre-org-cgi-bin-cvename-cgi-name-cve-2018-8009-zip-slip-impact-on-apache-hadoop https://lists.apache.org/thread.html/708d94141126eac03011144a971a6411fcac16d9c248d1d535a39451%40%3Csolr-user.lucene.apache.org%3E https://lists.apache.org/thread.html/a1c227745ce30acbcf388c5b0cc8423e8bf495d619cd0fa973f7f38d%40%3Cuser.hadoop.apache.org%3E https://lists.apache.org/thread.html/r4dd • CWE-20: Improper Input Validation CWE-22: Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal') •