Goal Reached Thanks to every supporter — we hit 100%!

Goal: 1000 CNY · Raised: 1000 CNY

100.0%

CVE-2017-5638 PoC — Apache Struts 2 输入验证错误漏洞

Source
Associated Vulnerability
Title:Apache Struts 2 输入验证错误漏洞 (CVE-2017-5638)
Description:The Jakarta Multipart parser in Apache Struts 2 2.3.x before 2.3.32 and 2.5.x before 2.5.10.1 has incorrect exception handling and error-message generation during file-upload attempts, which allows remote attackers to execute arbitrary commands via a crafted Content-Type, Content-Disposition, or Content-Length HTTP header, as exploited in the wild in March 2017 with a Content-Type header containing a #cmd= string.
Description
CVE-2017-5638
Readme
# S2-Reaper

This project is used to collect vulnerable URLs that affected by Struts2 S2-045 from the Google search results.

## Usage

```
python reaper.py
```

## About

The `reaper.py` will run a google search crawler with keywords definded at `crawler.conf` to find vulnerable URLs.

### `crawler.conf`

`base_url` : the basic google search url

`keyword` : e.g. site:gov ext:action

`expect_num` : expect search results to be crawlered

`http/socks` : set a HTTP/SOCKS5 proxy for the crawler

## Dependence

You need to run the following command to install requirements.

```
pip install beautifulsoup4 requests
```

If you want to use a SOCKS5 proxy, then install requests[socks] with pip.

```
pip install requests[socks]
```

## Reference

> 
> https://github.com/meibenjin/GoogleSearchCrawler
> 
> http://www.freebuf.com/sectool/129224.html
> 
File Snapshot

[4.0K] /data/pocs/e528ff405a69443ad29617754c49701c65037dec ├── [ 189] crawler.conf ├── [ 34K] LICENSE ├── [ 853] README.md ├── [ 10K] reaper.py └── [ 690] user_agents.txt 0 directories, 5 files
Shenlong Bot has cached this for you
Remarks
    1. It is advised to access via the original source first.
    2. Local POC snapshots are reserved for subscribers — if the original source is unavailable, the local mirror is part of the paid plan.
    3. Mirroring, verifying, and maintaining this POC archive takes ongoing effort, so local snapshots are a paid feature. Your subscription keeps the archive online — thank you for the support. View subscription plans →