生活随笔
收集整理的這篇文章主要介紹了
AWVS13批量脚本
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
你可以在以下渠道聯系到我,轉載請注明文章來源地址~
- 知乎:Sp4rkW
- GITHUB:Sp4rkW
- B站:一只技術君
- 博客:https://sp4rkw.blog.csdn.net/
- 聯系郵箱:getf_own@163.com
文章目錄
前言
最近在改reaper的awvs互動功能,因為自己的服務器垃圾,一次最多掃四個站,否則就卡死了。所以需要對現有的批量腳本進行修改處理。邏輯比較簡單:
- 拿到web資產,django異步啟掃描任務
- 從list中取出前四個,丟入awvs,選擇slow模式慢慢掃
- 一分鐘判斷一次目前正在掃描的任務數量,不滿4個自動新增補全到4個任務
- 知道列表為空
django部分代碼略去,awvs的部分代碼我提取出來了,供大家使用
核心接口
儀表盤接口
/api/v1/me/stats
參數說明
| most_vulnerable_targets | 最脆弱的目標 |
| scans_conducted_count | 總進行掃描個數 |
| scans_running_count | 正在掃描的個數 |
| scans_waiting_count | 等待掃描的個數 |
| targets_count | 總進行掃描個數 |
| top_vulnerabilities | 排名靠前漏洞分布 |
| vuln_count_by_criticality | 通過危險程度進行漏洞等級個數分布 |
| vuln_count | 漏洞數據 |
| vuln_count_by_criticality | 通過危險程度進行漏洞等級個數分布 |
| top_vulnerabilities | 排名靠前漏洞分布 |
| vulnerabilities_open_count | 共發現漏洞總數 |
api_running_url
= 'https://x/api/v1/me/stats'
headers
= {'X-Auth': 'x','Content-type': 'application/json'
}
r
= requests
.get
(url
=api_running_url
, headers
=headers
, verify
=False).json
()
print(r
['scans_running_count'])
{'most_vulnerable_targets': [], 'scans_conducted_count': 0, 'scans_running_count': 0, 'scans_waiting_count': 0, 'targets_count': 0, 'top_vulnerabilities': [], 'vuln_count': {'high': None, 'low': None, 'med': None}, 'vuln_count_by_criticality': {'critical': None, 'high': None, 'low': None, 'normal': None}, 'vulnerabilities_open_count': 0}
新增任務接口
Method:POST
URL: /api/v1/targets
發送參數類型說明
| address | string | 目標網址:需http或https開頭 |
| criticality | Int | 危險程度;范圍:[30,20,10,0];默認為10 |
| description | string | 備注 |
api_add_url
= "https://x/api/v1/targets"
headers
= {'X-Auth': 'x','Content-type': 'application/json'
}data
= '{"address":"http://vulnweb.com/","description":"create_by_reaper","criticality":"10"}'r
= requests
.post
(url
=api_add_url
, headers
=headers
, data
=data
,verify
=False).json
()
print(r
)
返回參數說明
| address | 目標網址 |
| criticality | 危險程度 |
| description | 備注 |
| type | 類型 |
| domain | 域名 |
| target_id | 目標id |
| target_type | 目標類型 |
| canonical_address | 根域名 |
| canonical_address_hash | 根域名hash |
# 返回包如下{'address': 'http://vulnweb.com/', 'criticality': 10, 'description': 'create_by_reaper', 'type': 'default', 'domain': 'vulnweb.com', 'target_id': '13564b22-7fd8-46d5-b10f-3c87a6cc6afa', 'target_type': None, 'canonical_address': 'vulnweb.com', 'canonical_address_hash': '823a9c89d4aea02ab8a4f5d31fd603c7'}
設置掃描速度
Method:PATCH
URL: /api/v1/targets/{target_id}/configuration
參數類型說明
| scan_speed | string | 由慢到快:sequential slow moderate fast |
api_speed_url
= "https://x/api/v1/targets/{}/configuration".format(target_id
)
data
= json
.dumps
({"scan_speed":"sequential"})r
= requests
.patch
(url
=api_speed_url
, headers
=headers
, data
=data
, verify
=False)print(r
)
<Response
[204]>
啟動掃描任務
Method:POST
URL: /api/v1/scans
參數類型說明
| profile_id | string | 掃描類型 |
| ui_session_i | string | 可不傳 |
| schedule | json | 掃描時間設置(默認即時) |
| report_template_id | string | 掃描報告類型(可不傳) |
| target_id | string | 目標id |
掃描類型值國光翻譯的理解
| Full Scan | 11111111-1111-1111-1111-111111111111 | 完全掃描 |
| High Risk Vulnerabilities | 11111111-1111-1111-1111-111111111112 | 高風險漏洞 |
| Cross-site Scripting Vulnerabilities | 11111111-1111-1111-1111-111111111116 | XSS漏洞 |
| SQL Injection Vulnerabilities | 11111111-1111-1111-1111-111111111113 | SQL注入漏洞 |
| Weak Passwords | 11111111-1111-1111-1111-111111111115 | 弱口令檢測 |
| Crawl Only | 11111111-1111-1111-1111-111111111117 | Crawl Only |
| Malware Scan | 11111111-1111-1111-1111-111111111120 | 惡意軟件掃描 |
data
= '{"profile_id":"11111111-1111-1111-1111-111111111111","schedule":{"disable":false,"start_date":null,"time_sensitive":false},"target_id":"%s"}'% target_idr
= requests
.post
(url
=api_run_url
, headers
=headers
, data
=data
, verify
=False).json
()
print(r
)
{'profile_id': '11111111-1111-1111-1111-111111111111', 'schedule': {'disable': False, 'start_date': None, 'time_sensitive': False, 'triggerable': False}, 'target_id': '13564b22-7fd8-46d5-b10f-3c87a6cc6afa', 'incremental': False, 'max_scan_time': 0, 'ui_session_id': None}
絲滑腳本
import requests
import json
import time
from requests
.packages
.urllib3
.exceptions
import InsecureRequestWarningrequests
.packages
.urllib3
.disable_warnings
(InsecureRequestWarning
)awvs_token
= 'xxx'
website
= ""def awvs_reaper(domainlist
):headers
= {'X-Auth': awvs_token
,'Content-type': 'application/json;charset=utf8'}api_running_url
= website
+'/api/v1/me/stats'api_add_url
= website
+"/api/v1/targets"api_run_url
= website
+"/api/v1/scans"target_list
= []for target
in domainlist
:data
= '{"address":"%s","description":"create_by_reaper","criticality":"10"}'% targetr
= requests
.post
(url
=api_add_url
, headers
=headers
, data
=data
, verify
=False).json
()target_id
= r
['target_id']api_speed_url
= website
+"/api/v1/targets/{}/configuration".format(target_id
)data
= json
.dumps
({"scan_speed":"fast"})r
= requests
.patch
(url
=api_speed_url
, headers
=headers
, data
=data
, verify
=False)target_list
.append
(target_id
)target_num
= len(target_list
)if target_num
<= 4:for target_id
in target_list
:data
= '{"profile_id":"11111111-1111-1111-1111-111111111111","schedule":{"disable":false,"start_date":null,"time_sensitive":false},"target_id":"%s"}'% target_idr
= requests
.post
(url
=api_run_url
, headers
=headers
, data
=data
, verify
=False).json
()else:r
= requests
.get
(url
=api_running_url
, headers
=headers
, verify
=False).json
()runnum
= int(r
['scans_running_count']) flag
= 0 while flag
< target_num
:if runnum
< 4:target_id
= target_list
[flag
]flag
= flag
+ 1data
= '{"profile_id":"11111111-1111-1111-1111-111111111111","schedule":{"disable":false,"start_date":null,"time_sensitive":false},"target_id":"%s"}'% target_idr
= requests
.post
(url
=api_run_url
, headers
=headers
, data
=data
, verify
=False).json
()r
= requests
.get
(url
=api_running_url
, headers
=headers
, verify
=False).json
()runnum
= int(r
['scans_running_count']) else:passtime
.sleep
(60)return target_num
if __name__
== "__main__":domainlist
= ['http://10086.1.com', 'http://10087.1.com', 'http://10088.1.com']awvs_reaper
(domainlist
)
總結
以上是生活随笔為你收集整理的AWVS13批量脚本的全部內容,希望文章能夠幫你解決所遇到的問題。
如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。