Uploading large files to proftpd through paramiko times out

Jacob Roberts

I've set up a SFTP server using proftpd on my local machine. It works fine, except that it times out when uploading files larger than approximately 30000 characters.

Uploading from the command line through proftpd works without any problems, and using paramiko to upload to a different SFTP server also works. This leads me to think there is a bug specifically in the interaction between paramiko and proftpd.

I've made a small script to illustrate the problem:

import paramiko

transport = paramiko.Transport(('localhost', 2220)) # my proftpd SFTP port
transport.connect(username='x', password='x')
client = paramiko.SFTPClient.from_transport(transport)
with open('testimage.jpg') as f: # 35241 characters
    content = f.read()
with client.open('testimage.jpg', 'w') as f:
    f.write(content)

My SFTP-specific proftpd configuration:

<IfModule mod_sftp.c>
    <VirtualHost 0.0.0.0>
        Include            /etc/proftpd/conf.d
        SFTPEngine         on
        SFTPLog            /var/log/proftpd/sftp.log
        Port               2220
        SFTPHostKey        /etc/ssh/ssh_host_rsa_key
        SFTPHostKey        /etc/ssh/ssh_host_dsa_key
        SFTPAuthMethods    password
        SFTPCompression    delayed
        MaxLoginAttempts   3
    </VirtualHost>
</IfModule>

After 10 minutes, the program exits and spits out this error:

Traceback (most recent call last):
  File "ftptest.py", line 9, in <module>
    f.write(content)
  File "/Library/Python/2.7/site-packages/paramiko/file.py", line 330, in write
    self._write_all(data)
  File "/Library/Python/2.7/site-packages/paramiko/file.py", line 447, in _write_all
    count = self._write(data)
  File "/Library/Python/2.7/site-packages/paramiko/sftp_file.py", line 176, in _write
    self._reqs.append(self.sftp._async_request(type(None), CMD_WRITE, self.handle, long(self._realpos), data[:chunk]))
  File "/Library/Python/2.7/site-packages/paramiko/sftp_client.py", line 668, in _async_request
    self._send_packet(t, msg)
  File "/Library/Python/2.7/site-packages/paramiko/sftp.py", line 170, in _send_packet
    self._write_all(out)
  File "/Library/Python/2.7/site-packages/paramiko/sftp.py", line 135, in _write_all
    raise EOFError()
EOFError

Using paramiko 1.15 and proftpd 1.3.5

Jacob Roberts

The default window size of 4 GB was too big for paramiko, causing data transfer to stall.

The issue was resolved by adding this to the proftpd SFTP configuration:
SFTPClientMatch ".*" channelWindowSize 3999MB

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

Uploading large files via Zuul

From Dev

MultipartEntity Files not uploading through device

From Dev

pysftp and paramiko stop uploading files after a few seconds

From Dev

pysftp and paramiko stop uploading files after a few seconds

From Dev

Uploading large Files with HttpWebRequest (AllowWriteSteamBuffering doesnt solve this)

From Dev

No response when uploading relatively large files to server

From Dev

Uploading large Files with HttpWebRequest (AllowWriteSteamBuffering doesnt solve this)

From Dev

Uploading large files to Mariadb from cmd

From Dev

Ubuntu server crashes while uploading large files

From Dev

Paramiko session times out, but i need to execute a lot of commands

From Dev

Uploading Files through GWT-RPC?

From Dev

Is it possible to attack a server by uploading / through uploaded files?

From Dev

Netty IdleStateHandler times out for large messages

From Dev

Netty IdleStateHandler times out for large messages

From Dev

FileZilla times out when transferring large file

From Dev

Why proftpd allows only downloading and not uploading

From Dev

Amazon S3 Low Level API uploading Large files

From Dev

Dropzone.js : Screen hangs when uploading large number of files

From Dev

Performance issues uploading large text files via Paperclip

From Dev

Uploading large files to Google Storage GCE from a Kubernetes pod

From Dev

MemoryError when uploading large files to Tornado HTTP server

From Dev

Issues with pushing large files through GIT

From Dev

How to stream large files through Kafka?

From Dev

Issues with pushing large files through GIT

From Dev

.bat file for searching through large text files

From Dev

Uploading and saving files in server through flask gives error

From Dev

PHP MySQL times out when updating large db

From Dev

Couchdb view on large documents with strings as keys times out

From Dev

Download large files using large byte array causes "Out of memory"

Related Related

  1. 1

    Uploading large files via Zuul

  2. 2

    MultipartEntity Files not uploading through device

  3. 3

    pysftp and paramiko stop uploading files after a few seconds

  4. 4

    pysftp and paramiko stop uploading files after a few seconds

  5. 5

    Uploading large Files with HttpWebRequest (AllowWriteSteamBuffering doesnt solve this)

  6. 6

    No response when uploading relatively large files to server

  7. 7

    Uploading large Files with HttpWebRequest (AllowWriteSteamBuffering doesnt solve this)

  8. 8

    Uploading large files to Mariadb from cmd

  9. 9

    Ubuntu server crashes while uploading large files

  10. 10

    Paramiko session times out, but i need to execute a lot of commands

  11. 11

    Uploading Files through GWT-RPC?

  12. 12

    Is it possible to attack a server by uploading / through uploaded files?

  13. 13

    Netty IdleStateHandler times out for large messages

  14. 14

    Netty IdleStateHandler times out for large messages

  15. 15

    FileZilla times out when transferring large file

  16. 16

    Why proftpd allows only downloading and not uploading

  17. 17

    Amazon S3 Low Level API uploading Large files

  18. 18

    Dropzone.js : Screen hangs when uploading large number of files

  19. 19

    Performance issues uploading large text files via Paperclip

  20. 20

    Uploading large files to Google Storage GCE from a Kubernetes pod

  21. 21

    MemoryError when uploading large files to Tornado HTTP server

  22. 22

    Issues with pushing large files through GIT

  23. 23

    How to stream large files through Kafka?

  24. 24

    Issues with pushing large files through GIT

  25. 25

    .bat file for searching through large text files

  26. 26

    Uploading and saving files in server through flask gives error

  27. 27

    PHP MySQL times out when updating large db

  28. 28

    Couchdb view on large documents with strings as keys times out

  29. 29

    Download large files using large byte array causes "Out of memory"

HotTag

Archive