I have reached a problem that has me completely stumped. I'm not sure if there's a hardware issue or a software issue at this point. Recently my server was moved to a new data center. When I logged in after doing some synchronizing with my local file system (or trying to) using SFTP, I saw a bunch of SSH processes. This is usually normal so I didn't think anything of it. However, what I didn't notice is that the synchronization was actually failing.
Right at this moment, I cannot copy any files to the server, either through SFTP or through a web application (Trac in this case). In both of these cases, the file transfer fails. In the case of the SFTP transfer, it creates an empty file, but doesn't actually transfer any of the file data, so I get an empty file. In the case of uploading a file to Trac, I can successfully transfer about 500 KB before it stops and doesn't transfer any more, which seems like just enough to eat up a buffer and no more.
Downloads from the server work just fine. I just downloaded a file from the server using SFTP that was 100 MB with no problems, it was a little slow, but it worked.
However, if I try to download files to the server, such as using
It seems like data going to the server is hosed, but data coming from it is fine. I've verified that there aren't any issues with the disk or anything and there doesn't seem to be any packet loss. It also doesn't seem to be a network only issue because HTTP and the like work for loading webpages, just not for uploading files.
I'm stumped. Anyone have any ideas on how I can debug this? I can SSH into the server, I just don't have console access at the moment.
Right at this moment, I cannot copy any files to the server, either through SFTP or through a web application (Trac in this case). In both of these cases, the file transfer fails. In the case of the SFTP transfer, it creates an empty file, but doesn't actually transfer any of the file data, so I get an empty file. In the case of uploading a file to Trac, I can successfully transfer about 500 KB before it stops and doesn't transfer any more, which seems like just enough to eat up a buffer and no more.
Downloads from the server work just fine. I just downloaded a file from the server using SFTP that was 100 MB with no problems, it was a little slow, but it worked.
However, if I try to download files to the server, such as using
fetch
with a URL or freebsd-update fetch
it fails outright and won't download anything.It seems like data going to the server is hosed, but data coming from it is fine. I've verified that there aren't any issues with the disk or anything and there doesn't seem to be any packet loss. It also doesn't seem to be a network only issue because HTTP and the like work for loading webpages, just not for uploading files.
I'm stumped. Anyone have any ideas on how I can debug this? I can SSH into the server, I just don't have console access at the moment.