FreeBSD Jailed SFTP server for someone to upload files

Hello,

I would like to discuss with you about possibilities for FreeBSD Jailed SFTP server for someone to upload files.

What do you think about this method?
Maybe could it be easier to install with FreeBSD, so that a package handle it and prompt under CLI to make so.

There are situations when you have a nice server out there, and you want/need someone to upload important files, but you only want to give them a minimal access to the system. You can use SSHD with sFTP and /sbin/nologin shell for that in chroot environment (dedicated limited userspace). Note that SCP in fact requires a working shell, so you need to use sFTP in this case.

Create a new user account with existing ftp group and /sbin/nologin shell:
Code:
# adduser

Alternatively you can modify an existing account to share:
Code:
# pw groupmod ftp -m username
# pw usermod username -s /sbin/nologin

You need to set correct permissions to the user home directory and public directory inside for upload:
Code:
# chown root:wheel /home/username
# mkdir /home/username/public
# chown username:ftp /home/username/public

Now modify the SSHD server configuration file /etc/ssh/sshd_config and append:
Code:
Match Group ftp
        ChrootDirectory         /home/%u
        ForceCommand            internal-sftp
        AllowTcpForwarding      no
        PermitTunnel            no
        X11Forwarding           no

Remember to restart the SSHD in order to apply new configuration:
Code:
# service sshd restart
Once the account is not necessary anymore remember to remove it:

Code:
# pw userdel username

Ref:
 
Thanks for bringing up this issue. I have been giving it some thought at a time most self-hosted cloud services we use would not upload a large until we use SFTP and then use cli command to complete the process..

All tricks - filesize in PHP.ini, etc- didn't work.
I would find a simple SFTP client that does the upload. However, how many 'professionals' support and use this approach?
 
Depends how large Your "large" files are.
I am (ab)using the picture upload in a web application to upload files; that works until the machine runs out of heap memory or the file gets too big for the database to store it, i.e. 1GB.
 
Before we discuss the "how", let's discuss the "what" and "why".

You want people to upload files. Who is going to be allowed to upload? Anyone in the world? That opens the possibility of massive DoS attacks. If not everyone, how are you going to authenticate the users? How are you going to make the user authentication secure enough?

Even ignoring the possibility of malicious DoS attacks: you will need to control the data volume that is uploaded, otherwise uploads have the potential of running you out of disk space, which will "crash" your machine (perhaps not a full OS crash, but at least make it not work correctly any longer). How do you place to do that? What is the data volume that uploaders really need? Is the storage long-term or short-term?

What is the protection for uploaded files? Are you going allow people to re-read the stuff they have uploaded? Are you going to allow people other than the original uploader to read stuff? Depending on the answer, you will need a method of not only authenticating users, but differentiating users (Adam can not read stuff that Bob uploaded, so you need to know how to tell Adam and Bob apart).

If you allow people to upload and re-download, then your service is likely to be used as an intermediate stash for illegal activities, like sharing of pirated files, child pornography, and to clandestinely transmit information between parties. I would put the phone number of a good criminal defense attorney on your speed-dial, you'll need it.

How big are the files going to be? Say for example you give users a quota of 1GB each. What happens if someone uploads 1GB of data, in the form of one billion 1-byte files? What happens if someone tries to upload a single 1TB file (which is easy with modern disks to store, and easy with modern network bandwidth to transmit, but you need to use restartable protocols, because a 1TB single stream upload is likely to be interrupted). Also, if you want to upload a file that big (which is not unusually big today), you probably need to use a protocol that allows for parallel upload using multiple sockets, since otherwise your TCP stack might become the limiting factor. Such protocols exist, but they are not sFTP, and they based on http(s), which you hate for no logical reason.

Finally, what is the purpose of the uploaded files? How are you going to process them, or make them accessible?

Once you have a good idea about purpose and goals and "speeds and feeds", we can start discussing the how.
 
Thanks for bringing up this issue. I have been giving it some thought at a time most self-hosted cloud services we use would not upload a large until we use SFTP and then use cli command to complete the process..

All tricks - filesize in PHP.ini, etc- didn't work.
I would find a simple SFTP client that does the upload. However, how many 'professionals' support and use this approach?
the simple SFTP is the best, because it is 100 pct free !!
You are owner of your data, not google.
 
Back
Top