Storing user uploaded files on a web server

I am working on a website that allows users to upload files (pictures, etc.). I have no experience in this area, and I was hoping to get some input in the correct way to store and index these files.

Although I would like to have an architecture that scales well for large amounts of data, I am not currently worried about extremely high (facebook, google-scale) volumes.

I was thinking about storing files in the file system in

/files/{username}/ 

And then having a uploads database, where each user has his own table with the file names (and therefore URLs) of each file that he uploaded (and any other additional information that I might want to save). The end result of this database (giving each user their own table) seems to me very inefficient, but saving records of all files in one table does not seem correct, as this will require searching the entire table every time one file is accessed.

My reasoning for giving each user their own table was that it is a neat and clear way to outline data on tables and reduce search time when searching for a file provided to the user.

+7
source share
2 answers

What Matt H offers is a good idea if what you are trying to achieve is user-level access to images. But assuming you are limited in storage stored in the database, storing images in binary data is inefficient, as you stated.

Using a table for the user is a poor design. The user who uploaded the file should simply be a field / column in the table in which all file downloads are stored, as well as any file metadata. I suggest creating a GUID for the file name, which is guaranteed to be unique and better than the auto-increment field, which is easy to guess if you are trying to prevent users from simply accessing all images.

You are concerned about performance, but as long as you are not dealing with millions and millions of records, your image selection requests are owned by the user, downloaded in a certain period of time (for example, you save a time stamp or the like) are insignificant in cost. If speed is a problem, you can add a B-tree index to the username, which will greatly speed up your user requests.

Back to the topic of security, access and organization. Store images with a folder per user (although depending on the number of users, the number of folders may increase to an unmanageable level). If you do not want the images to be publicly available, save them in a non-web folder, ask the application to read the data and transfer it to render the image to the user. More complicated, but you are hiding the actual file from the Internet. In addition, you can check all image requests with an authenticated user.

+3
source

It depends on the nature and structure of your application and database. I used many methods, including folder-based, snapshots stored in the block block of the database, non-partisan folders with files that are accessed through the authentication gateway ...

For external images that are not directly related to the application or database, for example, temporary photos or something else, I usually put them in a folder. Since your structure seems to be images from the user, then I expect that there may be metadata associated with the image, such as tags. In this case, I would probably save the image in the database table, assuming that I have the potential for this. If photos need to be protected, inaccessible to other users without authentication, then the database will have its own security, while file-based storage will require some kind of trick to prevent unauthorized access.

I would not use a table for the user, but only a table with images with elements ID, userid, image frame.

Does it help?

+3
source

All Articles