1
5 Comments

Which libraries to handle big file upload/chunk? And shouldn't we just use chunk by default? Is there any drawback when using chunk method?

I use PHP for back, JS for front.
I'm thinking to integrate with cloudinary for storage. However, I realized that handling file upload, chunking and uploading is kind of big works.

I don't want to put much effort. (actually, I've put 1 day just to handle file upload, realizing that it gives err when trying to upload big files) for this. If you know any libraries that can handle this, I would really appreciate it.

Initial researches was like this:

  1. Just use Cloudinary Upload Widget.
  2. BlueImp JqueryFileUpload
  3. FilePond

Anything else?

  1. 2

    Are you trying to upload from the browser directly to Cloudinary, or going to your PHP service first?

    It looks like they have a plain Javascript example that creates a FormData object and makes the API call with fetch, which is likely the easiest option.

    If you're trying to go from: browser --> PHP backend --> Cloudinary, then the ideal way is to do a multipart-form upload from the browser and find a way to make PHP stream/proxy that request over to Cloudinary so you avoid saving the file to disk and then relaying it to Cloudinary with a completely separate request.

    1. 1

      Yes, I'm trying to go with browser -> php backend -> cloudinary. I don't know but it seems I feel insecure with using that plain js *security matters.

      The stream/proxy way, have you made it? I wonder because I succeed to do it if the file was <1MB. However, the issue occurs when I try to upload with chunking.

      I'm about to try dropzone.js, too. https://github.com/Dodotree/DropzonePHPchunks Probably, I'll report if succeed at this.

      1. 1

        So they actually have you covered there. On that page if you scroll down to "Generating authentication signatures", you'll find what you need. Basically what you do is rather than exposing your secret key to the world (highly insecure), you can have your PHP backend generate essentially a one-use key so you can perform the upload from the browser. AWS S3 has a similar functionality that I use all the time with great success.

        The stream/proxy way, have you made it? I wonder because I succeed to do it if the file was <1MB. However, the issue occurs when I try to upload with chunking.

        Ive done it before, but with NodeJS and not PHP. With node, since everything is basically a stream of data, you can easily take an incoming request and pipe it as the body of a new request to a new destination. "Chunking" for file uploads should be done using a multipart-form which your browser will automatically format with proper boundaries as it initiates the request.

        1. 1

          Haha I've come acrossed that, too. However, I think they still need cloudName defined in the front end, so I felt insecure, too there.
          I think I will fallback to that if anything else fail.

          @seanmcgary may I know what did you use to handle the front end part?

          1. 1

            Since fetch and XMLHTTPRequest both handle multipart uploads, I’ve always rolled my own upload dialogs, using everything from jQuery to React in more recent times.

Trending on Indie Hackers
Getting first 908 Paid Signups by Spending $353 ONLY. 25 comments I talked to 8 SaaS founders, these are the most common SaaS tools they use 20 comments What are your cold outreach conversion rates? Top 3 Metrics And Benchmarks To Track 19 comments How I Sourced 60% of Customers From Linkedin, Organically 12 comments Hero Section Copywriting Framework that Converts 3x 12 comments Promptzone - first-of-its-kind social media platform dedicated to all things AI. 8 comments