ShareFile API fails for large files where browser succeeds (C#)

  • 1
  • Problem
  • Updated 1 year ago
We like the ShareFile API so far, but we need to be able to use it to upload large files.  We've uploaded 25GB files in the browser (slowly, manually), but not yet even 1GB with the default ShareFileV3 or any modification yet attempted.  Countless combinations of settings and approaches work for files less than a GB, but none yet work for GB+ files.
Is anyone able to upload GB+ files with a ShareFile API?  Can anyone please provide a working example to upload files larger than a gigabyte?

We can modify the code to produce different errors for GB+ files while remaining functional for smaller files, notably:
  • `The stream does not support concurrent IO read or write operations.`
  • 404
  • `The request was aborted: The request was canceled.`
  • `An existing connection was forcibly closed by the remote host.`
I believe the modified C# code below rules out dotnet memory limitations.  C# Code to show progress to console is readily available if requested.


public static void UploadMultiPartFile(string parameterName, FileInfo file, string uploadUrl)
        {
            // This works for files less than approx 1 gigabyte, but not more, even though 25GB files can be uploaded in the browser.
            // "Filedata", instead of "File1", for parameterName, also seems to work interchangeably https://community.sharefilesupport.com/citrixsharefile/topics/correct-way-to-do-streamed-upload
            Console.WriteLine("Garbage collecting...");
            GC.Collect();

            long fileBytes;
            byte[][] chunks; // Jagged array to require less contiguous ram, to avoid some OutOfMemoryExceptions
            byte[] buffer;
            long chunkCount;
            int bytesPerChunk = 1024 * 48; // 48KiB is sharefile api default
            int lastChunkByteCount;

            // 1. Load content into ram, given Concurrent streaming IO operations are not supported.
            using (var fs = file.OpenRead())
            {
                fileBytes = fs.Length;
                if (fileBytes == 0)
                {
                    Console.WriteLine(String.Format("Skipping uploading empty file `{0}`.", file.Name));
                    return;
                }
                lastChunkByteCount = (int)(fileBytes % bytesPerChunk);
                chunkCount = (fileBytes / bytesPerChunk); // Truncates, rounding down
                if (lastChunkByteCount == 0)
                    lastChunkByteCount = bytesPerChunk; // Last chunk is uniform size
                else
                    chunkCount++; // Count last chunk

                chunks = new byte[chunkCount][];
                for (long c = 0; c < chunkCount; c++)
                {
                    buffer = chunks[c] = new byte[bytesPerChunk];
                    int bufferBytesDone = fs.Read(buffer, 0, bytesPerChunk);
                }
            }

            // 2. Write content from ram to stream, given Concurrent streaming IO operations are not supported.
            string boundaryGuid = "upload-" + Guid.NewGuid().ToString("n");
            byte[] boundaryStartBytes = Encoding.UTF8.GetBytes("--" + boundaryGuid + "\r\n"); // Two dashes fewer than finish.
            byte[] headerBytes = Encoding.UTF8.GetBytes(String.Format(@"Content-Disposition: form-data; name=""{0}""; filename=""{1}""" +
                "\r\nContent-Type: application/octet-stream\r\n\r\n", parameterName, file.Name));
            byte[] boundaryFinishBytes = Encoding.UTF8.GetBytes("\r\n--" + boundaryGuid + "--\r\n"); // Two dashes more than start.

            //WebRequest.DefaultCachePolicy = new System.Net.Cache.RequestCachePolicy(System.Net.Cache.RequestCacheLevel.BypassCache);
            HttpWebRequest request = WebRequest.CreateHttp(uploadUrl);
            request.Timeout = 1000 * 60 * 60 * 15; // 15 hours is near maximum ShareFile supports // Timeout never yet reached
            request.Method = "POST";
            request.ContentType = "multipart/form-data; boundary=" + boundaryGuid;
            request.Credentials = CredentialCache.DefaultCredentials;
            //request.KeepAlive = false; // No success in any setting for gb+ files.
            //request.ProtocolVersion = HttpVersion.Version11; // No difference noticed between Version10 and Version11
            request.ReadWriteTimeout = 1000 * 60 * 30; // 30 minutes // Timeout never yet reached
            //request.ContentLength = 
            //    boundaryStartBytes.Length +
            //    headerBytes.Length + 
            //    fileBytes +
            //    boundaryFinishBytes.Length; // May be left automatic, except for gb+ files

            using (Stream postStream = request.GetRequestStream())
            // while smaller files work regardless, gb+ files get partway or 100% and then an exception occurs (different depending on buffer and request settings):
            // - `The stream does not support concurrent IO read or write operations.`
            // - 404
            // - `The request was aborted: The request was canceled.`
            // - `An existing connection was forcibly closed by the remote host.`
            //using (BufferedStream postStream = new BufferedStream(request.GetRequestStream(), bytesPerChunk))
            {
                // Write MIME header
                postStream.Write(boundaryStartBytes, 0, boundaryStartBytes.Length);
                postStream.Write(headerBytes, 0, headerBytes.Length);

                // Write chunks, up to last
                long lastChunkIndex = chunkCount - 1;
                for (long c = 0; c < lastChunkIndex; c++)
                {
                    buffer = chunks[c];
                    postStream.Write(buffer, 0, bytesPerChunk);
                }

                // Write last chunk, which may be a different size
                postStream.Write(chunks[lastChunkIndex], 0, lastChunkByteCount);

                // Write MIME footer
                postStream.Write(boundaryFinishBytes, 0, boundaryFinishBytes.Length);
            }

            // 3. Finish submittal to ShareFile
            Console.WriteLine("...getting response...");
            HttpWebResponse response = (HttpWebResponse)request.GetResponse();
            Console.WriteLine("Upload Status: " + response.StatusCode);
            response.Close();
        }
Photo of OCAT

OCAT

  • 2 Posts
  • 0 Reply Likes

Posted 1 year ago

  • 1
Photo of Brian

Brian, Employee

  • 9 Posts
  • 2 Reply Likes
Have you tried using our .NET SDK? There are a number of upload functions implemented which would probably be a lot easier than doing it by hand. 
Photo of OCAT

OCAT

  • 2 Posts
  • 0 Reply Likes
Thanks for your reply, Brian.  I started with the C#.NET code sample at https://api.sharefile.com/rest/samples/csharp.aspx , which was fine until large files, but haven't tried a version using that SDK yet at https://github.com/citrix/ShareFile-NET .

Can you vouch for whether that SDK has been proven with multigigabyte uploads?  Are you aware of any api upload-size-limit permissions that may need to be checked?
Photo of Brian

Brian, Employee

  • 9 Posts
  • 2 Reply Likes
Several of our internal tools are built off the SDK and as far as I know they support large files. There are account-wide limits on file size that I believe default to 10GB but may be different for different levels of plan, you can consult with your support rep for more information on this. The uploader will return a 413 if you try to prep a file that violates this limit.