Giter Site home page Giter Site logo

bezzad / downloader Goto Github PK

View Code? Open in Web Editor NEW
1.2K 35.0 185.0 12.58 MB

Fast, cross-platform and reliable multipart downloader with asynchronous progress events for .NET applications.

License: MIT License

C# 100.00%
download-manager downloader download-file stream-downloader multipart-download dotnet-core high-concurrency filedownloader multitasking dotnet-standard

downloader's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

downloader's Issues

GetFileSize may cause exception when HEAD method is not allowed

Some url only allow GET, use HEAD may cause 403 exception.

protected async Task<long> GetFileSize(Uri address)
{
var request = GetRequest("HEAD", address);
using var response = await request.GetResponseAsync();
// if (long.TryParse(response.Headers.Get("Content-Length"), out var respLength))
return response.ContentLength;
}

Maybe we could use GET to retrieve Content-Length?

[Feature Request] Download List of Urls

Thanks for your very helpful library
Some features seem to be missing And it is better to be supported
I have a list of URLs and I want them to be downloaded in order
It is also better to have a feature that allows us to download multiple files at the same time
Something like this:

var mylist = ...
{
"url1",
"url2",
...
}

var downloader = new DownloadService(downloadOpt);
downloader.MaxNumberOfMultipleFileDownload = 2;
await downloader.DownloadFilesAsync(mylist);

Deleting package file just after file is downloaded is throwing `The process cannot access the file because it is being used by another process.

Version 2.2.8

I'm trying to delete the package after the file download is completed and I get an exception
image

My delete method:

public static void Delete(this DownloadPackage package, string episodePath) {
	var formattedPath = GetFilePath(episodePath);
	if (!File.Exists(formattedPath)) return;
	File.Delete(formattedPath);
}

Message

The process cannot access the file 'D:\Download\Kawaii\1.package' because it is being used by another process.

File 1.package is DownloaderPackage saved to file using BinarryFormatter.

   at System.IO.FileStream.ValidateFileHandle(SafeFileHandle fileHandle)
   at System.IO.FileStream.CreateFileOpenHandle(FileMode mode, FileShare share, FileOptions options)
   at System.IO.FileStream.OpenHandle(FileMode mode, FileShare share, FileOptions options)
   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, FileOptions options)
   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, Boolean useAsync)
   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)
   at DownloaderLibrary.Helpers.DownloadPackageExtensions.SavePackage(DownloadPackage package, String episodePath) in D:\Projekty\AnimeVideoDownloader\AnimeVideoDownloader\DownloaderLibrary\Helpers\DownloadPackageExtensions.cs:line 23
   at DownloaderLibrary.Downloaders.BaseAnimeDownloader.<>c__DisplayClass13_0.<DownloadEpisode>b__0(Object sender, DownloadProgressChangedEventArgs args) in D:\Projekty\AnimeVideoDownloader\AnimeVideoDownloader\DownloaderLibrary\Downloaders\BaseAnimeDownloader.cs:line 128
   at Downloader.DownloadService.OnChunkDownloadProgressChanged(Object sender, DownloadProgressChangedEventArgs e)
   at Downloader.ChunkDownloader.OnDownloadProgressChanged(DownloadProgressChangedEventArgs e)
   at Downloader.ChunkDownloader.ReadStream(Stream stream, CancellationToken token)
   at Downloader.ChunkDownloader.DownloadChunk(Request downloadRequest, CancellationToken token)
   at Downloader.ChunkDownloader.Download(Request downloadRequest, CancellationToken cancellationToken)
   at Downloader.ChunkDownloader.Download(Request downloadRequest, CancellationToken cancellationToken)
   at Downloader.DownloadService.SerialDownload(CancellationToken cancellationToken)
   at Downloader.DownloadService.StartDownload()

The exception is thrown here.

image

Why Downloader is trying to access the file after the download is completed? How can I safely delete the package after the download is completed?

Using .NetFramework throw Unable to read data from the transport connection: The connection was closed.

Hello. I can't download a file normally using ur package in .Net Framework environment
image
However, If I use the sample you provided, I tested sooo many times and it just work fine with no problem at all.
I did a google search to try to find some relavent problem.
Here is what I can find: https://social.msdn.microsoft.com/Forums/en-US/c620ce2c-c512-4c9f-a481-521ecd260039/systemioioexception-unable-to-read-data-from-the-transport-connection-the-connection-was-closed?forum=vstswebtest
Please kindly look and try to fix this issue
Finally My configuration:

private async void DownloadClientAsync
{
           var chunkCount =8;
            var downloadOpt = new DownloadConfiguration()
            {

                ParallelDownload = true, // download parts of file as parallel or not
                BufferBlockSize = 8000, // usually, hosts support max to 8000 bytes
                ChunkCount = chunkCount, // file parts to download
                MaxTryAgainOnFailover = int.MaxValue, // the maximum number of times to fail.
                OnTheFlyDownload = true, // caching in-memory mode
                Timeout = 3000 // timeout (millisecond) per stream block reader
            };
            var ds = new DownloadService(downloadOpt);
            ds.ChunkDownloadProgressChanged += OnChunkDownloadProgressChanged;
            ds.DownloadProgressChanged += OnDownloadProgressChanged;
            ds.DownloadFileCompleted += OnDownloadFileCompleted;

            var url = "http://resources.downloads.pokecity.club/2020.rar";
            System.Net.ServicePointManager.Expect100Continue = false;
            System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
            var fileName = Path.Combine(Path.GetTempPath(), "downloadtext.rar");
            await ds.DownloadFileAsync(url, fileName).ConfigureAwait(false);
            ds.Clear();
}

private void OnDownloadProgressChanged(object sender, DownloadProgressChangedEventArgs e)
{
            var nonZeroSpeed = e.BytesPerSecondSpeed == 0 ? 0.0001 : e.BytesPerSecondSpeed;
            var estimateTime = (int)((e.TotalBytesToReceive - e.BytesReceived) / nonZeroSpeed);
            var isMins = estimateTime >= 60;
            var timeLeftUnit = "";
            if (isMins)
            {
                timeLeftUnit = "分钟";
                estimateTime /= 60;
            }
           
            //flowLayoutPanel1.Controls.Add();
            
            var a= $"{e.ProgressPercentage:N3}%  -  {CalcMemoryMensurableUnit(e.BytesPerSecondSpeed)}/s  -  " +
                            $"[{CalcMemoryMensurableUnit(e.BytesReceived)} of {CalcMemoryMensurableUnit(e.TotalBytesToReceive)}], {estimateTime} {timeLeftUnit} left";
            Console.Title=a;
            //DownloadingProgressBar.Value= ((int)(e.ProgressPercentage * 100));

}

The response ended prematurely, with at least 34256189 additional bytes expected.

System.IO.IOException
HResult=0x80131620
Message=The response ended prematurely, with at least 34256189 additional bytes expected.
Source=System.Net.Http
StackTrace:
at System.Net.Http.HttpConnection.ContentLengthReadStream.Read(Span1 buffer) at System.Net.Http.HttpBaseStream.Read(Byte[] buffer, Int32 offset, Int32 count) at Downloader.ThrottledStream.Read(Byte[] buffer, Int32 offset, Int32 count) in C:\Users\Fengyuan\Source\Repos\Downloader3\src\Downloader\ThrottledStream.cs:line 176 at System.IO.Stream.<>c.b__43_0(Object ) at System.Threading.Tasks.Task1.InnerInvoke()
at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)

Problem not solved at all.. still the same
Download Url: http://resources.downloads.pokecity.club/2020.rar
Downloader configuration:

    var downloadOpt = new DownloadConfiguration
    {
        ParallelDownload = true, // download parts of file as parallel or not
        BufferBlockSize = 10240, // usually, hosts support max to 8000 bytes
        ChunkCount = 8, // file parts to download
        MaxTryAgainOnFailover = int.MaxValue, // the maximum number of times to fail.
        OnTheFlyDownload = false, // caching in-memory or not?
        Timeout = 1000, // timeout (millisecond) per stream block reader
        MaximumBytesPerSecond =0, //1024 * 1024, // speed limited to 1MB/s
        TempDirectory = "C:\\temp", // Set the temp path for buffering chunk files, the default path is Path.GetTempPath().
        RequestConfiguration = // config and customize request headers
        {
            Accept = "*/*",
            UserAgent = $"DownloaderSample/{Assembly.GetExecutingAssembly().GetName().Version.ToString(3)}",
            ProtocolVersion = HttpVersion.Version11,
            KeepAlive = false,
            UseDefaultCredentials = false
        }
    };

Not all chunks are downloaded in parallel

On .Net Framework 4.8 projects, not all the chunks are downloaded in parallel.
The parallel downloads are only -1 of the total.

Here is the configuration with the output:
image

I think this commit caused this issue: 4fcbc34
When I changed the DefaultConnectionLimit back to 1000 then the issue doesn't occur.
image

Note: This issue only occurs in .net framework though, when I converted the project to .net core 3.1 the issue didn't occur.

Download speed not maximized

The new feature/logic added seems to slow down the download speed even I set it to no limit. I tried to download the file with IDM and it reached 30MB/s(around 300mbps). I only have around 6MB/s(around 60Mbps) speed with version 1.2. In the version 1.1 the problem does not exist.
image

Download YouTube Video

Hey,

i have a question. How can I download a YouTube video with the Downloader? I tried to download the video with the Video url but this doesn’t worked.

Download M3U8 link and save to local disk issue.

Hello:
I want to test for download M3U8 links and save to local disk.
I created one C# Console Project, target .NET 5.0, and I added the nuget package:
PM> Install-Package Downloader -Version 2.2.9
The following is my C# code: (OS: Windows 10)

`using Downloader;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.ComponentModel;
using System.Diagnostics;
using System.IO;
using System.Net;
using System.Reflection;
using System.Threading;
using System.Threading.Tasks;
using Newtonsoft.Json;

namespace DownloaderTest
{
class Program
{
public const string M3U8_Link1 =
@"http://playertest.longtailvideo.com/adaptive/oceans_aes/oceans_aes.m3u8";
public const string Local_MP4_File = @"D:\TestDownload\1.mp4";

    static async Task Main()
    {
        try
        {
            var downloadOpt = GetDownloadConfiguration();
            var downloader = new DownloadService(downloadOpt);
            await downloader.DownloadFileTaskAsync(M3U8_Link1, Local_MP4_File);
        }
        catch (Exception e)
        {
            Console.Error.WriteLine(e);
            Debugger.Break();
        }
        Console.WriteLine("END");
        Console.Read();
    }

    private static DownloadConfiguration GetDownloadConfiguration()
    {
        string version = Assembly.GetExecutingAssembly().GetName().Version?.ToString(3) ?? "1";
        var cookies = new CookieContainer();
        cookies.Add(new Cookie("download-type", "test") { Domain = "domain.com" });
        return new DownloadConfiguration
        {
            BufferBlockSize = 10240, 
            ChunkCount = 8,
            MaximumBytesPerSecond = 1024 * 1024, 
            MaxTryAgainOnFailover = int.MaxValue, 
            OnTheFlyDownload = false, 
            ParallelDownload = true, 
            TempDirectory = "C:\\temp", 
            Timeout = 1000, 
            RequestConfiguration = {
                Accept = "*/*",
                CookieContainer = cookies,
                Headers = new WebHeaderCollection(), 
                KeepAlive = true,
                ProtocolVersion = HttpVersion.Version11, 
                UseDefaultCredentials = false,
                UserAgent = $"DownloaderSample/{version}"
            }
        };
    }

`

After I run my code, I saw the downloaded file: "D:\TestDownload\1.mp4"
But its size was only 1KB.
And I used another M3U8 downloader software to download the same M3U8 link and save in local disk, I found its size was: 29.7MB, and played in VLC media player, it is a MP4 video of 118 seconds, nearly 2 minutes.
What is wrong with my code, or is there any other issue with the repo?
By the way, in my real world job, I have to download a lot of live M3U8 links and save to local disks, most of them are live sports events, lasting about 2 hours. Can I use this repo for my job?
Thanks,

Batch download problem

When I'm download files in bulk let's say I need to download 1000 files and when it gets to 990 files it stops downloading for some reason and when it restarts it will continue to download the remaining 10 files,How to solve this problem?

Download to Memory Stream ?

I want to check things on the download files before writing them on disk, so I planned to download to a memory stream, do my stuff, and then save on disk

But I can't find a method to do it..

DownloadService returns 404 error.

I'm getting a 404 error when trying to download even a single file but when i copy-paste the url to my browser it downloads just fine.

An example url: https://edge.forgecdn.net/files/3533/204/TinySkeletons-v1.0.1-1.16.5-Forge.jar

Here's the code:

`
public class DownloadHandler : IEnableLogger
{
public Dictionary<string, string> ImportedFiles { get; set; } = new();

public async Task StartAsync(string manifestFile, string minecraftFolder)
{
    await using FileStream deserializationStream = System.IO.File.OpenRead(manifestFile);
    Manifest manifest = await JsonSerializer.DeserializeAsync<Manifest>(deserializationStream);

    using ForgeClient forge = new();

    foreach (Data.File file in manifest.Files)
    {
        string downloadUrl = await forge.Files.RetrieveDownloadUrl(file.ProjectId, file.FileId);
        string fileName = downloadUrl[(1 + downloadUrl.LastIndexOf('/'))..];

        // Start downloading.
        string destination = fileName.EndsWith(".jar") ? Path.GetFullPath(fileName, Path.GetFullPath("mods", minecraftFolder))
                                                       : Path.GetFullPath(fileName, Path.GetFullPath("resourcepacks", minecraftFolder));

        this.Log().Info($"URL: {downloadUrl}\n" +
                        $"Filename: {fileName}\n" +
                        $"Destination: {destination}\n");

        DownloadService downloader = await DownloadFile(downloadUrl, fileName, destination).ConfigureAwait(false);
        downloader.Clear();
    }
}

private async Task<DownloadService> DownloadFile(string downloadUrl, string fileName, string destination)
{
    DownloadService currentDownloader = new(new DownloadConfiguration
    {
        BufferBlockSize = 4096,
        ChunkCount = 8,
        MaxTryAgainOnFailover = 2,
        OnTheFlyDownload = false,
        ParallelDownload = true,
        TempDirectory = Paths.Temp,
        Timeout = 10000,
        RequestConfiguration =
        {
            Accept = "*/*",
            KeepAlive = true,
            ProtocolVersion = HttpVersion.Version11,
            UseDefaultCredentials = false,
            UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/6.0;)",
        }
    });

    currentDownloader.DownloadStarted += (sender, downloadArgs) => this.Log().Info($"Started downloading {downloadArgs.FileName}");
    currentDownloader.DownloadProgressChanged += (sender, progressArgs) => this.Log().Info($"{fileName} - {progressArgs.ProgressPercentage}%");
    currentDownloader.DownloadFileCompleted += (sender, completedArgs) => this.Log().Info($"Finished downloading {fileName}");

    await currentDownloader.DownloadFileTaskAsync(downloadUrl, destination).ConfigureAwait(false);
    return currentDownloader;
}

}`

"File size is invalid!" when using DownloadFileAsync

I'm trying to download a relatively big file, but it seems like your HEAD request to get the file size doesn't resolve correctly in this case.
It throws the exception here:

throw new InvalidDataException("File size is invalid!");

The URL I'm trying to get is http://patch-dl.ffxiv.com/game/4e9a232b/H2017.06.06.0000.0001a.patch
When checking with insomnia, I do get a correct Content-Length back via HEAD.
image

Any idea why this might be the case? Am I using it wrong?

Dispose of Filestorage removes the files

FileStorage's dispose implementation is removing the files. Should it be just Closing the stream instead? Currently i am trying to extend the JsonConverter which will internally use FileStorage. As this is disposable, i also implement IDisposable to call the Dispose method in it. This converter is then used in a using block when i am deserializing, which kinda removes the files even before download is resumed.

long url can not be downloaded

The following link cannot be downloaded I tested with and without VPN and it does not work

https://subf2m.co/subtitles/farsi_persian-text/x3cqTn4LAYuJK5RKSaSYnXewfSU6iVUgSYDkn9DehSsFVXf1R4u5miZZ6v7w6nRZrSkeo1pneD_1Dj7srraAtYr12bchHoiHZEBrJqc1aq-k9fa0rQPJuCL_FKHppoqL0?name=UmljayBhbmQgTW9ydHkgRmlmdGggU2Vhc29uIFBlcnNpYW4gc3VidGl0bGUgLSBTdWJmMm0gW3N1YmYybS5jb10uemlw

Internet Download Manager downloads without any problems

How to Use Pac Proxy?

Hi @bezzad
I have a pac file proxy, how can i use this proxy with DownloadService?
i tried this code but not worked:

var downloadOpt = new DownloadConfiguration()
{
    RequestConfiguration =
    {
        Proxy = new WebProxy("www.myserver.com/proxy/can.pac", true)
    }
};
var downloader = new DownloadService(downloadOpt);

in windows, i can set this url as a proxy (in internet Options) and works fine

[Feature Request] Pre-allocated and stream into final file location.

Would it be possible to have an option to make Downloader pre-allocate the final file and stream into it directly instead of using temporary files? With large downloads the moving of temp files into the final file can take some time, especially with an overzealous anti-virus program.

[Feature Request] Get Url File Name

Please add this feature to get the file name and extension from the link Therefore, there is no need to specify the name and extension in the path

for example:

var downloader = new DownloadService(downloadOpt);
await downloader.DownloadFileAsync(url, "C:\myFolder")

StackOverflowException when retrying download for a long time

When DownloadFileAsync is encapsulated in a Task.Run then it produces a StackOverflowException when the download is being retried for a long time.

This only happens on <= .net framework 4.8 though

Steps:

  1. Execute the Task.Run and wait for it
  2. Disconnect internet connection to produce the retry
  3. Wait for 5 minutes (timeout in this case is set 100)
  4. Connect internet connection again
  5. Download resumes but somehow stops before completing and StackOverflowException occurs

Here is some test code that I used.

await Task.Run((async () =>
{
    var downloader = new DownloadService(new DownloadConfiguration()
    {
        ParallelDownload = true, // download parts of file as parallel or not
        BufferBlockSize = 10240, // usually, hosts support max to 8000 bytes
        ChunkCount = 3, // file parts to download
        MaxTryAgainOnFailover = int.MaxValue, // the maximum number of times to fail.
        OnTheFlyDownload = false, // caching in-memory or not?
        Timeout = 100 // timeout (millisecond) per stream block reader
    });
    await downloader.DownloadFileAsync("http://ipv4.download.thinkbroadband.com/100MB.zip", "C:\\100MB.zip").ConfigureAwait(false);
}));

I found some threads about this:
https://stackoverflow.com/questions/44760486/stackoverflowexceptions-in-nested-async-methods-on-unwinding-of-the-stack
https://stackoverflow.com/questions/13808166/recursion-and-the-await-async-keywords

Based on that I tried adding a finally clause in ChunkDownloader::Download and the exception didn't occur anymore.

public async Task<Chunk> Download(Request downloadRequest, CancellationToken cancellationToken)
{
    try
    {
        await DownloadChunk(downloadRequest, cancellationToken);
        return Chunk;
    }
    catch (TaskCanceledException) // when stream reader timeout occurred 
    {
        // re-request and continue downloading...
        return await Download(downloadRequest, cancellationToken);
    }
    catch (WebException) when (Chunk.CanTryAgainOnFailover())
    {
        // when the host forcibly closed the connection.
        await Task.Delay(Chunk.Timeout, cancellationToken);
        // re-request and continue downloading...
        return await Download(downloadRequest, cancellationToken);
    }
    catch (Exception error) when (Chunk.CanTryAgainOnFailover() &&
                                  (error.HasSource("System.Net.Http") ||
                                   error.HasSource("System.Net.Sockets") ||
                                   error.HasSource("System.Net.Security") ||
                                   error.InnerException is SocketException))
    {
        Chunk.Timeout += TimeoutIncrement; // decrease download speed to down pressure on host
        await Task.Delay(Chunk.Timeout, cancellationToken);
        // re-request and continue downloading...
        return await Download(downloadRequest, cancellationToken);
    }
    finally
    {
        await Task.Yield();
    }
}

I'm not sure if this is the correct way to fix this one though.

Thank you for this great and clean code by the way.

Cannot download to UNC path

When trying to download to a network UNC path like \\server\d it fails with the following exception:
Drive name must be a root directory (i.e. 'C:\') or a drive letter ('C'). (Parameter 'driveName')

This comes from this stacktrace:

System.ArgumentException: Drive name must be a root directory (i.e. 'C:\') or a drive letter ('C'). (Parameter 'driveName')
   at System.IO.DriveInfoInternal.NormalizeDriveName(String driveName)
   at System.IO.DriveInfo..ctor(String driveName)
   at Downloader.FileHelper.CheckDiskSize(String directory, Int64 actualSize)
   at Downloader.DownloadService.CheckSizes()
   at Downloader.DownloadService.Validate()
   at Downloader.DownloadService.StartDownload()
   at Downloader.DownloadService.DownloadFileAsync(String address, String fileName)

It's checking for the drive size here:

DriveInfo drive = new DriveInfo(Directory.GetDirectoryRoot(directory));

Unfortunately drive info does not support UNC paths:
https://docs.microsoft.com/en-us/dotnet/api/system.io.driveinfo.-ctor?view=net-5.0

Use this class to obtain information on drives. The drive name must be either an uppercase or lowercase letter from 'a' to 'z'. You cannot use this method to obtain information on drive names that are null or use UNC (\\server\share) paths.

I think it might be best to simply skip doing the CheckSizes function when a network path is found?

Resuming download starts from the begging and appends data to temp file.

Hello :)

I've found a weird bug.

When I resume my download it starts from the beginning but appends data to the temp file.

Process

image

image

As you can see here is a file with 41MB from 50 MB file downloaded. This is saved when I closed my app.

Not when I resume and download another 30 MB, as you can see package looks correct and the temp file is the same. But progress says that it's started from the beginning.

image

image

But the temp file is now 75 MB

image

And if I let him download all file it saves all bytes in temp file into the new so... downloaded file has now 125 MB!

image

Config

var config = new DownloadConfiguration {
  CheckDiskSizeBeforeDownload = true,
  OnTheFlyDownload = false,
  TempDirectory = Config.DownloadDirectory
};

Tested file: http://ipv4.download.thinkbroadband.com/50MB.zip (ignore the name 1.mp4, it is that 50MB.zip file)

Package serialized using BinarryFormatter

Version: 2.2.6

How do you resume a download?

I really want to use this in my project, but I cant find any type of start range. Right now I use connection.SetRequestProperty("Range", "bytes=" + rFile.Length() + "-"); for my own download, just so I can pick up where I stopped, but I cant find it in DownloadConfiguration or RequestConfiguration. Also is this project supported on Android, and can handle android paths?

Missing Interface, Events/EventArgs issue

I'd like to have an interface for DownloadService so I can mock it to facilitate testing.

There's also a couple issues with the events that means I can't convert them to Observables:

DownloadProgressChangedEventArgs should be a child of EventArgs, the EventHandlers should all be declared with event.

I've resolved these myself, all test passing, however I'm unable to push the branch to be reviewed. Could you please change the repo permissions to allow contributors to put up stuff?

Allow Pausing downloads

Please add a method to DownloadService to allow for pausing the active downloads.

This is useful for example when downloading big files and trying to offer a "Pause" feature for users, in case they need to use their bandwidth for something else.

Allow setting a speed limit on downloads

Please allow setting a speed limit for downloads in the DownloadConfiguration object.
It's an often requested feature for things like updaters that have to download big files, and would otherwise suck up all the bandwidth on the network.

If this is already possible and I haven't found the right way to do it, please let me know.

Thank you!

Downloads file over 100% when continuing downloading from package info

Version: 2.2.4

image

When my app is loading a saved package from the file (serialized in binary) and continuing downloading, the Downloader is downloading and reporting progress over 100% (and over TotalBytes)! (Check BytesReceived and Percent)

It only happens when Downloader is continuing from the package.

My config:

var config = new DownloadConfiguration {
		CheckDiskSizeBeforeDownload = true,
		OnTheFlyDownload = false,
	};

Suggestions:

  • My config is using OnTheFlyDownload as false. I think that when is was true I didn't have that issue.
  • My code for saving and loading package info is here. Maybe there is something wrong?

Some Downloads dont start

Hi i used Version 1.9.7 but updated to the newest version.

With the old version i was able to download from all URLs, but since the update i cant download from two of the four URLs i used.

image

the downloader does nothing after the DownloadFileTaskAsync function. I tried debugging into the Events, but none of them is even getting called.

The URLs which work are:
http://{servername}/8.20.zip
http://{servername}/ZEE600%202.0%20Installation.zip

and these dont:
http://{servername}/HelpFileEN.zip
http://{servername}/HelpFileDE.zip

I am able to download all files via Browser, so the files or the access to them isn't the problem.

Hope you can help.

Windows 10 Build 1909
.NET Framework 4.7.2
Downloader 2.2.4

image

Best regards
Hightower1992

Download from Redirected Url

Hi, Hope you are well
It seems that it is not possible to download from the server that forwards the download link to another address.
For example, the following address will not be downloaded

https://isubtitles.org/download/arrow/farsi-persian/1801961

If you try with Internet Download Manager, you will see that this link is transferred to:
https://file.isubtitles.org/2020/11/16/arrow-farsi-persian-1801961.zip

استفاده از try-catch

سلام
چرا خطا ها در
try-catch
گرفته نمیشن؟
الان من گاهی دچار خطاهای مختلفی میشم و هیچ جوره نمیتونم از
try-catch
برای مدیریت خطا استفاده کنم
الان توی تصویر زیر برنامه دچار اکسپشن شده ولی بلوک
try-catch
اجرا نمیشه

Untitled

Resume downloading doesn't start from saved temp file

Resume downloading doesn't start from a saved temp file. Instead, it appends more data into it.

This is a temp file saved in my custom directory just to test things out.

image

1-chunkInfo file is Package class saved on the disk using BinaryFormatter

image

It has more bytes (762 290 176 bytes) that ReceivedBytesSize and even more TotalBytesToReceive 762 290 176

image

I am using this setting to download it

var config = new DownloadConfiguration {
    CheckDiskSizeBeforeDownload = true,
    OnTheFlyDownload = false,
    TempDirectory = Config.DownloadDirectory
};

It starts the download from the begging every time I pass the Package object to it. After downloading it saves all bytes in file like that:

image

It is 1.5 GB instead of 640 MB!

Am I doing something incorretly or is it a bug?

Executing downloaded file immediately fails with access exception

When trying to execute downloaded file immediately after OnDownloadFileCompleted is called:
Win32Exception: The process cannot access the file because it is being used by another process

Example:

private string filePath = @"J:\Test\hb_setup.exe";

private async Task StartDownload()
{
  var downloadOpt = new DownloadConfiguration()
  {
    BufferBlockSize = 10240, // usually, hosts support max to 8000 bytes, default values is 8000
    ChunkCount = 8, // file parts to download, default value is 1 
    MaxTryAgainOnFailover = 100, // the maximum number of times to fail
    OnTheFlyDownload = false, // caching in-memory or not? default values is true
    ParallelDownload = true, // download parts of file as parallel or not. Default value is false
    Timeout = 1000, // timeout (millisecond) per stream block reader, default values is 1000
  };

  downloader = new DownloadService(downloadOpt);
  downloader.DownloadFileCompleted += OnDownloadFileCompleted;
  await downloader.DownloadFileTaskAsync(@"https://github.com/HandBrake/HandBrake/releases/download/1.3.3/HandBrake-1.3.3-x86_64-Win_GUI.exe", filePath);
}

private void OnDownloadFileCompleted(object sender, AsyncCompletedEventArgs e)
{
  if (e.Cancelled)
  {
    //user cancelled
  }
  else if (e.Error != null)
  {
    //error
  }
  else
  {
    Process.Start(filePath);
  }
}

Tried calling dispose before, still same issue.

Big download stalls after a while

Hello, I'm embedding this library in my app and after updating to 2.2.8, certain big downloads seem to stall after a while.

Here is my configuration object:

			ParallelDownload = true, // download parts of file as parallel or not
            BufferBlockSize = 8000, // usually, hosts support max to 8000 bytes
            ChunkCount = 8, // file parts to download
            MaxTryAgainOnFailover = int.MaxValue, // the maximum number of times to fail.
            OnTheFlyDownload = false, // caching in-memory mode
            Timeout = 10000, // timeout (millisecond) per stream block reader
            TempDirectory = Path.GetTempPath(), // this is the library default
            RequestConfiguration = new RequestConfiguration
            {
                UserAgent = "FFXIV PATCH CLIENT",
                Accept = "*/*"
            },
            MaximumBytesPerSecond = App.Settings.SpeedLimitBytes / MAX_DOWNLOADS_AT_ONCE

In this example, MaximumBytesPerSecond is confirmed to be set to 0 on initialization and as such will be its maximum value.

I currently run 4 downloads at once. After a while, all 4 will get stuck. The BytesPerSecondSpeed on DownloadProgressChanged will end up at some value between 100-200kbps, and will not receive any further updates. Progress will not advance.

Is this a problem with my usage of the library? You can see the whole logic for downloading files here: https://github.com/goatcorp/FFXIVQuickLauncher/blob/master/XIVLauncher/Game/Patch/PatchManager.cs#L146

When downloads get stuck, even after the specified timeout, the error/cancelled objects are null.
I've ruled out the download server, as I can perform more than 4 simultaneous downloads using JDownloader just fine.
Disk space is not an issue.

Let me know if I can help you any further reproducing this.

Sample files:

http://patch-dl.ffxiv.com/game/4e9a232b/H2017.06.06.0000.0001a.patch
http://patch-dl.ffxiv.com/game/4e9a232b/H2017.06.06.0000.0001b.patch
http://patch-dl.ffxiv.com/game/4e9a232b/H2017.06.06.0000.0001c.patch
http://patch-dl.ffxiv.com/game/4e9a232b/H2017.06.06.0000.0001d.patch
[...]
http://patch-dl.ffxiv.com/game/4e9a232b/H2017.06.06.0000.0001m.patch

System.Net.WebException: 'An error occurred while sending the request. The response ended prematurely.'

This error occur in version1.2's sample project.
image
image
image
image

image
StackTrace:
System.Net.Http.HttpConnection.d__53.MoveNext()
System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
System.Runtime.CompilerServices.ConfiguredTaskAwaitable1.ConfiguredTaskAwaiter.GetResult() System.Net.Http.HttpConnectionPool.<SendWithNtConnectionAuthAsync>d__48.MoveNext() System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) System.Runtime.CompilerServices.ConfiguredTaskAwaitable1.ConfiguredTaskAwaiter.GetResult()
System.Net.Http.HttpConnectionPool.d__47.MoveNext()
System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
System.Runtime.CompilerServices.ConfiguredTaskAwaitable1.ConfiguredTaskAwaiter.GetResult() System.Net.Http.RedirectHandler.<SendAsync>d__4.MoveNext() System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) System.Runtime.CompilerServices.ConfiguredTaskAwaitable1.ConfiguredTaskAwaiter.GetResult()
System.Net.Http.DiagnosticsHandler.d__2.MoveNext()
System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
System.Runtime.CompilerServices.ConfiguredTaskAwaitable1.ConfiguredTaskAwaiter.GetResult() System.Net.Http.HttpClient.<FinishSendAsyncUnbuffered>d__71.MoveNext() System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) System.Runtime.CompilerServices.ConfiguredTaskAwaitable1.ConfiguredTaskAwaiter.GetResult()
System.Net.HttpWebRequest.d__194.MoveNext()
System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
System.Net.HttpWebRequest.GetResponse()
image

Adding Cookie to download request

Hi,

I'm trying to download files from volafile.org rooms, and to get files from it, when I use WebClient, I have to add
client.Headers.Add(HttpRequestHeader.Cookie, "allow-download=1");
(without it, the request ends with 403 not Allowed error)
If i'm not mistaken, there no way to do that using RequestConfiguration.

High memory usage when downloading big files

This is more of a feature request, but I'll write it down here anyways:

It would be nice if the library would use the final output file together with shared write streams during the download instead of allocating a buffer for the whole file at all times.
This causes very high memory usage when downloading big files, especially when downloading multiple of them, since memory for the whole file is allocated at all times.

Thanks!

Package.Chunks always ignored

Hi, I'm trying to continue a previous started download.

I saved the package to a custom JSON file and restore it to create a DownloadPackage object. It works fine, just... don't continue from paused state!
The download start with the last percentage, but keep going until finish with 100% + last_state. Eg, if I paused with 37% and restart the app, the download will finish with 137%.

Reading the code, I've found something...

public async Task<Stream> DownloadFileAsync(DownloadPackage package)
{
    Package = package;
    InitialDownloader(package.Address.OriginalString);
    return await StartDownload().ConfigureAwait(false);
}

private async Task<Stream> StartDownload()
{
	// ...
	Package.Chunks = _chunkHub.ChunkFile(Package.TotalFileSize, Package.Options.ChunkCount);
}

In the code above, the package is read and StartDownload is called, but the chunks are always override. I have restored it's content, but this values isn't read and the whole file is downloaded again.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.