Giter Site home page Giter Site logo

azure / azure-storage-js Goto Github PK

View Code? Open in Web Editor NEW
75.0 19.0 31.0 1.73 MB

Microsoft Azure Storage Library for JavaScript (This repo has been moved to https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/storage on Apr18, 2019. This Repo will be deprecated.)

Home Page: https://docs.microsoft.com/en-us/javascript/api/overview/azure/storage/client?view=azure-node-preview

License: MIT License

JavaScript 2.71% TypeScript 97.29%

azure-storage-js's Introduction

azure-storage-js's People

Contributors

bterlson avatar devconcept avatar jiacfan avatar ljian3377 avatar microsoft-github-policy-service[bot] avatar microsoftopensource avatar msftgits avatar vinjiang avatar xiaoningliu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

azure-storage-js's Issues

replace sas token while uploading to blob storage

Which service(blob, file, queue, table) does this issue concern?

Blob

Which version of the SDK was used?

lastest

What's the Node.js/Browser version?

chrome

What problem was encountered?

Is there any possibility to replace SAS_TOKEN while uploading to Blob Storage? Below I will describe exactly the problem.

Example:
Our expire time of token is 24 min. I would like to upload 10GB file and it will take about 1 hour with my network connection. After 24 min upload fails with response 401 (auth issues) - token expired. Is there any possibility to replace after 23 min of uploading with new valid sas token? It is just the question, ofc we could extend the lifetime of token to 1hr or 2hrs, but I would like not to do that.

23 mins with token A
23 - 46 mins with token B
etc.

Steps to reproduce the issue?

Have you found a mitigation/solution?

no

@Azure/storage-queue is causing builds to fail in Create React App Typescript

Which service(blob, file, queue, table) does this issue concern?

queue

Which version of the SDK was used?

@azure/[email protected]

What's the Node.js/Browser version?

Node.js v10.14.1

What problem was encountered?

When attempting to run CRA-ts with the above version installed I get this error:

./node_modules/@azure/storage-queue/dist-esm/lib/index.browser.js
Module not found: [CaseSensitivePathsPlugin] /app/node_modules/@azure/storage-queue/dist-esm/lib/MessageIdURL.js does not match the corresponding path on disk MessageIDURL.js.

Steps to reproduce the issue?

  • Have latest versions of Create React App Typescript and @azure/storage-queue in package.json
  • Run npm start

Have you found a mitigation/solution?

Renaming MessageIDURL.js to MessageIdURL.js fixes the issue

Reading an append blob to a string takes long and then returns HTTP 412

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

@azure/[email protected]

What's the Node.js/Browser version?

Node.js v10.14.1

What problem was encountered?

Following the example here https://github.com/Azure/azure-storage-js/blob/master/blob/samples/basic.sample.js I was trying to read an append blob from my storage account to a string. It resulted in the streamToString function taking really long and then finally giving an HTTP 412 error with errorCode of undefined. I suspect the error might have something to do with me trying to read an append blob that is constantly getting more lines in it. It is a log file and I would just like to be able to read the current snapshot of it. I could not find any examples dealing with a scenario like mine. Any help would be appreciated!

The detailed error is down here:

{ Error: Unexpected status code: 412
    at new RestError (C:\projects\xxx\RequestLogViewer\node_modules\@azure\ms-rest-js\dist\msRest.node.js:1397:28)
    at C:\projects\xxx\RequestLogViewer\node_modules\@azure\ms-rest-js\dist\msRest.node.js:1849:37
    at process._tickCallback (internal/process/next_tick.js:68:7)
  code: undefined,
  statusCode: 412,
  request:
   WebResource {
     streamResponseBody: true,
     url:
      'https://xxxstor.blob.core.windows.net/request-logs/2019-01-04.txt',
     method: 'GET',
     headers: HttpHeaders { _headersMap: [Object] },
     body: undefined,
     query: undefined,
     formData: undefined,
     withCredentials: false,
     abortSignal:
      a {
        _aborted: false,
        children: [],
        abortEventListeners: [Array],
        parent: undefined,
        key: undefined,
        value: undefined },
     timeout: 0,
     onUploadProgress: undefined,
     onDownloadProgress: undefined,
     operationSpec:
      { httpMethod: 'GET',
        path: '{containerName}/{blob}',
        urlParameters: [Array],
        queryParameters: [Array],
        headerParameters: [Array],
        responses: [Object],
        isXML: true,
        serializer: [Serializer] } },
  response:
   { body: undefined,
     headers: HttpHeaders { _headersMap: [Object] },
     status: 412 },
  body: undefined }

Steps to reproduce the issue?

Here is my code:

const {
  Aborter,
  BlobURL,
  ContainerURL,
  SharedKeyCredential,
  ServiceURL,
  StorageURL,
} = require('@azure/storage-blob');
const format = require('date-fns/format');

async function streamToString(readableStream) {
  return new Promise((resolve, reject) => {
    const chunks = [];
    readableStream.on('data', (data) => {
      chunks.push(data.toString());
    });
    readableStream.on('end', () => {
      resolve(chunks.join(''));
    });
    readableStream.on('error', reject);
  });
}

async function run() {
  const accountName = 'xxxstor';
  const accountKey = 'omitted';
  const credential = new SharedKeyCredential(accountName, accountKey);
  const pipeline = StorageURL.newPipeline(credential);
  const serviceURL = new ServiceURL(
    `https://${accountName}.blob.core.windows.net`,
    pipeline
  );
  const containerName = 'request-logs';
  const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
  const blobName = `${format(new Date(), 'YYYY-MM-DD[.txt]')}`;
  const blobURL = BlobURL.fromContainerURL(containerURL, blobName);
  console.log('Downloading blob...');
  const response = await blobURL.download(Aborter.none, 0);
  console.log('Reading response to string...');
  const body = await streamToString(response.);
  console.log(body.length);
}

run().catch((err) => {
  console.error(err);
});

Have you found a mitigation/solution?

no

REQUEST_ABORTED_ERROR when uploading blob

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

"@azure/storage-blob": "10.2.0-preview"

What's the Node.js/Browser version?

node 8.10

What problem was encountered?

I've got REQUEST_ABORTED_ERROR when trying to upload blob.
It happens only on azure deployment. On my local machine it works perfectly.

Steps to reproduce the issue?

call uploadStreamToBlockBlob with arguments:
Aborter.timeout(FIVE_MINUTES_IN_MS), readableStream, blockBlobURL, 4 * 1024 * 1024, 20

Have you found a mitigation/solution?

Nope

Dowloading a blob should not reject on 304

Which service(blob, file, queue, table) does this issue concern?

Blob

Which version of the SDK was used?

10.3.0

What's the Node.js/Browser version?

v8.12.0

What problem was encountered?

When using BlobURL.download and the If-None-Modified modifiedAccessCondition, I noticed that the promise gets rejected with a response code 304.

It feels a bit unnatural though to have to catch and branch in there in this case:

       (.catch (fn [error]
                     (let [code (gobj/get error "statusCode")]
                       (cond
                         ;; not found
                         (= 404 code) (do (when cache
                                            (vswap! cache cache/evict blob-name))
                                          (reject nil))

                         ;; not modified
                         (= 304 code)
                         (resolve (some-> cache
                                          deref
                                          (cache/lookup blob-name)
                                          :content))

                         :else (reject error)))))

Promise rejections are usually signalling unrecoverable problems and are stopping the code flow. I think that 30X would not be treated as exceptions. Even better, maybe an option should allow you to chose what you want to reject on - some library does that.

Steps to reproduce the issue?

Try to pass a populated blobAccessConditions -> modifiedAccessConditions -> ifNoneMatch options to download any blob.

Have you found a mitigation/solution?

See above, not really satisfying.

SharedKeyCredential function is not importing in browser environment.

Which service(blob, file, queue, table) does this issue concern?

blob stroage

Which version of the SDK was used?

v10.1.0-preview-blob

What's the Node.js/Browser version?

node -v
8.4.2
angular 5

What problem was encountered?

import { SharedKeyCredential, AnonymousCredential, TokenCredential } from "@azure/storage-blob";

SharedKeyCredential is not importing from azure blob other than that all are adding to the project.

Steps to reproduce the issue?

install the plugin in any Angular project and import in component
import { SharedKeyCredential, Credential, TokenCredential } from "@azure/storage-blob";
ngOnit() {
console.log(SharedKeyCredential) //undefined
}

ONLY AVAILABLE IN NODE.JS RUNTIME. SharedKeyCredential for account key authorization of Azure Storage service.

Is it possible to use with angularJs 5.

Need help from community.

Failing to upload using `uploadStreamToBlockBlob`

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

@azure/storage-blob ^10.3.0

What's the Node.js/Browser version?

node v9.10.1

What problem was encountered?

Attempting to upload node file stream as blob using uploadStreamToBlockBlob and it fails, producing confusing error that I've been unable to pinpoint the cause of.

Steps to reproduce the issue?

I created a resolver function for apollo-server 2, where the resolver receives a file stream and then attempts to upload it as a new blob in a container named 'avatars'.

Resolver code

const uploadAvatar = isAuthenticatedResolver.createResolver(
  async (root, args, context, error) => {
    const { file } = args;
    const { stream, filename, mimetype, encoding } = await file;

    const account = "remix2";
    const accountKey =
      "redacted";

    // Use SharedKeyCredential with storage account and account key
    const sharedKeyCredential = new SharedKeyCredential(account, accountKey);

    // Use sharedKeyCredential, tokenCredential or anonymousCredential to create a pipeline
    const pipeline = StorageURL.newPipeline(sharedKeyCredential);

    const serviceURL = new ServiceURL(
      // When using AnonymousCredential, following url should include a valid SAS or support public access
      `https://${account}.blob.core.windows.net`,
      pipeline
    );

    const containerName = "avatars";
    const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);

    const blobName = "newblob" + new Date().getTime();
    const blobURL = BlobURL.fromContainerURL(containerURL, blobName);
    const blockBlobURL = BlockBlobURL.fromBlobURL(blobURL);
    const uploadBlobResponse = await uploadStreamToBlockBlob(
      Aborter.timeout(30 * 60 * 60 * 1000),
      stream,
      blockBlobURL,
      20
    );
    console.log(
      `Upload block blob ${blobName} successfully`,
      uploadBlobResponse
    );
  }
);

Error code

TypeError: Cannot read property 'on' of undefined
    at /Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/dist-esm/lib/utils/BufferScheduler.js:156:40
    at new Promise (<anonymous>)
    at e.<anonymous> (/Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/dist-esm/lib/utils/BufferScheduler.js:153:25)
    at /Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:97:23
    at Object.next (/Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:78:53)
    at /Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:71:71
    at new Promise (<anonymous>)
    at __awaiter (/Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:67:12)
    at e.do (/Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/dist-esm/lib/utils/BufferScheduler.js:140:14)
    at /Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/dist-esm/lib/highlevel.node.js:304:58
    at /Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:97:23
    at Object.next (/Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:78:53)
    at /Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:71:71
    at new Promise (<anonymous>)
    at __awaiter (/Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/node_modules/tslib/tslib.es6.js:67:12)
    at uploadStreamToBlockBlob (/Users/connorelsea/Projects/remix-server/node_modules/@azure/storage-blob/dist-esm/lib/highlevel.node.js:271:9)
    at _callee3$ (/Users/connorelsea/Projects/remix-server/resolvers/storage.js:179:38)
    at tryCatch (/Users/connorelsea/Projects/remix-server/node_modules/regenerator-runtime/runtime.js:65:40)
    at Generator.invoke [as _invoke] (/Users/connorelsea/Projects/remix-server/node_modules/regenerator-runtime/runtime.js:303:22)
    at Generator.prototype.(anonymous function) [as next] (/Users/connorelsea/Projects/remix-server/node_modules/regenerator-runtime/runtime.js:117:21)
    at step (/Users/connorelsea/Projects/remix-server/resolvers/storage.js:25:191)
    at /Users/connorelsea/Projects/remix-server/resolvers/storage.js:25:361

Have you found a mitigation/solution?

Not yet, reading through the code that it points to now to try to understand what is causing it, but hoping someone here might have an idea of what I'm doing wrong.

How to list blobs?

I could not find how to list blobs under certain prefix, similar to listBlobsSegmentedWithPrefix from the old SDK. I see there is a method for listing the containers, but is there one for listing blobs?

uploadStreamToBlockBlob doesn't work

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

Don't know how to find the version of the sdk on Ubuntu.
Here is the output of az cli:
andrey@andrey-ubuntu:~$ az --version
azure-cli (2.0.49)

acr (2.1.7)
acs (2.3.9)
advisor (0.6.0)
ams (0.2.4)
appservice (0.2.5)
backup (1.2.1)
batch (3.4.1)
batchai (0.4.4)
billing (0.2.0)
botservice (0.1.1)
cdn (0.2.0)
cloud (2.1.0)
cognitiveservices (0.2.3)
command-modules-nspkg (2.0.2)
configure (2.0.18)
consumption (0.4.0)
container (0.3.7)
core (2.0.49)
cosmosdb (0.2.2)
dla (0.2.3)
dls (0.1.4)
dms (0.1.1)
eventgrid (0.2.0)
eventhubs (0.3.0)
extension (0.2.2)
feedback (2.1.4)
find (0.2.12)
hdinsight (0.1.0)
interactive (0.3.31)
iot (0.3.3)
iotcentral (0.1.3)
keyvault (2.2.5)
lab (0.1.2)
maps (0.3.2)
monitor (0.2.5)
network (2.2.7)
nspkg (3.0.3)
policyinsights (0.1.0)
profile (2.1.1)
rdbms (0.3.2)
redis (0.3.2)
relay (0.1.2)
reservations (0.4.0)
resource (2.1.5)
role (2.1.8)
search (0.1.1)
servicebus (0.3.1)
servicefabric (0.1.6)
signalr (1.0.0)
sql (2.1.5)
storage (2.2.3)
telemetry (1.0.0)
vm (2.2.6)

Python location '/opt/az/bin/python3'
Extensions directory '/home/andrey/.azure/cliextensions'

Python (Linux) 3.6.5 (default, Oct 18 2018, 19:51:37)
[GCC 7.3.0]

What's the Node.js/Browser version?

node 8.10

What problem was encountered?

uploadStreamToBlockBlob doesn't work

Steps to reproduce the issue?

const buf = Buffer.from([0x62, 0x75, 0x66, 0x66, 0x65, 0x72])
const bufferStream = new require('stream').PassThrough()
bufferStream.end(buf)

const uploadRes = await uploadStreamToBlockBlob(Aborter.none, bufferStream, blockBlobURL, {
  blockSize: 2 * 1024 * 1024, // 2MB block size
  parallelism: 20, // 20 concurrency
   },
  blobHTTPHeaders: { // headers returned with each request to the photo
    blobContentType: "image/jpeg",
    blobCacheControl: `max-age=0, must-revalidate`
  }
},)

Have you found a mitigation/solution?

if to save stream to file and use uploadFileToBlockBlob it works

10.2.0 cannot support chinese file name

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

10.2.0 and 10.1.0

What's the Node.js/Browser version?

v10.14.1

What problem was encountered?

Using chinese file name cannot upload to blob container

Steps to reproduce the issue?

use the following function and filepath must contain chinese or other utf8 string
async function uploadStream(aborter, containerURL, filePath) {

filePath = path.resolve(filePath);


const fileName = encodeURI(path.basename(filePath))
console.log(`${fileName} ${containerURL}`)
const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, "test/"+fileName);
console.log(`${blockBlobURL}`)

const stream = fs.createReadStream(filePath, {
  highWaterMark: FOUR_MEGABYTES,
});

const uploadOptions = {
    bufferSize: FOUR_MEGABYTES,
    maxBuffers: 5,
};

return await uploadStreamToBlockBlob(
                aborter, 
                stream, 
                blockBlobURL, 
                uploadOptions.bufferSize, 
                uploadOptions.maxBuffers);

}

Have you found a mitigation/solution?

[Tests] incorrect assert.ok() on result from string.indexof

They should verify that indexOf() != -1, or indexOf() >= 0, or use includes()

>rg assert\.ok.*indexOf
blob\tests\containerurl.test.ts
173:    assert.ok(containerURL.url.indexOf(result.containerName));
176:    assert.ok(blobURLs[0].url.indexOf(result.segment.blobItems![0].name));
216:    assert.ok(containerURL.url.indexOf(result.containerName));
218:    assert.ok(blobURLs[0].url.indexOf(result.segment.blobItems![0].name));
238:    assert.ok(containerURL.url.indexOf(result2.containerName));
240:    assert.ok(blobURLs[0].url.indexOf(result2.segment.blobItems![0].name));
264:    assert.ok(containerURL.url.indexOf(result.containerName));
274:      assert.ok(blob.url.indexOf(result.segment.blobPrefixes![i++].name));
311:    assert.ok(containerURL.url.indexOf(result.containerName));
314:    assert.ok(blobURLs[0].url.indexOf(result.segment.blobPrefixes![0].name));
327:    assert.ok(containerURL.url.indexOf(result2.containerName));
330:    assert.ok(blobURLs[0].url.indexOf(result2.segment.blobPrefixes![0].name));
343:    assert.ok(containerURL.url.indexOf(result3.containerName));
348:    assert.ok(blobURLs[0].url.indexOf(result3.segment.blobItems![0].name));

file\tests\directoryurl.test.ts
104:    assert.ok(shareURL.url.indexOf(result.shareName));

Missing setProperties function for blob

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

10.3.0

What's the Node.js/Browser version?

11.9.0

What problem was encountered?

Missing setProperties on BlobURL. There are getProperties and setMetadata though.

Steps to reproduce the issue?

N/A

Have you found a mitigation/solution?

No

Refactor pipeline

This is not a bug, just a suggestion.

You create a new pipeline by creating a new array and then pushing elements.

const factories: RequestPolicyFactory[] = [];
factories.push(new TelemetryPolicyFactory(pipelineOptions.telemetry));
factories.push(new UniqueRequestIDPolicyFactory());
factories.push(new BrowserPolicyFactory());
factories.push(deserializationPolicy()); // Default deserializationPolicy is provided by protocol layer
factories.push(new RetryPolicyFactory(pipelineOptions.retryOptions));
factories.push(new LoggingPolicyFactory());
factories.push(credential);

Why not create your pipeline using this?

const factories: RequestPolicyFactory[] = [
  new TelemetryPolicyFactory(pipelineOptions.telemetry),
  new UniqueRequestIDPolicyFactory(),
  new BrowserPolicyFactory(),
  deserializationPolicy(), // Default deserializationPolicy is provided by protocol layer
  new RetryPolicyFactory(pipelineOptions.retryOptions),
  new LoggingPolicyFactory(),
  credential
];

The intent is clear anyway and the level of indirection is confusing

Temp issue testing CRI crawling

Which service(blob, file, queue, table) does this issue concern?

Which version of the SDK was used?

What's the Node.js/Browser version?

What problem was encountered?

Steps to reproduce the issue?

Have you found a mitigation/solution?

API reference link

I see you support Swagger, but I don't see where the generated API reference is located. Please put a link to the API reference in the README if it exists.

Samples should work with copy/paste

The hero sample in README.md includes top-level await and doesn't import anything. It should include the imports. It should also include an async function main, a common pattern in samples.

Under the samples directory, the examples seem to assume they're running in the project, and not as an external dependency.

Aborter.timeout() scale may be misleading

The documentation for Aborter.timeout() states that its timeout parameters is expressed in "million-seconds":

/**
* Creates a new Aborter instance with timeout in million-seconds.
* Set parameter timeout to 0 will not create a timer.
*
* @static
* @param {number} {timeout} in million-seconds
* @returns {Aborter}
* @memberof Aborter
*/
public static timeout(timeout: number): Aborter {
return new Aborter(undefined, timeout);
}

However, Aborter.constructor() will pass this value directly to setTimeout, which uses milliseconds:

this.timer = setTimeout(() => {
this.abort.call(this);
}, timeout);

Is this a typo, or am I missing something?

How to upload a local file from browser?

The BlobService has a method called createBlockBlobFromBrowserFile to upload a HTML5 File Object to the Blob storage. I can't find a similar call to upload the HTML5 File object. Is there one?

I'm also confused by different APIs available to upload stream to Blob storage. There is Azure BlobService, Azure Storage SDK 2.1 and now 10.1. Can I still use Blobservice or is it deprecated?

Unable to set file HTTP headers

Which service(blob, file, queue, table) does this issue concern?

File

Which version of the SDK was used?

10.1.0

What's the Node.js/Browser version?

Node: 8.9.4

What problem was encountered?

When I try to set the content type header on a file upload, I get an authorization error. I assume the Authorization header is being stripped?

Steps to reproduce the issue?

await uploadFileToAzureFile(aborter, localFilePath, fileUrl, {
  progress: ev => console.log(ev),
  rangeSize: 4 * 1024 * 1024,
  parallelism: 20,
  fileHTTPHeaders: { 
    fileContentType: 'application/json'
  }
})

Which comes back with:

[14/02/2019 11:34:33]    { body: '<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:4c8445f8-601a-00c4-5359-c4b789000000\nTime:2019-02-14T11:35:30.1780358Z</Message><AuthenticationErrorDetail>Signature not valid in the specified time frame: Start [Wed, 13 Feb 2019 16:55:21 GMT] - Expiry [Thu, 14 Feb 2019 00:55:21 GMT] - Current [Thu, 14 Feb 2019 11:35:30 GMT]</AuthenticationErrorDetail></Error>',
[14/02/2019 11:34:33]      headers: HttpHeaders { _headersMap: [Object] },
[14/02/2019 11:34:33]      status: 403 },
[14/02/2019 11:34:33]   body:
[14/02/2019 11:34:33]    { message: 'Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:4c8445f8-601a-00c4-5359-c4b789000000\nTime:2019-02-14T11:35:30.1780358Z',
[14/02/2019 11:34:33]      Code: 'AuthenticationFailed',
[14/02/2019 11:34:33]      AuthenticationErrorDetail: 'Signature not valid in the specified time frame: Start [Wed, 13 Feb 2019 16:55:21 GMT] - Expiry [Thu, 14 Feb 2019 00:55:21 GMT] - Current [Thu, 14 Feb 2019 11:35:30 GMT]' } }
[14/02/2019 11:34:33] Executed 'Functions.generate-schema' (Succeeded, Id=c8f0f7bc-183a-419b-9639-750fc95b976a)
[14/02/2019 11:34:33] Executed HTTP request: {
[14/02/2019 11:34:33]   "requestId": "c427fd7a-d39a-4b21-8d5e-1747f7368a21",
[14/02/2019 11:34:33]   "method": "GET",
[14/02/2019 11:34:33]   "uri": "/api/generate-schema",
[14/02/2019 11:34:33]   "identities": [
[14/02/2019 11:34:33]     {
[14/02/2019 11:34:33]       "type": "WebJobsAuthLevel",
[14/02/2019 11:34:33]       "level": "Admin"
[14/02/2019 11:34:33]     }
[14/02/2019 11:34:33]   ],
[14/02/2019 11:34:33]   "status": 200,
[14/02/2019 11:34:33]   "duration": 2583
[14/02/2019 11:34:33] }

Have you found a mitigation/solution?

I looked at re-constructing the auth headers, but would rather not!

Aborter throws timeout error even though it succeeds on download of BlockBlobUrl

Which service(blob, file, queue, table) does this issue concern?

Blob Storage

Which version of the SDK was used?

10.3.0

What's the Node.js/Browser version?

NodeJS 11.8

What problem was encountered?

Aborter times out after BlockBlobURL download method has already completed:

 Error: The request was aborted
     at new RestError (/shared/node_modules/@azure/ms-rest-js/dist/msRest.node.js:1397:28)
     at a.<anonymous> (/shared/node_modules/@azure/storage-blob/dist/index.js:1:11269)
     at /shared/node_modules/@azure/storage-blob/dist/index.js:1:1277
    at Array.forEach (<anonymous>)
    at a.abort (/shared/node_modules/@azure/storage-blob/dist/index.js:1:1255)
     at Timeout.<anonymous> (/shared/node_modules/@azure/storage-blob/dist/index.js:1:519)
     at listOnTimeout (timers.js:324:15)
    at processTimers (timers.js:268:5)

Steps to reproduce the issue?

My Azure storage utils class AzureStorage.ts:

import {
  BlockBlobURL,
  uploadFileToBlockBlob,
  Aborter,
  SharedKeyCredential,
  ContainerURL,
  ServiceURL,
  StorageURL
} from '@azure/storage-blob';
const path = require('path');

const ONE_MINUTE = 60 * 1000;

class AzureStorage {
  credentials: SharedKeyCredential;
  accountName: string;
  timeout: number;
  serviceURL: ServiceURL;
  pipeline: any;

  constructor(accountName: string, accessKey: string, timeout: number = 2) {
    this.credentials = new SharedKeyCredential(accountName, accessKey);
    this.accountName = accountName;
    this.timeout = timeout;
    this.createPipeline = this.createPipeline.bind(this);
    this.createServiceURL = this.createServiceURL.bind(this);
    this.createContainerURL = this.createContainerURL.bind(this);
    this.createContainer = this.createContainer.bind(this);
    this.getContainerNames = this.getContainerNames.bind(this);
    this.uploadLocalFile = this.uploadLocalFile.bind(this);
    this.downloadFile = this.downloadFile.bind(this);
  }

  createPipeline() {
    this.pipeline = StorageURL.newPipeline(this.credentials);
    return this.pipeline;
  }

  createServiceURL() {
    this.serviceURL = new ServiceURL(
      `https://${this.accountName}.blob.core.windows.net`,
      this.createPipeline()
    );
    return this.serviceURL;
  }

  createContainerURL(containerName: string) {
    return ContainerURL.fromServiceURL(this.createServiceURL(), containerName);
  }

  async createContainer(containerName: string) {
    await this.createContainerURL(containerName).create(this.createAborter());
  }

  async getContainerNames() {
    let marker;
    return await this.createServiceURL().listContainersSegment(
      this.createAborter(),
      marker
    );
  }

  checkIfContainerExists(containerName: string, containers: any) {
    if (!containers) return false;
    const found = containers.find(
      container => containerName === container.name
    );
    return found !== undefined;
  }

  async uploadLocalFile(containerName: string, filePath: string) {
    filePath = path.resolve(filePath);
    const fileName = path.basename(filePath);
    const blockBlobURL = BlockBlobURL.fromContainerURL(
      this.createContainerURL(containerName),
      fileName
    );
    return await uploadFileToBlockBlob(this.createAborter(), filePath, blockBlobURL);
  }

  async downloadFile(containerName: string, fileName: string) {
    const blockBlobURL = BlockBlobURL.fromContainerURL(
      this.createContainerURL(containerName),
      fileName
    );
    const downloadResponse = await blockBlobURL.download(this.createAborter(), 0);
    return downloadResponse.readableStreamBody;
  }

  createAborter() {
    return Aborter.timeout(this.timeout * ONE_MINUTE);
  }
}

export default AzureStorage;

When I use above class' downloadFile method to download a blob from BlobStorage, it succeeds and returns the readableStreamBody but the Aborter still throws the timeout error after 1 minute.

Have you found a mitigation/solution?

Using Aborter.none. But then you don't have a working timeout when it's needed ;)

Can't set file Content-MD5

Which service(blob, file, queue, table) does this issue concern?

File

Which version of the SDK was used?

10.0.0

What's the Node.js/Browser version?

Firefox 63

What problem was encountered?

Can't set Content-MD5 in file, it stays equals "-" on Azure.

image

Steps to reproduce the issue?

just call

uploadBrowserDataToAzureFile(
  aborter,
  file,
  fileURL,
  {
    fileHTTPHeaders: {
      fileContentMD5: new Uint8Array("any MD5 hash")
    }
  });

Have you found a mitigation/solution?

Tried using setHTTPHeaders after uploading the file. Same problem.

Browser func - uploadBrowserDatatoBlockBlob: TypeError: blockBlobURL.upload is not a function

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

10.0.0-preview

What's the Node.js/Browser version?

Browser

What problem was encountered?

When attempting to upload a file from the browser using function uploadBrowserDatatoBlockBlob() I receive a TypeError: blockBlobURL.upload is not a function.

This is using the pre-minified azure-storage.blob.min.js provided in the latest v10 preview release.

The function is supplied with required parameters as shown by console logging before invoking the function:
screenshot-2018-09-19-10 01 07

I've received the same TypeError before when accidentally using a node only function on the browser. I attempted to use blockBlobURL.upload directly and received the same error.

Steps to reproduce the issue?

My code is pulled directly from the highlevel sample

I notice the browser functions are commented out, so they may not be working yet.

Also, does my pipeline need something defined for httpClient parameter? I haven't found any clear documentation on this.

Have you found a mitigation/solution?

No mitigation/solution found yet.

Unable to set blob content type on BlockBlubURL upload

Which service(blob, file, queue, table) does this issue concern?

Blob

Which version of the SDK was used?

2.3.1-preview

What's the Node.js/Browser version?

8.11.1

What problem was encountered?

Cannot set the content type for uploaded blob

Steps to reproduce the issue?

Upload blob with the following code:

  let blobUpload = await blockBlobURL.upload(
	Aborter.none,
	content,
	content.length,
        {blobHTTPHeaders: {blobContentType: "text/html"}}
  );

Have you found a mitigation/solution?

None found.

Cannot find name 'Event'

Which service(blob, file, queue, table) does this issue concern?

Encountered on blob, but I would guess it concerns more of them.

Which version of the SDK was used?

10.3.0

What's the Node.js/Browser version?

NodeJS v10.14.1

What problem was encountered?

Getting a typescript error that the type 'Event' was not found. If I enable "dom" lib in my tsconfig.json, I can compile without any problems.

Steps to reproduce the issue?

Clone repo and use typescript without the "dom" lib option. I used typescript v3.2.1
image

Have you found a mitigation/solution?

Workaround is to add "dom" to lib in tsconfig, when in a NodeJS environment this is not optimal however because I would get lots of types and functions available in typescript which may fail at run-time because the DOM is not really available at run-time in NodeJS.

err.code.toUpperCase is not a function

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

latest

What's the Node.js/Browser version?

chrome

What problem was encountered?

RetryPolicy../node_modules/@azure/storage-blob/dist-esm/lib/policies/RetryPolicy.js.RetryPolicy.shouldRetry
TypeError: err.code.toUpperCase is not a function

Steps to reproduce the issue?

use in a another library and install into main project

Have you found a mitigation/solution?

no

Authentication error when blob name contains spaces

Hi,

Noticed while trying to use this new library that uploading files that contains spaces result in authentication error.

Here is a sample js code to repro

// @ts-check
const { StorageURL, ServiceURL, SharedKeyCredential, uploadFileToBlockBlob, Aborter, ContainerURL, BlobURL, BlockBlobURL } = require("@azure/storage-blob");

const pipeline = StorageURL.newPipeline(
    new SharedKeyCredential("[account]", ""),
);

// List containers
const serviceURL = new ServiceURL(
    "https://[account].blob.core.windows.net",
    pipeline,
);

const containerURL = ContainerURL.fromServiceURL(serviceURL, "test");
const blobURL = BlobURL.fromContainerURL(containerURL, "blob with spaces.txt");
const blockBlobURL = BlockBlobURL.fromBlobURL(blobURL);
uploadFileToBlockBlob(Aborter.None, "./file with spaces.txt", blockBlobURL).then(() => {
    console.log("Success")
}).catch((e) => {
    console.log("Error", e);
})

Error returned is

Error { Error: <?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:4f27d66f-501e-0017-53a0-4e8893000000
Time:2018-09-17T16:07:02.7426199Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request 'd65Ec7bxhPHYaFNLPd5jIsUSAonRmxOgbSf/7owR25U=' is not the same as any computed signature. Server used following string to sign: 'PUT


12

application/octet-stream






x-ms-blob-type:BlockBlob
x-ms-client-request-id:4c723297-519f-4655-a05c-853e83d2f841
x-ms-date:Mon, 17 Sep 2018 16:07:01 GMT
x-ms-version:2018-03-28
/testtim1/test/blob%20with%20spaces.txt
timeout:60000'.</AuthenticationErrorDetail></Error>
    at new RestError (D:\dev\test\node_modules\ms-rest-js\dist\lib\restError.js:9:28)
    at D:\dev\test\node_modules\ms-rest-js\dist\lib\policies\deserializationPolicy.js:89:37
    at process._tickCallback (internal/process/next_tick.js:68:7)
  code: undefined,
  statusCode: 403,
  request:
   WebResource {
     streamResponseBody: false,
     url:
      'https://testtim1.blob.core.windows.net/test/blob with spaces.txt?timeout=60000',
     method: 'PUT',
     headers: HttpHeaders { _headersMap: [Object] },
     body: [Function],
     query: undefined,
     formData: undefined,
     withCredentials: false,
     abortSignal:
      Aborter {
        _aborted: false,
        children: [],
        abortEventListeners: [],
        parent: undefined,
        key: undefined,
        value: undefined },
     timeout: 0,
     onUploadProgress: undefined,
     onDownloadProgress: undefined,
     operationSpec:
      { httpMethod: 'PUT',
        path: '{containerName}/{blob}',
        urlParameters: [Array],
        queryParameters: [Array],
        headerParameters: [Array],
        requestBody: [Object],
        contentType: 'application/octet-stream',
        responses: [Object],
        isXML: true,
        serializer: [Serializer] },
     shouldDeserialize: undefined,
     operationResponseGetter: undefined },
  response:
   { body:
      '<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:4f27d66f-501e-0017-53a0-4e8893000000\nTime:2018-09-17T16:07:02.7426199Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request \'d65Ec7bxhPHYaFNLPd5jIsUSAonRmxOgbSf/7owR25U=\' is not the same as any computed signature. Server used following string to sign: \'PUT\n\n\n12\n\napplication/octet-stream\n\n\n\n\n\n\nx-ms-blob-type:BlockBlob\nx-ms-client-request-id:4c723297-519f-4655-a05c-853e83d2f841\nx-ms-date:Mon, 17 Sep 2018 16:07:01 GMT\nx-ms-version:2018-03-28\n/testtim1/test/blob%20with%20spaces.txt\ntimeout:60000\'.</AuthenticationErrorDetail></Error>',
     headers: HttpHeaders { _headersMap: [Object] },
     status: 403 },
  body:
   { message:
      'Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:4f27d66f-501e-0017-53a0-4e8893000000\nTime:2018-09-17T16:07:02.7426199Z' } }

As soon as I remove the spaces from the blob name it uploads just fine.

Set X-Request-ID / client-request-id Header per-upload

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

10

What's the Node.js/Browser version?

What problem was encountered?

Trying to figure out if you can set the X-Request-ID Header per blob upload. We want to be able to set X-Request-ID/client request id Header for better logging and to sent to the EventGrid.

Based on the API, there is a BlockBlobUploadHeaders, but I am unsure 1) How to use it in my current code 2) If the requestId can be set 3) If this is even the X-Request-ID.

Here is my existing code but unsure if you can set the header here:

const anonymousCredential = new AnonymousCredential();
const pipeline = StorageURL.newPipeline(anonymousCredential);
const serviceURL = new ServiceURL(
{storageAssetInfo.uri}{storageAssetInfo.sharedAccessSignature},
pipeline,
);

const containerName = storageAssetInfo.name;
const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
const blobName = ${file.name}-${storageAssetInfo.name};
const blobUrl = BlobURL.fromContainerURL(containerURL, blobName);
const blockblobURL = BlockBlobURL.fromBlobURL(blobUrl);

const options: IUploadToBlockBlobOptions = {
blockSize: this.uploadToBlockBlobOptionsService.getBlockSize(file.size),
parallelism: this.uploadToBlockBlobOptionsService.getParallelism(),
maxSingleShotSize: this.uploadToBlockBlobOptionsService.getMaxSingleShotSize(file.size),
progress: (transferProgressEvent: TransferProgressEvent) => {
this.onFileUploadProgressChanged(transferProgressEvent, file, this.uploadProgressSource);
},
};

const blobUploadCommonResponse = await uploadBrowserDataToBlockBlob(Aborter.none, file, blockblobURL, options);

Steps to reproduce the issue?

None

Have you found a mitigation/solution?

No

how to handle viewing files that are stored in azure storage with multiple mime types

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

10.0.3

What's the Node.js/Browser version?

8.11.1

What problem was encountered?

I have an angular app that displays a table of records and each record has a link to a file store in azure storage.

Once the user clicks on a link, it executes this code:

  async downloadFile(file: UploadedFileMetaData) {
    if (this.downloading.has(file)) {
      return;
    }

    this.downloading.add(file);

    let url: string;

    try {
      url = await this.blobStorageService.getDownloadUrl(file);
    } catch (err) {
      throw err;
    }

    return this.http
      .get(url, {
        responseType: ResponseContentType.Blob
      })
      .map((res) => {
        return {
          filename: file.blobName,
          data: res.blob()
        };
      })
      .subscribe(
        (res) => {
          console.log('start download:', res);
          const fileUrl = window.URL.createObjectURL(res.data);
          const a = document.createElement('a');
          document.body.appendChild(a);

          a.setAttribute('style', 'display: none');

          a.href = fileUrl;

          a.download = res.filename;
          a.click();

          window.URL.revokeObjectURL(fileUrl);
          a.remove();
        },
        (error) => {
          this.downloading.delete(file);
          console.log('download error:', JSON.stringify(error));
        },
        () => {
          this.downloading.delete(file);
          console.log('Completed file download.');
        }
      );
  }

Is there anyway I can get these files to open in the browser this way. Downloading the files to the user's hard drive is not a great experience and does not seem to work on iphone or ipad.

Steps to reproduce the issue?

Have you found a mitigation/solution?

Read blob to text

Which service(blob, file, queue, table) does this issue concern?

Blob

Which version of the SDK was used?

10.2.0-preview

What's the Node.js/Browser version?

Browser

What problem was encountered?

Ease of use. It seems like it should be a pretty common scenario to read text from blob. A lot of the other Azure storage SDKs have some kind of readText function that will read the text straight from blob into a string. Wondering if that's coming here as well.

Have you found a mitigation/solution?

Yes. Currently using the workaround found in example test file of bodyToString which calls blobToString. Seems like it could be something that fits better inside the SDK

Support for Angular4 (Typescript < 2.4)

Which service(blob, file, queue, table) does this issue concern?

Blob (+ms-rest-js)

Which version of the SDK was used?

10.0.3

What's the Node.js/Browser version?

Angular 4 + dependencies (TypeScript 2.3.3)

What problem was encountered?

The library is incompatible with Angular4. Angular4 must use TypeScript 2.3.3 which doesn't support String enum declaration which is done in

  • queryCollectionFormat.d.ts
  • SASQueryParameters.d.ts

Steps to reproduce the issue?

  1. Use Angular4 or TypeScript 2.3.3

  2. Npm install the latest version of this.

  3. Reference it i.e. import { } from '@azure/storage-blob';

  4. You'll see an error 'In ambient enum declarations member initializer must be constant expression' aka no string enum delcaration allowed.

Have you found a mitigation/solution?

Not a solution but I can fix this locally by making the above files' containing enums use numerical enums instead of strings to fix this, although I imagine this does break some functionality.

... Or use the legacy SDK. Although that sort of defeats the purpose as Angular4 was <2 years ago

how to track progress with uploadBrowserDataToBlockBlob

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

10.3.0

What's the Node.js/Browser version?

What problem was encountered?

How do I use the progress call back passed to uploadBrowserDataToBlockBlob

      response = await Azure.uploadBrowserDataToBlockBlob(Azure.Aborter.none, file, blockBlobURL, {
        maxSingleShotSize: 4 * 1024 * 1024,
        progress: (progressEvent: TransferProgressEvent) => {
          console.log(progressEvent);
        }
      });

I just get an object with loadedBytes which seems to be the full amount of the block.

Is there a way I can track the progress with this?

can I use this library in a React Native app?

Which service(blob, file, queue, table) does this issue concern?

Which version of the SDK was used?

What's the Node.js/Browser version?

What problem was encountered?

Steps to reproduce the issue?

Have you found a mitigation/solution?

Uploading a stream to a block blob extremely slow

Which service(blob, file, queue, table) does this issue concern?

Blob

Which version of the SDK was used?

"@azure/storage-blob": "^10.2.0-preview"

What's the Node.js/Browser version?

Node.js

What problem was encountered?

Storage streams never drained / flushed.

Steps to reproduce the issue?

For one scenario we're using the tar package and uploading streams directly from tar archives, using the entry event.

Emits 'entry' events with tar.ReadEntry objects, which are themselves readable streams that you can pipe wherever. Each entry will not emit until the one before it is flushed through, so make sure to either consume the data (with on('data', ...) or .pipe(...)) or throw it away with .resume() to keep the stream flowing.

A few files upload fine, but then it gets 'stuck' and takes several minutes before, slowly, continuing on to upload more files. This indicates that BlockBlobUrl.upload does not consume the file stream. The promise returned from blobUrl.upload also does not resolve.

Our code is something like this:

const parser = new Parse();
createReadStream(tarball).pipe(parser);

parser.on('entry', file => {
  blobUrl.upload(Aborter.none, () => file, file.size, {
    blobHTTPHeaders: {
      blobContentType: mime.getType(file.path)!,
    },
  });
});

There's a bit more stuff in there around handling promises and such, but that's the gist of it.

Using storage.createBlockBlobFromStreamAsync from the previous SDK worked fine in this scenario, and also manually calling uploadStreamToBlockBlob works...

uploadStreamToBlockBlob(Aborter.none, file, blobUrl, 2 * 1024 * 1024, 20, {
  blobHTTPHeaders: {
    blobContentType: mime.getType(file.path),
  },
});

...but the more ergonomic blobUrl.upload does not.

Have you found a mitigation/solution?

Above ^

Headers in README.md sample are undefined

The sample code in the README attempts to get headers, but the result is undefined. It also doesn't appear that headers is a valid property on downloadBlockBlogResponse.

Here's the sample code for reference:

console.log(`[headers]:${downloadBlockBlobResponse.headers}`);

where in the response of an upload do I get the details to retrieve the file

Which service(blob, file, queue, table) does this issue concern?

blob

Which version of the SDK was used?

10.3

What's the Node.js/Browser version?

8.11.1

What problem was encountered?

I have a backing database table that stores additional searchable metadata about each file that gets uploaded into azure storage, such as description, name and other details. There will also be a pseudo directory structure which is all very normal stuff for document management.

I am making a call to uploadBrowserDataToBlockBlob but I cannot see anything in the response that will help me later retrieve the file later.

My code looks like this:

response = await Azure.uploadBrowserDataToBlockBlob(Azure.Aborter.none, file, blockBlobURL, options);

but I can't see anything in that response that will let me later uniquely get this file, unless the blob name is the only thing that I have.

In which case am I best to generate some unique name for the identifier?

Steps to reproduce the issue?

Have you found a mitigation/solution?

Using application/json header for file upload (x-ms-blob-type error)

Which service(blob, file, queue, table) does this issue concern?

File

Which version of the SDK was used?

10.1.0

What's the Node.js/Browser version?

Node: 8.9.4

What problem was encountered?

I'm not sure how to set the x-ms-blob-type header on a file upload. I can't find it documented. I am looking to upload a .graphql file. I think the content type should be application/json, but when I set this, I get the following error:

Code: 'MissingRequiredHeader',
HeaderName: 'x-ms-blob-type'

Steps to reproduce the issue?

await uploadFileToAzureFile(aborter, localFilePath, fileUrl, {
  progress: ev => console.log(ev),
  rangeSize: 4 * 1024 * 1024,
  parallelism: 20,
  fileHTTPHeaders: {
    fileContentType: 'application/json'
  }
})

Support for Tables

Is support for Tables planned for this library? Any ETA on that? I'd prefer to use the modern APIs as apposed to those available in the older library which supports tables. Thanks!

Aborter does not abort uploads that are in timeout / retrying

Which service(blob, file, queue, table) does this issue concern?

Uploads in browser with uploadBrowserDataToBlockBlob()

Which version of the SDK was used?

10.3.0

What's the Node.js/Browser version?

Chrome Browser v72.0.3626.119

What problem was encountered?

When using an Aborter to abort pending uploads and the upload is currently in timeout / retry loop because of connection problems (or similar issues), the upload is later resumed although the Aborter.aborted returns true.

Steps to reproduce the issue?

  1. Upload a large file to blob storage with uploadBrowserDataToBlockBlob() and an Aborter, so you have time to cancel the upload and the chunk upload is used. Aborter.none should suffice; no timeout is specified.
  2. During upload go offline. This can be achieved through your browsers' developer tools network tab.
  3. Cancel the upload process by calling aborter.abort().
  4. Go online again.
  5. View the network tab of your browser to see the upload resuming.

Have you found a mitigation/solution?

This is just an assumption and has not yet been tested / verified:
BlockBlob.stageBlock => StorageClientContext.sendOperationRequest() could check early for abortSignal.aborted and raise an error to fail the returned Promise.
Also in Batch.addOperation() the this.state === BatchStates.Error should be checked early before await operation(); is called.

high-level samples Abort.timeout duration

Which service(blob, file, queue, table) does this issue concern?

Documentation samples

Which version of the SDK was used?

Irrelevant

What's the Node.js/Browser version?

Irrelevant

What problem was encountered?

The timeout duration in the sample examples appears to be wrong:
30 * 60 * 60 * 1000 is 30 hours, not 30 mins as the comment says. It should be 30 * 60 * 1000 to avoid leading unattentive devs (does that even exist?) to have their timeout duration a bit too big.

Steps to reproduce the issue?

Irrelevant

Have you found a mitigation/solution?

The time calculation should be 30 * 60 * 1000 at the following places :

https://github.com/Azure/azure-storage-js/blob/master/blob/samples/highlevel.sample.js#L60-L87

https://github.com/Azure/azure-storage-js/blob/master/blob/samples/highlevel.sample.js#L87

https://github.com/Azure/azure-storage-js/blob/master/file/samples/highlevel.sample.js#L67

https://github.com/Azure/azure-storage-js/blob/master/file/samples/highlevel.sample.js#L94

Gracefully handle 404 of reading from blob

Which service(blob, file, queue, table) does this issue concern?

Blob

Which version of the SDK was used?

10.3.0

What's the Node.js/Browser version?

Browser

What problem was encountered?

Not being able to gracefully catch 404 when trying to read text from a non-existent blob

Steps to reproduce the issue?

  • Get a blockBlobURL object for a non-existent blob but an existing container
  • Call await blockBlobURL.download(Aborter.none, 0); inside of a try catch
  • The catch is reached, but the 404 error has already been thrown and cannot be prevented

Have you found a mitigation/solution?

For now, we're reading all blobs from the container to check that the blob exists before trying to read from the blob, but that's not very performant

downloadBlobToBuffer may download error ranges when customize non zero offset with count

Which service(blob, file, queue, table) does this issue concern?

blob, file needs check too

Which version of the SDK was used?

all

What's the Node.js/Browser version?

Node.js

What problem was encountered?

downloadBlobToBuffer may download error ranges when customize non zero offset with count

Steps to reproduce the issue?

      const chunkEnd =
        off + options.blockSize! < count! ? off + options.blockSize! : count!; // count is wrongly used as end position
      const response = await blobURL.download(
        aborter,
        off,
        chunkEnd - off + 1, // should not + 1
        {
          blobAccessConditions: options.blobAccessConditions,
          maxRetryRequests: options.maxRetryRequestsPerBlock
        }
      );

Have you found a mitigation/solution?

N/A

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.