Add node_modules

This commit is contained in:
sysadmin 2023-07-29 20:33:28 +02:00
parent e122e7a6ae
commit eab4122ef5
344 changed files with 56300 additions and 1 deletions

1
.gitignore vendored
View file

@ -1,2 +1 @@
node_modules
commit.sh

1
node_modules/.bin/mkdirp generated vendored Normal file
View file

@ -0,0 +1 @@
../mkdirp/bin/cmd.js

70
node_modules/asap/CHANGES.md generated vendored Normal file
View file

@ -0,0 +1,70 @@
## 2.0.6
Version 2.0.4 adds support for React Native by clarifying in package.json that
the browser environment does not support Node.js domains.
Why this is necessary, we leave as an exercise for the user.
## 2.0.3
Version 2.0.3 fixes a bug when adjusting the capacity of the task queue.
## 2.0.1-2.02
Version 2.0.1 fixes a bug in the way redirects were expressed that affected the
function of Browserify, but which Mr would tolerate.
## 2.0.0
Version 2 of ASAP is a full rewrite with a few salient changes.
First, the ASAP source is CommonJS only and designed with [Browserify][] and
[Browserify-compatible][Mr] module loaders in mind.
[Browserify]: https://github.com/substack/node-browserify
[Mr]: https://github.com/montagejs/mr
The new version has been refactored in two dimensions.
Support for Node.js and browsers have been separated, using Browserify
redirects and ASAP has been divided into two modules.
The "raw" layer depends on the tasks to catch thrown exceptions and unravel
Node.js domains.
The full implementation of ASAP is loadable as `require("asap")` in both Node.js
and browsers.
The raw layer that lacks exception handling overhead is loadable as
`require("asap/raw")`.
The interface is the same for both layers.
Tasks are no longer required to be functions, but can rather be any object that
implements `task.call()`.
With this feature you can recycle task objects to avoid garbage collector churn
and avoid closures in general.
The implementation has been rigorously documented so that our successors can
understand the scope of the problem that this module solves and all of its
nuances, ensuring that the next generation of implementations know what details
are essential.
- [asap.js](https://github.com/kriskowal/asap/blob/master/asap.js)
- [raw.js](https://github.com/kriskowal/asap/blob/master/raw.js)
- [browser-asap.js](https://github.com/kriskowal/asap/blob/master/browser-asap.js)
- [browser-raw.js](https://github.com/kriskowal/asap/blob/master/browser-raw.js)
The new version has also been rigorously tested across a broad spectrum of
browsers, in both the window and worker context.
The following charts capture the browser test results for the most recent
release.
The first chart shows test results for ASAP running in the main window context.
The second chart shows test results for ASAP running in a web worker context.
Test results are inconclusive (grey) on browsers that do not support web
workers.
These data are captured automatically by [Continuous
Integration][].
![Browser Compatibility](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-results-matrix.svg)
![Compatibility in Web Workers](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-worker-results-matrix.svg)
[Continuous Integration]: https://github.com/kriskowal/asap/blob/master/CONTRIBUTING.md

21
node_modules/asap/LICENSE.md generated vendored Normal file
View file

@ -0,0 +1,21 @@
Copyright 20092014 Contributors. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to
deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.

237
node_modules/asap/README.md generated vendored Normal file
View file

@ -0,0 +1,237 @@
# ASAP
[![Build Status](https://travis-ci.org/kriskowal/asap.png?branch=master)](https://travis-ci.org/kriskowal/asap)
Promise and asynchronous observer libraries, as well as hand-rolled callback
programs and libraries, often need a mechanism to postpone the execution of a
callback until the next available event.
(See [Designing APIs for Asynchrony][Zalgo].)
The `asap` function executes a task **as soon as possible** but not before it
returns, waiting only for the completion of the current event and previously
scheduled tasks.
```javascript
asap(function () {
// ...
});
```
[Zalgo]: http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony
This CommonJS package provides an `asap` module that exports a function that
executes a task function *as soon as possible*.
ASAP strives to schedule events to occur before yielding for IO, reflow,
or redrawing.
Each event receives an independent stack, with only platform code in parent
frames and the events run in the order they are scheduled.
ASAP provides a fast event queue that will execute tasks until it is
empty before yielding to the JavaScript engine's underlying event-loop.
When a task gets added to a previously empty event queue, ASAP schedules a flush
event, preferring for that event to occur before the JavaScript engine has an
opportunity to perform IO tasks or rendering, thus making the first task and
subsequent tasks semantically indistinguishable.
ASAP uses a variety of techniques to preserve this invariant on different
versions of browsers and Node.js.
By design, ASAP prevents input events from being handled until the task
queue is empty.
If the process is busy enough, this may cause incoming connection requests to be
dropped, and may cause existing connections to inform the sender to reduce the
transmission rate or stall.
ASAP allows this on the theory that, if there is enough work to do, there is no
sense in looking for trouble.
As a consequence, ASAP can interfere with smooth animation.
If your task should be tied to the rendering loop, consider using
`requestAnimationFrame` instead.
A long sequence of tasks can also effect the long running script dialog.
If this is a problem, you may be able to use ASAPs cousin `setImmediate` to
break long processes into shorter intervals and periodically allow the browser
to breathe.
`setImmediate` will yield for IO, reflow, and repaint events.
It also returns a handler and can be canceled.
For a `setImmediate` shim, consider [YuzuJS setImmediate][setImmediate].
[setImmediate]: https://github.com/YuzuJS/setImmediate
Take care.
ASAP can sustain infinite recursive calls without warning.
It will not halt from a stack overflow, and it will not consume unbounded
memory.
This is behaviorally equivalent to an infinite loop.
Just as with infinite loops, you can monitor a Node.js process for this behavior
with a heart-beat signal.
As with infinite loops, a very small amount of caution goes a long way to
avoiding problems.
```javascript
function loop() {
asap(loop);
}
loop();
```
In browsers, if a task throws an exception, it will not interrupt the flushing
of high-priority tasks.
The exception will be postponed to a later, low-priority event to avoid
slow-downs.
In Node.js, if a task throws an exception, ASAP will resume flushing only if—and
only after—the error is handled by `domain.on("error")` or
`process.on("uncaughtException")`.
## Raw ASAP
Checking for exceptions comes at a cost.
The package also provides an `asap/raw` module that exports the underlying
implementation which is faster but stalls if a task throws an exception.
This internal version of the ASAP function does not check for errors.
If a task does throw an error, it will stall the event queue unless you manually
call `rawAsap.requestFlush()` before throwing the error, or any time after.
In Node.js, `asap/raw` also runs all tasks outside any domain.
If you need a task to be bound to your domain, you will have to do it manually.
```js
if (process.domain) {
task = process.domain.bind(task);
}
rawAsap(task);
```
## Tasks
A task may be any object that implements `call()`.
A function will suffice, but closures tend not to be reusable and can cause
garbage collector churn.
Both `asap` and `rawAsap` accept task objects to give you the option of
recycling task objects or using higher callable object abstractions.
See the `asap` source for an illustration.
## Compatibility
ASAP is tested on Node.js v0.10 and in a broad spectrum of web browsers.
The following charts capture the browser test results for the most recent
release.
The first chart shows test results for ASAP running in the main window context.
The second chart shows test results for ASAP running in a web worker context.
Test results are inconclusive (grey) on browsers that do not support web
workers.
These data are captured automatically by [Continuous
Integration][].
[Continuous Integration]: https://github.com/kriskowal/asap/blob/master/CONTRIBUTING.md
![Browser Compatibility](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-results-matrix.svg)
![Compatibility in Web Workers](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-worker-results-matrix.svg)
## Caveats
When a task is added to an empty event queue, it is not always possible to
guarantee that the task queue will begin flushing immediately after the current
event.
However, once the task queue begins flushing, it will not yield until the queue
is empty, even if the queue grows while executing tasks.
The following browsers allow the use of [DOM mutation observers][] to access
the HTML [microtask queue][], and thus begin flushing ASAP's task queue
immediately at the end of the current event loop turn, before any rendering or
IO:
[microtask queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#microtask-queue
[DOM mutation observers]: http://dom.spec.whatwg.org/#mutation-observers
- Android 44.3
- Chrome 2634
- Firefox 1429
- Internet Explorer 11
- iPad Safari 67.1
- iPhone Safari 77.1
- Safari 67
In the absense of mutation observers, there are a few browsers, and situations
like web workers in some of the above browsers, where [message channels][]
would be a useful way to avoid falling back to timers.
Message channels give direct access to the HTML [task queue][], so the ASAP
task queue would flush after any already queued rendering and IO tasks, but
without having the minimum delay imposed by timers.
However, among these browsers, Internet Explorer 10 and Safari do not reliably
dispatch messages, so they are not worth the trouble to implement.
[message channels]: http://www.whatwg.org/specs/web-apps/current-work/multipage/web-messaging.html#message-channels
[task queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#concept-task
- Internet Explorer 10
- Safair 5.0-1
- Opera 11-12
In the absense of mutation observers, these browsers and the following browsers
all fall back to using `setTimeout` and `setInterval` to ensure that a `flush`
occurs.
The implementation uses both and cancels whatever handler loses the race, since
`setTimeout` tends to occasionally skip tasks in unisolated circumstances.
Timers generally delay the flushing of ASAP's task queue for four milliseconds.
- Firefox 313
- Internet Explorer 610
- iPad Safari 4.3
- Lynx 2.8.7
## Heritage
ASAP has been factored out of the [Q][] asynchronous promise library.
It originally had a naïve implementation in terms of `setTimeout`, but
[Malte Ubl][NonBlocking] provided an insight that `postMessage` might be
useful for creating a high-priority, no-delay event dispatch hack.
Since then, Internet Explorer proposed and implemented `setImmediate`.
Robert Katić began contributing to Q by measuring the performance of
the internal implementation of `asap`, paying particular attention to
error recovery.
Domenic, Robert, and Kris Kowal collectively settled on the current strategy of
unrolling the high-priority event queue internally regardless of what strategy
we used to dispatch the potentially lower-priority flush event.
Domenic went on to make ASAP cooperate with Node.js domains.
[Q]: https://github.com/kriskowal/q
[NonBlocking]: http://www.nonblocking.io/2011/06/windownexttick.html
For further reading, Nicholas Zakas provided a thorough article on [The
Case for setImmediate][NCZ].
[NCZ]: http://www.nczonline.net/blog/2013/07/09/the-case-for-setimmediate/
Embers RSVP promise implementation later [adopted][RSVP ASAP] the name ASAP but
further developed the implentation.
Particularly, The `MessagePort` implementation was abandoned due to interaction
[problems with Mobile Internet Explorer][IE Problems] in favor of an
implementation backed on the newer and more reliable DOM `MutationObserver`
interface.
These changes were back-ported into this library.
[IE Problems]: https://github.com/cujojs/when/issues/197
[RSVP ASAP]: https://github.com/tildeio/rsvp.js/blob/cddf7232546a9cf858524b75cde6f9edf72620a7/lib/rsvp/asap.js
In addition, ASAP factored into `asap` and `asap/raw`, such that `asap` remained
exception-safe, but `asap/raw` provided a tight kernel that could be used for
tasks that guaranteed that they would not throw exceptions.
This core is useful for promise implementations that capture thrown errors in
rejected promises and do not need a second safety net.
At the same time, the exception handling in `asap` was factored into separate
implementations for Node.js and browsers, using the the [Browserify][Browser
Config] `browser` property in `package.json` to instruct browser module loaders
and bundlers, including [Browserify][], [Mr][], and [Mop][], to use the
browser-only implementation.
[Browser Config]: https://gist.github.com/defunctzombie/4339901
[Browserify]: https://github.com/substack/node-browserify
[Mr]: https://github.com/montagejs/mr
[Mop]: https://github.com/montagejs/mop
## License
Copyright 2009-2014 by Contributors
MIT License (enclosed)

65
node_modules/asap/asap.js generated vendored Normal file
View file

@ -0,0 +1,65 @@
"use strict";
var rawAsap = require("./raw");
var freeTasks = [];
/**
* Calls a task as soon as possible after returning, in its own event, with
* priority over IO events. An exception thrown in a task can be handled by
* `process.on("uncaughtException") or `domain.on("error")`, but will otherwise
* crash the process. If the error is handled, all subsequent tasks will
* resume.
*
* @param {{call}} task A callable object, typically a function that takes no
* arguments.
*/
module.exports = asap;
function asap(task) {
var rawTask;
if (freeTasks.length) {
rawTask = freeTasks.pop();
} else {
rawTask = new RawTask();
}
rawTask.task = task;
rawTask.domain = process.domain;
rawAsap(rawTask);
}
function RawTask() {
this.task = null;
this.domain = null;
}
RawTask.prototype.call = function () {
if (this.domain) {
this.domain.enter();
}
var threw = true;
try {
this.task.call();
threw = false;
// If the task throws an exception (presumably) Node.js restores the
// domain stack for the next event.
if (this.domain) {
this.domain.exit();
}
} finally {
// We use try/finally and a threw flag to avoid messing up stack traces
// when we catch and release errors.
if (threw) {
// In Node.js, uncaught exceptions are considered fatal errors.
// Re-throw them to interrupt flushing!
// Ensure that flushing continues if an uncaught exception is
// suppressed listening process.on("uncaughtException") or
// domain.on("error").
rawAsap.requestFlush();
}
// If the task threw an error, we do not want to exit the domain here.
// Exiting the domain would prevent the domain from catching the error.
this.task = null;
this.domain = null;
freeTasks.push(this);
}
};

66
node_modules/asap/browser-asap.js generated vendored Normal file
View file

@ -0,0 +1,66 @@
"use strict";
// rawAsap provides everything we need except exception management.
var rawAsap = require("./raw");
// RawTasks are recycled to reduce GC churn.
var freeTasks = [];
// We queue errors to ensure they are thrown in right order (FIFO).
// Array-as-queue is good enough here, since we are just dealing with exceptions.
var pendingErrors = [];
var requestErrorThrow = rawAsap.makeRequestCallFromTimer(throwFirstError);
function throwFirstError() {
if (pendingErrors.length) {
throw pendingErrors.shift();
}
}
/**
* Calls a task as soon as possible after returning, in its own event, with priority
* over other events like animation, reflow, and repaint. An error thrown from an
* event will not interrupt, nor even substantially slow down the processing of
* other events, but will be rather postponed to a lower priority event.
* @param {{call}} task A callable object, typically a function that takes no
* arguments.
*/
module.exports = asap;
function asap(task) {
var rawTask;
if (freeTasks.length) {
rawTask = freeTasks.pop();
} else {
rawTask = new RawTask();
}
rawTask.task = task;
rawAsap(rawTask);
}
// We wrap tasks with recyclable task objects. A task object implements
// `call`, just like a function.
function RawTask() {
this.task = null;
}
// The sole purpose of wrapping the task is to catch the exception and recycle
// the task object after its single use.
RawTask.prototype.call = function () {
try {
this.task.call();
} catch (error) {
if (asap.onerror) {
// This hook exists purely for testing purposes.
// Its name will be periodically randomized to break any code that
// depends on its existence.
asap.onerror(error);
} else {
// In a web browser, exceptions are not fatal. However, to avoid
// slowing down the queue of pending tasks, we rethrow the error in a
// lower priority turn.
pendingErrors.push(error);
requestErrorThrow();
}
} finally {
this.task = null;
freeTasks[freeTasks.length] = this;
}
};

223
node_modules/asap/browser-raw.js generated vendored Normal file
View file

@ -0,0 +1,223 @@
"use strict";
// Use the fastest means possible to execute a task in its own turn, with
// priority over other events including IO, animation, reflow, and redraw
// events in browsers.
//
// An exception thrown by a task will permanently interrupt the processing of
// subsequent tasks. The higher level `asap` function ensures that if an
// exception is thrown by a task, that the task queue will continue flushing as
// soon as possible, but if you use `rawAsap` directly, you are responsible to
// either ensure that no exceptions are thrown from your task, or to manually
// call `rawAsap.requestFlush` if an exception is thrown.
module.exports = rawAsap;
function rawAsap(task) {
if (!queue.length) {
requestFlush();
flushing = true;
}
// Equivalent to push, but avoids a function call.
queue[queue.length] = task;
}
var queue = [];
// Once a flush has been requested, no further calls to `requestFlush` are
// necessary until the next `flush` completes.
var flushing = false;
// `requestFlush` is an implementation-specific method that attempts to kick
// off a `flush` event as quickly as possible. `flush` will attempt to exhaust
// the event queue before yielding to the browser's own event loop.
var requestFlush;
// The position of the next task to execute in the task queue. This is
// preserved between calls to `flush` so that it can be resumed if
// a task throws an exception.
var index = 0;
// If a task schedules additional tasks recursively, the task queue can grow
// unbounded. To prevent memory exhaustion, the task queue will periodically
// truncate already-completed tasks.
var capacity = 1024;
// The flush function processes all tasks that have been scheduled with
// `rawAsap` unless and until one of those tasks throws an exception.
// If a task throws an exception, `flush` ensures that its state will remain
// consistent and will resume where it left off when called again.
// However, `flush` does not make any arrangements to be called again if an
// exception is thrown.
function flush() {
while (index < queue.length) {
var currentIndex = index;
// Advance the index before calling the task. This ensures that we will
// begin flushing on the next task the task throws an error.
index = index + 1;
queue[currentIndex].call();
// Prevent leaking memory for long chains of recursive calls to `asap`.
// If we call `asap` within tasks scheduled by `asap`, the queue will
// grow, but to avoid an O(n) walk for every task we execute, we don't
// shift tasks off the queue after they have been executed.
// Instead, we periodically shift 1024 tasks off the queue.
if (index > capacity) {
// Manually shift all values starting at the index back to the
// beginning of the queue.
for (var scan = 0, newLength = queue.length - index; scan < newLength; scan++) {
queue[scan] = queue[scan + index];
}
queue.length -= index;
index = 0;
}
}
queue.length = 0;
index = 0;
flushing = false;
}
// `requestFlush` is implemented using a strategy based on data collected from
// every available SauceLabs Selenium web driver worker at time of writing.
// https://docs.google.com/spreadsheets/d/1mG-5UYGup5qxGdEMWkhP6BWCz053NUb2E1QoUTU16uA/edit#gid=783724593
// Safari 6 and 6.1 for desktop, iPad, and iPhone are the only browsers that
// have WebKitMutationObserver but not un-prefixed MutationObserver.
// Must use `global` or `self` instead of `window` to work in both frames and web
// workers. `global` is a provision of Browserify, Mr, Mrs, or Mop.
/* globals self */
var scope = typeof global !== "undefined" ? global : self;
var BrowserMutationObserver = scope.MutationObserver || scope.WebKitMutationObserver;
// MutationObservers are desirable because they have high priority and work
// reliably everywhere they are implemented.
// They are implemented in all modern browsers.
//
// - Android 4-4.3
// - Chrome 26-34
// - Firefox 14-29
// - Internet Explorer 11
// - iPad Safari 6-7.1
// - iPhone Safari 7-7.1
// - Safari 6-7
if (typeof BrowserMutationObserver === "function") {
requestFlush = makeRequestCallFromMutationObserver(flush);
// MessageChannels are desirable because they give direct access to the HTML
// task queue, are implemented in Internet Explorer 10, Safari 5.0-1, and Opera
// 11-12, and in web workers in many engines.
// Although message channels yield to any queued rendering and IO tasks, they
// would be better than imposing the 4ms delay of timers.
// However, they do not work reliably in Internet Explorer or Safari.
// Internet Explorer 10 is the only browser that has setImmediate but does
// not have MutationObservers.
// Although setImmediate yields to the browser's renderer, it would be
// preferrable to falling back to setTimeout since it does not have
// the minimum 4ms penalty.
// Unfortunately there appears to be a bug in Internet Explorer 10 Mobile (and
// Desktop to a lesser extent) that renders both setImmediate and
// MessageChannel useless for the purposes of ASAP.
// https://github.com/kriskowal/q/issues/396
// Timers are implemented universally.
// We fall back to timers in workers in most engines, and in foreground
// contexts in the following browsers.
// However, note that even this simple case requires nuances to operate in a
// broad spectrum of browsers.
//
// - Firefox 3-13
// - Internet Explorer 6-9
// - iPad Safari 4.3
// - Lynx 2.8.7
} else {
requestFlush = makeRequestCallFromTimer(flush);
}
// `requestFlush` requests that the high priority event queue be flushed as
// soon as possible.
// This is useful to prevent an error thrown in a task from stalling the event
// queue if the exception handled by Node.jss
// `process.on("uncaughtException")` or by a domain.
rawAsap.requestFlush = requestFlush;
// To request a high priority event, we induce a mutation observer by toggling
// the text of a text node between "1" and "-1".
function makeRequestCallFromMutationObserver(callback) {
var toggle = 1;
var observer = new BrowserMutationObserver(callback);
var node = document.createTextNode("");
observer.observe(node, {characterData: true});
return function requestCall() {
toggle = -toggle;
node.data = toggle;
};
}
// The message channel technique was discovered by Malte Ubl and was the
// original foundation for this library.
// http://www.nonblocking.io/2011/06/windownexttick.html
// Safari 6.0.5 (at least) intermittently fails to create message ports on a
// page's first load. Thankfully, this version of Safari supports
// MutationObservers, so we don't need to fall back in that case.
// function makeRequestCallFromMessageChannel(callback) {
// var channel = new MessageChannel();
// channel.port1.onmessage = callback;
// return function requestCall() {
// channel.port2.postMessage(0);
// };
// }
// For reasons explained above, we are also unable to use `setImmediate`
// under any circumstances.
// Even if we were, there is another bug in Internet Explorer 10.
// It is not sufficient to assign `setImmediate` to `requestFlush` because
// `setImmediate` must be called *by name* and therefore must be wrapped in a
// closure.
// Never forget.
// function makeRequestCallFromSetImmediate(callback) {
// return function requestCall() {
// setImmediate(callback);
// };
// }
// Safari 6.0 has a problem where timers will get lost while the user is
// scrolling. This problem does not impact ASAP because Safari 6.0 supports
// mutation observers, so that implementation is used instead.
// However, if we ever elect to use timers in Safari, the prevalent work-around
// is to add a scroll event listener that calls for a flush.
// `setTimeout` does not call the passed callback if the delay is less than
// approximately 7 in web workers in Firefox 8 through 18, and sometimes not
// even then.
function makeRequestCallFromTimer(callback) {
return function requestCall() {
// We dispatch a timeout with a specified delay of 0 for engines that
// can reliably accommodate that request. This will usually be snapped
// to a 4 milisecond delay, but once we're flushing, there's no delay
// between events.
var timeoutHandle = setTimeout(handleTimer, 0);
// However, since this timer gets frequently dropped in Firefox
// workers, we enlist an interval handle that will try to fire
// an event 20 times per second until it succeeds.
var intervalHandle = setInterval(handleTimer, 50);
function handleTimer() {
// Whichever timer succeeds will cancel both timers and
// execute the callback.
clearTimeout(timeoutHandle);
clearInterval(intervalHandle);
callback();
}
};
}
// This is for `asap.js` only.
// Its name will be periodically randomized to break any code that depends on
// its existence.
rawAsap.makeRequestCallFromTimer = makeRequestCallFromTimer;
// ASAP was originally a nextTick shim included in Q. This was factored out
// into this ASAP package. It was later adapted to RSVP which made further
// amendments. These decisions, particularly to marginalize MessageChannel and
// to capture the MutationObserver implementation in a closure, were integrated
// back into ASAP proper.
// https://github.com/tildeio/rsvp.js/blob/cddf7232546a9cf858524b75cde6f9edf72620a7/lib/rsvp/asap.js

87
node_modules/asap/package.json generated vendored Normal file
View file

@ -0,0 +1,87 @@
{
"_from": "asap@^2.0.0",
"_id": "asap@2.0.6",
"_inBundle": false,
"_integrity": "sha512-BSHWgDSAiKs50o2Re8ppvp3seVHXSRM44cdSsT9FfNEUUZLOGWVCsiWaRPWM1Znn+mqZ1OfVZ3z3DWEzSp7hRA==",
"_location": "/asap",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "asap@^2.0.0",
"name": "asap",
"escapedName": "asap",
"rawSpec": "^2.0.0",
"saveSpec": null,
"fetchSpec": "^2.0.0"
},
"_requiredBy": [
"/dezalgo"
],
"_resolved": "https://registry.npmjs.org/asap/-/asap-2.0.6.tgz",
"_shasum": "e50347611d7e690943208bbdafebcbc2fb866d46",
"_spec": "asap@^2.0.0",
"_where": "/home/ubuntu/formidable/node_modules/dezalgo",
"browser": {
"./asap": "./browser-asap.js",
"./asap.js": "./browser-asap.js",
"./raw": "./browser-raw.js",
"./raw.js": "./browser-raw.js",
"./test/domain.js": "./test/browser-domain.js"
},
"bugs": {
"url": "https://github.com/kriskowal/asap/issues"
},
"bundleDependencies": false,
"deprecated": false,
"description": "High-priority task queue for Node.js and browsers",
"devDependencies": {
"benchmark": "^1.0.0",
"events": "^1.0.1",
"jshint": "^2.5.1",
"knox": "^0.8.10",
"mr": "^2.0.5",
"opener": "^1.3.0",
"q": "^2.0.3",
"q-io": "^2.0.3",
"saucelabs": "^0.1.1",
"wd": "^0.2.21",
"weak-map": "^1.0.5"
},
"files": [
"raw.js",
"asap.js",
"browser-raw.js",
"browser-asap.js"
],
"homepage": "https://github.com/kriskowal/asap#readme",
"keywords": [
"event",
"task",
"queue"
],
"license": "MIT",
"main": "./asap.js",
"name": "asap",
"react-native": {
"domain": false
},
"repository": {
"type": "git",
"url": "git+https://github.com/kriskowal/asap.git"
},
"scripts": {
"benchmarks": "node benchmarks",
"lint": "jshint raw.js asap.js browser-raw.js browser-asap.js $(find scripts -name '*.js' | grep -v gauntlet)",
"test": "npm run lint && npm run test-node",
"test-browser": "node scripts/publish-bundle.js test/asap-test.js | xargs opener",
"test-node": "node test/asap-test.js",
"test-publish": "node scripts/publish-bundle.js test/asap-test.js | pbcopy",
"test-saucelabs": "node scripts/saucelabs.js test/asap-test.js scripts/saucelabs-spot-configurations.json",
"test-saucelabs-all": "node scripts/saucelabs.js test/asap-test.js scripts/saucelabs-all-configurations.json",
"test-saucelabs-worker": "node scripts/saucelabs-worker-test.js scripts/saucelabs-spot-configurations.json",
"test-saucelabs-worker-all": "node scripts/saucelabs-worker-test.js scripts/saucelabs-all-configurations.json",
"test-travis": "npm run lint && npm run test-node && npm run test-saucelabs && npm run test-saucelabs-worker"
},
"version": "2.0.6"
}

101
node_modules/asap/raw.js generated vendored Normal file
View file

@ -0,0 +1,101 @@
"use strict";
var domain; // The domain module is executed on demand
var hasSetImmediate = typeof setImmediate === "function";
// Use the fastest means possible to execute a task in its own turn, with
// priority over other events including network IO events in Node.js.
//
// An exception thrown by a task will permanently interrupt the processing of
// subsequent tasks. The higher level `asap` function ensures that if an
// exception is thrown by a task, that the task queue will continue flushing as
// soon as possible, but if you use `rawAsap` directly, you are responsible to
// either ensure that no exceptions are thrown from your task, or to manually
// call `rawAsap.requestFlush` if an exception is thrown.
module.exports = rawAsap;
function rawAsap(task) {
if (!queue.length) {
requestFlush();
flushing = true;
}
// Avoids a function call
queue[queue.length] = task;
}
var queue = [];
// Once a flush has been requested, no further calls to `requestFlush` are
// necessary until the next `flush` completes.
var flushing = false;
// The position of the next task to execute in the task queue. This is
// preserved between calls to `flush` so that it can be resumed if
// a task throws an exception.
var index = 0;
// If a task schedules additional tasks recursively, the task queue can grow
// unbounded. To prevent memory excaustion, the task queue will periodically
// truncate already-completed tasks.
var capacity = 1024;
// The flush function processes all tasks that have been scheduled with
// `rawAsap` unless and until one of those tasks throws an exception.
// If a task throws an exception, `flush` ensures that its state will remain
// consistent and will resume where it left off when called again.
// However, `flush` does not make any arrangements to be called again if an
// exception is thrown.
function flush() {
while (index < queue.length) {
var currentIndex = index;
// Advance the index before calling the task. This ensures that we will
// begin flushing on the next task the task throws an error.
index = index + 1;
queue[currentIndex].call();
// Prevent leaking memory for long chains of recursive calls to `asap`.
// If we call `asap` within tasks scheduled by `asap`, the queue will
// grow, but to avoid an O(n) walk for every task we execute, we don't
// shift tasks off the queue after they have been executed.
// Instead, we periodically shift 1024 tasks off the queue.
if (index > capacity) {
// Manually shift all values starting at the index back to the
// beginning of the queue.
for (var scan = 0, newLength = queue.length - index; scan < newLength; scan++) {
queue[scan] = queue[scan + index];
}
queue.length -= index;
index = 0;
}
}
queue.length = 0;
index = 0;
flushing = false;
}
rawAsap.requestFlush = requestFlush;
function requestFlush() {
// Ensure flushing is not bound to any domain.
// It is not sufficient to exit the domain, because domains exist on a stack.
// To execute code outside of any domain, the following dance is necessary.
var parentDomain = process.domain;
if (parentDomain) {
if (!domain) {
// Lazy execute the domain module.
// Only employed if the user elects to use domains.
domain = require("domain");
}
domain.active = process.domain = null;
}
// `setImmediate` is slower that `process.nextTick`, but `process.nextTick`
// cannot handle recursion.
// `requestFlush` will only be called recursively from `asap.js`, to resume
// flushing after an error is thrown into a domain.
// Conveniently, `setImmediate` was introduced in the same version
// `process.nextTick` started throwing recursion errors.
if (flushing && hasSetImmediate) {
setImmediate(flush);
} else {
process.nextTick(flush);
}
if (parentDomain) {
domain.active = process.domain = parentDomain;
}
}

145
node_modules/asn1.js-rfc2560/index.js generated vendored Normal file
View file

@ -0,0 +1,145 @@
var asn1 = require('asn1.js');
var rfc5280 = require('asn1.js-rfc5280');
var OCSPRequest = asn1.define('OCSPRequest', function() {
this.seq().obj(
this.key('tbsRequest').use(TBSRequest),
this.key('optionalSignature').optional().explicit(0).use(Signature)
);
});
exports.OCSPRequest = OCSPRequest;
var TBSRequest = asn1.define('TBSRequest', function() {
this.seq().obj(
this.key('version').def('v1').explicit(0).use(rfc5280.Version),
this.key('requestorName').optional().explicit(1).use(rfc5280.GeneralName),
this.key('requestList').seqof(Request),
this.key('requestExtensions').optional().explicit(2)
.seqof(rfc5280.Extension)
);
});
exports.TBSRequest = TBSRequest;
var Signature = asn1.define('Signature', function() {
this.seq().obj(
this.key('signatureAlgorithm').use(rfc5280.AlgorithmIdentifier),
this.key('signature').bitstr(),
this.key('certs').optional().explicit(0).seqof(rfc5280.Certificate)
);
});
exports.Signature = Signature;
var Request = asn1.define('Request', function() {
this.seq().obj(
this.key('reqCert').use(CertID),
this.key('singleRequestExtensions').optional().explicit(0).seqof(
rfc5280.Extension)
);
});
exports.Request = Request;
var OCSPResponse = asn1.define('OCSPResponse', function() {
this.seq().obj(
this.key('responseStatus').use(ResponseStatus),
this.key('responseBytes').optional().explicit(0).seq().obj(
this.key('responseType').objid({
'1 3 6 1 5 5 7 48 1 1': 'id-pkix-ocsp-basic'
}),
this.key('response').octstr()
)
);
});
exports.OCSPResponse = OCSPResponse;
var ResponseStatus = asn1.define('ResponseStatus', function() {
this.enum({
0: 'successful',
1: 'malformed_request',
2: 'internal_error',
3: 'try_later',
5: 'sig_required',
6: 'unauthorized'
});
});
exports.ResponseStatus = ResponseStatus;
var BasicOCSPResponse = asn1.define('BasicOCSPResponse', function() {
this.seq().obj(
this.key('tbsResponseData').use(ResponseData),
this.key('signatureAlgorithm').use(rfc5280.AlgorithmIdentifier),
this.key('signature').bitstr(),
this.key('certs').optional().explicit(0).seqof(rfc5280.Certificate)
);
});
exports.BasicOCSPResponse = BasicOCSPResponse;
var ResponseData = asn1.define('ResponseData', function() {
this.seq().obj(
this.key('version').def('v1').explicit(0).use(rfc5280.Version),
this.key('responderID').use(ResponderID),
this.key('producedAt').gentime(),
this.key('responses').seqof(SingleResponse),
this.key('responseExtensions').optional().explicit(0)
.seqof(rfc5280.Extension)
);
});
exports.ResponseData = ResponseData;
var ResponderID = asn1.define('ResponderId', function() {
this.choice({
byName: this.explicit(1).use(rfc5280.Name),
byKey: this.explicit(2).use(KeyHash)
});
});
exports.ResponderID = ResponderID;
var KeyHash = asn1.define('KeyHash', function() {
this.octstr();
});
exports.KeyHash = KeyHash;
var SingleResponse = asn1.define('SingleResponse', function() {
this.seq().obj(
this.key('certId').use(CertID),
this.key('certStatus').use(CertStatus),
this.key('thisUpdate').gentime(),
this.key('nextUpdate').optional().explicit(0).gentime(),
this.key('singleExtensions').optional().explicit(1).seqof(rfc5280.Extension)
);
});
exports.SingleResponse = SingleResponse;
var CertStatus = asn1.define('CertStatus', function() {
this.choice({
good: this.implicit(0).null_(),
revoked: this.implicit(1).use(RevokedInfo),
unknown: this.implicit(2).null_()
});
});
exports.CertStatus = CertStatus;
var RevokedInfo = asn1.define('RevokedInfo', function() {
this.seq().obj(
this.key('revocationTime').gentime(),
this.key('revocationReason').optional().explicit(0).use(rfc5280.ReasonCode)
);
});
exports.RevokedInfo = RevokedInfo;
var CertID = asn1.define('CertID', function() {
this.seq().obj(
this.key('hashAlgorithm').use(rfc5280.AlgorithmIdentifier),
this.key('issuerNameHash').octstr(),
this.key('issuerKeyHash').octstr(),
this.key('serialNumber').use(rfc5280.CertificateSerialNumber)
);
});
exports.CertID = CertID;
var Nonce = asn1.define('Nonce', function() {
this.octstr();
});
exports.Nonce = Nonce;
exports['id-pkix-ocsp'] = [ 1, 3, 6, 1, 5, 5, 7, 48, 1 ];
exports['id-pkix-ocsp-nonce'] = [ 1, 3, 6, 1, 5, 5, 7, 48, 1, 2 ];

60
node_modules/asn1.js-rfc2560/package.json generated vendored Normal file
View file

@ -0,0 +1,60 @@
{
"_from": "asn1.js-rfc2560@^4.0.0",
"_id": "asn1.js-rfc2560@4.0.6",
"_inBundle": false,
"_integrity": "sha512-ysf48ni+f/efNPilq4+ApbifUPcSW/xbDeQAh055I+grr2gXgNRQqHew7kkO70WSMQ2tEOURVwsK+dJqUNjIIg==",
"_location": "/asn1.js-rfc2560",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "asn1.js-rfc2560@^4.0.0",
"name": "asn1.js-rfc2560",
"escapedName": "asn1.js-rfc2560",
"rawSpec": "^4.0.0",
"saveSpec": null,
"fetchSpec": "^4.0.0"
},
"_requiredBy": [
"/ocsp"
],
"_resolved": "https://registry.npmjs.org/asn1.js-rfc2560/-/asn1.js-rfc2560-4.0.6.tgz",
"_shasum": "0975ce84768a8401e95884ad13e2d00e7b25a280",
"_spec": "asn1.js-rfc2560@^4.0.0",
"_where": "/home/ubuntu/OCSP/node_modules/ocsp",
"author": {
"name": "Fedor Indutny"
},
"bugs": {
"url": "https://github.com/indutny/asn1.js/issues"
},
"bundleDependencies": false,
"dependencies": {
"asn1.js-rfc5280": "^2.0.0"
},
"deprecated": false,
"description": "RFC2560 structures for asn1.js",
"devDependencies": {
"mocha": "^4.0.1"
},
"homepage": "https://github.com/indutny/asn1.js",
"keywords": [
"asn1",
"rfc2560",
"der"
],
"license": "MIT",
"main": "index.js",
"name": "asn1.js-rfc2560",
"peerDependencies": {
"asn1.js": "^4.4.0"
},
"repository": {
"type": "git",
"url": "git+ssh://git@github.com/indutny/asn1.js.git"
},
"scripts": {
"test": "mocha --reporter=spec test/*-test.js"
},
"version": "4.0.6"
}

892
node_modules/asn1.js-rfc5280/index.js generated vendored Normal file
View file

@ -0,0 +1,892 @@
var asn1 = require('asn1.js');
/**
* RFC5280 X509 and Extension Definitions
*/
var rfc5280 = exports;
// OIDs
var x509OIDs = {
'2 5 29 9': 'subjectDirectoryAttributes',
'2 5 29 14': 'subjectKeyIdentifier',
'2 5 29 15': 'keyUsage',
'2 5 29 17': 'subjectAlternativeName',
'2 5 29 18': 'issuerAlternativeName',
'2 5 29 19': 'basicConstraints',
'2 5 29 20': 'cRLNumber',
'2 5 29 21': 'reasonCode',
'2 5 29 24': 'invalidityDate',
'2 5 29 27': 'deltaCRLIndicator',
'2 5 29 28': 'issuingDistributionPoint',
'2 5 29 29': 'certificateIssuer',
'2 5 29 30': 'nameConstraints',
'2 5 29 31': 'cRLDistributionPoints',
'2 5 29 32': 'certificatePolicies',
'2 5 29 33': 'policyMappings',
'2 5 29 35': 'authorityKeyIdentifier',
'2 5 29 36': 'policyConstraints',
'2 5 29 37': 'extendedKeyUsage',
'2 5 29 46': 'freshestCRL',
'2 5 29 54': 'inhibitAnyPolicy',
'1 3 6 1 5 5 7 1 1': 'authorityInformationAccess',
'1 3 6 1 5 5 7 11': 'subjectInformationAccess'
};
// CertificateList ::= SEQUENCE {
// tbsCertList TBSCertList,
// signatureAlgorithm AlgorithmIdentifier,
// signature BIT STRING }
var CertificateList = asn1.define('CertificateList', function() {
this.seq().obj(
this.key('tbsCertList').use(TBSCertList),
this.key('signatureAlgorithm').use(AlgorithmIdentifier),
this.key('signature').bitstr()
);
});
rfc5280.CertificateList = CertificateList;
// AlgorithmIdentifier ::= SEQUENCE {
// algorithm OBJECT IDENTIFIER,
// parameters ANY DEFINED BY algorithm OPTIONAL }
var AlgorithmIdentifier = asn1.define('AlgorithmIdentifier', function() {
this.seq().obj(
this.key('algorithm').objid(),
this.key('parameters').optional().any()
);
});
rfc5280.AlgorithmIdentifier = AlgorithmIdentifier;
// Certificate ::= SEQUENCE {
// tbsCertificate TBSCertificate,
// signatureAlgorithm AlgorithmIdentifier,
// signature BIT STRING }
var Certificate = asn1.define('Certificate', function() {
this.seq().obj(
this.key('tbsCertificate').use(TBSCertificate),
this.key('signatureAlgorithm').use(AlgorithmIdentifier),
this.key('signature').bitstr()
);
});
rfc5280.Certificate = Certificate;
// TBSCertificate ::= SEQUENCE {
// version [0] Version DEFAULT v1,
// serialNumber CertificateSerialNumber,
// signature AlgorithmIdentifier,
// issuer Name,
// validity Validity,
// subject Name,
// subjectPublicKeyInfo SubjectPublicKeyInfo,
// issuerUniqueID [1] IMPLICIT UniqueIdentifier OPTIONAL,
// subjectUniqueID [2] IMPLICIT UniqueIdentifier OPTIONAL,
// extensions [3] Extensions OPTIONAL
var TBSCertificate = asn1.define('TBSCertificate', function() {
this.seq().obj(
this.key('version').def('v1').explicit(0).use(Version),
this.key('serialNumber').int(),
this.key('signature').use(AlgorithmIdentifier),
this.key('issuer').use(Name),
this.key('validity').use(Validity),
this.key('subject').use(Name),
this.key('subjectPublicKeyInfo').use(SubjectPublicKeyInfo),
this.key('issuerUniqueID').optional().implicit(1).bitstr(),
this.key('subjectUniqueID').optional().implicit(2).bitstr(),
this.key('extensions').optional().explicit(3).seqof(Extension)
);
});
rfc5280.TBSCertificate = TBSCertificate;
// Version ::= INTEGER { v1(0), v2(1), v3(2) }
var Version = asn1.define('Version', function() {
this.int({
0: 'v1',
1: 'v2',
2: 'v3'
});
});
rfc5280.Version = Version;
// Validity ::= SEQUENCE {
// notBefore Time,
// notAfter Time }
var Validity = asn1.define('Validity', function() {
this.seq().obj(
this.key('notBefore').use(Time),
this.key('notAfter').use(Time)
);
});
rfc5280.Validity = Validity;
// Time ::= CHOICE {
// utcTime UTCTime,
// generalTime GeneralizedTime }
var Time = asn1.define('Time', function() {
this.choice({
utcTime: this.utctime(),
genTime: this.gentime()
});
});
rfc5280.Time = Time;
// SubjectPublicKeyInfo ::= SEQUENCE {
// algorithm AlgorithmIdentifier,
// subjectPublicKey BIT STRING }
var SubjectPublicKeyInfo = asn1.define('SubjectPublicKeyInfo', function() {
this.seq().obj(
this.key('algorithm').use(AlgorithmIdentifier),
this.key('subjectPublicKey').bitstr()
);
});
rfc5280.SubjectPublicKeyInfo = SubjectPublicKeyInfo;
// TBSCertList ::= SEQUENCE {
// version Version OPTIONAL,
// signature AlgorithmIdentifier,
// issuer Name,
// thisUpdate Time,
// nextUpdate Time OPTIONAL,
// revokedCertificates SEQUENCE OF SEQUENCE {
// userCertificate CertificateSerialNumber,
// revocationDate Time,
// crlEntryExtensions Extensions OPTIONAL
// } OPTIONAL,
// crlExtensions [0] Extensions OPTIONAL }
var TBSCertList = asn1.define('TBSCertList', function() {
this.seq().obj(
this.key('version').optional().int(),
this.key('signature').use(AlgorithmIdentifier),
this.key('issuer').use(Name),
this.key('thisUpdate').use(Time),
this.key('nextUpdate').use(Time),
this.key('revokedCertificates').optional().seqof(RevokedCertificate),
this.key('crlExtensions').explicit(0).optional().seqof(Extension)
);
});
rfc5280.TBSCertList = TBSCertList;
var RevokedCertificate = asn1.define('RevokedCertificate', function() {
this.seq().obj(
this.key('userCertificate').use(CertificateSerialNumber),
this.key('revocationDate').use(Time),
this.key('crlEntryExtensions').optional().seqof(Extension)
)
});
// Extension ::= SEQUENCE {
// extnID OBJECT IDENTIFIER,
// critical BOOLEAN DEFAULT FALSE,
// extnValue OCTET STRING }
var Extension = asn1.define('Extension', function() {
this.seq().obj(
this.key('extnID').objid(x509OIDs),
this.key('critical').bool().def(false),
this.key('extnValue').octstr().contains(function(obj) {
var out = x509Extensions[obj.extnID];
// Cope with unknown extensions
return out ? out : asn1.define('OctString', function() { this.any() })
})
);
});
rfc5280.Extension = Extension;
// Name ::= CHOICE { -- only one possibility for now --
// rdnSequence RDNSequence }
var Name = asn1.define('Name', function() {
this.choice({
rdnSequence: this.use(RDNSequence)
});
});
rfc5280.Name = Name;
// GeneralName ::= CHOICE {
// otherName [0] AnotherName,
// rfc822Name [1] IA5String,
// dNSName [2] IA5String,
// x400Address [3] ORAddress,
// directoryName [4] Name,
// ediPartyName [5] EDIPartyName,
// uniformResourceIdentifier [6] IA5String,
// iPAddress [7] OCTET STRING,
// registeredID [8] OBJECT IDENTIFIER }
var GeneralName = asn1.define('GeneralName', function() {
this.choice({
otherName: this.implicit(0).use(AnotherName),
rfc822Name: this.implicit(1).ia5str(),
dNSName: this.implicit(2).ia5str(),
directoryName: this.explicit(4).use(Name),
ediPartyName: this.implicit(5).use(EDIPartyName),
uniformResourceIdentifier: this.implicit(6).ia5str(),
iPAddress: this.implicit(7).octstr(),
registeredID: this.implicit(8).objid()
});
});
rfc5280.GeneralName = GeneralName;
// GeneralNames ::= SEQUENCE SIZE (1..MAX) OF GeneralName
var GeneralNames = asn1.define('GeneralNames', function() {
this.seqof(GeneralName);
});
rfc5280.GeneralNames = GeneralNames;
// AnotherName ::= SEQUENCE {
// type-id OBJECT IDENTIFIER,
// value [0] EXPLICIT ANY DEFINED BY type-id }
var AnotherName = asn1.define('AnotherName', function() {
this.seq().obj(
this.key('type-id').objid(),
this.key('value').explicit(0).any()
);
});
rfc5280.AnotherName = AnotherName;
// EDIPartyName ::= SEQUENCE {
// nameAssigner [0] DirectoryString OPTIONAL,
// partyName [1] DirectoryString }
var EDIPartyName = asn1.define('EDIPartyName', function() {
this.seq().obj(
this.key('nameAssigner').implicit(0).optional().use(DirectoryString),
this.key('partyName').implicit(1).use(DirectoryString)
);
});
rfc5280.EDIPartyName = EDIPartyName;
// RDNSequence ::= SEQUENCE OF RelativeDistinguishedName
var RDNSequence = asn1.define('RDNSequence', function() {
this.seqof(RelativeDistinguishedName);
});
rfc5280.RDNSequence = RDNSequence;
// RelativeDistinguishedName ::=
// SET SIZE (1..MAX) OF AttributeTypeAndValue
var RelativeDistinguishedName = asn1.define('RelativeDistinguishedName',
function() {
this.setof(AttributeTypeAndValue);
});
rfc5280.RelativeDistinguishedName = RelativeDistinguishedName;
// AttributeTypeAndValue ::= SEQUENCE {
// type AttributeType,
// value AttributeValue }
var AttributeTypeAndValue = asn1.define('AttributeTypeAndValue', function() {
this.seq().obj(
this.key('type').use(AttributeType),
this.key('value').use(AttributeValue)
);
});
rfc5280.AttributeTypeAndValue = AttributeTypeAndValue;
// Attribute ::= SEQUENCE {
// type AttributeType,
// values SET OF AttributeValue }
var Attribute = asn1.define('Attribute', function() {
this.seq().obj(
this.key('type').use(AttributeType),
this.key('values').setof(AttributeValue)
);
});
rfc5280.Attribute = Attribute;
// AttributeType ::= OBJECT IDENTIFIER
var AttributeType = asn1.define('AttributeType', function() {
this.objid();
});
rfc5280.AttributeType = AttributeType;
// AttributeValue ::= ANY -- DEFINED BY AttributeType
var AttributeValue = asn1.define('AttributeValue', function() {
this.any();
});
rfc5280.AttributeValue = AttributeValue;
// DirectoryString ::= CHOICE {
// teletexString TeletexString (SIZE (1..MAX)),
// printableString PrintableString (SIZE (1..MAX)),
// universalString UniversalString (SIZE (1..MAX)),
// utf8String UTF8String (SIZE (1..MAX)),
// bmpString BMPString (SIZE (1..MAX)) }
var DirectoryString = asn1.define('DirectoryString', function() {
this.choice({
teletexString: this.t61str(),
printableString: this.printstr(),
universalString: this.unistr(),
utf8String: this.utf8str(),
bmpString: this.bmpstr()
});
});
rfc5280.DirectoryString = DirectoryString;
// AuthorityKeyIdentifier ::= SEQUENCE {
// keyIdentifier [0] KeyIdentifier OPTIONAL,
// authorityCertIssuer [1] GeneralNames OPTIONAL,
// authorityCertSerialNumber [2] CertificateSerialNumber OPTIONAL }
var AuthorityKeyIdentifier = asn1.define('AuthorityKeyIdentifier', function() {
this.seq().obj(
this.key('keyIdentifier').implicit(0).optional().use(KeyIdentifier),
this.key('authorityCertIssuer').implicit(1).optional().use(GeneralNames),
this.key('authorityCertSerialNumber').implicit(2).optional()
.use(CertificateSerialNumber)
);
});
rfc5280.AuthorityKeyIdentifier = AuthorityKeyIdentifier;
// KeyIdentifier ::= OCTET STRING
var KeyIdentifier = asn1.define('KeyIdentifier', function() {
this.octstr();
});
rfc5280.KeyIdentifier = KeyIdentifier;
// CertificateSerialNumber ::= INTEGER
var CertificateSerialNumber = asn1.define('CertificateSerialNumber',
function() {
this.int();
});
rfc5280.CertificateSerialNumber = CertificateSerialNumber;
// ORAddress ::= SEQUENCE {
// built-in-standard-attributes BuiltInStandardAttributes,
// built-in-domain-defined-attributes BuiltInDomainDefinedAttributes
// OPTIONAL,
// extension-attributes ExtensionAttributes OPTIONAL }
var ORAddress = asn1.define('ORAddress', function() {
this.seq().obj(
this.key('builtInStandardAttributes').use(BuiltInStandardAttributes),
this.key('builtInDomainDefinedAttributes').optional()
.use(BuiltInDomainDefinedAttributes),
this.key('extensionAttributes').optional().use(ExtensionAttributes)
);
});
rfc5280.ORAddress = ORAddress;
// BuiltInStandardAttributes ::= SEQUENCE {
// country-name CountryName OPTIONAL,
// administration-domain-name AdministrationDomainName OPTIONAL,
// network-address [0] IMPLICIT NetworkAddress OPTIONAL,
// terminal-identifier [1] IMPLICIT TerminalIdentifier OPTIONAL,
// private-domain-name [2] PrivateDomainName OPTIONAL,
// organization-name [3] IMPLICIT OrganizationName OPTIONAL,
// numeric-user-identifier [4] IMPLICIT NumericUserIdentifier OPTIONAL,
// personal-name [5] IMPLICIT PersonalName OPTIONAL,
// organizational-unit-names [6] IMPLICIT OrganizationalUnitNames OPTIONAL }
var BuiltInStandardAttributes = asn1.define('BuiltInStandardAttributes',
function() {
this.seq().obj(
this.key('countryName').optional().use(CountryName),
this.key('administrationDomainName').optional()
.use(AdministrationDomainName),
this.key('networkAddress').implicit(0).optional().use(NetworkAddress),
this.key('terminalIdentifier').implicit(1).optional()
.use(TerminalIdentifier),
this.key('privateDomainName').explicit(2).optional().use(PrivateDomainName),
this.key('organizationName').implicit(3).optional().use(OrganizationName),
this.key('numericUserIdentifier').implicit(4).optional()
.use(NumericUserIdentifier),
this.key('personalName').implicit(5).optional().use(PersonalName),
this.key('organizationalUnitNames').implicit(6).optional()
.use(OrganizationalUnitNames)
);
});
rfc5280.BuiltInStandardAttributes = BuiltInStandardAttributes;
// CountryName ::= CHOICE {
// x121-dcc-code NumericString,
// iso-3166-alpha2-code PrintableString }
var CountryName = asn1.define('CountryName', function() {
this.choice({
x121DccCode: this.numstr(),
iso3166Alpha2Code: this.printstr()
});
});
rfc5280.CountryName = CountryName;
// AdministrationDomainName ::= CHOICE {
// numeric NumericString,
// printable PrintableString }
var AdministrationDomainName = asn1.define('AdministrationDomainName',
function() {
this.choice({
numeric: this.numstr(),
printable: this.printstr()
});
});
rfc5280.AdministrationDomainName = AdministrationDomainName;
// NetworkAddress ::= X121Address
var NetworkAddress = asn1.define('NetworkAddress', function() {
this.use(X121Address);
});
rfc5280.NetworkAddress = NetworkAddress;
// X121Address ::= NumericString
var X121Address = asn1.define('X121Address', function() {
this.numstr();
});
rfc5280.X121Address = X121Address;
// TerminalIdentifier ::= PrintableString
var TerminalIdentifier = asn1.define('TerminalIdentifier', function() {
this.printstr();
});
rfc5280.TerminalIdentifier = TerminalIdentifier;
// PrivateDomainName ::= CHOICE {
// numeric NumericString,
// printable PrintableString }
var PrivateDomainName = asn1.define('PrivateDomainName', function() {
this.choice({
numeric: this.numstr(),
printable: this.printstr()
});
});
rfc5280.PrivateDomainName = PrivateDomainName;
// OrganizationName ::= PrintableString
var OrganizationName = asn1.define('OrganizationName', function() {
this.printstr();
});
rfc5280.OrganizationName = OrganizationName;
// NumericUserIdentifier ::= NumericString
var NumericUserIdentifier = asn1.define('NumericUserIdentifier', function() {
this.numstr();
});
rfc5280.NumericUserIdentifier = NumericUserIdentifier;
// PersonalName ::= SET {
// surname [0] IMPLICIT PrintableString,
// given-name [1] IMPLICIT PrintableString OPTIONAL,
// initials [2] IMPLICIT PrintableString OPTIONAL,
// generation-qualifier [3] IMPLICIT PrintableString OPTIONAL }
var PersonalName = asn1.define('PersonalName', function() {
this.set().obj(
this.key('surname').implicit(0).printstr(),
this.key('givenName').implicit(1).printstr(),
this.key('initials').implicit(2).printstr(),
this.key('generationQualifier').implicit(3).printstr()
);
});
rfc5280.PersonalName = PersonalName;
// OrganizationalUnitNames ::= SEQUENCE SIZE (1..ub-organizational-units)
// OF OrganizationalUnitName
var OrganizationalUnitNames = asn1.define('OrganizationalUnitNames',
function() {
this.seqof(OrganizationalUnitName);
});
rfc5280.OrganizationalUnitNames = OrganizationalUnitNames;
// OrganizationalUnitName ::= PrintableString (SIZE
// (1..ub-organizational-unit-name-length))
var OrganizationalUnitName = asn1.define('OrganizationalUnitName', function() {
this.printstr();
});
rfc5280.OrganizationalUnitName = OrganizationalUnitName;
// uiltInDomainDefinedAttributes ::= SEQUENCE SIZE
// (1..ub-domain-defined-attributes)
// OF BuiltInDomainDefinedAttribute
var BuiltInDomainDefinedAttributes = asn1.define(
'BuiltInDomainDefinedAttributes', function() {
this.seqof(BuiltInDomainDefinedAttribute);
});
rfc5280.BuiltInDomainDefinedAttributes = BuiltInDomainDefinedAttributes;
// BuiltInDomainDefinedAttribute ::= SEQUENCE {
// type PrintableString (SIZE (1..ub-domain-defined-attribute-type-length)),
// value PrintableString (SIZE (1..ub-domain-defined-attribute-value-length))
//}
var BuiltInDomainDefinedAttribute = asn1.define('BuiltInDomainDefinedAttribute',
function() {
this.seq().obj(
this.key('type').printstr(),
this.key('value').printstr()
);
});
rfc5280.BuiltInDomainDefinedAttribute = BuiltInDomainDefinedAttribute;
// ExtensionAttributes ::= SET SIZE (1..ub-extension-attributes) OF
// ExtensionAttribute
var ExtensionAttributes = asn1.define('ExtensionAttributes', function() {
this.seqof(ExtensionAttribute);
});
rfc5280.ExtensionAttributes = ExtensionAttributes;
// ExtensionAttribute ::= SEQUENCE {
// extension-attribute-type [0] IMPLICIT INTEGER,
// extension-attribute-value [1] ANY DEFINED BY extension-attribute-type }
var ExtensionAttribute = asn1.define('ExtensionAttribute', function() {
this.seq().obj(
this.key('extensionAttributeType').implicit(0).int(),
this.key('extensionAttributeValue').any().explicit(1).int()
);
});
rfc5280.ExtensionAttribute = ExtensionAttribute;
// SubjectKeyIdentifier ::= KeyIdentifier
var SubjectKeyIdentifier = asn1.define('SubjectKeyIdentifier', function() {
this.use(KeyIdentifier);
});
rfc5280.SubjectKeyIdentifier = SubjectKeyIdentifier;
// KeyUsage ::= BIT STRING {
// digitalSignature (0),
// nonRepudiation (1), -- recent editions of X.509 have
// -- renamed this bit to contentCommitment
// keyEncipherment (2),
// dataEncipherment (3),
// keyAgreement (4),
// keyCertSign (5),
// cRLSign (6),
// encipherOnly (7),
// decipherOnly (8) }
var KeyUsage = asn1.define('KeyUsage', function() {
this.bitstr();
});
rfc5280.KeyUsage = KeyUsage;
// CertificatePolicies ::= SEQUENCE SIZE (1..MAX) OF PolicyInformation
var CertificatePolicies = asn1.define('CertificatePolicies', function() {
this.seqof(PolicyInformation);
});
rfc5280.CertificatePolicies = CertificatePolicies;
// PolicyInformation ::= SEQUENCE {
// policyIdentifier CertPolicyId,
// policyQualifiers SEQUENCE SIZE (1..MAX) OF PolicyQualifierInfo
// OPTIONAL }
var PolicyInformation = asn1.define('PolicyInformation', function() {
this.seq().obj(
this.key('policyIdentifier').use(CertPolicyId),
this.key('policyQualifiers').optional().use(PolicyQualifiers)
);
});
rfc5280.PolicyInformation = PolicyInformation;
// CertPolicyId ::= OBJECT IDENTIFIER
var CertPolicyId = asn1.define('CertPolicyId', function() {
this.objid();
});
rfc5280.CertPolicyId = CertPolicyId;
var PolicyQualifiers = asn1.define('PolicyQualifiers', function() {
this.seqof(PolicyQualifierInfo);
});
rfc5280.PolicyQualifiers = PolicyQualifiers;
// PolicyQualifierInfo ::= SEQUENCE {
// policyQualifierId PolicyQualifierId,
// qualifier ANY DEFINED BY policyQualifierId }
var PolicyQualifierInfo = asn1.define('PolicyQualifierInfo', function() {
this.seq().obj(
this.key('policyQualifierId').use(PolicyQualifierId),
this.key('qualifier').any()
);
});
rfc5280.PolicyQualifierInfo = PolicyQualifierInfo;
// PolicyQualifierId ::= OBJECT IDENTIFIER
var PolicyQualifierId = asn1.define('PolicyQualifierId', function() {
this.objid();
});
rfc5280.PolicyQualifierId = PolicyQualifierId;
// PolicyMappings ::= SEQUENCE SIZE (1..MAX) OF SEQUENCE {
// issuerDomainPolicy CertPolicyId,
// subjectDomainPolicy CertPolicyId }
var PolicyMappings = asn1.define('PolicyMappings', function() {
this.seqof(PolicyMapping);
});
rfc5280.PolicyMappings = PolicyMappings;
var PolicyMapping = asn1.define('PolicyMapping', function() {
this.seq().obj(
this.key('issuerDomainPolicy').use(CertPolicyId),
this.key('subjectDomainPolicy').use(CertPolicyId)
);
});
rfc5280.PolicyMapping = PolicyMapping;
// SubjectAltName ::= GeneralNames
var SubjectAlternativeName = asn1.define('SubjectAlternativeName', function() {
this.use(GeneralNames);
});
rfc5280.SubjectAlternativeName = SubjectAlternativeName;
// IssuerAltName ::= GeneralNames
var IssuerAlternativeName = asn1.define('IssuerAlternativeName', function() {
this.use(GeneralNames);
});
rfc5280.IssuerAlternativeName = IssuerAlternativeName;
// SubjectDirectoryAttributes ::= SEQUENCE SIZE (1..MAX) OF Attribute
var SubjectDirectoryAttributes = asn1.define('SubjectDirectoryAttributes',
function() {
this.seqof(Attribute);
});
rfc5280.SubjectDirectoryAttributes = SubjectDirectoryAttributes;
// BasicConstraints ::= SEQUENCE {
// cA BOOLEAN DEFAULT FALSE,
// pathLenConstraint INTEGER (0..MAX) OPTIONAL }
var BasicConstraints = asn1.define('BasicConstraints', function() {
this.seq().obj(
this.key('cA').bool().def(false),
this.key('pathLenConstraint').optional().int()
);
});
rfc5280.BasicConstraints = BasicConstraints;
// NameConstraints ::= SEQUENCE {
// permittedSubtrees [0] GeneralSubtrees OPTIONAL,
// excludedSubtrees [1] GeneralSubtrees OPTIONAL }
var NameConstraints = asn1.define('NameConstraints', function() {
this.seq().obj(
this.key('permittedSubtrees').implicit(0).optional().use(GeneralSubtrees),
this.key('excludedSubtrees').implicit(1).optional().use(GeneralSubtrees)
);
});
rfc5280.NameConstraints = NameConstraints;
// GeneralSubtrees ::= SEQUENCE SIZE (1..MAX) OF GeneralSubtree
var GeneralSubtrees = asn1.define('GeneralSubtrees', function() {
this.seqof(GeneralSubtree);
});
rfc5280.GeneralSubtrees = GeneralSubtrees;
// GeneralSubtree ::= SEQUENCE {
// base GeneralName,
// minimum [0] BaseDistance DEFAULT 0,
// maximum [1] BaseDistance OPTIONAL }
var GeneralSubtree = asn1.define('GeneralSubtree', function() {
this.seq().obj(
this.key('base').use(GeneralName),
this.key('minimum').implicit(0).def(0).use(BaseDistance),
this.key('maximum').implicit(0).optional().use(BaseDistance)
);
});
rfc5280.GeneralSubtree = GeneralSubtree;
// BaseDistance ::= INTEGER
var BaseDistance = asn1.define('BaseDistance', function() {
this.int();
});
rfc5280.BaseDistance = BaseDistance;
// PolicyConstraints ::= SEQUENCE {
// requireExplicitPolicy [0] SkipCerts OPTIONAL,
// inhibitPolicyMapping [1] SkipCerts OPTIONAL }
var PolicyConstraints = asn1.define('PolicyConstraints', function() {
this.seq().obj(
this.key('requireExplicitPolicy').implicit(0).optional().use(SkipCerts),
this.key('inhibitPolicyMapping').implicit(1).optional().use(SkipCerts)
);
});
rfc5280.PolicyConstraints = PolicyConstraints;
// SkipCerts ::= INTEGER
var SkipCerts = asn1.define('SkipCerts', function() {
this.int();
});
rfc5280.SkipCerts = SkipCerts;
// ExtKeyUsageSyntax ::= SEQUENCE SIZE (1..MAX) OF KeyPurposeId
var ExtendedKeyUsage = asn1.define('ExtendedKeyUsage', function() {
this.seqof(KeyPurposeId);
});
rfc5280.ExtendedKeyUsage = ExtendedKeyUsage;
// KeyPurposeId ::= OBJECT IDENTIFIER
var KeyPurposeId = asn1.define('KeyPurposeId', function() {
this.objid();
});
rfc5280.KeyPurposeId = KeyPurposeId;
// RLDistributionPoints ::= SEQUENCE SIZE (1..MAX) OF DistributionPoint
var CRLDistributionPoints = asn1.define('CRLDistributionPoints', function() {
this.seqof(DistributionPoint);
});
rfc5280.CRLDistributionPoints = CRLDistributionPoints;
// DistributionPoint ::= SEQUENCE {
// distributionPoint [0] DistributionPointName OPTIONAL,
// reasons [1] ReasonFlags OPTIONAL,
// cRLIssuer [2] GeneralNames OPTIONAL }
var DistributionPoint = asn1.define('DistributionPoint', function() {
this.seq().obj(
this.key('distributionPoint').optional().explicit(0)
.use(DistributionPointName),
this.key('reasons').optional().implicit(1).use(ReasonFlags),
this.key('cRLIssuer').optional().implicit(2).use(GeneralNames)
);
});
rfc5280.DistributionPoint = DistributionPoint;
// DistributionPointName ::= CHOICE {
// fullName [0] GeneralNames,
// nameRelativeToCRLIssuer [1] RelativeDistinguishedName }
var DistributionPointName = asn1.define('DistributionPointName', function() {
this.choice({
fullName: this.implicit(0).use(GeneralNames),
nameRelativeToCRLIssuer: this.implicit(1).use(RelativeDistinguishedName)
});
});
rfc5280.DistributionPointName = DistributionPointName;
// ReasonFlags ::= BIT STRING {
// unused (0),
// keyCompromise (1),
// cACompromise (2),
// affiliationChanged (3),
// superseded (4),
// cessationOfOperation (5),
// certificateHold (6),
// privilegeWithdrawn (7),
// aACompromise (8) }
var ReasonFlags = asn1.define('ReasonFlags', function() {
this.bitstr();
});
rfc5280.ReasonFlags = ReasonFlags;
// InhibitAnyPolicy ::= SkipCerts
var InhibitAnyPolicy = asn1.define('InhibitAnyPolicy', function() {
this.use(SkipCerts);
});
rfc5280.InhibitAnyPolicy = InhibitAnyPolicy;
// FreshestCRL ::= CRLDistributionPoints
var FreshestCRL = asn1.define('FreshestCRL', function() {
this.use(CRLDistributionPoints);
});
rfc5280.FreshestCRL = FreshestCRL;
// AuthorityInfoAccessSyntax ::=
// SEQUENCE SIZE (1..MAX) OF AccessDescription
var AuthorityInfoAccessSyntax = asn1.define('AuthorityInfoAccessSyntax',
function() {
this.seqof(AccessDescription);
});
rfc5280.AuthorityInfoAccessSyntax = AuthorityInfoAccessSyntax;
// AccessDescription ::= SEQUENCE {
// accessMethod OBJECT IDENTIFIER,
// accessLocation GeneralName }
var AccessDescription = asn1.define('AccessDescription', function() {
this.seq().obj(
this.key('accessMethod').objid(),
this.key('accessLocation').use(GeneralName)
);
});
rfc5280.AccessDescription = AccessDescription;
// SubjectInfoAccessSyntax ::=
// SEQUENCE SIZE (1..MAX) OF AccessDescription
var SubjectInformationAccess = asn1.define('SubjectInformationAccess',
function() {
this.seqof(AccessDescription);
});
rfc5280.SubjectInformationAccess = SubjectInformationAccess;
/**
* CRL Extensions
*/
// CRLNumber ::= INTEGER
var CRLNumber = asn1.define('CRLNumber', function() {
this.int();
});
rfc5280.CRLNumber = CRLNumber;
var DeltaCRLIndicator = asn1.define('DeltaCRLIndicator', function() {
this.use(CRLNumber);
});
rfc5280.DeltaCRLIndicator = DeltaCRLIndicator;
// IssuingDistributionPoint ::= SEQUENCE {
// distributionPoint [0] DistributionPointName OPTIONAL,
// onlyContainsUserCerts [1] BOOLEAN DEFAULT FALSE,
// onlyContainsCACerts [2] BOOLEAN DEFAULT FALSE,
// onlySomeReasons [3] ReasonFlags OPTIONAL,
// indirectCRL [4] BOOLEAN DEFAULT FALSE,
// onlyContainsAttributeCerts [5] BOOLEAN DEFAULT FALSE }
var IssuingDistributionPoint = asn1.define('IssuingDistributionPoint',
function() {
this.seq().obj(
this.key('distributionPoint').explicit(0).optional()
.use(DistributionPointName),
this.key('onlyContainsUserCerts').implicit(1).def(false).bool(),
this.key('onlyContainsCACerts').implicit(2).def(false).bool(),
this.key('onlySomeReasons').implicit(3).optional().use(ReasonFlags),
this.key('indirectCRL').implicit(4).def(false).bool(),
this.key('onlyContainsAttributeCerts').implicit(5).def(false).bool()
);
});
rfc5280.IssuingDistributionPoint = IssuingDistributionPoint;
// CRLReason ::= ENUMERATED {
// unspecified (0),
// keyCompromise (1),
// cACompromise (2),
// affiliationChanged (3),
// superseded (4),
// cessationOfOperation (5),
// certificateHold (6),
// -- value 7 is not used
// removeFromCRL (8),
// privilegeWithdrawn (9),
// aACompromise (10) }
var ReasonCode = asn1.define('ReasonCode', function() {
this.enum({
0: 'unspecified',
1: 'keyCompromise',
2: 'cACompromise',
3: 'affiliationChanged',
4: 'superseded',
5: 'cessationOfOperation',
6: 'certificateHold',
8: 'removeFromCRL',
9: 'privilegeWithdrawn',
10: 'aACompromise'
});
});
rfc5280.ReasonCode = ReasonCode;
// InvalidityDate ::= GeneralizedTime
var InvalidityDate = asn1.define('InvalidityDate', function() {
this.gentime();
});
rfc5280.InvalidityDate = InvalidityDate;
// CertificateIssuer ::= GeneralNames
var CertificateIssuer = asn1.define('CertificateIssuer', function() {
this.use(GeneralNames);
});
rfc5280.CertificateIssuer = CertificateIssuer;
// OID label to extension model mapping
var x509Extensions = {
subjectDirectoryAttributes: SubjectDirectoryAttributes,
subjectKeyIdentifier: SubjectKeyIdentifier,
keyUsage: KeyUsage,
subjectAlternativeName: SubjectAlternativeName,
issuerAlternativeName: IssuerAlternativeName,
basicConstraints: BasicConstraints,
cRLNumber: CRLNumber,
reasonCode: ReasonCode,
invalidityDate: InvalidityDate,
deltaCRLIndicator: DeltaCRLIndicator,
issuingDistributionPoint: IssuingDistributionPoint,
certificateIssuer: CertificateIssuer,
nameConstraints: NameConstraints,
cRLDistributionPoints: CRLDistributionPoints,
certificatePolicies: CertificatePolicies,
policyMappings: PolicyMappings,
authorityKeyIdentifier: AuthorityKeyIdentifier,
policyConstraints: PolicyConstraints,
extendedKeyUsage: ExtendedKeyUsage,
freshestCRL: FreshestCRL,
inhibitAnyPolicy: InhibitAnyPolicy,
authorityInformationAccess: AuthorityInfoAccessSyntax,
subjectInformationAccess: SubjectInformationAccess
};

58
node_modules/asn1.js-rfc5280/package.json generated vendored Normal file
View file

@ -0,0 +1,58 @@
{
"_from": "asn1.js-rfc5280@^2.0.0",
"_id": "asn1.js-rfc5280@2.0.1",
"_inBundle": false,
"_integrity": "sha512-1e2ypnvTbYD/GdxWK77tdLBahvo1fZUHlQJqAVUuZWdYj0rdjGcf2CWYUtbsyRYpYUMwMWLZFUtLxog8ZXTrcg==",
"_location": "/asn1.js-rfc5280",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "asn1.js-rfc5280@^2.0.0",
"name": "asn1.js-rfc5280",
"escapedName": "asn1.js-rfc5280",
"rawSpec": "^2.0.0",
"saveSpec": null,
"fetchSpec": "^2.0.0"
},
"_requiredBy": [
"/asn1.js-rfc2560",
"/ocsp"
],
"_resolved": "https://registry.npmjs.org/asn1.js-rfc5280/-/asn1.js-rfc5280-2.0.1.tgz",
"_shasum": "072f3dfc03f86d1faae7485c6197584ba2bb5ddc",
"_spec": "asn1.js-rfc5280@^2.0.0",
"_where": "/home/ubuntu/OCSP/node_modules/ocsp",
"author": {
"name": "Felix Hanley"
},
"bugs": {
"url": "https://github.com/indutny/asn1.js/issues"
},
"bundleDependencies": false,
"dependencies": {
"asn1.js": "^4.5.0"
},
"deprecated": false,
"description": "RFC5280 extension structures for asn1.js",
"devDependencies": {
"mocha": "^4.0.1"
},
"homepage": "https://github.com/indutny/asn1.js",
"keywords": [
"asn1",
"rfc5280",
"der"
],
"license": "MIT",
"main": "index.js",
"name": "asn1.js-rfc5280",
"repository": {
"type": "git",
"url": "git+ssh://git@github.com/indutny/asn1.js.git"
},
"scripts": {
"test": "mocha --reporter=spec test/*-test.js"
},
"version": "2.0.1"
}

100
node_modules/asn1.js/README.md generated vendored Normal file
View file

@ -0,0 +1,100 @@
# ASN1.js
ASN.1 DER Encoder/Decoder and DSL.
## Example
Define model:
```javascript
var asn = require('asn1.js');
var Human = asn.define('Human', function() {
this.seq().obj(
this.key('firstName').octstr(),
this.key('lastName').octstr(),
this.key('age').int(),
this.key('gender').enum({ 0: 'male', 1: 'female' }),
this.key('bio').seqof(Bio)
);
});
var Bio = asn.define('Bio', function() {
this.seq().obj(
this.key('time').gentime(),
this.key('description').octstr()
);
});
```
Encode data:
```javascript
var output = Human.encode({
firstName: 'Thomas',
lastName: 'Anderson',
age: 28,
gender: 'male',
bio: [
{
time: +new Date('31 March 1999'),
description: 'freedom of mind'
}
]
}, 'der');
```
Decode data:
```javascript
var human = Human.decode(output, 'der');
console.log(human);
/*
{ firstName: <Buffer 54 68 6f 6d 61 73>,
lastName: <Buffer 41 6e 64 65 72 73 6f 6e>,
age: 28,
gender: 'male',
bio:
[ { time: 922820400000,
description: <Buffer 66 72 65 65 64 6f 6d 20 6f 66 20 6d 69 6e 64> } ] }
*/
```
### Partial decode
Its possible to parse data without stopping on first error. In order to do it,
you should call:
```javascript
var human = Human.decode(output, 'der', { partial: true });
console.log(human);
/*
{ result: { ... },
errors: [ ... ] }
*/
```
#### LICENSE
This software is licensed under the MIT License.
Copyright Fedor Indutny, 2013.
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to permit
persons to whom the Software is furnished to do so, subject to the
following conditions:
The above copyright notice and this permission notice shall be included
in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
USE OR OTHER DEALINGS IN THE SOFTWARE.

9
node_modules/asn1.js/lib/asn1.js generated vendored Normal file
View file

@ -0,0 +1,9 @@
var asn1 = exports;
asn1.bignum = require('bn.js');
asn1.define = require('./asn1/api').define;
asn1.base = require('./asn1/base');
asn1.constants = require('./asn1/constants');
asn1.decoders = require('./asn1/decoders');
asn1.encoders = require('./asn1/encoders');

61
node_modules/asn1.js/lib/asn1/api.js generated vendored Normal file
View file

@ -0,0 +1,61 @@
var asn1 = require('../asn1');
var inherits = require('inherits');
var api = exports;
api.define = function define(name, body) {
return new Entity(name, body);
};
function Entity(name, body) {
this.name = name;
this.body = body;
this.decoders = {};
this.encoders = {};
};
Entity.prototype._createNamed = function createNamed(base) {
var named;
try {
named = require('vm').runInThisContext(
'(function ' + this.name + '(entity) {\n' +
' this._initNamed(entity);\n' +
'})'
);
} catch (e) {
named = function (entity) {
this._initNamed(entity);
};
}
inherits(named, base);
named.prototype._initNamed = function initnamed(entity) {
base.call(this, entity);
};
return new named(this);
};
Entity.prototype._getDecoder = function _getDecoder(enc) {
enc = enc || 'der';
// Lazily create decoder
if (!this.decoders.hasOwnProperty(enc))
this.decoders[enc] = this._createNamed(asn1.decoders[enc]);
return this.decoders[enc];
};
Entity.prototype.decode = function decode(data, enc, options) {
return this._getDecoder(enc).decode(data, options);
};
Entity.prototype._getEncoder = function _getEncoder(enc) {
enc = enc || 'der';
// Lazily create encoder
if (!this.encoders.hasOwnProperty(enc))
this.encoders[enc] = this._createNamed(asn1.encoders[enc]);
return this.encoders[enc];
};
Entity.prototype.encode = function encode(data, enc, /* internal */ reporter) {
return this._getEncoder(enc).encode(data, reporter);
};

116
node_modules/asn1.js/lib/asn1/base/buffer.js generated vendored Normal file
View file

@ -0,0 +1,116 @@
var inherits = require('inherits');
var Reporter = require('../base').Reporter;
var Buffer = require('buffer').Buffer;
function DecoderBuffer(base, options) {
Reporter.call(this, options);
if (!Buffer.isBuffer(base)) {
this.error('Input not Buffer');
return;
}
this.base = base;
this.offset = 0;
this.length = base.length;
}
inherits(DecoderBuffer, Reporter);
exports.DecoderBuffer = DecoderBuffer;
DecoderBuffer.prototype.save = function save() {
return { offset: this.offset, reporter: Reporter.prototype.save.call(this) };
};
DecoderBuffer.prototype.restore = function restore(save) {
// Return skipped data
var res = new DecoderBuffer(this.base);
res.offset = save.offset;
res.length = this.offset;
this.offset = save.offset;
Reporter.prototype.restore.call(this, save.reporter);
return res;
};
DecoderBuffer.prototype.isEmpty = function isEmpty() {
return this.offset === this.length;
};
DecoderBuffer.prototype.readUInt8 = function readUInt8(fail) {
if (this.offset + 1 <= this.length)
return this.base.readUInt8(this.offset++, true);
else
return this.error(fail || 'DecoderBuffer overrun');
}
DecoderBuffer.prototype.skip = function skip(bytes, fail) {
if (!(this.offset + bytes <= this.length))
return this.error(fail || 'DecoderBuffer overrun');
var res = new DecoderBuffer(this.base);
// Share reporter state
res._reporterState = this._reporterState;
res.offset = this.offset;
res.length = this.offset + bytes;
this.offset += bytes;
return res;
}
DecoderBuffer.prototype.raw = function raw(save) {
return this.base.slice(save ? save.offset : this.offset, this.length);
}
function EncoderBuffer(value, reporter) {
if (Array.isArray(value)) {
this.length = 0;
this.value = value.map(function(item) {
if (!(item instanceof EncoderBuffer))
item = new EncoderBuffer(item, reporter);
this.length += item.length;
return item;
}, this);
} else if (typeof value === 'number') {
if (!(0 <= value && value <= 0xff))
return reporter.error('non-byte EncoderBuffer value');
this.value = value;
this.length = 1;
} else if (typeof value === 'string') {
this.value = value;
this.length = Buffer.byteLength(value);
} else if (Buffer.isBuffer(value)) {
this.value = value;
this.length = value.length;
} else {
return reporter.error('Unsupported type: ' + typeof value);
}
}
exports.EncoderBuffer = EncoderBuffer;
EncoderBuffer.prototype.join = function join(out, offset) {
if (!out)
out = new Buffer(this.length);
if (!offset)
offset = 0;
if (this.length === 0)
return out;
if (Array.isArray(this.value)) {
this.value.forEach(function(item) {
item.join(out, offset);
offset += item.length;
});
} else {
if (typeof this.value === 'number')
out[offset] = this.value;
else if (typeof this.value === 'string')
out.write(this.value, offset);
else if (Buffer.isBuffer(this.value))
this.value.copy(out, offset);
offset += this.length;
}
return out;
};

6
node_modules/asn1.js/lib/asn1/base/index.js generated vendored Normal file
View file

@ -0,0 +1,6 @@
var base = exports;
base.Reporter = require('./reporter').Reporter;
base.DecoderBuffer = require('./buffer').DecoderBuffer;
base.EncoderBuffer = require('./buffer').EncoderBuffer;
base.Node = require('./node');

634
node_modules/asn1.js/lib/asn1/base/node.js generated vendored Normal file
View file

@ -0,0 +1,634 @@
var Reporter = require('../base').Reporter;
var EncoderBuffer = require('../base').EncoderBuffer;
var DecoderBuffer = require('../base').DecoderBuffer;
var assert = require('minimalistic-assert');
// Supported tags
var tags = [
'seq', 'seqof', 'set', 'setof', 'objid', 'bool',
'gentime', 'utctime', 'null_', 'enum', 'int', 'objDesc',
'bitstr', 'bmpstr', 'charstr', 'genstr', 'graphstr', 'ia5str', 'iso646str',
'numstr', 'octstr', 'printstr', 't61str', 'unistr', 'utf8str', 'videostr'
];
// Public methods list
var methods = [
'key', 'obj', 'use', 'optional', 'explicit', 'implicit', 'def', 'choice',
'any', 'contains'
].concat(tags);
// Overrided methods list
var overrided = [
'_peekTag', '_decodeTag', '_use',
'_decodeStr', '_decodeObjid', '_decodeTime',
'_decodeNull', '_decodeInt', '_decodeBool', '_decodeList',
'_encodeComposite', '_encodeStr', '_encodeObjid', '_encodeTime',
'_encodeNull', '_encodeInt', '_encodeBool'
];
function Node(enc, parent) {
var state = {};
this._baseState = state;
state.enc = enc;
state.parent = parent || null;
state.children = null;
// State
state.tag = null;
state.args = null;
state.reverseArgs = null;
state.choice = null;
state.optional = false;
state.any = false;
state.obj = false;
state.use = null;
state.useDecoder = null;
state.key = null;
state['default'] = null;
state.explicit = null;
state.implicit = null;
state.contains = null;
// Should create new instance on each method
if (!state.parent) {
state.children = [];
this._wrap();
}
}
module.exports = Node;
var stateProps = [
'enc', 'parent', 'children', 'tag', 'args', 'reverseArgs', 'choice',
'optional', 'any', 'obj', 'use', 'alteredUse', 'key', 'default', 'explicit',
'implicit', 'contains'
];
Node.prototype.clone = function clone() {
var state = this._baseState;
var cstate = {};
stateProps.forEach(function(prop) {
cstate[prop] = state[prop];
});
var res = new this.constructor(cstate.parent);
res._baseState = cstate;
return res;
};
Node.prototype._wrap = function wrap() {
var state = this._baseState;
methods.forEach(function(method) {
this[method] = function _wrappedMethod() {
var clone = new this.constructor(this);
state.children.push(clone);
return clone[method].apply(clone, arguments);
};
}, this);
};
Node.prototype._init = function init(body) {
var state = this._baseState;
assert(state.parent === null);
body.call(this);
// Filter children
state.children = state.children.filter(function(child) {
return child._baseState.parent === this;
}, this);
assert.equal(state.children.length, 1, 'Root node can have only one child');
};
Node.prototype._useArgs = function useArgs(args) {
var state = this._baseState;
// Filter children and args
var children = args.filter(function(arg) {
return arg instanceof this.constructor;
}, this);
args = args.filter(function(arg) {
return !(arg instanceof this.constructor);
}, this);
if (children.length !== 0) {
assert(state.children === null);
state.children = children;
// Replace parent to maintain backward link
children.forEach(function(child) {
child._baseState.parent = this;
}, this);
}
if (args.length !== 0) {
assert(state.args === null);
state.args = args;
state.reverseArgs = args.map(function(arg) {
if (typeof arg !== 'object' || arg.constructor !== Object)
return arg;
var res = {};
Object.keys(arg).forEach(function(key) {
if (key == (key | 0))
key |= 0;
var value = arg[key];
res[value] = key;
});
return res;
});
}
};
//
// Overrided methods
//
overrided.forEach(function(method) {
Node.prototype[method] = function _overrided() {
var state = this._baseState;
throw new Error(method + ' not implemented for encoding: ' + state.enc);
};
});
//
// Public methods
//
tags.forEach(function(tag) {
Node.prototype[tag] = function _tagMethod() {
var state = this._baseState;
var args = Array.prototype.slice.call(arguments);
assert(state.tag === null);
state.tag = tag;
this._useArgs(args);
return this;
};
});
Node.prototype.use = function use(item) {
assert(item);
var state = this._baseState;
assert(state.use === null);
state.use = item;
return this;
};
Node.prototype.optional = function optional() {
var state = this._baseState;
state.optional = true;
return this;
};
Node.prototype.def = function def(val) {
var state = this._baseState;
assert(state['default'] === null);
state['default'] = val;
state.optional = true;
return this;
};
Node.prototype.explicit = function explicit(num) {
var state = this._baseState;
assert(state.explicit === null && state.implicit === null);
state.explicit = num;
return this;
};
Node.prototype.implicit = function implicit(num) {
var state = this._baseState;
assert(state.explicit === null && state.implicit === null);
state.implicit = num;
return this;
};
Node.prototype.obj = function obj() {
var state = this._baseState;
var args = Array.prototype.slice.call(arguments);
state.obj = true;
if (args.length !== 0)
this._useArgs(args);
return this;
};
Node.prototype.key = function key(newKey) {
var state = this._baseState;
assert(state.key === null);
state.key = newKey;
return this;
};
Node.prototype.any = function any() {
var state = this._baseState;
state.any = true;
return this;
};
Node.prototype.choice = function choice(obj) {
var state = this._baseState;
assert(state.choice === null);
state.choice = obj;
this._useArgs(Object.keys(obj).map(function(key) {
return obj[key];
}));
return this;
};
Node.prototype.contains = function contains(item) {
var state = this._baseState;
assert(state.use === null);
state.contains = item;
return this;
};
//
// Decoding
//
Node.prototype._decode = function decode(input, options) {
var state = this._baseState;
// Decode root node
if (state.parent === null)
return input.wrapResult(state.children[0]._decode(input, options));
var result = state['default'];
var present = true;
var prevKey = null;
if (state.key !== null)
prevKey = input.enterKey(state.key);
// Check if tag is there
if (state.optional) {
var tag = null;
if (state.explicit !== null)
tag = state.explicit;
else if (state.implicit !== null)
tag = state.implicit;
else if (state.tag !== null)
tag = state.tag;
if (tag === null && !state.any) {
// Trial and Error
var save = input.save();
try {
if (state.choice === null)
this._decodeGeneric(state.tag, input, options);
else
this._decodeChoice(input, options);
present = true;
} catch (e) {
present = false;
}
input.restore(save);
} else {
present = this._peekTag(input, tag, state.any);
if (input.isError(present))
return present;
}
}
// Push object on stack
var prevObj;
if (state.obj && present)
prevObj = input.enterObject();
if (present) {
// Unwrap explicit values
if (state.explicit !== null) {
var explicit = this._decodeTag(input, state.explicit);
if (input.isError(explicit))
return explicit;
input = explicit;
}
var start = input.offset;
// Unwrap implicit and normal values
if (state.use === null && state.choice === null) {
if (state.any)
var save = input.save();
var body = this._decodeTag(
input,
state.implicit !== null ? state.implicit : state.tag,
state.any
);
if (input.isError(body))
return body;
if (state.any)
result = input.raw(save);
else
input = body;
}
if (options && options.track && state.tag !== null)
options.track(input.path(), start, input.length, 'tagged');
if (options && options.track && state.tag !== null)
options.track(input.path(), input.offset, input.length, 'content');
// Select proper method for tag
if (state.any)
result = result;
else if (state.choice === null)
result = this._decodeGeneric(state.tag, input, options);
else
result = this._decodeChoice(input, options);
if (input.isError(result))
return result;
// Decode children
if (!state.any && state.choice === null && state.children !== null) {
state.children.forEach(function decodeChildren(child) {
// NOTE: We are ignoring errors here, to let parser continue with other
// parts of encoded data
child._decode(input, options);
});
}
// Decode contained/encoded by schema, only in bit or octet strings
if (state.contains && (state.tag === 'octstr' || state.tag === 'bitstr')) {
var data = new DecoderBuffer(result);
result = this._getUse(state.contains, input._reporterState.obj)
._decode(data, options);
}
}
// Pop object
if (state.obj && present)
result = input.leaveObject(prevObj);
// Set key
if (state.key !== null && (result !== null || present === true))
input.leaveKey(prevKey, state.key, result);
else if (prevKey !== null)
input.exitKey(prevKey);
return result;
};
Node.prototype._decodeGeneric = function decodeGeneric(tag, input, options) {
var state = this._baseState;
if (tag === 'seq' || tag === 'set')
return null;
if (tag === 'seqof' || tag === 'setof')
return this._decodeList(input, tag, state.args[0], options);
else if (/str$/.test(tag))
return this._decodeStr(input, tag, options);
else if (tag === 'objid' && state.args)
return this._decodeObjid(input, state.args[0], state.args[1], options);
else if (tag === 'objid')
return this._decodeObjid(input, null, null, options);
else if (tag === 'gentime' || tag === 'utctime')
return this._decodeTime(input, tag, options);
else if (tag === 'null_')
return this._decodeNull(input, options);
else if (tag === 'bool')
return this._decodeBool(input, options);
else if (tag === 'objDesc')
return this._decodeStr(input, tag, options);
else if (tag === 'int' || tag === 'enum')
return this._decodeInt(input, state.args && state.args[0], options);
if (state.use !== null) {
return this._getUse(state.use, input._reporterState.obj)
._decode(input, options);
} else {
return input.error('unknown tag: ' + tag);
}
};
Node.prototype._getUse = function _getUse(entity, obj) {
var state = this._baseState;
// Create altered use decoder if implicit is set
state.useDecoder = this._use(entity, obj);
assert(state.useDecoder._baseState.parent === null);
state.useDecoder = state.useDecoder._baseState.children[0];
if (state.implicit !== state.useDecoder._baseState.implicit) {
state.useDecoder = state.useDecoder.clone();
state.useDecoder._baseState.implicit = state.implicit;
}
return state.useDecoder;
};
Node.prototype._decodeChoice = function decodeChoice(input, options) {
var state = this._baseState;
var result = null;
var match = false;
Object.keys(state.choice).some(function(key) {
var save = input.save();
var node = state.choice[key];
try {
var value = node._decode(input, options);
if (input.isError(value))
return false;
result = { type: key, value: value };
match = true;
} catch (e) {
input.restore(save);
return false;
}
return true;
}, this);
if (!match)
return input.error('Choice not matched');
return result;
};
//
// Encoding
//
Node.prototype._createEncoderBuffer = function createEncoderBuffer(data) {
return new EncoderBuffer(data, this.reporter);
};
Node.prototype._encode = function encode(data, reporter, parent) {
var state = this._baseState;
if (state['default'] !== null && state['default'] === data)
return;
var result = this._encodeValue(data, reporter, parent);
if (result === undefined)
return;
if (this._skipDefault(result, reporter, parent))
return;
return result;
};
Node.prototype._encodeValue = function encode(data, reporter, parent) {
var state = this._baseState;
// Decode root node
if (state.parent === null)
return state.children[0]._encode(data, reporter || new Reporter());
var result = null;
// Set reporter to share it with a child class
this.reporter = reporter;
// Check if data is there
if (state.optional && data === undefined) {
if (state['default'] !== null)
data = state['default']
else
return;
}
// Encode children first
var content = null;
var primitive = false;
if (state.any) {
// Anything that was given is translated to buffer
result = this._createEncoderBuffer(data);
} else if (state.choice) {
result = this._encodeChoice(data, reporter);
} else if (state.contains) {
content = this._getUse(state.contains, parent)._encode(data, reporter);
primitive = true;
} else if (state.children) {
content = state.children.map(function(child) {
if (child._baseState.tag === 'null_')
return child._encode(null, reporter, data);
if (child._baseState.key === null)
return reporter.error('Child should have a key');
var prevKey = reporter.enterKey(child._baseState.key);
if (typeof data !== 'object')
return reporter.error('Child expected, but input is not object');
var res = child._encode(data[child._baseState.key], reporter, data);
reporter.leaveKey(prevKey);
return res;
}, this).filter(function(child) {
return child;
});
content = this._createEncoderBuffer(content);
} else {
if (state.tag === 'seqof' || state.tag === 'setof') {
// TODO(indutny): this should be thrown on DSL level
if (!(state.args && state.args.length === 1))
return reporter.error('Too many args for : ' + state.tag);
if (!Array.isArray(data))
return reporter.error('seqof/setof, but data is not Array');
var child = this.clone();
child._baseState.implicit = null;
content = this._createEncoderBuffer(data.map(function(item) {
var state = this._baseState;
return this._getUse(state.args[0], data)._encode(item, reporter);
}, child));
} else if (state.use !== null) {
result = this._getUse(state.use, parent)._encode(data, reporter);
} else {
content = this._encodePrimitive(state.tag, data);
primitive = true;
}
}
// Encode data itself
var result;
if (!state.any && state.choice === null) {
var tag = state.implicit !== null ? state.implicit : state.tag;
var cls = state.implicit === null ? 'universal' : 'context';
if (tag === null) {
if (state.use === null)
reporter.error('Tag could be omitted only for .use()');
} else {
if (state.use === null)
result = this._encodeComposite(tag, primitive, cls, content);
}
}
// Wrap in explicit
if (state.explicit !== null)
result = this._encodeComposite(state.explicit, false, 'context', result);
return result;
};
Node.prototype._encodeChoice = function encodeChoice(data, reporter) {
var state = this._baseState;
var node = state.choice[data.type];
if (!node) {
assert(
false,
data.type + ' not found in ' +
JSON.stringify(Object.keys(state.choice)));
}
return node._encode(data.value, reporter);
};
Node.prototype._encodePrimitive = function encodePrimitive(tag, data) {
var state = this._baseState;
if (/str$/.test(tag))
return this._encodeStr(data, tag);
else if (tag === 'objid' && state.args)
return this._encodeObjid(data, state.reverseArgs[0], state.args[1]);
else if (tag === 'objid')
return this._encodeObjid(data, null, null);
else if (tag === 'gentime' || tag === 'utctime')
return this._encodeTime(data, tag);
else if (tag === 'null_')
return this._encodeNull();
else if (tag === 'int' || tag === 'enum')
return this._encodeInt(data, state.args && state.reverseArgs[0]);
else if (tag === 'bool')
return this._encodeBool(data);
else if (tag === 'objDesc')
return this._encodeStr(data, tag);
else
throw new Error('Unsupported tag: ' + tag);
};
Node.prototype._isNumstr = function isNumstr(str) {
return /^[0-9 ]*$/.test(str);
};
Node.prototype._isPrintstr = function isPrintstr(str) {
return /^[A-Za-z0-9 '\(\)\+,\-\.\/:=\?]*$/.test(str);
};

121
node_modules/asn1.js/lib/asn1/base/reporter.js generated vendored Normal file
View file

@ -0,0 +1,121 @@
var inherits = require('inherits');
function Reporter(options) {
this._reporterState = {
obj: null,
path: [],
options: options || {},
errors: []
};
}
exports.Reporter = Reporter;
Reporter.prototype.isError = function isError(obj) {
return obj instanceof ReporterError;
};
Reporter.prototype.save = function save() {
var state = this._reporterState;
return { obj: state.obj, pathLen: state.path.length };
};
Reporter.prototype.restore = function restore(data) {
var state = this._reporterState;
state.obj = data.obj;
state.path = state.path.slice(0, data.pathLen);
};
Reporter.prototype.enterKey = function enterKey(key) {
return this._reporterState.path.push(key);
};
Reporter.prototype.exitKey = function exitKey(index) {
var state = this._reporterState;
state.path = state.path.slice(0, index - 1);
};
Reporter.prototype.leaveKey = function leaveKey(index, key, value) {
var state = this._reporterState;
this.exitKey(index);
if (state.obj !== null)
state.obj[key] = value;
};
Reporter.prototype.path = function path() {
return this._reporterState.path.join('/');
};
Reporter.prototype.enterObject = function enterObject() {
var state = this._reporterState;
var prev = state.obj;
state.obj = {};
return prev;
};
Reporter.prototype.leaveObject = function leaveObject(prev) {
var state = this._reporterState;
var now = state.obj;
state.obj = prev;
return now;
};
Reporter.prototype.error = function error(msg) {
var err;
var state = this._reporterState;
var inherited = msg instanceof ReporterError;
if (inherited) {
err = msg;
} else {
err = new ReporterError(state.path.map(function(elem) {
return '[' + JSON.stringify(elem) + ']';
}).join(''), msg.message || msg, msg.stack);
}
if (!state.options.partial)
throw err;
if (!inherited)
state.errors.push(err);
return err;
};
Reporter.prototype.wrapResult = function wrapResult(result) {
var state = this._reporterState;
if (!state.options.partial)
return result;
return {
result: this.isError(result) ? null : result,
errors: state.errors
};
};
function ReporterError(path, msg) {
this.path = path;
this.rethrow(msg);
};
inherits(ReporterError, Error);
ReporterError.prototype.rethrow = function rethrow(msg) {
this.message = msg + ' at: ' + (this.path || '(shallow)');
if (Error.captureStackTrace)
Error.captureStackTrace(this, ReporterError);
if (!this.stack) {
try {
// IE only adds stack when thrown
throw new Error(this.message);
} catch (e) {
this.stack = e.stack;
}
}
return this;
};

42
node_modules/asn1.js/lib/asn1/constants/der.js generated vendored Normal file
View file

@ -0,0 +1,42 @@
var constants = require('../constants');
exports.tagClass = {
0: 'universal',
1: 'application',
2: 'context',
3: 'private'
};
exports.tagClassByName = constants._reverse(exports.tagClass);
exports.tag = {
0x00: 'end',
0x01: 'bool',
0x02: 'int',
0x03: 'bitstr',
0x04: 'octstr',
0x05: 'null_',
0x06: 'objid',
0x07: 'objDesc',
0x08: 'external',
0x09: 'real',
0x0a: 'enum',
0x0b: 'embed',
0x0c: 'utf8str',
0x0d: 'relativeOid',
0x10: 'seq',
0x11: 'set',
0x12: 'numstr',
0x13: 'printstr',
0x14: 't61str',
0x15: 'videostr',
0x16: 'ia5str',
0x17: 'utctime',
0x18: 'gentime',
0x19: 'graphstr',
0x1a: 'iso646str',
0x1b: 'genstr',
0x1c: 'unistr',
0x1d: 'charstr',
0x1e: 'bmpstr'
};
exports.tagByName = constants._reverse(exports.tag);

19
node_modules/asn1.js/lib/asn1/constants/index.js generated vendored Normal file
View file

@ -0,0 +1,19 @@
var constants = exports;
// Helper
constants._reverse = function reverse(map) {
var res = {};
Object.keys(map).forEach(function(key) {
// Convert key to integer if it is stringified
if ((key | 0) == key)
key = key | 0;
var value = map[key];
res[value] = key;
});
return res;
};
constants.der = require('./der');

324
node_modules/asn1.js/lib/asn1/decoders/der.js generated vendored Normal file
View file

@ -0,0 +1,324 @@
var inherits = require('inherits');
var asn1 = require('../../asn1');
var base = asn1.base;
var bignum = asn1.bignum;
// Import DER constants
var der = asn1.constants.der;
function DERDecoder(entity) {
this.enc = 'der';
this.name = entity.name;
this.entity = entity;
// Construct base tree
this.tree = new DERNode();
this.tree._init(entity.body);
};
module.exports = DERDecoder;
DERDecoder.prototype.decode = function decode(data, options) {
if (!(data instanceof base.DecoderBuffer))
data = new base.DecoderBuffer(data, options);
return this.tree._decode(data, options);
};
// Tree methods
function DERNode(parent) {
base.Node.call(this, 'der', parent);
}
inherits(DERNode, base.Node);
DERNode.prototype._peekTag = function peekTag(buffer, tag, any) {
if (buffer.isEmpty())
return false;
var state = buffer.save();
var decodedTag = derDecodeTag(buffer, 'Failed to peek tag: "' + tag + '"');
if (buffer.isError(decodedTag))
return decodedTag;
buffer.restore(state);
return decodedTag.tag === tag || decodedTag.tagStr === tag ||
(decodedTag.tagStr + 'of') === tag || any;
};
DERNode.prototype._decodeTag = function decodeTag(buffer, tag, any) {
var decodedTag = derDecodeTag(buffer,
'Failed to decode tag of "' + tag + '"');
if (buffer.isError(decodedTag))
return decodedTag;
var len = derDecodeLen(buffer,
decodedTag.primitive,
'Failed to get length of "' + tag + '"');
// Failure
if (buffer.isError(len))
return len;
if (!any &&
decodedTag.tag !== tag &&
decodedTag.tagStr !== tag &&
decodedTag.tagStr + 'of' !== tag) {
return buffer.error('Failed to match tag: "' + tag + '"');
}
if (decodedTag.primitive || len !== null)
return buffer.skip(len, 'Failed to match body of: "' + tag + '"');
// Indefinite length... find END tag
var state = buffer.save();
var res = this._skipUntilEnd(
buffer,
'Failed to skip indefinite length body: "' + this.tag + '"');
if (buffer.isError(res))
return res;
len = buffer.offset - state.offset;
buffer.restore(state);
return buffer.skip(len, 'Failed to match body of: "' + tag + '"');
};
DERNode.prototype._skipUntilEnd = function skipUntilEnd(buffer, fail) {
while (true) {
var tag = derDecodeTag(buffer, fail);
if (buffer.isError(tag))
return tag;
var len = derDecodeLen(buffer, tag.primitive, fail);
if (buffer.isError(len))
return len;
var res;
if (tag.primitive || len !== null)
res = buffer.skip(len)
else
res = this._skipUntilEnd(buffer, fail);
// Failure
if (buffer.isError(res))
return res;
if (tag.tagStr === 'end')
break;
}
};
DERNode.prototype._decodeList = function decodeList(buffer, tag, decoder,
options) {
var result = [];
while (!buffer.isEmpty()) {
var possibleEnd = this._peekTag(buffer, 'end');
if (buffer.isError(possibleEnd))
return possibleEnd;
var res = decoder.decode(buffer, 'der', options);
if (buffer.isError(res) && possibleEnd)
break;
result.push(res);
}
return result;
};
DERNode.prototype._decodeStr = function decodeStr(buffer, tag) {
if (tag === 'bitstr') {
var unused = buffer.readUInt8();
if (buffer.isError(unused))
return unused;
return { unused: unused, data: buffer.raw() };
} else if (tag === 'bmpstr') {
var raw = buffer.raw();
if (raw.length % 2 === 1)
return buffer.error('Decoding of string type: bmpstr length mismatch');
var str = '';
for (var i = 0; i < raw.length / 2; i++) {
str += String.fromCharCode(raw.readUInt16BE(i * 2));
}
return str;
} else if (tag === 'numstr') {
var numstr = buffer.raw().toString('ascii');
if (!this._isNumstr(numstr)) {
return buffer.error('Decoding of string type: ' +
'numstr unsupported characters');
}
return numstr;
} else if (tag === 'octstr') {
return buffer.raw();
} else if (tag === 'objDesc') {
return buffer.raw();
} else if (tag === 'printstr') {
var printstr = buffer.raw().toString('ascii');
if (!this._isPrintstr(printstr)) {
return buffer.error('Decoding of string type: ' +
'printstr unsupported characters');
}
return printstr;
} else if (/str$/.test(tag)) {
return buffer.raw().toString();
} else {
return buffer.error('Decoding of string type: ' + tag + ' unsupported');
}
};
DERNode.prototype._decodeObjid = function decodeObjid(buffer, values, relative) {
var result;
var identifiers = [];
var ident = 0;
while (!buffer.isEmpty()) {
var subident = buffer.readUInt8();
ident <<= 7;
ident |= subident & 0x7f;
if ((subident & 0x80) === 0) {
identifiers.push(ident);
ident = 0;
}
}
if (subident & 0x80)
identifiers.push(ident);
var first = (identifiers[0] / 40) | 0;
var second = identifiers[0] % 40;
if (relative)
result = identifiers;
else
result = [first, second].concat(identifiers.slice(1));
if (values) {
var tmp = values[result.join(' ')];
if (tmp === undefined)
tmp = values[result.join('.')];
if (tmp !== undefined)
result = tmp;
}
return result;
};
DERNode.prototype._decodeTime = function decodeTime(buffer, tag) {
var str = buffer.raw().toString();
if (tag === 'gentime') {
var year = str.slice(0, 4) | 0;
var mon = str.slice(4, 6) | 0;
var day = str.slice(6, 8) | 0;
var hour = str.slice(8, 10) | 0;
var min = str.slice(10, 12) | 0;
var sec = str.slice(12, 14) | 0;
} else if (tag === 'utctime') {
var year = str.slice(0, 2) | 0;
var mon = str.slice(2, 4) | 0;
var day = str.slice(4, 6) | 0;
var hour = str.slice(6, 8) | 0;
var min = str.slice(8, 10) | 0;
var sec = str.slice(10, 12) | 0;
if (year < 70)
year = 2000 + year;
else
year = 1900 + year;
} else {
return buffer.error('Decoding ' + tag + ' time is not supported yet');
}
return Date.UTC(year, mon - 1, day, hour, min, sec, 0);
};
DERNode.prototype._decodeNull = function decodeNull(buffer) {
return null;
};
DERNode.prototype._decodeBool = function decodeBool(buffer) {
var res = buffer.readUInt8();
if (buffer.isError(res))
return res;
else
return res !== 0;
};
DERNode.prototype._decodeInt = function decodeInt(buffer, values) {
// Bigint, return as it is (assume big endian)
var raw = buffer.raw();
var res = new bignum(raw);
if (values)
res = values[res.toString(10)] || res;
return res;
};
DERNode.prototype._use = function use(entity, obj) {
if (typeof entity === 'function')
entity = entity(obj);
return entity._getDecoder('der').tree;
};
// Utility methods
function derDecodeTag(buf, fail) {
var tag = buf.readUInt8(fail);
if (buf.isError(tag))
return tag;
var cls = der.tagClass[tag >> 6];
var primitive = (tag & 0x20) === 0;
// Multi-octet tag - load
if ((tag & 0x1f) === 0x1f) {
var oct = tag;
tag = 0;
while ((oct & 0x80) === 0x80) {
oct = buf.readUInt8(fail);
if (buf.isError(oct))
return oct;
tag <<= 7;
tag |= oct & 0x7f;
}
} else {
tag &= 0x1f;
}
var tagStr = der.tag[tag];
return {
cls: cls,
primitive: primitive,
tag: tag,
tagStr: tagStr
};
}
function derDecodeLen(buf, primitive, fail) {
var len = buf.readUInt8(fail);
if (buf.isError(len))
return len;
// Indefinite form
if (!primitive && len === 0x80)
return null;
// Definite form
if ((len & 0x80) === 0) {
// Short form
return len;
}
// Long form
var num = len & 0x7f;
if (num > 4)
return buf.error('length octect is too long');
len = 0;
for (var i = 0; i < num; i++) {
len <<= 8;
var j = buf.readUInt8(fail);
if (buf.isError(j))
return j;
len |= j;
}
return len;
}

4
node_modules/asn1.js/lib/asn1/decoders/index.js generated vendored Normal file
View file

@ -0,0 +1,4 @@
var decoders = exports;
decoders.der = require('./der');
decoders.pem = require('./pem');

49
node_modules/asn1.js/lib/asn1/decoders/pem.js generated vendored Normal file
View file

@ -0,0 +1,49 @@
var inherits = require('inherits');
var Buffer = require('buffer').Buffer;
var DERDecoder = require('./der');
function PEMDecoder(entity) {
DERDecoder.call(this, entity);
this.enc = 'pem';
};
inherits(PEMDecoder, DERDecoder);
module.exports = PEMDecoder;
PEMDecoder.prototype.decode = function decode(data, options) {
var lines = data.toString().split(/[\r\n]+/g);
var label = options.label.toUpperCase();
var re = /^-----(BEGIN|END) ([^-]+)-----$/;
var start = -1;
var end = -1;
for (var i = 0; i < lines.length; i++) {
var match = lines[i].match(re);
if (match === null)
continue;
if (match[2] !== label)
continue;
if (start === -1) {
if (match[1] !== 'BEGIN')
break;
start = i;
} else {
if (match[1] !== 'END')
break;
end = i;
break;
}
}
if (start === -1 || end === -1)
throw new Error('PEM section not found for: ' + label);
var base64 = lines.slice(start + 1, end).join('');
// Remove excessive symbols
base64.replace(/[^a-z0-9\+\/=]+/gi, '');
var input = new Buffer(base64, 'base64');
return DERDecoder.prototype.decode.call(this, input, options);
};

295
node_modules/asn1.js/lib/asn1/encoders/der.js generated vendored Normal file
View file

@ -0,0 +1,295 @@
var inherits = require('inherits');
var Buffer = require('buffer').Buffer;
var asn1 = require('../../asn1');
var base = asn1.base;
// Import DER constants
var der = asn1.constants.der;
function DEREncoder(entity) {
this.enc = 'der';
this.name = entity.name;
this.entity = entity;
// Construct base tree
this.tree = new DERNode();
this.tree._init(entity.body);
};
module.exports = DEREncoder;
DEREncoder.prototype.encode = function encode(data, reporter) {
return this.tree._encode(data, reporter).join();
};
// Tree methods
function DERNode(parent) {
base.Node.call(this, 'der', parent);
}
inherits(DERNode, base.Node);
DERNode.prototype._encodeComposite = function encodeComposite(tag,
primitive,
cls,
content) {
var encodedTag = encodeTag(tag, primitive, cls, this.reporter);
// Short form
if (content.length < 0x80) {
var header = new Buffer(2);
header[0] = encodedTag;
header[1] = content.length;
return this._createEncoderBuffer([ header, content ]);
}
// Long form
// Count octets required to store length
var lenOctets = 1;
for (var i = content.length; i >= 0x100; i >>= 8)
lenOctets++;
var header = new Buffer(1 + 1 + lenOctets);
header[0] = encodedTag;
header[1] = 0x80 | lenOctets;
for (var i = 1 + lenOctets, j = content.length; j > 0; i--, j >>= 8)
header[i] = j & 0xff;
return this._createEncoderBuffer([ header, content ]);
};
DERNode.prototype._encodeStr = function encodeStr(str, tag) {
if (tag === 'bitstr') {
return this._createEncoderBuffer([ str.unused | 0, str.data ]);
} else if (tag === 'bmpstr') {
var buf = new Buffer(str.length * 2);
for (var i = 0; i < str.length; i++) {
buf.writeUInt16BE(str.charCodeAt(i), i * 2);
}
return this._createEncoderBuffer(buf);
} else if (tag === 'numstr') {
if (!this._isNumstr(str)) {
return this.reporter.error('Encoding of string type: numstr supports ' +
'only digits and space');
}
return this._createEncoderBuffer(str);
} else if (tag === 'printstr') {
if (!this._isPrintstr(str)) {
return this.reporter.error('Encoding of string type: printstr supports ' +
'only latin upper and lower case letters, ' +
'digits, space, apostrophe, left and rigth ' +
'parenthesis, plus sign, comma, hyphen, ' +
'dot, slash, colon, equal sign, ' +
'question mark');
}
return this._createEncoderBuffer(str);
} else if (/str$/.test(tag)) {
return this._createEncoderBuffer(str);
} else if (tag === 'objDesc') {
return this._createEncoderBuffer(str);
} else {
return this.reporter.error('Encoding of string type: ' + tag +
' unsupported');
}
};
DERNode.prototype._encodeObjid = function encodeObjid(id, values, relative) {
if (typeof id === 'string') {
if (!values)
return this.reporter.error('string objid given, but no values map found');
if (!values.hasOwnProperty(id))
return this.reporter.error('objid not found in values map');
id = values[id].split(/[\s\.]+/g);
for (var i = 0; i < id.length; i++)
id[i] |= 0;
} else if (Array.isArray(id)) {
id = id.slice();
for (var i = 0; i < id.length; i++)
id[i] |= 0;
}
if (!Array.isArray(id)) {
return this.reporter.error('objid() should be either array or string, ' +
'got: ' + JSON.stringify(id));
}
if (!relative) {
if (id[1] >= 40)
return this.reporter.error('Second objid identifier OOB');
id.splice(0, 2, id[0] * 40 + id[1]);
}
// Count number of octets
var size = 0;
for (var i = 0; i < id.length; i++) {
var ident = id[i];
for (size++; ident >= 0x80; ident >>= 7)
size++;
}
var objid = new Buffer(size);
var offset = objid.length - 1;
for (var i = id.length - 1; i >= 0; i--) {
var ident = id[i];
objid[offset--] = ident & 0x7f;
while ((ident >>= 7) > 0)
objid[offset--] = 0x80 | (ident & 0x7f);
}
return this._createEncoderBuffer(objid);
};
function two(num) {
if (num < 10)
return '0' + num;
else
return num;
}
DERNode.prototype._encodeTime = function encodeTime(time, tag) {
var str;
var date = new Date(time);
if (tag === 'gentime') {
str = [
two(date.getFullYear()),
two(date.getUTCMonth() + 1),
two(date.getUTCDate()),
two(date.getUTCHours()),
two(date.getUTCMinutes()),
two(date.getUTCSeconds()),
'Z'
].join('');
} else if (tag === 'utctime') {
str = [
two(date.getFullYear() % 100),
two(date.getUTCMonth() + 1),
two(date.getUTCDate()),
two(date.getUTCHours()),
two(date.getUTCMinutes()),
two(date.getUTCSeconds()),
'Z'
].join('');
} else {
this.reporter.error('Encoding ' + tag + ' time is not supported yet');
}
return this._encodeStr(str, 'octstr');
};
DERNode.prototype._encodeNull = function encodeNull() {
return this._createEncoderBuffer('');
};
DERNode.prototype._encodeInt = function encodeInt(num, values) {
if (typeof num === 'string') {
if (!values)
return this.reporter.error('String int or enum given, but no values map');
if (!values.hasOwnProperty(num)) {
return this.reporter.error('Values map doesn\'t contain: ' +
JSON.stringify(num));
}
num = values[num];
}
// Bignum, assume big endian
if (typeof num !== 'number' && !Buffer.isBuffer(num)) {
var numArray = num.toArray();
if (!num.sign && numArray[0] & 0x80) {
numArray.unshift(0);
}
num = new Buffer(numArray);
}
if (Buffer.isBuffer(num)) {
var size = num.length;
if (num.length === 0)
size++;
var out = new Buffer(size);
num.copy(out);
if (num.length === 0)
out[0] = 0
return this._createEncoderBuffer(out);
}
if (num < 0x80)
return this._createEncoderBuffer(num);
if (num < 0x100)
return this._createEncoderBuffer([0, num]);
var size = 1;
for (var i = num; i >= 0x100; i >>= 8)
size++;
var out = new Array(size);
for (var i = out.length - 1; i >= 0; i--) {
out[i] = num & 0xff;
num >>= 8;
}
if(out[0] & 0x80) {
out.unshift(0);
}
return this._createEncoderBuffer(new Buffer(out));
};
DERNode.prototype._encodeBool = function encodeBool(value) {
return this._createEncoderBuffer(value ? 0xff : 0);
};
DERNode.prototype._use = function use(entity, obj) {
if (typeof entity === 'function')
entity = entity(obj);
return entity._getEncoder('der').tree;
};
DERNode.prototype._skipDefault = function skipDefault(dataBuffer, reporter, parent) {
var state = this._baseState;
var i;
if (state['default'] === null)
return false;
var data = dataBuffer.join();
if (state.defaultBuffer === undefined)
state.defaultBuffer = this._encodeValue(state['default'], reporter, parent).join();
if (data.length !== state.defaultBuffer.length)
return false;
for (i=0; i < data.length; i++)
if (data[i] !== state.defaultBuffer[i])
return false;
return true;
};
// Utility methods
function encodeTag(tag, primitive, cls, reporter) {
var res;
if (tag === 'seqof')
tag = 'seq';
else if (tag === 'setof')
tag = 'set';
if (der.tagByName.hasOwnProperty(tag))
res = der.tagByName[tag];
else if (typeof tag === 'number' && (tag | 0) === tag)
res = tag;
else
return reporter.error('Unknown tag: ' + tag);
if (res >= 0x1f)
return reporter.error('Multi-octet tag encoding unsupported');
if (!primitive)
res |= 0x20;
res |= (der.tagClassByName[cls || 'universal'] << 6);
return res;
}

4
node_modules/asn1.js/lib/asn1/encoders/index.js generated vendored Normal file
View file

@ -0,0 +1,4 @@
var encoders = exports;
encoders.der = require('./der');
encoders.pem = require('./pem');

21
node_modules/asn1.js/lib/asn1/encoders/pem.js generated vendored Normal file
View file

@ -0,0 +1,21 @@
var inherits = require('inherits');
var DEREncoder = require('./der');
function PEMEncoder(entity) {
DEREncoder.call(this, entity);
this.enc = 'pem';
};
inherits(PEMEncoder, DEREncoder);
module.exports = PEMEncoder;
PEMEncoder.prototype.encode = function encode(data, options) {
var buf = DEREncoder.prototype.encode.call(this, data);
var p = buf.toString('base64');
var out = [ '-----BEGIN ' + options.label + '-----' ];
for (var i = 0; i < p.length; i += 64)
out.push(p.slice(i, i + 64));
out.push('-----END ' + options.label + '-----');
return out.join('\n');
};

59
node_modules/asn1.js/package.json generated vendored Normal file
View file

@ -0,0 +1,59 @@
{
"_from": "asn1.js@^4.8.0",
"_id": "asn1.js@4.10.1",
"_inBundle": false,
"_integrity": "sha512-p32cOF5q0Zqs9uBiONKYLm6BClCoBCM5O9JfeUSlnQLBTxYdTK+pW+nXflm8UkKd2UYlEbYz5qEi0JuZR9ckSw==",
"_location": "/asn1.js",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "asn1.js@^4.8.0",
"name": "asn1.js",
"escapedName": "asn1.js",
"rawSpec": "^4.8.0",
"saveSpec": null,
"fetchSpec": "^4.8.0"
},
"_requiredBy": [
"/asn1.js-rfc5280",
"/ocsp"
],
"_resolved": "https://registry.npmjs.org/asn1.js/-/asn1.js-4.10.1.tgz",
"_shasum": "b9c2bf5805f1e64aadeed6df3a2bfafb5a73f5a0",
"_spec": "asn1.js@^4.8.0",
"_where": "/home/ubuntu/OCSP/node_modules/ocsp",
"author": {
"name": "Fedor Indutny"
},
"bugs": {
"url": "https://github.com/indutny/asn1.js/issues"
},
"bundleDependencies": false,
"dependencies": {
"bn.js": "^4.0.0",
"inherits": "^2.0.1",
"minimalistic-assert": "^1.0.0"
},
"deprecated": false,
"description": "ASN.1 encoder and decoder",
"devDependencies": {
"mocha": "^2.3.4"
},
"homepage": "https://github.com/indutny/asn1.js",
"keywords": [
"asn.1",
"der"
],
"license": "MIT",
"main": "lib/asn1.js",
"name": "asn1.js",
"repository": {
"type": "git",
"url": "git+ssh://git@github.com/indutny/asn1.js.git"
},
"scripts": {
"test": "mocha --reporter spec test/*-test.js && cd rfc/2560 && npm i && npm test && cd ../../rfc/5280 && npm i && npm test"
},
"version": "4.10.1"
}

125
node_modules/async/CHANGELOG.md generated vendored Normal file
View file

@ -0,0 +1,125 @@
# v1.5.2
- Allow using `"consructor"` as an argument in `memoize` (#998)
- Give a better error messsage when `auto` dependency checking fails (#994)
- Various doc updates (#936, #956, #979, #1002)
# v1.5.1
- Fix issue with `pause` in `queue` with concurrency enabled (#946)
- `while` and `until` now pass the final result to callback (#963)
- `auto` will properly handle concurrency when there is no callback (#966)
- `auto` will now properly stop execution when an error occurs (#988, #993)
- Various doc fixes (#971, #980)
# v1.5.0
- Added `transform`, analogous to [`_.transform`](http://lodash.com/docs#transform) (#892)
- `map` now returns an object when an object is passed in, rather than array with non-numeric keys. `map` will begin always returning an array with numeric indexes in the next major release. (#873)
- `auto` now accepts an optional `concurrency` argument to limit the number of running tasks (#637)
- Added `queue#workersList()`, to retrieve the list of currently running tasks. (#891)
- Various code simplifications (#896, #904)
- Various doc fixes :scroll: (#890, #894, #903, #905, #912)
# v1.4.2
- Ensure coverage files don't get published on npm (#879)
# v1.4.1
- Add in overlooked `detectLimit` method (#866)
- Removed unnecessary files from npm releases (#861)
- Removed usage of a reserved word to prevent :boom: in older environments (#870)
# v1.4.0
- `asyncify` now supports promises (#840)
- Added `Limit` versions of `filter` and `reject` (#836)
- Add `Limit` versions of `detect`, `some` and `every` (#828, #829)
- `some`, `every` and `detect` now short circuit early (#828, #829)
- Improve detection of the global object (#804), enabling use in WebWorkers
- `whilst` now called with arguments from iterator (#823)
- `during` now gets called with arguments from iterator (#824)
- Code simplifications and optimizations aplenty ([diff](https://github.com/caolan/async/compare/v1.3.0...v1.4.0))
# v1.3.0
New Features:
- Added `constant`
- Added `asyncify`/`wrapSync` for making sync functions work with callbacks. (#671, #806)
- Added `during` and `doDuring`, which are like `whilst` with an async truth test. (#800)
- `retry` now accepts an `interval` parameter to specify a delay between retries. (#793)
- `async` should work better in Web Workers due to better `root` detection (#804)
- Callbacks are now optional in `whilst`, `doWhilst`, `until`, and `doUntil` (#642)
- Various internal updates (#786, #801, #802, #803)
- Various doc fixes (#790, #794)
Bug Fixes:
- `cargo` now exposes the `payload` size, and `cargo.payload` can be changed on the fly after the `cargo` is created. (#740, #744, #783)
# v1.2.1
Bug Fix:
- Small regression with synchronous iterator behavior in `eachSeries` with a 1-element array. Before 1.1.0, `eachSeries`'s callback was called on the same tick, which this patch restores. In 2.0.0, it will be called on the next tick. (#782)
# v1.2.0
New Features:
- Added `timesLimit` (#743)
- `concurrency` can be changed after initialization in `queue` by setting `q.concurrency`. The new concurrency will be reflected the next time a task is processed. (#747, #772)
Bug Fixes:
- Fixed a regression in `each` and family with empty arrays that have additional properties. (#775, #777)
# v1.1.1
Bug Fix:
- Small regression with synchronous iterator behavior in `eachSeries` with a 1-element array. Before 1.1.0, `eachSeries`'s callback was called on the same tick, which this patch restores. In 2.0.0, it will be called on the next tick. (#782)
# v1.1.0
New Features:
- `cargo` now supports all of the same methods and event callbacks as `queue`.
- Added `ensureAsync` - A wrapper that ensures an async function calls its callback on a later tick. (#769)
- Optimized `map`, `eachOf`, and `waterfall` families of functions
- Passing a `null` or `undefined` array to `map`, `each`, `parallel` and families will be treated as an empty array (#667).
- The callback is now optional for the composed results of `compose` and `seq`. (#618)
- Reduced file size by 4kb, (minified version by 1kb)
- Added code coverage through `nyc` and `coveralls` (#768)
Bug Fixes:
- `forever` will no longer stack overflow with a synchronous iterator (#622)
- `eachLimit` and other limit functions will stop iterating once an error occurs (#754)
- Always pass `null` in callbacks when there is no error (#439)
- Ensure proper conditions when calling `drain()` after pushing an empty data set to a queue (#668)
- `each` and family will properly handle an empty array (#578)
- `eachSeries` and family will finish if the underlying array is modified during execution (#557)
- `queue` will throw if a non-function is passed to `q.push()` (#593)
- Doc fixes (#629, #766)
# v1.0.0
No known breaking changes, we are simply complying with semver from here on out.
Changes:
- Start using a changelog!
- Add `forEachOf` for iterating over Objects (or to iterate Arrays with indexes available) (#168 #704 #321)
- Detect deadlocks in `auto` (#663)
- Better support for require.js (#527)
- Throw if queue created with concurrency `0` (#714)
- Fix unneeded iteration in `queue.resume()` (#758)
- Guard against timer mocking overriding `setImmediate` (#609 #611)
- Miscellaneous doc fixes (#542 #596 #615 #628 #631 #690 #729)
- Use single noop function internally (#546)
- Optimize internal `_each`, `_map` and `_keys` functions.

19
node_modules/async/LICENSE generated vendored Normal file
View file

@ -0,0 +1,19 @@
Copyright (c) 2010-2014 Caolan McMahon
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

1877
node_modules/async/README.md generated vendored Normal file

File diff suppressed because it is too large Load diff

1265
node_modules/async/dist/async.js generated vendored Normal file

File diff suppressed because it is too large Load diff

2
node_modules/async/dist/async.min.js generated vendored Normal file

File diff suppressed because one or more lines are too long

1265
node_modules/async/lib/async.js generated vendored Normal file

File diff suppressed because it is too large Load diff

113
node_modules/async/package.json generated vendored Normal file
View file

@ -0,0 +1,113 @@
{
"_from": "async@^1.5.2",
"_id": "async@1.5.2",
"_inBundle": false,
"_integrity": "sha512-nSVgobk4rv61R9PUSDtYt7mPVB2olxNR5RWJcAsH676/ef11bUZwvu7+RGYrYauVdDPcO519v68wRhXQtxsV9w==",
"_location": "/async",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "async@^1.5.2",
"name": "async",
"escapedName": "async",
"rawSpec": "^1.5.2",
"saveSpec": null,
"fetchSpec": "^1.5.2"
},
"_requiredBy": [
"/ocsp"
],
"_resolved": "https://registry.npmjs.org/async/-/async-1.5.2.tgz",
"_shasum": "ec6a61ae56480c0c3cb241c95618e20892f9672a",
"_spec": "async@^1.5.2",
"_where": "/home/ubuntu/OCSP/node_modules/ocsp",
"author": {
"name": "Caolan McMahon"
},
"bugs": {
"url": "https://github.com/caolan/async/issues"
},
"bundleDependencies": false,
"deprecated": false,
"description": "Higher-order functions and common patterns for asynchronous code",
"devDependencies": {
"benchmark": "github:bestiejs/benchmark.js",
"bluebird": "^2.9.32",
"chai": "^3.1.0",
"coveralls": "^2.11.2",
"es6-promise": "^2.3.0",
"jscs": "^1.13.1",
"jshint": "~2.8.0",
"karma": "^0.13.2",
"karma-browserify": "^4.2.1",
"karma-firefox-launcher": "^0.1.6",
"karma-mocha": "^0.2.0",
"karma-mocha-reporter": "^1.0.2",
"lodash": "^3.9.0",
"mkdirp": "~0.5.1",
"mocha": "^2.2.5",
"native-promise-only": "^0.8.0-a",
"nodeunit": ">0.0.0",
"nyc": "^2.1.0",
"rsvp": "^3.0.18",
"semver": "^4.3.6",
"uglify-js": "~2.4.0",
"xyz": "^0.5.0",
"yargs": "~3.9.1"
},
"files": [
"lib",
"dist/async.js",
"dist/async.min.js"
],
"homepage": "https://github.com/caolan/async#readme",
"jam": {
"main": "lib/async.js",
"include": [
"lib/async.js",
"README.md",
"LICENSE"
],
"categories": [
"Utilities"
]
},
"keywords": [
"async",
"callback",
"utility",
"module"
],
"license": "MIT",
"main": "lib/async.js",
"name": "async",
"repository": {
"type": "git",
"url": "git+https://github.com/caolan/async.git"
},
"scripts": {
"coverage": "nyc npm test && nyc report",
"coveralls": "nyc npm test && nyc report --reporter=text-lcov | coveralls",
"lint": "jshint lib/*.js test/*.js perf/*.js && jscs lib/*.js test/*.js perf/*.js",
"mocha-browser-test": "karma start",
"mocha-node-test": "mocha mocha_test/",
"mocha-test": "npm run mocha-node-test && npm run mocha-browser-test",
"nodeunit-test": "nodeunit test/test-async.js",
"test": "npm run-script lint && npm run nodeunit-test && npm run mocha-test"
},
"spm": {
"main": "lib/async.js"
},
"version": "1.5.2",
"volo": {
"main": "lib/async.js",
"ignore": [
"**/.*",
"node_modules",
"bower_components",
"test",
"tests"
]
}
}

19
node_modules/bn.js/LICENSE generated vendored Normal file
View file

@ -0,0 +1,19 @@
Copyright Fedor Indutny, 2015.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

200
node_modules/bn.js/README.md generated vendored Normal file
View file

@ -0,0 +1,200 @@
# <img src="./logo.png" alt="bn.js" width="160" height="160" />
> BigNum in pure javascript
[![Build Status](https://secure.travis-ci.org/indutny/bn.js.png)](http://travis-ci.org/indutny/bn.js)
## Install
`npm install --save bn.js`
## Usage
```js
const BN = require('bn.js');
var a = new BN('dead', 16);
var b = new BN('101010', 2);
var res = a.add(b);
console.log(res.toString(10)); // 57047
```
**Note**: decimals are not supported in this library.
## Notation
### Prefixes
There are several prefixes to instructions that affect the way the work. Here
is the list of them in the order of appearance in the function name:
* `i` - perform operation in-place, storing the result in the host object (on
which the method was invoked). Might be used to avoid number allocation costs
* `u` - unsigned, ignore the sign of operands when performing operation, or
always return positive value. Second case applies to reduction operations
like `mod()`. In such cases if the result will be negative - modulo will be
added to the result to make it positive
### Postfixes
The only available postfix at the moment is:
* `n` - which means that the argument of the function must be a plain JavaScript
Number. Decimals are not supported.
### Examples
* `a.iadd(b)` - perform addition on `a` and `b`, storing the result in `a`
* `a.umod(b)` - reduce `a` modulo `b`, returning positive value
* `a.iushln(13)` - shift bits of `a` left by 13
## Instructions
Prefixes/postfixes are put in parens at the of the line. `endian` - could be
either `le` (little-endian) or `be` (big-endian).
### Utilities
* `a.clone()` - clone number
* `a.toString(base, length)` - convert to base-string and pad with zeroes
* `a.toNumber()` - convert to Javascript Number (limited to 53 bits)
* `a.toJSON()` - convert to JSON compatible hex string (alias of `toString(16)`)
* `a.toArray(endian, length)` - convert to byte `Array`, and optionally zero
pad to length, throwing if already exceeding
* `a.toArrayLike(type, endian, length)` - convert to an instance of `type`,
which must behave like an `Array`
* `a.toBuffer(endian, length)` - convert to Node.js Buffer (if available). For
compatibility with browserify and similar tools, use this instead:
`a.toArrayLike(Buffer, endian, length)`
* `a.bitLength()` - get number of bits occupied
* `a.zeroBits()` - return number of less-significant consequent zero bits
(example: `1010000` has 4 zero bits)
* `a.byteLength()` - return number of bytes occupied
* `a.isNeg()` - true if the number is negative
* `a.isEven()` - no comments
* `a.isOdd()` - no comments
* `a.isZero()` - no comments
* `a.cmp(b)` - compare numbers and return `-1` (a `<` b), `0` (a `==` b), or `1` (a `>` b)
depending on the comparison result (`ucmp`, `cmpn`)
* `a.lt(b)` - `a` less than `b` (`n`)
* `a.lte(b)` - `a` less than or equals `b` (`n`)
* `a.gt(b)` - `a` greater than `b` (`n`)
* `a.gte(b)` - `a` greater than or equals `b` (`n`)
* `a.eq(b)` - `a` equals `b` (`n`)
* `a.toTwos(width)` - convert to two's complement representation, where `width` is bit width
* `a.fromTwos(width)` - convert from two's complement representation, where `width` is the bit width
* `BN.isBN(object)` - returns true if the supplied `object` is a BN.js instance
### Arithmetics
* `a.neg()` - negate sign (`i`)
* `a.abs()` - absolute value (`i`)
* `a.add(b)` - addition (`i`, `n`, `in`)
* `a.sub(b)` - subtraction (`i`, `n`, `in`)
* `a.mul(b)` - multiply (`i`, `n`, `in`)
* `a.sqr()` - square (`i`)
* `a.pow(b)` - raise `a` to the power of `b`
* `a.div(b)` - divide (`divn`, `idivn`)
* `a.mod(b)` - reduct (`u`, `n`) (but no `umodn`)
* `a.divRound(b)` - rounded division
### Bit operations
* `a.or(b)` - or (`i`, `u`, `iu`)
* `a.and(b)` - and (`i`, `u`, `iu`, `andln`) (NOTE: `andln` is going to be replaced
with `andn` in future)
* `a.xor(b)` - xor (`i`, `u`, `iu`)
* `a.setn(b)` - set specified bit to `1`
* `a.shln(b)` - shift left (`i`, `u`, `iu`)
* `a.shrn(b)` - shift right (`i`, `u`, `iu`)
* `a.testn(b)` - test if specified bit is set
* `a.maskn(b)` - clear bits with indexes higher or equal to `b` (`i`)
* `a.bincn(b)` - add `1 << b` to the number
* `a.notn(w)` - not (for the width specified by `w`) (`i`)
### Reduction
* `a.gcd(b)` - GCD
* `a.egcd(b)` - Extended GCD results (`{ a: ..., b: ..., gcd: ... }`)
* `a.invm(b)` - inverse `a` modulo `b`
## Fast reduction
When doing lots of reductions using the same modulo, it might be beneficial to
use some tricks: like [Montgomery multiplication][0], or using special algorithm
for [Mersenne Prime][1].
### Reduction context
To enable this tricks one should create a reduction context:
```js
var red = BN.red(num);
```
where `num` is just a BN instance.
Or:
```js
var red = BN.red(primeName);
```
Where `primeName` is either of these [Mersenne Primes][1]:
* `'k256'`
* `'p224'`
* `'p192'`
* `'p25519'`
Or:
```js
var red = BN.mont(num);
```
To reduce numbers with [Montgomery trick][0]. `.mont()` is generally faster than
`.red(num)`, but slower than `BN.red(primeName)`.
### Converting numbers
Before performing anything in reduction context - numbers should be converted
to it. Usually, this means that one should:
* Convert inputs to reducted ones
* Operate on them in reduction context
* Convert outputs back from the reduction context
Here is how one may convert numbers to `red`:
```js
var redA = a.toRed(red);
```
Where `red` is a reduction context created using instructions above
Here is how to convert them back:
```js
var a = redA.fromRed();
```
### Red instructions
Most of the instructions from the very start of this readme have their
counterparts in red context:
* `a.redAdd(b)`, `a.redIAdd(b)`
* `a.redSub(b)`, `a.redISub(b)`
* `a.redShl(num)`
* `a.redMul(b)`, `a.redIMul(b)`
* `a.redSqr()`, `a.redISqr()`
* `a.redSqrt()` - square root modulo reduction context's prime
* `a.redInvm()` - modular inverse of the number
* `a.redNeg()`
* `a.redPow(b)` - modular exponentiation
## LICENSE
This software is licensed under the MIT License.
[0]: https://en.wikipedia.org/wiki/Montgomery_modular_multiplication
[1]: https://en.wikipedia.org/wiki/Mersenne_prime

3446
node_modules/bn.js/lib/bn.js generated vendored Normal file

File diff suppressed because it is too large Load diff

64
node_modules/bn.js/package.json generated vendored Normal file
View file

@ -0,0 +1,64 @@
{
"_from": "bn.js@^4.0.0",
"_id": "bn.js@4.12.0",
"_inBundle": false,
"_integrity": "sha512-c98Bf3tPniI+scsdk237ku1Dc3ujXQTSgyiPUDEOe7tRkhrqridvh8klBv0HCEso1OLOYcHuCv/cS6DNxKH+ZA==",
"_location": "/bn.js",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "bn.js@^4.0.0",
"name": "bn.js",
"escapedName": "bn.js",
"rawSpec": "^4.0.0",
"saveSpec": null,
"fetchSpec": "^4.0.0"
},
"_requiredBy": [
"/asn1.js"
],
"_resolved": "https://registry.npmjs.org/bn.js/-/bn.js-4.12.0.tgz",
"_shasum": "775b3f278efbb9718eec7361f483fb36fbbfea88",
"_spec": "bn.js@^4.0.0",
"_where": "/home/ubuntu/OCSP/node_modules/asn1.js",
"author": {
"name": "Fedor Indutny",
"email": "fedor@indutny.com"
},
"browser": {
"buffer": false
},
"bugs": {
"url": "https://github.com/indutny/bn.js/issues"
},
"bundleDependencies": false,
"deprecated": false,
"description": "Big number implementation in pure javascript",
"devDependencies": {
"istanbul": "^0.3.5",
"mocha": "^2.1.0",
"semistandard": "^7.0.4"
},
"homepage": "https://github.com/indutny/bn.js",
"keywords": [
"BN",
"BigNum",
"Big number",
"Modulo",
"Montgomery"
],
"license": "MIT",
"main": "lib/bn.js",
"name": "bn.js",
"repository": {
"type": "git",
"url": "git+ssh://git@github.com/indutny/bn.js.git"
},
"scripts": {
"lint": "semistandard",
"test": "npm run lint && npm run unit",
"unit": "mocha --reporter=spec test/*-test.js"
},
"version": "4.12.0"
}

1
node_modules/call-bind/.eslintignore generated vendored Normal file
View file

@ -0,0 +1 @@
coverage/

17
node_modules/call-bind/.eslintrc generated vendored Normal file
View file

@ -0,0 +1,17 @@
{
"root": true,
"extends": "@ljharb",
"rules": {
"func-name-matching": 0,
"id-length": 0,
"new-cap": [2, {
"capIsNewExceptions": [
"GetIntrinsic",
],
}],
"no-magic-numbers": 0,
"operator-linebreak": [2, "before"],
},
}

12
node_modules/call-bind/.github/FUNDING.yml generated vendored Normal file
View file

@ -0,0 +1,12 @@
# These are supported funding model platforms
github: [ljharb]
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: npm/call-bind
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']

13
node_modules/call-bind/.nycrc generated vendored Normal file
View file

@ -0,0 +1,13 @@
{
"all": true,
"check-coverage": false,
"reporter": ["text-summary", "text", "html", "json"],
"lines": 86,
"statements": 85.93,
"functions": 82.43,
"branches": 76.06,
"exclude": [
"coverage",
"test"
]
}

42
node_modules/call-bind/CHANGELOG.md generated vendored Normal file
View file

@ -0,0 +1,42 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [v1.0.2](https://github.com/ljharb/call-bind/compare/v1.0.1...v1.0.2) - 2021-01-11
### Commits
- [Fix] properly include the receiver in the bound length [`dbae7bc`](https://github.com/ljharb/call-bind/commit/dbae7bc676c079a0d33c0a43e9ef92cb7b01345d)
## [v1.0.1](https://github.com/ljharb/call-bind/compare/v1.0.0...v1.0.1) - 2021-01-08
### Commits
- [Tests] migrate tests to Github Actions [`b6db284`](https://github.com/ljharb/call-bind/commit/b6db284c36f8ccd195b88a6764fe84b7223a0da1)
- [meta] do not publish github action workflow files [`ec7fe46`](https://github.com/ljharb/call-bind/commit/ec7fe46e60cfa4764ee943d2755f5e5a366e578e)
- [Fix] preserve original functions length when possible [`adbceaa`](https://github.com/ljharb/call-bind/commit/adbceaa3cac4b41ea78bb19d7ccdbaaf7e0bdadb)
- [Tests] gather coverage data on every job [`d69e23c`](https://github.com/ljharb/call-bind/commit/d69e23cc65f101ba1d4c19bb07fa8eb0ec624be8)
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `aud`, `tape` [`2fd3586`](https://github.com/ljharb/call-bind/commit/2fd3586c5d47b335364c14293114c6b625ae1f71)
- [Deps] update `get-intrinsic` [`f23e931`](https://github.com/ljharb/call-bind/commit/f23e9318cc271c2add8bb38cfded85ee7baf8eee)
- [Deps] update `get-intrinsic` [`72d9f44`](https://github.com/ljharb/call-bind/commit/72d9f44e184465ba8dd3fb48260bbcff234985f2)
- [meta] fix FUNDING.yml [`e723573`](https://github.com/ljharb/call-bind/commit/e723573438c5a68dcec31fb5d96ea6b7e4a93be8)
- [eslint] ignore coverage output [`15e76d2`](https://github.com/ljharb/call-bind/commit/15e76d28a5f43e504696401e5b31ebb78ee1b532)
- [meta] add Automatic Rebase and Require Allow Edits workflows [`8fa4dab`](https://github.com/ljharb/call-bind/commit/8fa4dabb23ba3dd7bb92c9571c1241c08b56e4b6)
## v1.0.0 - 2020-10-30
### Commits
- Initial commit [`306cf98`](https://github.com/ljharb/call-bind/commit/306cf98c7ec9e7ef66b653ec152277ac1381eb50)
- Tests [`e10d0bb`](https://github.com/ljharb/call-bind/commit/e10d0bbdadc7a10ecedc9a1c035112d3e368b8df)
- Implementation [`43852ed`](https://github.com/ljharb/call-bind/commit/43852eda0f187327b7fad2423ca972149a52bd65)
- npm init [`408f860`](https://github.com/ljharb/call-bind/commit/408f860b773a2f610805fd3613d0d71bac1b6249)
- [meta] add Automatic Rebase and Require Allow Edits workflows [`fb349b2`](https://github.com/ljharb/call-bind/commit/fb349b2e48defbec8b5ec8a8395cc8f69f220b13)
- [meta] add `auto-changelog` [`c4001fc`](https://github.com/ljharb/call-bind/commit/c4001fc43031799ef908211c98d3b0fb2b60fde4)
- [meta] add "funding"; create `FUNDING.yml` [`d4d6d29`](https://github.com/ljharb/call-bind/commit/d4d6d2974a14bc2e98830468eda7fe6d6a776717)
- [Tests] add `npm run lint` [`dedfb98`](https://github.com/ljharb/call-bind/commit/dedfb98bd0ecefb08ddb9a94061bd10cde4332af)
- Only apps should have lockfiles [`54ac776`](https://github.com/ljharb/call-bind/commit/54ac77653db45a7361dc153d2f478e743f110650)
- [meta] add `safe-publish-latest` [`9ea8e43`](https://github.com/ljharb/call-bind/commit/9ea8e435b950ce9b705559cd651039f9bf40140f)

21
node_modules/call-bind/LICENSE generated vendored Normal file
View file

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2020 Jordan Harband
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

2
node_modules/call-bind/README.md generated vendored Normal file
View file

@ -0,0 +1,2 @@
# call-bind
Robustly `.call.bind()` a function.

15
node_modules/call-bind/callBound.js generated vendored Normal file
View file

@ -0,0 +1,15 @@
'use strict';
var GetIntrinsic = require('get-intrinsic');
var callBind = require('./');
var $indexOf = callBind(GetIntrinsic('String.prototype.indexOf'));
module.exports = function callBoundIntrinsic(name, allowMissing) {
var intrinsic = GetIntrinsic(name, !!allowMissing);
if (typeof intrinsic === 'function' && $indexOf(name, '.prototype.') > -1) {
return callBind(intrinsic);
}
return intrinsic;
};

47
node_modules/call-bind/index.js generated vendored Normal file
View file

@ -0,0 +1,47 @@
'use strict';
var bind = require('function-bind');
var GetIntrinsic = require('get-intrinsic');
var $apply = GetIntrinsic('%Function.prototype.apply%');
var $call = GetIntrinsic('%Function.prototype.call%');
var $reflectApply = GetIntrinsic('%Reflect.apply%', true) || bind.call($call, $apply);
var $gOPD = GetIntrinsic('%Object.getOwnPropertyDescriptor%', true);
var $defineProperty = GetIntrinsic('%Object.defineProperty%', true);
var $max = GetIntrinsic('%Math.max%');
if ($defineProperty) {
try {
$defineProperty({}, 'a', { value: 1 });
} catch (e) {
// IE 8 has a broken defineProperty
$defineProperty = null;
}
}
module.exports = function callBind(originalFunction) {
var func = $reflectApply(bind, $call, arguments);
if ($gOPD && $defineProperty) {
var desc = $gOPD(func, 'length');
if (desc.configurable) {
// original length, plus the receiver, minus any additional arguments (after the receiver)
$defineProperty(
func,
'length',
{ value: 1 + $max(0, originalFunction.length - (arguments.length - 1)) }
);
}
}
return func;
};
var applyBind = function applyBind() {
return $reflectApply(bind, $apply, arguments);
};
if ($defineProperty) {
$defineProperty(module.exports, 'apply', { value: applyBind });
} else {
module.exports.apply = applyBind;
}

108
node_modules/call-bind/package.json generated vendored Normal file
View file

@ -0,0 +1,108 @@
{
"_from": "call-bind@^1.0.0",
"_id": "call-bind@1.0.2",
"_inBundle": false,
"_integrity": "sha512-7O+FbCihrB5WGbFYesctwmTKae6rOiIzmz1icreWJ+0aA7LJfuqhEso2T9ncpcFtzMQtzXf2QGGueWJGTYsqrA==",
"_location": "/call-bind",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "call-bind@^1.0.0",
"name": "call-bind",
"escapedName": "call-bind",
"rawSpec": "^1.0.0",
"saveSpec": null,
"fetchSpec": "^1.0.0"
},
"_requiredBy": [
"/side-channel"
],
"_resolved": "https://registry.npmjs.org/call-bind/-/call-bind-1.0.2.tgz",
"_shasum": "b1d4e89e688119c3c9a903ad30abb2f6a919be3c",
"_spec": "call-bind@^1.0.0",
"_where": "/home/ubuntu/formidable/node_modules/side-channel",
"author": {
"name": "Jordan Harband",
"email": "ljharb@gmail.com"
},
"auto-changelog": {
"output": "CHANGELOG.md",
"template": "keepachangelog",
"unreleased": false,
"commitLimit": false,
"backfillLimit": false,
"hideCredit": true
},
"bugs": {
"url": "https://github.com/ljharb/call-bind/issues"
},
"bundleDependencies": false,
"dependencies": {
"function-bind": "^1.1.1",
"get-intrinsic": "^1.0.2"
},
"deprecated": false,
"description": "Robustly `.call.bind()` a function",
"devDependencies": {
"@ljharb/eslint-config": "^17.3.0",
"aud": "^1.1.3",
"auto-changelog": "^2.2.1",
"eslint": "^7.17.0",
"nyc": "^10.3.2",
"safe-publish-latest": "^1.1.4",
"tape": "^5.1.1"
},
"exports": {
".": [
{
"default": "./index.js"
},
"./index.js"
],
"./callBound": [
{
"default": "./callBound.js"
},
"./callBound.js"
],
"./package.json": "./package.json"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
},
"homepage": "https://github.com/ljharb/call-bind#readme",
"keywords": [
"javascript",
"ecmascript",
"es",
"js",
"callbind",
"callbound",
"call",
"bind",
"bound",
"call-bind",
"call-bound",
"function",
"es-abstract"
],
"license": "MIT",
"main": "index.js",
"name": "call-bind",
"repository": {
"type": "git",
"url": "git+https://github.com/ljharb/call-bind.git"
},
"scripts": {
"lint": "eslint --ext=.js,.mjs .",
"posttest": "aud --production",
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\"",
"prepublish": "safe-publish-latest",
"pretest": "npm run lint",
"test": "npm run tests-only",
"tests-only": "nyc tape 'test/*'",
"version": "auto-changelog && git add CHANGELOG.md"
},
"version": "1.0.2"
}

55
node_modules/call-bind/test/callBound.js generated vendored Normal file
View file

@ -0,0 +1,55 @@
'use strict';
var test = require('tape');
var callBound = require('../callBound');
test('callBound', function (t) {
// static primitive
t.equal(callBound('Array.length'), Array.length, 'Array.length yields itself');
t.equal(callBound('%Array.length%'), Array.length, '%Array.length% yields itself');
// static non-function object
t.equal(callBound('Array.prototype'), Array.prototype, 'Array.prototype yields itself');
t.equal(callBound('%Array.prototype%'), Array.prototype, '%Array.prototype% yields itself');
t.equal(callBound('Array.constructor'), Array.constructor, 'Array.constructor yields itself');
t.equal(callBound('%Array.constructor%'), Array.constructor, '%Array.constructor% yields itself');
// static function
t.equal(callBound('Date.parse'), Date.parse, 'Date.parse yields itself');
t.equal(callBound('%Date.parse%'), Date.parse, '%Date.parse% yields itself');
// prototype primitive
t.equal(callBound('Error.prototype.message'), Error.prototype.message, 'Error.prototype.message yields itself');
t.equal(callBound('%Error.prototype.message%'), Error.prototype.message, '%Error.prototype.message% yields itself');
// prototype function
t.notEqual(callBound('Object.prototype.toString'), Object.prototype.toString, 'Object.prototype.toString does not yield itself');
t.notEqual(callBound('%Object.prototype.toString%'), Object.prototype.toString, '%Object.prototype.toString% does not yield itself');
t.equal(callBound('Object.prototype.toString')(true), Object.prototype.toString.call(true), 'call-bound Object.prototype.toString calls into the original');
t.equal(callBound('%Object.prototype.toString%')(true), Object.prototype.toString.call(true), 'call-bound %Object.prototype.toString% calls into the original');
t['throws'](
function () { callBound('does not exist'); },
SyntaxError,
'nonexistent intrinsic throws'
);
t['throws'](
function () { callBound('does not exist', true); },
SyntaxError,
'allowMissing arg still throws for unknown intrinsic'
);
/* globals WeakRef: false */
t.test('real but absent intrinsic', { skip: typeof WeakRef !== 'undefined' }, function (st) {
st['throws'](
function () { callBound('WeakRef'); },
TypeError,
'real but absent intrinsic throws'
);
st.equal(callBound('WeakRef', true), undefined, 'allowMissing arg avoids exception');
st.end();
});
t.end();
});

66
node_modules/call-bind/test/index.js generated vendored Normal file
View file

@ -0,0 +1,66 @@
'use strict';
var callBind = require('../');
var bind = require('function-bind');
var test = require('tape');
/*
* older engines have length nonconfigurable
* in io.js v3, it is configurable except on bound functions, hence the .bind()
*/
var functionsHaveConfigurableLengths = !!(
Object.getOwnPropertyDescriptor
&& Object.getOwnPropertyDescriptor(bind.call(function () {}), 'length').configurable
);
test('callBind', function (t) {
var sentinel = { sentinel: true };
var func = function (a, b) {
// eslint-disable-next-line no-invalid-this
return [this, a, b];
};
t.equal(func.length, 2, 'original function length is 2');
t.deepEqual(func(), [undefined, undefined, undefined], 'unbound func with too few args');
t.deepEqual(func(1, 2), [undefined, 1, 2], 'unbound func with right args');
t.deepEqual(func(1, 2, 3), [undefined, 1, 2], 'unbound func with too many args');
var bound = callBind(func);
t.equal(bound.length, func.length + 1, 'function length is preserved', { skip: !functionsHaveConfigurableLengths });
t.deepEqual(bound(), [undefined, undefined, undefined], 'bound func with too few args');
t.deepEqual(bound(1, 2), [1, 2, undefined], 'bound func with right args');
t.deepEqual(bound(1, 2, 3), [1, 2, 3], 'bound func with too many args');
var boundR = callBind(func, sentinel);
t.equal(boundR.length, func.length, 'function length is preserved', { skip: !functionsHaveConfigurableLengths });
t.deepEqual(boundR(), [sentinel, undefined, undefined], 'bound func with receiver, with too few args');
t.deepEqual(boundR(1, 2), [sentinel, 1, 2], 'bound func with receiver, with right args');
t.deepEqual(boundR(1, 2, 3), [sentinel, 1, 2], 'bound func with receiver, with too many args');
var boundArg = callBind(func, sentinel, 1);
t.equal(boundArg.length, func.length - 1, 'function length is preserved', { skip: !functionsHaveConfigurableLengths });
t.deepEqual(boundArg(), [sentinel, 1, undefined], 'bound func with receiver and arg, with too few args');
t.deepEqual(boundArg(2), [sentinel, 1, 2], 'bound func with receiver and arg, with right arg');
t.deepEqual(boundArg(2, 3), [sentinel, 1, 2], 'bound func with receiver and arg, with too many args');
t.test('callBind.apply', function (st) {
var aBound = callBind.apply(func);
st.deepEqual(aBound(sentinel), [sentinel, undefined, undefined], 'apply-bound func with no args');
st.deepEqual(aBound(sentinel, [1], 4), [sentinel, 1, undefined], 'apply-bound func with too few args');
st.deepEqual(aBound(sentinel, [1, 2], 4), [sentinel, 1, 2], 'apply-bound func with right args');
var aBoundArg = callBind.apply(func);
st.deepEqual(aBoundArg(sentinel, [1, 2, 3], 4), [sentinel, 1, 2], 'apply-bound func with too many args');
st.deepEqual(aBoundArg(sentinel, [1, 2], 4), [sentinel, 1, 2], 'apply-bound func with right args');
st.deepEqual(aBoundArg(sentinel, [1], 4), [sentinel, 1, undefined], 'apply-bound func with too few args');
var aBoundR = callBind.apply(func, sentinel);
st.deepEqual(aBoundR([1, 2, 3], 4), [sentinel, 1, 2], 'apply-bound func with receiver and too many args');
st.deepEqual(aBoundR([1, 2], 4), [sentinel, 1, 2], 'apply-bound func with receiver and right args');
st.deepEqual(aBoundR([1], 4), [sentinel, 1, undefined], 'apply-bound func with receiver and too few args');
st.end();
});
t.end();
});

15
node_modules/chownr/LICENSE generated vendored Normal file
View file

@ -0,0 +1,15 @@
The ISC License
Copyright (c) Isaac Z. Schlueter and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

3
node_modules/chownr/README.md generated vendored Normal file
View file

@ -0,0 +1,3 @@
Like `chown -R`.
Takes the same arguments as `fs.chown()`

167
node_modules/chownr/chownr.js generated vendored Normal file
View file

@ -0,0 +1,167 @@
'use strict'
const fs = require('fs')
const path = require('path')
/* istanbul ignore next */
const LCHOWN = fs.lchown ? 'lchown' : 'chown'
/* istanbul ignore next */
const LCHOWNSYNC = fs.lchownSync ? 'lchownSync' : 'chownSync'
/* istanbul ignore next */
const needEISDIRHandled = fs.lchown &&
!process.version.match(/v1[1-9]+\./) &&
!process.version.match(/v10\.[6-9]/)
const lchownSync = (path, uid, gid) => {
try {
return fs[LCHOWNSYNC](path, uid, gid)
} catch (er) {
if (er.code !== 'ENOENT')
throw er
}
}
/* istanbul ignore next */
const chownSync = (path, uid, gid) => {
try {
return fs.chownSync(path, uid, gid)
} catch (er) {
if (er.code !== 'ENOENT')
throw er
}
}
/* istanbul ignore next */
const handleEISDIR =
needEISDIRHandled ? (path, uid, gid, cb) => er => {
// Node prior to v10 had a very questionable implementation of
// fs.lchown, which would always try to call fs.open on a directory
// Fall back to fs.chown in those cases.
if (!er || er.code !== 'EISDIR')
cb(er)
else
fs.chown(path, uid, gid, cb)
}
: (_, __, ___, cb) => cb
/* istanbul ignore next */
const handleEISDirSync =
needEISDIRHandled ? (path, uid, gid) => {
try {
return lchownSync(path, uid, gid)
} catch (er) {
if (er.code !== 'EISDIR')
throw er
chownSync(path, uid, gid)
}
}
: (path, uid, gid) => lchownSync(path, uid, gid)
// fs.readdir could only accept an options object as of node v6
const nodeVersion = process.version
let readdir = (path, options, cb) => fs.readdir(path, options, cb)
let readdirSync = (path, options) => fs.readdirSync(path, options)
/* istanbul ignore next */
if (/^v4\./.test(nodeVersion))
readdir = (path, options, cb) => fs.readdir(path, cb)
const chown = (cpath, uid, gid, cb) => {
fs[LCHOWN](cpath, uid, gid, handleEISDIR(cpath, uid, gid, er => {
// Skip ENOENT error
cb(er && er.code !== 'ENOENT' ? er : null)
}))
}
const chownrKid = (p, child, uid, gid, cb) => {
if (typeof child === 'string')
return fs.lstat(path.resolve(p, child), (er, stats) => {
// Skip ENOENT error
if (er)
return cb(er.code !== 'ENOENT' ? er : null)
stats.name = child
chownrKid(p, stats, uid, gid, cb)
})
if (child.isDirectory()) {
chownr(path.resolve(p, child.name), uid, gid, er => {
if (er)
return cb(er)
const cpath = path.resolve(p, child.name)
chown(cpath, uid, gid, cb)
})
} else {
const cpath = path.resolve(p, child.name)
chown(cpath, uid, gid, cb)
}
}
const chownr = (p, uid, gid, cb) => {
readdir(p, { withFileTypes: true }, (er, children) => {
// any error other than ENOTDIR or ENOTSUP means it's not readable,
// or doesn't exist. give up.
if (er) {
if (er.code === 'ENOENT')
return cb()
else if (er.code !== 'ENOTDIR' && er.code !== 'ENOTSUP')
return cb(er)
}
if (er || !children.length)
return chown(p, uid, gid, cb)
let len = children.length
let errState = null
const then = er => {
if (errState)
return
if (er)
return cb(errState = er)
if (-- len === 0)
return chown(p, uid, gid, cb)
}
children.forEach(child => chownrKid(p, child, uid, gid, then))
})
}
const chownrKidSync = (p, child, uid, gid) => {
if (typeof child === 'string') {
try {
const stats = fs.lstatSync(path.resolve(p, child))
stats.name = child
child = stats
} catch (er) {
if (er.code === 'ENOENT')
return
else
throw er
}
}
if (child.isDirectory())
chownrSync(path.resolve(p, child.name), uid, gid)
handleEISDirSync(path.resolve(p, child.name), uid, gid)
}
const chownrSync = (p, uid, gid) => {
let children
try {
children = readdirSync(p, { withFileTypes: true })
} catch (er) {
if (er.code === 'ENOENT')
return
else if (er.code === 'ENOTDIR' || er.code === 'ENOTSUP')
return handleEISDirSync(p, uid, gid)
else
throw er
}
if (children && children.length)
children.forEach(child => chownrKidSync(p, child, uid, gid))
return handleEISDirSync(p, uid, gid)
}
module.exports = chownr
chownr.sync = chownrSync

65
node_modules/chownr/package.json generated vendored Normal file
View file

@ -0,0 +1,65 @@
{
"_from": "chownr@^2.0.0",
"_id": "chownr@2.0.0",
"_inBundle": false,
"_integrity": "sha512-bIomtDF5KGpdogkLd9VspvFzk9KfpyyGlS8YFVZl7TGPBHL5snIOnxeshwVgPteQ9b4Eydl+pVbIyE1DcvCWgQ==",
"_location": "/chownr",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "chownr@^2.0.0",
"name": "chownr",
"escapedName": "chownr",
"rawSpec": "^2.0.0",
"saveSpec": null,
"fetchSpec": "^2.0.0"
},
"_requiredBy": [
"/tar"
],
"_resolved": "https://registry.npmjs.org/chownr/-/chownr-2.0.0.tgz",
"_shasum": "15bfbe53d2eab4cf70f18a8cd68ebe5b3cb1dece",
"_spec": "chownr@^2.0.0",
"_where": "/home/ubuntu/fix/node_modules/tar",
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
"url": "http://blog.izs.me/"
},
"bugs": {
"url": "https://github.com/isaacs/chownr/issues"
},
"bundleDependencies": false,
"deprecated": false,
"description": "like `chown -R`",
"devDependencies": {
"mkdirp": "0.3",
"rimraf": "^2.7.1",
"tap": "^14.10.6"
},
"engines": {
"node": ">=10"
},
"files": [
"chownr.js"
],
"homepage": "https://github.com/isaacs/chownr#readme",
"license": "ISC",
"main": "chownr.js",
"name": "chownr",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/chownr.git"
},
"scripts": {
"postversion": "npm publish",
"prepublishOnly": "git push origin --follow-tags",
"preversion": "npm test",
"test": "tap"
},
"tap": {
"check-coverage": true
},
"version": "2.0.0"
}

15
node_modules/dezalgo/LICENSE generated vendored Normal file
View file

@ -0,0 +1,15 @@
The ISC License
Copyright (c) Isaac Z. Schlueter and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

29
node_modules/dezalgo/README.md generated vendored Normal file
View file

@ -0,0 +1,29 @@
# dezalgo
Contain async insanity so that the dark pony lord doesn't eat souls
See [this blog
post](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony).
## USAGE
Pass a callback to `dezalgo` and it will ensure that it is *always*
called in a future tick, and never in this tick.
```javascript
var dz = require('dezalgo')
var cache = {}
function maybeSync(arg, cb) {
cb = dz(cb)
// this will actually defer to nextTick
if (cache[arg]) cb(null, cache[arg])
fs.readFile(arg, function (er, data) {
// since this is *already* defered, it will call immediately
if (er) cb(er)
cb(null, cache[arg] = data)
})
}
```

22
node_modules/dezalgo/dezalgo.js generated vendored Normal file
View file

@ -0,0 +1,22 @@
var wrappy = require('wrappy')
module.exports = wrappy(dezalgo)
var asap = require('asap')
function dezalgo (cb) {
var sync = true
asap(function () {
sync = false
})
return function zalgoSafe() {
var args = arguments
var me = this
if (sync)
asap(function() {
cb.apply(me, args)
})
else
cb.apply(me, args)
}
}

75
node_modules/dezalgo/package.json generated vendored Normal file
View file

@ -0,0 +1,75 @@
{
"_from": "dezalgo@^1.0.4",
"_id": "dezalgo@1.0.4",
"_inBundle": false,
"_integrity": "sha512-rXSP0bf+5n0Qonsb+SVVfNfIsimO4HEtmnIpPHY8Q1UCzKlQrDMfdobr8nJOOsRgWCyMRqeSBQzmWUMq7zvVig==",
"_location": "/dezalgo",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "dezalgo@^1.0.4",
"name": "dezalgo",
"escapedName": "dezalgo",
"rawSpec": "^1.0.4",
"saveSpec": null,
"fetchSpec": "^1.0.4"
},
"_requiredBy": [
"/formidable"
],
"_resolved": "https://registry.npmjs.org/dezalgo/-/dezalgo-1.0.4.tgz",
"_shasum": "751235260469084c132157dfa857f386d4c33d81",
"_spec": "dezalgo@^1.0.4",
"_where": "/home/ubuntu/formidable/node_modules/formidable",
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
"url": "http://blog.izs.me/"
},
"bugs": {
"url": "https://github.com/npm/dezalgo/issues"
},
"bundleDependencies": false,
"dependencies": {
"asap": "^2.0.0",
"wrappy": "1"
},
"deprecated": false,
"description": "Contain async insanity so that the dark pony lord doesn't eat souls",
"devDependencies": {
"tap": "^12.4.0"
},
"directories": {
"test": "test"
},
"files": [
"dezalgo.js"
],
"homepage": "https://github.com/npm/dezalgo",
"keywords": [
"async",
"zalgo",
"the dark pony",
"he comes",
"asynchrony of all holy and good",
"To invoke the hive mind representing chaos",
"Invoking the feeling of chaos. /Without order",
"The Nezperdian Hive Mind of Chaos, (zalgo………………)",
"He who waits beyond the wall",
"ZALGO",
"HE COMES",
"there used to be some funky unicode keywords here, but it broke the npm website on chrome, so they were removed, sorry"
],
"license": "ISC",
"main": "dezalgo.js",
"name": "dezalgo",
"repository": {
"type": "git",
"url": "git+https://github.com/npm/dezalgo.git"
},
"scripts": {
"test": "tap test/*.js"
},
"version": "1.0.4"
}

92
node_modules/formidable/CHANGELOG.md generated vendored Normal file
View file

@ -0,0 +1,92 @@
# Changelog
### Unreleased (`canary` & `dev` dist-tags)
* feat: add options.filter ([#716](https://github.com/node-formidable/formidable/pull/716))
* feat: add code and httpCode to most errors ([#686](https://github.com/node-formidable/formidable/pull/686))
* rename: option.hash into option.hashAlgorithm ([#689](https://github.com/node-formidable/formidable/pull/689))
* rename: file.path into file.filepath ([#689](https://github.com/node-formidable/formidable/pull/689))
* rename: file.type into file.mimetype ([#689](https://github.com/node-formidable/formidable/pull/689))
* refactor: split file.name into file.newFilename and file.originalFilename ([#689](https://github.com/node-formidable/formidable/pull/689))
* feat: prevent directory traversal attacks by default ([#689](https://github.com/node-formidable/formidable/pull/689))
* meta: stop including test files in npm ([7003c](https://github.com/node-formidable/formidable/commit/7003cd6133f90c384081accb51743688d5e1f4be))
* fix: handle invalid filenames ([d0a34](https://github.com/node-formidable/formidable/commit/d0a3484b048b8c177e62d66aecb03f5928f7a857))
* feat: add fileWriteStreamHandler option
* feat: add allowEmptyFiles and minFileSize options
* feat: Array support for fields and files ([#380](https://github.com/node-formidable/node-formidable/pull/380), [#340](https://github.com/node-formidable/node-formidable/pull/340), [#367](https://github.com/node-formidable/node-formidable/pull/367), [#33](https://github.com/node-formidable/node-formidable/issues/33), [#498](https://github.com/node-formidable/node-formidable/issues/498), [#280](https://github.com/node-formidable/node-formidable/issues/280), [#483](https://github.com/node-formidable/node-formidable/issues/483))
* possible partial fix of [#386](https://github.com/node-formidable/node-formidable/pull/386) with #380 (need tests and better implementation)
* refactor: use hasOwnProperty in check against files/fields ([#522](https://github.com/node-formidable/node-formidable/pull/522))
* meta: do not promote `IncomingForm` and add `exports.default` ([#529](https://github.com/node-formidable/node-formidable/pull/529))
* meta: Improve examples and tests ([#523](https://github.com/node-formidable/node-formidable/pull/523))
* refactor: First step of Code quality improvements ([#525](https://github.com/node-formidable/node-formidable/pull/525))
* chore(funding): remove patreon & add npm funding field ([#525](https://github.com/node-formidable/node-formidable/pull/532)
* feat: use Modern Streams API ([#531](https://github.com/node-formidable/node-formidable/pull/531))
* fix: urlencoded parsing to emit end [#543](https://github.com/node-formidable/node-formidable/pull/543), introduced in [#531](https://github.com/node-formidable/node-formidable/pull/531)
* fix(tests): include multipart and qs parser unit tests, part of [#415](https://github.com/node-formidable/node-formidable/issues/415)
* fix: reorganize exports + move parsers to `src/parsers/`
* fix: update docs and examples [#544](https://github.com/node-formidable/node-formidable/pull/544) ([#248](https://github.com/node-formidable/node-formidable/issues/248), [#335](https://github.com/node-formidable/node-formidable/issues/335), [#371](https://github.com/node-formidable/node-formidable/issues/371), [#372](https://github.com/node-formidable/node-formidable/issues/372), [#387](https://github.com/node-formidable/node-formidable/issues/387), partly [#471](https://github.com/node-formidable/node-formidable/issues/471), [#535](https://github.com/node-formidable/node-formidable/issues/535))
* feat: introduce Plugins API, fix silent failing tests ([#545](https://github.com/node-formidable/node-formidable/pull/545), [#391](https://github.com/node-formidable/node-formidable/pull/391), [#407](https://github.com/node-formidable/node-formidable/pull/407), [#386](https://github.com/node-formidable/node-formidable/pull/386), [#374](https://github.com/node-formidable/node-formidable/pull/374), [#521](https://github.com/node-formidable/node-formidable/pull/521), [#267](https://github.com/node-formidable/node-formidable/pull/267))
* fix: exposing file writable stream errors ([#520](https://github.com/node-formidable/node-formidable/pull/520), [#316](https://github.com/node-formidable/node-formidable/pull/316), [#469](https://github.com/node-formidable/node-formidable/pull/469), [#470](https://github.com/node-formidable/node-formidable/pull/470))
* feat: custom file (re)naming, thru options.filename ([#591](https://github.com/node-formidable/node-formidable/pull/591), [#84](https://github.com/node-formidable/node-formidable/issues/84), [#86](https://github.com/node-formidable/node-formidable/issues/86), [#94](https://github.com/node-formidable/node-formidable/issues/94), [#154](https://github.com/node-formidable/node-formidable/issues/154), [#158](https://github.com/node-formidable/node-formidable/issues/158), [#488](https://github.com/node-formidable/node-formidable/issues/488), [#595](https://github.com/node-formidable/node-formidable/issues/595))
### v1.2.1 (2018-03-20)
* `maxFileSize` option with default of 200MB (Charlike Mike Reagent, Nima Shahri)
* Simplified buffering in JSON parser to avoid denial of service attack (Kornel)
* Fixed upload file cleanup on aborted requests (liaoweiqiang)
* Fixed error handling of closed _writeStream (Vitalii)
### v1.1.1 (2017-01-15)
* Fix DeprecationWarning about os.tmpDir() (Christian)
* Update `buffer.write` order of arguments for Node 7 (Kornel Lesiński)
* JSON Parser emits error events to the IncomingForm (alessio.montagnani)
* Improved Content-Disposition parsing (Sebastien)
* Access WriteStream of fs during runtime instead of include time (Jonas Amundsen)
* Use built-in toString to convert buffer to hex (Charmander)
* Add hash to json if present (Nick Stamas)
* Add license to package.json (Simen Bekkhus)
### v1.0.14 (2013-05-03)
* Add failing hash tests. (Ben Trask)
* Enable hash calculation again (Eugene Girshov)
* Test for immediate data events (Tim Smart)
* Re-arrange IncomingForm#parse (Tim Smart)
### v1.0.13
* Only update hash if update method exists (Sven Lito)
* According to travis v0.10 needs to go quoted (Sven Lito)
* Bumping build node versions (Sven Lito)
* Additional fix for empty requests (Eugene Girshov)
* Change the default to 1000, to match the new Node behaviour. (OrangeDog)
* Add ability to control maxKeys in the querystring parser. (OrangeDog)
* Adjust test case to work with node 0.9.x (Eugene Girshov)
* Update package.json (Sven Lito)
* Path adjustment according to eb4468b (Markus Ast)
### v1.0.12
* Emit error on aborted connections (Eugene Girshov)
* Add support for empty requests (Eugene Girshov)
* Fix name/filename handling in Content-Disposition (jesperp)
* Tolerate malformed closing boundary in multipart (Eugene Girshov)
* Ignore preamble in multipart messages (Eugene Girshov)
* Add support for application/json (Mike Frey, Carlos Rodriguez)
* Add support for Base64 encoding (Elmer Bulthuis)
* Add File#toJSON (TJ Holowaychuk)
* Remove support for Node.js 0.4 & 0.6 (Andrew Kelley)
* Documentation improvements (Sven Lito, Andre Azevedo)
* Add support for application/octet-stream (Ion Lupascu, Chris Scribner)
* Use os.tmpdir() to get tmp directory (Andrew Kelley)
* Improve package.json (Andrew Kelley, Sven Lito)
* Fix benchmark script (Andrew Kelley)
* Fix scope issue in incoming_forms (Sven Lito)
* Fix file handle leak on error (OrangeDog)
---
[First commit, #3270eb4b1f8b (May 4th, 2010)](https://github.com/node-formidable/formidable/commit/3270eb4b1f8bb667b8c12f64c36a4e7b854216d8)

21
node_modules/formidable/LICENSE generated vendored Normal file
View file

@ -0,0 +1,21 @@
The MIT License (MIT)
Copyright (c) 2011-present Felix Geisendörfer, and contributors.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

826
node_modules/formidable/README.md generated vendored Normal file
View file

@ -0,0 +1,826 @@
<p align="center">
<img alt="npm formidable package logo" src="https://raw.githubusercontent.com/node-formidable/formidable/master/logo.png" />
</p>
# formidable [![npm version][npmv-img]][npmv-url] [![MIT license][license-img]][license-url] [![Libera Manifesto][libera-manifesto-img]][libera-manifesto-url] [![Twitter][twitter-img]][twitter-url]
> A Node.js module for parsing form data, especially file uploads.
[![Code style][codestyle-img]][codestyle-url]
[![codecoverage][codecov-img]][codecov-url]
[![linux build status][linux-build-img]][build-url]
[![windows build status][windows-build-img]][build-url]
[![macos build status][macos-build-img]][build-url]
If you have any _how-to_ kind of questions, please read the [Contributing
Guide][contributing-url] and [Code of Conduct][code_of_conduct-url]
documents.<br /> For bugs reports and feature requests, [please create an
issue][open-issue-url] or ping [@tunnckoCore / @3a1FcBx0](https://twitter.com/3a1FcBx0)
at Twitter.
[![Conventional Commits][ccommits-img]][ccommits-url]
[![Minimum Required Nodejs][nodejs-img]][npmv-url]
[![Tidelift Subcsription][tidelift-img]][tidelift-url]
[![Buy me a Kofi][kofi-img]][kofi-url]
[![Renovate App Status][renovateapp-img]][renovateapp-url]
[![Make A Pull Request][prs-welcome-img]][prs-welcome-url]
This project is [semantically versioned](https://semver.org) and available as
part of the [Tidelift Subscription][tidelift-url] for professional grade
assurances, enhanced support and security.
[Learn more.](https://tidelift.com/subscription/pkg/npm-formidable?utm_source=npm-formidable&utm_medium=referral&utm_campaign=enterprise)
_The maintainers of `formidable` and thousands of other packages are working
with Tidelift to deliver commercial support and maintenance for the Open Source
dependencies you use to build your applications. Save time, reduce risk, and
improve code health, while paying the maintainers of the exact dependencies you
use._
[![][npm-weekly-img]][npmv-url] [![][npm-monthly-img]][npmv-url]
[![][npm-yearly-img]][npmv-url] [![][npm-alltime-img]][npmv-url]
## Project Status: Maintained
_Check [VERSION NOTES](https://github.com/node-formidable/formidable/blob/master/VERSION_NOTES.md) for more information on v1, v2, and v3 plans, NPM dist-tags and branches._
This module was initially developed by
[**@felixge**](https://github.com/felixge) for
[Transloadit](http://transloadit.com/), a service focused on uploading and
encoding images and videos. It has been battle-tested against hundreds of GBs of
file uploads from a large variety of clients and is considered production-ready
and is used in production for years.
Currently, we are few maintainers trying to deal with it. :) More contributors
are always welcome! :heart: Jump on
[issue #412](https://github.com/felixge/node-formidable/issues/412) which is
closed, but if you are interested we can discuss it and add you after strict
rules, like enabling Two-Factor Auth in your npm and GitHub accounts.
## Highlights
- [Fast (~900-2500 mb/sec)](#benchmarks) & streaming multipart parser
- Automatically writing file uploads to disk (optional, see
[`options.fileWriteStreamHandler`](#options))
- [Plugins API](#useplugin-plugin) - allowing custom parsers and plugins
- Low memory footprint
- Graceful error handling
- Very high test coverage
## Install
This project requires `Node.js >= 10.13`. Install it using
[yarn](https://yarnpkg.com) or [npm](https://npmjs.com).<br /> _We highly
recommend to use Yarn when you think to contribute to this project._
This is a low-level package, and if you're using a high-level framework it _may_
already be included. Check the examples below and the [examples/](https://github.com/node-formidable/formidable/tree/master/examples) folder.
```sh
# v2
npm install formidable
npm install formidable@latest
npm install formidable@v2
# or v3
npm install formidable@v3
```
_**Note:** In near future v3 will be published on the `latest` NPM dist-tag. Future not ready releases will continue to be published on `canary` dist-tag._
## Examples
For more examples look at the `examples/` directory.
### with Node.js http module
Parse an incoming file upload, with the
[Node.js's built-in `http` module](https://nodejs.org/api/http.html).
```js
const http = require('http');
const formidable = require('formidable');
const server = http.createServer((req, res) => {
if (req.url === '/api/upload' && req.method.toLowerCase() === 'post') {
// parse a file upload
const form = formidable({ multiples: true });
form.parse(req, (err, fields, files) => {
if (err) {
res.writeHead(err.httpCode || 400, { 'Content-Type': 'text/plain' });
res.end(String(err));
return;
}
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ fields, files }, null, 2));
});
return;
}
// show a file upload form
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end(`
<h2>With Node.js <code>"http"</code> module</h2>
<form action="/api/upload" enctype="multipart/form-data" method="post">
<div>Text field title: <input type="text" name="title" /></div>
<div>File: <input type="file" name="multipleFiles" multiple="multiple" /></div>
<input type="submit" value="Upload" />
</form>
`);
});
server.listen(8080, () => {
console.log('Server listening on http://localhost:8080/ ...');
});
```
### with Express.js
There are multiple variants to do this, but Formidable just need Node.js Request
stream, so something like the following example should work just fine, without
any third-party [Express.js](https://ghub.now.sh/express) middleware.
Or try the
[examples/with-express.js](https://github.com/node-formidable/formidable/blob/master/examples/with-express.js)
```js
const express = require('express');
const formidable = require('formidable');
const app = express();
app.get('/', (req, res) => {
res.send(`
<h2>With <code>"express"</code> npm package</h2>
<form action="/api/upload" enctype="multipart/form-data" method="post">
<div>Text field title: <input type="text" name="title" /></div>
<div>File: <input type="file" name="someExpressFiles" multiple="multiple" /></div>
<input type="submit" value="Upload" />
</form>
`);
});
app.post('/api/upload', (req, res, next) => {
const form = formidable({ multiples: true });
form.parse(req, (err, fields, files) => {
if (err) {
next(err);
return;
}
res.json({ fields, files });
});
});
app.listen(3000, () => {
console.log('Server listening on http://localhost:3000 ...');
});
```
### with Koa and Formidable
Of course, with [Koa v1, v2 or future v3](https://ghub.now.sh/koa) the things
are very similar. You can use `formidable` manually as shown below or through
the [koa-better-body](https://ghub.now.sh/koa-better-body) package which is
using `formidable` under the hood and support more features and different
request bodies, check its documentation for more info.
_Note: this example is assuming Koa v2. Be aware that you should pass `ctx.req`
which is Node.js's Request, and **NOT** the `ctx.request` which is Koa's Request
object - there is a difference._
```js
const Koa = require('koa');
const formidable = require('formidable');
const app = new Koa();
app.on('error', (err) => {
console.error('server error', err);
});
app.use(async (ctx, next) => {
if (ctx.url === '/api/upload' && ctx.method.toLowerCase() === 'post') {
const form = formidable({ multiples: true });
// not very elegant, but that's for now if you don't want to use `koa-better-body`
// or other middlewares.
await new Promise((resolve, reject) => {
form.parse(ctx.req, (err, fields, files) => {
if (err) {
reject(err);
return;
}
ctx.set('Content-Type', 'application/json');
ctx.status = 200;
ctx.state = { fields, files };
ctx.body = JSON.stringify(ctx.state, null, 2);
resolve();
});
});
await next();
return;
}
// show a file upload form
ctx.set('Content-Type', 'text/html');
ctx.status = 200;
ctx.body = `
<h2>With <code>"koa"</code> npm package</h2>
<form action="/api/upload" enctype="multipart/form-data" method="post">
<div>Text field title: <input type="text" name="title" /></div>
<div>File: <input type="file" name="koaFiles" multiple="multiple" /></div>
<input type="submit" value="Upload" />
</form>
`;
});
app.use((ctx) => {
console.log('The next middleware is called');
console.log('Results:', ctx.state);
});
app.listen(3000, () => {
console.log('Server listening on http://localhost:3000 ...');
});
```
## Benchmarks
The benchmark is quite old, from the old codebase. But maybe quite true though.
Previously the numbers was around ~500 mb/sec. Currently with moving to the new
Node.js Streams API it's faster. You can clearly see the differences between the
Node versions.
_Note: a lot better benchmarking could and should be done in future._
Benchmarked on 8GB RAM, Xeon X3440 (2.53 GHz, 4 cores, 8 threads)
```
~/github/node-formidable master
nve --parallel 8 10 12 13 node benchmark/bench-multipart-parser.js
⬢ Node 8
1261.08 mb/sec
⬢ Node 10
1113.04 mb/sec
⬢ Node 12
2107.00 mb/sec
⬢ Node 13
2566.42 mb/sec
```
![benchmark January 29th, 2020](./benchmark/2020-01-29_xeon-x3440.png)
## API
### Formidable / IncomingForm
All shown are equivalent.
_Please pass [`options`](#options) to the function/constructor, not by assigning
them to the instance `form`_
```js
const formidable = require('formidable');
const form = formidable(options);
// or
const { formidable } = require('formidable');
const form = formidable(options);
// or
const { IncomingForm } = require('formidable');
const form = new IncomingForm(options);
// or
const { Formidable } = require('formidable');
const form = new Formidable(options);
```
### Options
See it's defaults in [src/Formidable.js DEFAULT_OPTIONS](./src/Formidable.js)
(the `DEFAULT_OPTIONS` constant).
- `options.encoding` **{string}** - default `'utf-8'`; sets encoding for
incoming form fields,
- `options.uploadDir` **{string}** - default `os.tmpdir()`; the directory for
placing file uploads in. You can move them later by using `fs.rename()`.
- `options.keepExtensions` **{boolean}** - default `false`; to include the
extensions of the original files or not
- `options.allowEmptyFiles` **{boolean}** - default `true`; allow upload empty
files
- `options.minFileSize` **{number}** - default `1` (1byte); the minium size of
uploaded file.
- `options.maxFileSize` **{number}** - default `200 * 1024 * 1024` (200mb);
limit the size of uploaded file.
- `options.maxFields` **{number}** - default `1000`; limit the number of fields, set 0 for unlimited
- `options.maxFieldsSize` **{number}** - default `20 * 1024 * 1024` (20mb);
limit the amount of memory all fields together (except files) can allocate in
bytes.
- `options.hashAlgorithm` **{string | false}** - default `false`; include checksums calculated
for incoming files, set this to some hash algorithm, see
[crypto.createHash](https://nodejs.org/api/crypto.html#crypto_crypto_createhash_algorithm_options)
for available algorithms
- `options.fileWriteStreamHandler` **{function}** - default `null`, which by
default writes to host machine file system every file parsed; The function
should return an instance of a
[Writable stream](https://nodejs.org/api/stream.html#stream_class_stream_writable)
that will receive the uploaded file data. With this option, you can have any
custom behavior regarding where the uploaded file data will be streamed for.
If you are looking to write the file uploaded in other types of cloud storages
(AWS S3, Azure blob storage, Google cloud storage) or private file storage,
this is the option you're looking for. When this option is defined the default
behavior of writing the file in the host machine file system is lost.
- `options.multiples` **{boolean}** - default `false`; when you call the
`.parse` method, the `files` argument (of the callback) will contain arrays of
files for inputs which submit multiple files using the HTML5 `multiple`
attribute. Also, the `fields` argument will contain arrays of values for
fields that have names ending with '[]'.
- `options.filename` **{function}** - default `undefined` Use it to control
newFilename. Must return a string. Will be joined with options.uploadDir.
- `options.filter` **{function}** - default function that always returns true.
Use it to filter files before they are uploaded. Must return a boolean.
#### `options.filename` **{function}** function (name, ext, part, form) -> string
_**Note:** If this size of combined fields, or size of some file is exceeded, an
`'error'` event is fired._
```js
// The amount of bytes received for this form so far.
form.bytesReceived;
```
```js
// The expected number of bytes in this form.
form.bytesExpected;
```
#### `options.filter` **{function}** function ({name, originalFilename, mimetype}) -> boolean
**Note:** use an outside variable to cancel all uploads upon the first error
```js
const options = {
filter: function ({name, originalFilename, mimetype}) {
// keep only images
return mimetype && mimetype.includes("image");
}
};
```
### .parse(request, callback)
Parses an incoming Node.js `request` containing form data. If `callback` is
provided, all fields and files are collected and passed to the callback.
```js
const formidable = require('formidable');
const form = formidable({ multiples: true, uploadDir: __dirname });
form.parse(req, (err, fields, files) => {
console.log('fields:', fields);
console.log('files:', files);
});
```
You may overwrite this method if you are interested in directly accessing the
multipart stream. Doing so will disable any `'field'` / `'file'` events
processing which would occur otherwise, making you fully responsible for
handling the processing.
About `uploadDir`, given the following directory structure
```
project-name
├── src
│ └── server.js
└── uploads
└── image.jpg
```
`__dirname` would be the same directory as the source file itself (src)
```js
`${__dirname}/../uploads`
```
to put files in uploads.
Omitting `__dirname` would make the path relative to the current working directory. This would be the same if server.js is launched from src but not project-name.
`null` will use default which is `os.tmpdir()`
Note: If the directory does not exist, the uploaded files are __silently discarded__. To make sure it exists:
```js
import {createNecessaryDirectoriesSync} from "filesac";
const uploadPath = `${__dirname}/../uploads`;
createNecessaryDirectoriesSync(`${uploadPath}/x`);
```
In the example below, we listen on couple of events and direct them to the
`data` listener, so you can do whatever you choose there, based on whether its
before the file been emitted, the header value, the header name, on field, on
file and etc.
Or the other way could be to just override the `form.onPart` as it's shown a bit
later.
```js
form.once('error', console.error);
form.on('fileBegin', (formname, file) => {
form.emit('data', { name: 'fileBegin', formname, value: file });
});
form.on('file', (formname, file) => {
form.emit('data', { name: 'file', formname, value: file });
});
form.on('field', (fieldName, fieldValue) => {
form.emit('data', { name: 'field', key: fieldName, value: fieldValue });
});
form.once('end', () => {
console.log('Done!');
});
// If you want to customize whatever you want...
form.on('data', ({ name, key, value, buffer, start, end, formname, ...more }) => {
if (name === 'partBegin') {
}
if (name === 'partData') {
}
if (name === 'headerField') {
}
if (name === 'headerValue') {
}
if (name === 'headerEnd') {
}
if (name === 'headersEnd') {
}
if (name === 'field') {
console.log('field name:', key);
console.log('field value:', value);
}
if (name === 'file') {
console.log('file:', formname, value);
}
if (name === 'fileBegin') {
console.log('fileBegin:', formname, value);
}
});
```
### .use(plugin: Plugin)
A method that allows you to extend the Formidable library. By default we include
4 plugins, which esentially are adapters to plug the different built-in parsers.
**The plugins added by this method are always enabled.**
_See [src/plugins/](./src/plugins/) for more detailed look on default plugins._
The `plugin` param has such signature:
```typescript
function(formidable: Formidable, options: Options): void;
```
The architecture is simple. The `plugin` is a function that is passed with the
Formidable instance (the `form` across the README examples) and the options.
**Note:** the plugin function's `this` context is also the same instance.
```js
const formidable = require('formidable');
const form = formidable({ keepExtensions: true });
form.use((self, options) => {
// self === this === form
console.log('woohoo, custom plugin');
// do your stuff; check `src/plugins` for inspiration
});
form.parse(req, (error, fields, files) => {
console.log('done!');
});
```
**Important to note**, is that inside plugin `this.options`, `self.options` and
`options` MAY or MAY NOT be the same. General best practice is to always use the
`this`, so you can later test your plugin independently and more easily.
If you want to disable some parsing capabilities of Formidable, you can disable
the plugin which corresponds to the parser. For example, if you want to disable
multipart parsing (so the [src/parsers/Multipart.js](./src/parsers/Multipart.js)
which is used in [src/plugins/multipart.js](./src/plugins/multipart.js)), then
you can remove it from the `options.enabledPlugins`, like so
```js
const { Formidable } = require('formidable');
const form = new Formidable({
hashAlgorithm: 'sha1',
enabledPlugins: ['octetstream', 'querystring', 'json'],
});
```
**Be aware** that the order _MAY_ be important too. The names corresponds 1:1 to
files in [src/plugins/](./src/plugins) folder.
Pull requests for new built-in plugins MAY be accepted - for example, more
advanced querystring parser. Add your plugin as a new file in `src/plugins/`
folder (lowercased) and follow how the other plugins are made.
### form.onPart
If you want to use Formidable to only handle certain parts for you, you can do
something similar. Or see
[#387](https://github.com/node-formidable/node-formidable/issues/387) for
inspiration, you can for example validate the mime-type.
```js
const form = formidable();
form.onPart = (part) => {
part.on('data', (buffer) => {
// do whatever you want here
});
};
```
For example, force Formidable to be used only on non-file "parts" (i.e., html
fields)
```js
const form = formidable();
form.onPart = function (part) {
// let formidable handle only non-file parts
if (part.originalFilename === '' || !part.mimetype) {
// used internally, please do not override!
form._handlePart(part);
}
};
```
### File
```ts
export interface File {
// The size of the uploaded file in bytes.
// If the file is still being uploaded (see `'fileBegin'` event),
// this property says how many bytes of the file have been written to disk yet.
file.size: number;
// The path this file is being written to. You can modify this in the `'fileBegin'` event in
// case you are unhappy with the way formidable generates a temporary path for your files.
file.filepath: string;
// The name this file had according to the uploading client.
file.originalFilename: string | null;
// calculated based on options provided
file.newFilename: string | null;
// The mime type of this file, according to the uploading client.
file.mimetype: string | null;
// A Date object (or `null`) containing the time this file was last written to.
// Mostly here for compatibility with the [W3C File API Draft](http://dev.w3.org/2006/webapi/FileAPI/).
file.mtime: Date | null;
file.hashAlgorithm: false | |'sha1' | 'md5' | 'sha256'
// If `options.hashAlgorithm` calculation was set, you can read the hex digest out of this var (at the end it will be a string)
file.hash: string | object | null;
}
```
#### file.toJSON()
This method returns a JSON-representation of the file, allowing you to
`JSON.stringify()` the file which is useful for logging and responding to
requests.
### Events
#### `'progress'`
Emitted after each incoming chunk of data that has been parsed. Can be used to
roll your own progress bar.
```js
form.on('progress', (bytesReceived, bytesExpected) => {});
```
#### `'field'`
Emitted whenever a field / value pair has been received.
```js
form.on('field', (name, value) => {});
```
#### `'fileBegin'`
Emitted whenever a new file is detected in the upload stream. Use this event if
you want to stream the file to somewhere else while buffering the upload on the
file system.
```js
form.on('fileBegin', (formName, file) => {
// accessible here
// formName the name in the form (<input name="thisname" type="file">) or http filename for octetstream
// file.originalFilename http filename or null if there was a parsing error
// file.newFilename generated hexoid or what options.filename returned
// file.filepath default pathnme as per options.uploadDir and options.filename
// file.filepath = CUSTOM_PATH // to change the final path
});
```
#### `'file'`
Emitted whenever a field / file pair has been received. `file` is an instance of
`File`.
```js
form.on('file', (formname, file) => {
// same as fileBegin, except
// it is too late to change file.filepath
// file.hash is available if options.hash was used
});
```
#### `'error'`
Emitted when there is an error processing the incoming form. A request that
experiences an error is automatically paused, you will have to manually call
`request.resume()` if you want the request to continue firing `'data'` events.
May have `error.httpCode` and `error.code` attached.
```js
form.on('error', (err) => {});
```
#### `'aborted'`
Emitted when the request was aborted by the user. Right now this can be due to a
'timeout' or 'close' event on the socket. After this event is emitted, an
`error` event will follow. In the future there will be a separate 'timeout'
event (needs a change in the node core).
```js
form.on('aborted', () => {});
```
#### `'end'`
Emitted when the entire request has been received, and all contained files have
finished flushing to disk. This is a great place for you to send your response.
```js
form.on('end', () => {});
```
## Changelog
[./CHANGELOG.md](./CHANGELOG.md)
## Ports & Credits
- [multipart-parser](http://github.com/FooBarWidget/multipart-parser): a C++
parser based on formidable
- [Ryan Dahl](http://twitter.com/ryah) for his work on
[http-parser](http://github.com/ry/http-parser) which heavily inspired the
initial `multipart_parser.js`.
## Contributing
If the documentation is unclear or has a typo, please click on the page's `Edit`
button (pencil icon) and suggest a correction. If you would like to help us fix
a bug or add a new feature, please check our [Contributing
Guide][contributing-url]. Pull requests are welcome!
Thanks goes to these wonderful people
([emoji key](https://allcontributors.org/docs/en/emoji-key)):
<!-- ALL-CONTRIBUTORS-LIST:START -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tr>
<td align="center"><a href="https://twitter.com/felixge"><img src="https://avatars3.githubusercontent.com/u/15000?s=460&v=4" width="100px;" alt=""/><br /><sub><b>Felix Geisendörfer</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/commits?author=felixge" title="Code">💻</a> <a href="#design-felixge" title="Design">🎨</a> <a href="#ideas-felixge" title="Ideas, Planning, & Feedback">🤔</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=felixge" title="Documentation">📖</a></td>
<td align="center"><a href="https://tunnckoCore.com"><img src="https://avatars3.githubusercontent.com/u/5038030?v=4" width="100px;" alt=""/><br /><sub><b>Charlike Mike Reagent</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/issues?q=author%3AtunnckoCore" title="Bug reports">🐛</a> <a href="#infra-tunnckoCore" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a> <a href="#design-tunnckoCore" title="Design">🎨</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=tunnckoCore" title="Code">💻</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=tunnckoCore" title="Documentation">📖</a> <a href="#example-tunnckoCore" title="Examples">💡</a> <a href="#ideas-tunnckoCore" title="Ideas, Planning, & Feedback">🤔</a> <a href="#maintenance-tunnckoCore" title="Maintenance">🚧</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=tunnckoCore" title="Tests">⚠️</a></td>
<td align="center"><a href="https://github.com/kedarv"><img src="https://avatars1.githubusercontent.com/u/1365665?v=4" width="100px;" alt=""/><br /><sub><b>Kedar</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/commits?author=kedarv" title="Code">💻</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=kedarv" title="Tests">⚠️</a> <a href="#question-kedarv" title="Answering Questions">💬</a> <a href="https://github.com/node-formidable/node-formidable/issues?q=author%3Akedarv" title="Bug reports">🐛</a></td>
<td align="center"><a href="https://github.com/GrosSacASac"><img src="https://avatars0.githubusercontent.com/u/5721194?v=4" width="100px;" alt=""/><br /><sub><b>Walle Cyril</b></sub></a><br /><a href="#question-GrosSacASac" title="Answering Questions">💬</a> <a href="https://github.com/node-formidable/node-formidable/issues?q=author%3AGrosSacASac" title="Bug reports">🐛</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=GrosSacASac" title="Code">💻</a> <a href="#financial-GrosSacASac" title="Financial">💵</a> <a href="#ideas-GrosSacASac" title="Ideas, Planning, & Feedback">🤔</a> <a href="#maintenance-GrosSacASac" title="Maintenance">🚧</a></td>
<td align="center"><a href="https://github.com/xarguments"><img src="https://avatars2.githubusercontent.com/u/40522463?v=4" width="100px;" alt=""/><br /><sub><b>Xargs</b></sub></a><br /><a href="#question-xarguments" title="Answering Questions">💬</a> <a href="https://github.com/node-formidable/node-formidable/issues?q=author%3Axarguments" title="Bug reports">🐛</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=xarguments" title="Code">💻</a> <a href="#maintenance-xarguments" title="Maintenance">🚧</a></td>
<td align="center"><a href="https://github.com/Amit-A"><img src="https://avatars1.githubusercontent.com/u/7987238?v=4" width="100px;" alt=""/><br /><sub><b>Amit-A</b></sub></a><br /><a href="#question-Amit-A" title="Answering Questions">💬</a> <a href="https://github.com/node-formidable/node-formidable/issues?q=author%3AAmit-A" title="Bug reports">🐛</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=Amit-A" title="Code">💻</a></td>
</tr>
<tr>
<td align="center"><a href="https://charmander.me/"><img src="https://avatars1.githubusercontent.com/u/1889843?v=4" width="100px;" alt=""/><br /><sub><b>Charmander</b></sub></a><br /><a href="#question-charmander" title="Answering Questions">💬</a> <a href="https://github.com/node-formidable/node-formidable/issues?q=author%3Acharmander" title="Bug reports">🐛</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=charmander" title="Code">💻</a> <a href="#ideas-charmander" title="Ideas, Planning, & Feedback">🤔</a> <a href="#maintenance-charmander" title="Maintenance">🚧</a></td>
<td align="center"><a href="https://twitter.com/dylan_piercey"><img src="https://avatars2.githubusercontent.com/u/4985201?v=4" width="100px;" alt=""/><br /><sub><b>Dylan Piercey</b></sub></a><br /><a href="#ideas-DylanPiercey" title="Ideas, Planning, & Feedback">🤔</a></td>
<td align="center"><a href="http://ochrona.jawne.info.pl"><img src="https://avatars1.githubusercontent.com/u/3618479?v=4" width="100px;" alt=""/><br /><sub><b>Adam Dobrawy</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/issues?q=author%3Aad-m" title="Bug reports">🐛</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=ad-m" title="Documentation">📖</a></td>
<td align="center"><a href="https://github.com/amitrohatgi"><img src="https://avatars3.githubusercontent.com/u/12177021?v=4" width="100px;" alt=""/><br /><sub><b>amitrohatgi</b></sub></a><br /><a href="#ideas-amitrohatgi" title="Ideas, Planning, & Feedback">🤔</a></td>
<td align="center"><a href="https://github.com/fengxinming"><img src="https://avatars2.githubusercontent.com/u/6262382?v=4" width="100px;" alt=""/><br /><sub><b>Jesse Feng</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/issues?q=author%3Afengxinming" title="Bug reports">🐛</a></td>
<td align="center"><a href="https://qtmsheep.com"><img src="https://avatars1.githubusercontent.com/u/7271496?v=4" width="100px;" alt=""/><br /><sub><b>Nathanael Demacon</b></sub></a><br /><a href="#question-quantumsheep" title="Answering Questions">💬</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=quantumsheep" title="Code">💻</a> <a href="https://github.com/node-formidable/node-formidable/pulls?q=is%3Apr+reviewed-by%3Aquantumsheep" title="Reviewed Pull Requests">👀</a></td>
</tr>
<tr>
<td align="center"><a href="https://github.com/MunMunMiao"><img src="https://avatars1.githubusercontent.com/u/18216142?v=4" width="100px;" alt=""/><br /><sub><b>MunMunMiao</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/issues?q=author%3AMunMunMiao" title="Bug reports">🐛</a></td>
<td align="center"><a href="https://github.com/gabipetrovay"><img src="https://avatars0.githubusercontent.com/u/1170398?v=4" width="100px;" alt=""/><br /><sub><b>Gabriel Petrovay</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/issues?q=author%3Agabipetrovay" title="Bug reports">🐛</a> <a href="https://github.com/node-formidable/node-formidable/commits?author=gabipetrovay" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/Elzair"><img src="https://avatars0.githubusercontent.com/u/2352818?v=4" width="100px;" alt=""/><br /><sub><b>Philip Woods</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/commits?author=Elzair" title="Code">💻</a> <a href="#ideas-Elzair" title="Ideas, Planning, & Feedback">🤔</a></td>
<td align="center"><a href="https://github.com/dmolim"><img src="https://avatars2.githubusercontent.com/u/7090374?v=4" width="100px;" alt=""/><br /><sub><b>Dmitry Ivonin</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/commits?author=dmolim" title="Documentation">📖</a></td>
<td align="center"><a href="https://audiobox.fm"><img src="https://avatars1.githubusercontent.com/u/12844?v=4" width="100px;" alt=""/><br /><sub><b>Claudio Poli</b></sub></a><br /><a href="https://github.com/node-formidable/node-formidable/commits?author=masterkain" title="Code">💻</a></td>
</tr>
</table>
<!-- markdownlint-enable -->
<!-- prettier-ignore-end -->
<!-- ALL-CONTRIBUTORS-LIST:END -->
From a [Felix blog post](https://felixge.de/2013/03/11/the-pull-request-hack/):
- [Sven Lito](https://github.com/svnlto) for fixing bugs and merging patches
- [egirshov](https://github.com/egirshov) for contributing many improvements to the node-formidable multipart parser
- [Andrew Kelley](https://github.com/superjoe30) for also helping with fixing bugs and making improvements
- [Mike Frey](https://github.com/mikefrey) for contributing JSON support
## License
Formidable is licensed under the [MIT License][license-url].
<!-- badges -->
<!-- prettier-ignore-start -->
[codestyle-url]: https://github.com/airbnb/javascript
[codestyle-img]: https://badgen.net/badge/code%20style/airbnb%20%2B%20prettier/ff5a5f?icon=airbnb&cache=300
[codecov-url]: https://codecov.io/gh/node-formidable/formidable
[codecov-img]: https://badgen.net/codecov/c/github/node-formidable/formidable/master?icon=codecov
[npmv-canary-img]: https://badgen.net/npm/v/formidable/canary?icon=npm
[npmv-dev-img]: https://badgen.net/npm/v/formidable/dev?icon=npm
[npmv-img]: https://badgen.net/npm/v/formidable?icon=npm
[npmv-url]: https://npmjs.com/package/formidable
[license-img]: https://badgen.net/npm/license/formidable
[license-url]: https://github.com/node-formidable/formidable/blob/master/LICENSE
[chat-img]: https://badgen.net/badge/chat/on%20gitter/46BC99?icon=gitter
[chat-url]: https://gitter.im/node-formidable/Lobby
[libera-manifesto-url]: https://liberamanifesto.com
[libera-manifesto-img]: https://badgen.net/badge/libera/manifesto/grey
[renovateapp-url]: https://renovatebot.com
[renovateapp-img]: https://badgen.net/badge/renovate/enabled/green?cache=300
[prs-welcome-img]: https://badgen.net/badge/PRs/welcome/green?cache=300
[prs-welcome-url]: http://makeapullrequest.com
[twitter-url]: https://twitter.com/3a1fcBx0
[twitter-img]: https://badgen.net/twitter/follow/3a1fcBx0?icon=twitter&color=1da1f2&cache=300
[npm-weekly-img]: https://badgen.net/npm/dw/formidable?icon=npm&cache=300
[npm-monthly-img]: https://badgen.net/npm/dm/formidable?icon=npm&cache=300
[npm-yearly-img]: https://badgen.net/npm/dy/formidable?icon=npm&cache=300
[npm-alltime-img]: https://badgen.net/npm/dt/formidable?icon=npm&cache=300&label=total%20downloads
[nodejs-img]: https://badgen.net/badge/node/>=%2010.13/green?cache=300
[ccommits-url]: https://conventionalcommits.org/
[ccommits-img]: https://badgen.net/badge/conventional%20commits/v1.0.0/green?cache=300
[contributing-url]: https://github.com/node-formidable/.github/blob/master/CONTRIBUTING.md
[code_of_conduct-url]: https://github.com/node-formidable/.github/blob/master/CODE_OF_CONDUCT.md
[open-issue-url]: https://github.com/node-formidable/formidable/issues/new
[tidelift-url]: https://tidelift.com/subscription/pkg/npm-formidable?utm_source=npm-formidable&utm_medium=referral&utm_campaign=enterprise
[tidelift-img]: https://badgen.net/badge/tidelift/subscription/4B5168?labelColor=F6914D
[kofi-url]: https://ko-fi.com/tunnckoCore/commissions
[kofi-img]: https://badgen.net/badge/ko-fi/support/29abe0c2?cache=300&icon=https://rawcdn.githack.com/tunnckoCore/badgen-icons/f8264c6414e0bec449dd86f2241d50a9b89a1203/icons/kofi.svg
[linux-build-img]: https://badgen.net/github/checks/node-formidable/formidable/master/ubuntu?cache=300&label=linux%20build&icon=github
[macos-build-img]: https://badgen.net/github/checks/node-formidable/formidable/master/macos?cache=300&label=macos%20build&icon=github
[windows-build-img]: https://badgen.net/github/checks/node-formidable/formidable/master/windows?cache=300&label=windows%20build&icon=github
[build-url]: https://github.com/node-formidable/formidable/actions?query=workflow%3Anodejs
<!-- prettier-ignore-end -->

130
node_modules/formidable/package.json generated vendored Normal file
View file

@ -0,0 +1,130 @@
{
"_from": "formidable",
"_id": "formidable@2.1.1",
"_inBundle": false,
"_integrity": "sha512-0EcS9wCFEzLvfiks7omJ+SiYJAiD+TzK4Pcw1UlUoGnhUxDcMKjt0P7x8wEb0u6OHu8Nb98WG3nxtlF5C7bvUQ==",
"_location": "/formidable",
"_phantomChildren": {},
"_requested": {
"type": "tag",
"registry": true,
"raw": "formidable",
"name": "formidable",
"escapedName": "formidable",
"rawSpec": "",
"saveSpec": null,
"fetchSpec": "latest"
},
"_requiredBy": [
"#USER",
"/"
],
"_resolved": "https://registry.npmjs.org/formidable/-/formidable-2.1.1.tgz",
"_shasum": "81269cbea1a613240049f5f61a9d97731517414f",
"_spec": "formidable",
"_where": "/home/ubuntu/formidable",
"bugs": {
"url": "https://github.com/node-formidable/formidable/issues"
},
"bundleDependencies": false,
"commitlint": {
"extends": [
"@commitlint/config-conventional"
]
},
"dependencies": {
"dezalgo": "^1.0.4",
"hexoid": "^1.0.0",
"once": "^1.4.0",
"qs": "^6.11.0"
},
"deprecated": false,
"description": "A node.js module for parsing form data, especially file uploads.",
"devDependencies": {
"@commitlint/cli": "8.3.5",
"@commitlint/config-conventional": "8.3.4",
"@tunnckocore/prettier-config": "1.3.8",
"del-cli": "3.0.0",
"eslint": "6.8.0",
"eslint-config-airbnb-base": "14.1.0",
"eslint-config-prettier": "6.11.0",
"eslint-plugin-import": "2.20.2",
"eslint-plugin-prettier": "3.1.3",
"express": "4.17.1",
"husky": "4.2.5",
"jest": "25.4.0",
"koa": "2.11.0",
"lint-staged": "10.2.7",
"make-dir-cli": "2.0.0",
"nyc": "15.0.1",
"prettier": "2.0.5",
"prettier-plugin-pkgjson": "0.2.8",
"request": "2.88.2",
"supertest": "4.0.2"
},
"files": [
"src"
],
"funding": "https://ko-fi.com/tunnckoCore/commissions",
"homepage": "https://github.com/node-formidable/formidable",
"husky": {
"hooks": {
"pre-commit": "git status --porcelain && yarn lint-staged",
"commit-msg": "yarn commitlint -E HUSKY_GIT_PARAMS"
}
},
"jest": {
"verbose": true
},
"keywords": [
"multipart",
"form",
"data",
"querystring",
"www",
"json",
"ulpoad",
"file"
],
"license": "MIT",
"lint-staged": {
"!*.{js,jsx,ts,tsx}": [
"yarn run fmt:prepare"
],
"*.{js,jsx,ts,tsx}": [
"yarn run lint"
]
},
"main": "./src/index.js",
"name": "formidable",
"publishConfig": {
"access": "public",
"tag": "latest"
},
"renovate": {
"extends": [
"@tunnckocore",
":pinAllExceptPeerDependencies"
]
},
"repository": {
"type": "git",
"url": "git+https://github.com/node-formidable/formidable.git"
},
"scripts": {
"bench": "node benchmark",
"fmt": "yarn run fmt:prepare '**/*'",
"fmt:prepare": "prettier --write",
"lint": "yarn run lint:prepare .",
"lint:prepare": "eslint --cache --fix --quiet --format codeframe",
"postreinstall": "yarn setup",
"pretest": "del-cli ./test/tmp && make-dir ./test/tmp",
"pretest:ci": "yarn run pretest",
"reinstall": "del-cli ./node_modules ./yarn.lock",
"setup": "yarn",
"test": "jest --coverage",
"test:ci": "nyc jest --coverage",
"test:jest": "jest --coverage"
},
"version": "2.1.1"
}

617
node_modules/formidable/src/Formidable.js generated vendored Normal file
View file

@ -0,0 +1,617 @@
/* eslint-disable class-methods-use-this */
/* eslint-disable no-underscore-dangle */
'use strict';
const os = require('os');
const path = require('path');
const hexoid = require('hexoid');
const once = require('once');
const dezalgo = require('dezalgo');
const { EventEmitter } = require('events');
const { StringDecoder } = require('string_decoder');
const qs = require('qs');
const toHexoId = hexoid(25);
const DEFAULT_OPTIONS = {
maxFields: 1000,
maxFieldsSize: 20 * 1024 * 1024,
maxFileSize: 200 * 1024 * 1024,
minFileSize: 1,
allowEmptyFiles: true,
keepExtensions: false,
encoding: 'utf-8',
hashAlgorithm: false,
uploadDir: os.tmpdir(),
multiples: false,
enabledPlugins: ['octetstream', 'querystring', 'multipart', 'json'],
fileWriteStreamHandler: null,
defaultInvalidName: 'invalid-name',
filter: function () {
return true;
},
};
const PersistentFile = require('./PersistentFile');
const VolatileFile = require('./VolatileFile');
const DummyParser = require('./parsers/Dummy');
const MultipartParser = require('./parsers/Multipart');
const errors = require('./FormidableError.js');
const { FormidableError } = errors;
function hasOwnProp(obj, key) {
return Object.prototype.hasOwnProperty.call(obj, key);
}
class IncomingForm extends EventEmitter {
constructor(options = {}) {
super();
this.options = { ...DEFAULT_OPTIONS, ...options };
const dir = path.resolve(
this.options.uploadDir || this.options.uploaddir || os.tmpdir(),
);
this.uploaddir = dir;
this.uploadDir = dir;
// initialize with null
[
'error',
'headers',
'type',
'bytesExpected',
'bytesReceived',
'_parser',
].forEach((key) => {
this[key] = null;
});
this._setUpRename();
this._flushing = 0;
this._fieldsSize = 0;
this._fileSize = 0;
this._plugins = [];
this.openedFiles = [];
this.options.enabledPlugins = []
.concat(this.options.enabledPlugins)
.filter(Boolean);
if (this.options.enabledPlugins.length === 0) {
throw new FormidableError(
'expect at least 1 enabled builtin plugin, see options.enabledPlugins',
errors.missingPlugin,
);
}
this.options.enabledPlugins.forEach((pluginName) => {
const plgName = pluginName.toLowerCase();
// eslint-disable-next-line import/no-dynamic-require, global-require
this.use(require(path.join(__dirname, 'plugins', `${plgName}.js`)));
});
this._setUpMaxFields();
}
use(plugin) {
if (typeof plugin !== 'function') {
throw new FormidableError(
'.use: expect `plugin` to be a function',
errors.pluginFunction,
);
}
this._plugins.push(plugin.bind(this));
return this;
}
parse(req, cb) {
this.pause = () => {
try {
req.pause();
} catch (err) {
// the stream was destroyed
if (!this.ended) {
// before it was completed, crash & burn
this._error(err);
}
return false;
}
return true;
};
this.resume = () => {
try {
req.resume();
} catch (err) {
// the stream was destroyed
if (!this.ended) {
// before it was completed, crash & burn
this._error(err);
}
return false;
}
return true;
};
// Setup callback first, so we don't miss anything from data events emitted immediately.
if (cb) {
const callback = once(dezalgo(cb));
const fields = {};
let mockFields = '';
const files = {};
this.on('field', (name, value) => {
if (
this.options.multiples &&
(this.type === 'multipart' || this.type === 'urlencoded')
) {
const mObj = { [name]: value };
mockFields = mockFields
? `${mockFields}&${qs.stringify(mObj)}`
: `${qs.stringify(mObj)}`;
} else {
fields[name] = value;
}
});
this.on('file', (name, file) => {
// TODO: too much nesting
if (this.options.multiples) {
if (hasOwnProp(files, name)) {
if (!Array.isArray(files[name])) {
files[name] = [files[name]];
}
files[name].push(file);
} else {
files[name] = file;
}
} else {
files[name] = file;
}
});
this.on('error', (err) => {
callback(err, fields, files);
});
this.on('end', () => {
if (this.options.multiples) {
Object.assign(fields, qs.parse(mockFields));
}
callback(null, fields, files);
});
}
// Parse headers and setup the parser, ready to start listening for data.
this.writeHeaders(req.headers);
// Start listening for data.
req
.on('error', (err) => {
this._error(err);
})
.on('aborted', () => {
this.emit('aborted');
this._error(new FormidableError('Request aborted', errors.aborted));
})
.on('data', (buffer) => {
try {
this.write(buffer);
} catch (err) {
this._error(err);
}
})
.on('end', () => {
if (this.error) {
return;
}
if (this._parser) {
this._parser.end();
}
this._maybeEnd();
});
return this;
}
writeHeaders(headers) {
this.headers = headers;
this._parseContentLength();
this._parseContentType();
if (!this._parser) {
this._error(
new FormidableError(
'no parser found',
errors.noParser,
415, // Unsupported Media Type
),
);
return;
}
this._parser.once('error', (error) => {
this._error(error);
});
}
write(buffer) {
if (this.error) {
return null;
}
if (!this._parser) {
this._error(
new FormidableError('uninitialized parser', errors.uninitializedParser),
);
return null;
}
this.bytesReceived += buffer.length;
this.emit('progress', this.bytesReceived, this.bytesExpected);
this._parser.write(buffer);
return this.bytesReceived;
}
pause() {
// this does nothing, unless overwritten in IncomingForm.parse
return false;
}
resume() {
// this does nothing, unless overwritten in IncomingForm.parse
return false;
}
onPart(part) {
// this method can be overwritten by the user
this._handlePart(part);
}
_handlePart(part) {
if (part.originalFilename && typeof part.originalFilename !== 'string') {
this._error(
new FormidableError(
`the part.originalFilename should be string when it exists`,
errors.filenameNotString,
),
);
return;
}
// This MUST check exactly for undefined. You can not change it to !part.originalFilename.
// todo: uncomment when switch tests to Jest
// console.log(part);
// ? NOTE(@tunnckocore): no it can be any falsey value, it most probably depends on what's returned
// from somewhere else. Where recently I changed the return statements
// and such thing because code style
// ? NOTE(@tunnckocore): or even better, if there is no mimetype, then it's for sure a field
// ? NOTE(@tunnckocore): originalFilename is an empty string when a field?
if (!part.mimetype) {
let value = '';
const decoder = new StringDecoder(
part.transferEncoding || this.options.encoding,
);
part.on('data', (buffer) => {
this._fieldsSize += buffer.length;
if (this._fieldsSize > this.options.maxFieldsSize) {
this._error(
new FormidableError(
`options.maxFieldsSize (${this.options.maxFieldsSize} bytes) exceeded, received ${this._fieldsSize} bytes of field data`,
errors.maxFieldsSizeExceeded,
413, // Payload Too Large
),
);
return;
}
value += decoder.write(buffer);
});
part.on('end', () => {
this.emit('field', part.name, value);
});
return;
}
if (!this.options.filter(part)) {
return;
}
this._flushing += 1;
const newFilename = this._getNewName(part);
const filepath = this._joinDirectoryName(newFilename);
const file = this._newFile({
newFilename,
filepath,
originalFilename: part.originalFilename,
mimetype: part.mimetype,
});
file.on('error', (err) => {
this._error(err);
});
this.emit('fileBegin', part.name, file);
file.open();
this.openedFiles.push(file);
part.on('data', (buffer) => {
this._fileSize += buffer.length;
if (this._fileSize < this.options.minFileSize) {
this._error(
new FormidableError(
`options.minFileSize (${this.options.minFileSize} bytes) inferior, received ${this._fileSize} bytes of file data`,
errors.smallerThanMinFileSize,
400,
),
);
return;
}
if (this._fileSize > this.options.maxFileSize) {
this._error(
new FormidableError(
`options.maxFileSize (${this.options.maxFileSize} bytes) exceeded, received ${this._fileSize} bytes of file data`,
errors.biggerThanMaxFileSize,
413,
),
);
return;
}
if (buffer.length === 0) {
return;
}
this.pause();
file.write(buffer, () => {
this.resume();
});
});
part.on('end', () => {
if (!this.options.allowEmptyFiles && this._fileSize === 0) {
this._error(
new FormidableError(
`options.allowEmptyFiles is false, file size should be greather than 0`,
errors.noEmptyFiles,
400,
),
);
return;
}
file.end(() => {
this._flushing -= 1;
this.emit('file', part.name, file);
this._maybeEnd();
});
});
}
// eslint-disable-next-line max-statements
_parseContentType() {
if (this.bytesExpected === 0) {
this._parser = new DummyParser(this, this.options);
return;
}
if (!this.headers['content-type']) {
this._error(
new FormidableError(
'bad content-type header, no content-type',
errors.missingContentType,
400,
),
);
return;
}
const results = [];
const _dummyParser = new DummyParser(this, this.options);
// eslint-disable-next-line no-plusplus
for (let idx = 0; idx < this._plugins.length; idx++) {
const plugin = this._plugins[idx];
let pluginReturn = null;
try {
pluginReturn = plugin(this, this.options) || this;
} catch (err) {
// directly throw from the `form.parse` method;
// there is no other better way, except a handle through options
const error = new FormidableError(
`plugin on index ${idx} failed with: ${err.message}`,
errors.pluginFailed,
500,
);
error.idx = idx;
throw error;
}
Object.assign(this, pluginReturn);
// todo: use Set/Map and pass plugin name instead of the `idx` index
this.emit('plugin', idx, pluginReturn);
results.push(pluginReturn);
}
this.emit('pluginsResults', results);
// NOTE: probably not needed, because we check options.enabledPlugins in the constructor
// if (results.length === 0 /* && results.length !== this._plugins.length */) {
// this._error(
// new Error(
// `bad content-type header, unknown content-type: ${this.headers['content-type']}`,
// ),
// );
// }
}
_error(err, eventName = 'error') {
// if (!err && this.error) {
// this.emit('error', this.error);
// return;
// }
if (this.error || this.ended) {
return;
}
this.error = err;
this.emit(eventName, err);
if (Array.isArray(this.openedFiles)) {
this.openedFiles.forEach((file) => {
file.destroy();
});
}
}
_parseContentLength() {
this.bytesReceived = 0;
if (this.headers['content-length']) {
this.bytesExpected = parseInt(this.headers['content-length'], 10);
} else if (this.headers['transfer-encoding'] === undefined) {
this.bytesExpected = 0;
}
if (this.bytesExpected !== null) {
this.emit('progress', this.bytesReceived, this.bytesExpected);
}
}
_newParser() {
return new MultipartParser(this.options);
}
_newFile({ filepath, originalFilename, mimetype, newFilename }) {
return this.options.fileWriteStreamHandler
? new VolatileFile({
newFilename,
filepath,
originalFilename,
mimetype,
createFileWriteStream: this.options.fileWriteStreamHandler,
hashAlgorithm: this.options.hashAlgorithm,
})
: new PersistentFile({
newFilename,
filepath,
originalFilename,
mimetype,
hashAlgorithm: this.options.hashAlgorithm,
});
}
_getFileName(headerValue) {
// matches either a quoted-string or a token (RFC 2616 section 19.5.1)
const m = headerValue.match(
/\bfilename=("(.*?)"|([^()<>{}[\]@,;:"?=\s/\t]+))($|;\s)/i,
);
if (!m) return null;
const match = m[2] || m[3] || '';
let originalFilename = match.substr(match.lastIndexOf('\\') + 1);
originalFilename = originalFilename.replace(/%22/g, '"');
originalFilename = originalFilename.replace(/&#([\d]{4});/g, (_, code) =>
String.fromCharCode(code),
);
return originalFilename;
}
_getExtension(str) {
if (!str) {
return '';
}
const basename = path.basename(str);
const firstDot = basename.indexOf('.');
const lastDot = basename.lastIndexOf('.');
const extname = path.extname(basename).replace(/(\.[a-z0-9]+).*/i, '$1');
if (firstDot === lastDot) {
return extname;
}
return basename.slice(firstDot, lastDot) + extname;
}
_joinDirectoryName(name) {
const newPath = path.join(this.uploadDir, name);
// prevent directory traversal attacks
if (!newPath.startsWith(this.uploadDir)) {
return path.join(this.uploadDir, this.options.defaultInvalidName);
}
return newPath;
}
_setUpRename() {
const hasRename = typeof this.options.filename === 'function';
if (hasRename) {
this._getNewName = (part) => {
let ext = '';
let name = this.options.defaultInvalidName;
if (part.originalFilename) {
// can be null
({ ext, name } = path.parse(part.originalFilename));
if (this.options.keepExtensions !== true) {
ext = '';
}
}
return this.options.filename.call(this, name, ext, part, this);
};
} else {
this._getNewName = (part) => {
const name = toHexoId();
if (part && this.options.keepExtensions) {
const originalFilename = typeof part === 'string' ? part : part.originalFilename;
return `${name}${this._getExtension(originalFilename)}`;
}
return name;
}
}
}
_setUpMaxFields() {
if (this.options.maxFields !== 0) {
let fieldsCount = 0;
this.on('field', () => {
fieldsCount += 1;
if (fieldsCount > this.options.maxFields) {
this._error(
new FormidableError(
`options.maxFields (${this.options.maxFields}) exceeded`,
errors.maxFieldsExceeded,
413,
),
);
}
});
}
}
_maybeEnd() {
// console.log('ended', this.ended);
// console.log('_flushing', this._flushing);
// console.log('error', this.error);
if (!this.ended || this._flushing || this.error) {
return;
}
this.emit('end');
}
}
IncomingForm.DEFAULT_OPTIONS = DEFAULT_OPTIONS;
module.exports = IncomingForm;

45
node_modules/formidable/src/FormidableError.js generated vendored Normal file
View file

@ -0,0 +1,45 @@
/* eslint-disable no-plusplus */
const missingPlugin = 1000;
const pluginFunction = 1001;
const aborted = 1002;
const noParser = 1003;
const uninitializedParser = 1004;
const filenameNotString = 1005;
const maxFieldsSizeExceeded = 1006;
const maxFieldsExceeded = 1007;
const smallerThanMinFileSize = 1008;
const biggerThanMaxFileSize = 1009;
const noEmptyFiles = 1010;
const missingContentType = 1011;
const malformedMultipart = 1012;
const missingMultipartBoundary = 1013;
const unknownTransferEncoding = 1014;
const FormidableError = class extends Error {
constructor(message, internalCode, httpCode = 500) {
super(message);
this.code = internalCode;
this.httpCode = httpCode;
}
};
module.exports = {
missingPlugin,
pluginFunction,
aborted,
noParser,
uninitializedParser,
filenameNotString,
maxFieldsSizeExceeded,
maxFieldsExceeded,
smallerThanMinFileSize,
biggerThanMaxFileSize,
noEmptyFiles,
missingContentType,
malformedMultipart,
missingMultipartBoundary,
unknownTransferEncoding,
FormidableError,
};

87
node_modules/formidable/src/PersistentFile.js generated vendored Normal file
View file

@ -0,0 +1,87 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const fs = require('fs');
const crypto = require('crypto');
const { EventEmitter } = require('events');
class PersistentFile extends EventEmitter {
constructor({ filepath, newFilename, originalFilename, mimetype, hashAlgorithm }) {
super();
this.lastModifiedDate = null;
Object.assign(this, { filepath, newFilename, originalFilename, mimetype, hashAlgorithm });
this.size = 0;
this._writeStream = null;
if (typeof this.hashAlgorithm === 'string') {
this.hash = crypto.createHash(this.hashAlgorithm);
} else {
this.hash = null;
}
}
open() {
this._writeStream = new fs.WriteStream(this.filepath);
this._writeStream.on('error', (err) => {
this.emit('error', err);
});
}
toJSON() {
const json = {
size: this.size,
filepath: this.filepath,
newFilename: this.newFilename,
mimetype: this.mimetype,
mtime: this.lastModifiedDate,
length: this.length,
originalFilename: this.originalFilename,
};
if (this.hash && this.hash !== '') {
json.hash = this.hash;
}
return json;
}
toString() {
return `PersistentFile: ${this.newFilename}, Original: ${this.originalFilename}, Path: ${this.filepath}`;
}
write(buffer, cb) {
if (this.hash) {
this.hash.update(buffer);
}
if (this._writeStream.closed) {
cb();
return;
}
this._writeStream.write(buffer, () => {
this.lastModifiedDate = new Date();
this.size += buffer.length;
this.emit('progress', this.size);
cb();
});
}
end(cb) {
if (this.hash) {
this.hash = this.hash.digest('hex');
}
this._writeStream.end(() => {
this.emit('end');
cb();
});
}
destroy() {
this._writeStream.destroy();
fs.unlink(this.filepath, () => {});
}
}
module.exports = PersistentFile;

82
node_modules/formidable/src/VolatileFile.js generated vendored Normal file
View file

@ -0,0 +1,82 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const crypto = require('crypto');
const { EventEmitter } = require('events');
class VolatileFile extends EventEmitter {
constructor({ filepath, newFilename, originalFilename, mimetype, hashAlgorithm, createFileWriteStream }) {
super();
this.lastModifiedDate = null;
Object.assign(this, { filepath, newFilename, originalFilename, mimetype, hashAlgorithm, createFileWriteStream });
this.size = 0;
this._writeStream = null;
if (typeof this.hashAlgorithm === 'string') {
this.hash = crypto.createHash(this.hashAlgorithm);
} else {
this.hash = null;
}
}
open() {
this._writeStream = this.createFileWriteStream(this);
this._writeStream.on('error', (err) => {
this.emit('error', err);
});
}
destroy() {
this._writeStream.destroy();
}
toJSON() {
const json = {
size: this.size,
newFilename: this.newFilename,
length: this.length,
originalFilename: this.originalFilename,
mimetype: this.mimetype,
};
if (this.hash && this.hash !== '') {
json.hash = this.hash;
}
return json;
}
toString() {
return `VolatileFile: ${this.originalFilename}`;
}
write(buffer, cb) {
if (this.hash) {
this.hash.update(buffer);
}
if (this._writeStream.closed || this._writeStream.destroyed) {
cb();
return;
}
this._writeStream.write(buffer, () => {
this.size += buffer.length;
this.emit('progress', this.size);
cb();
});
}
end(cb) {
if (this.hash) {
this.hash = this.hash.digest('hex');
}
this._writeStream.end(() => {
this.emit('end');
cb();
});
}
}
module.exports = VolatileFile;

38
node_modules/formidable/src/index.js generated vendored Normal file
View file

@ -0,0 +1,38 @@
'use strict';
const PersistentFile = require('./PersistentFile');
const VolatileFile = require('./VolatileFile');
const Formidable = require('./Formidable');
const FormidableError = require('./FormidableError');
const plugins = require('./plugins/index');
const parsers = require('./parsers/index');
// make it available without requiring the `new` keyword
// if you want it access `const formidable.IncomingForm` as v1
const formidable = (...args) => new Formidable(...args);
module.exports = Object.assign(formidable, {
errors: FormidableError,
File: PersistentFile,
PersistentFile,
VolatileFile,
Formidable,
formidable,
// alias
IncomingForm: Formidable,
// parsers
...parsers,
parsers,
// misc
defaultOptions: Formidable.DEFAULT_OPTIONS,
enabledPlugins: Formidable.DEFAULT_OPTIONS.enabledPlugins,
// plugins
plugins: {
...plugins,
},
});

21
node_modules/formidable/src/parsers/Dummy.js generated vendored Normal file
View file

@ -0,0 +1,21 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const { Transform } = require('stream');
class DummyParser extends Transform {
constructor(incomingForm, options = {}) {
super();
this.globalOptions = { ...options };
this.incomingForm = incomingForm;
}
_flush(callback) {
this.incomingForm.ended = true;
this.incomingForm._maybeEnd();
callback();
}
}
module.exports = DummyParser;

35
node_modules/formidable/src/parsers/JSON.js generated vendored Normal file
View file

@ -0,0 +1,35 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const { Transform } = require('stream');
class JSONParser extends Transform {
constructor(options = {}) {
super({ readableObjectMode: true });
this.chunks = [];
this.globalOptions = { ...options };
}
_transform(chunk, encoding, callback) {
this.chunks.push(String(chunk)); // todo consider using a string decoder
callback();
}
_flush(callback) {
try {
const fields = JSON.parse(this.chunks.join(''));
Object.keys(fields).forEach((key) => {
const value = fields[key];
this.push({ key, value });
});
} catch (e) {
callback(e);
return;
}
this.chunks = null;
callback();
}
}
module.exports = JSONParser;

347
node_modules/formidable/src/parsers/Multipart.js generated vendored Normal file
View file

@ -0,0 +1,347 @@
/* eslint-disable no-fallthrough */
/* eslint-disable no-bitwise */
/* eslint-disable no-plusplus */
/* eslint-disable no-underscore-dangle */
'use strict';
const { Transform } = require('stream');
const errors = require('../FormidableError.js');
const { FormidableError } = errors;
let s = 0;
const STATE = {
PARSER_UNINITIALIZED: s++,
START: s++,
START_BOUNDARY: s++,
HEADER_FIELD_START: s++,
HEADER_FIELD: s++,
HEADER_VALUE_START: s++,
HEADER_VALUE: s++,
HEADER_VALUE_ALMOST_DONE: s++,
HEADERS_ALMOST_DONE: s++,
PART_DATA_START: s++,
PART_DATA: s++,
PART_END: s++,
END: s++,
};
let f = 1;
const FBOUNDARY = { PART_BOUNDARY: f, LAST_BOUNDARY: (f *= 2) };
const LF = 10;
const CR = 13;
const SPACE = 32;
const HYPHEN = 45;
const COLON = 58;
const A = 97;
const Z = 122;
function lower(c) {
return c | 0x20;
}
exports.STATES = {};
Object.keys(STATE).forEach((stateName) => {
exports.STATES[stateName] = STATE[stateName];
});
class MultipartParser extends Transform {
constructor(options = {}) {
super({ readableObjectMode: true });
this.boundary = null;
this.boundaryChars = null;
this.lookbehind = null;
this.bufferLength = 0;
this.state = STATE.PARSER_UNINITIALIZED;
this.globalOptions = { ...options };
this.index = null;
this.flags = 0;
}
_flush(done) {
if (
(this.state === STATE.HEADER_FIELD_START && this.index === 0) ||
(this.state === STATE.PART_DATA && this.index === this.boundary.length)
) {
this._handleCallback('partEnd');
this._handleCallback('end');
done();
} else if (this.state !== STATE.END) {
done(
new FormidableError(
`MultipartParser.end(): stream ended unexpectedly: ${this.explain()}`,
errors.malformedMultipart,
400,
),
);
}
}
initWithBoundary(str) {
this.boundary = Buffer.from(`\r\n--${str}`);
this.lookbehind = Buffer.alloc(this.boundary.length + 8);
this.state = STATE.START;
this.boundaryChars = {};
for (let i = 0; i < this.boundary.length; i++) {
this.boundaryChars[this.boundary[i]] = true;
}
}
// eslint-disable-next-line max-params
_handleCallback(name, buf, start, end) {
if (start !== undefined && start === end) {
return;
}
this.push({ name, buffer: buf, start, end });
}
// eslint-disable-next-line max-statements
_transform(buffer, _, done) {
let i = 0;
let prevIndex = this.index;
let { index, state, flags } = this;
const { lookbehind, boundary, boundaryChars } = this;
const boundaryLength = boundary.length;
const boundaryEnd = boundaryLength - 1;
this.bufferLength = buffer.length;
let c = null;
let cl = null;
const setMark = (name, idx) => {
this[`${name}Mark`] = typeof idx === 'number' ? idx : i;
};
const clearMarkSymbol = (name) => {
delete this[`${name}Mark`];
};
const dataCallback = (name, shouldClear) => {
const markSymbol = `${name}Mark`;
if (!(markSymbol in this)) {
return;
}
if (!shouldClear) {
this._handleCallback(name, buffer, this[markSymbol], buffer.length);
setMark(name, 0);
} else {
this._handleCallback(name, buffer, this[markSymbol], i);
clearMarkSymbol(name);
}
};
for (i = 0; i < this.bufferLength; i++) {
c = buffer[i];
switch (state) {
case STATE.PARSER_UNINITIALIZED:
return i;
case STATE.START:
index = 0;
state = STATE.START_BOUNDARY;
case STATE.START_BOUNDARY:
if (index === boundary.length - 2) {
if (c === HYPHEN) {
flags |= FBOUNDARY.LAST_BOUNDARY;
} else if (c !== CR) {
return i;
}
index++;
break;
} else if (index - 1 === boundary.length - 2) {
if (flags & FBOUNDARY.LAST_BOUNDARY && c === HYPHEN) {
this._handleCallback('end');
state = STATE.END;
flags = 0;
} else if (!(flags & FBOUNDARY.LAST_BOUNDARY) && c === LF) {
index = 0;
this._handleCallback('partBegin');
state = STATE.HEADER_FIELD_START;
} else {
return i;
}
break;
}
if (c !== boundary[index + 2]) {
index = -2;
}
if (c === boundary[index + 2]) {
index++;
}
break;
case STATE.HEADER_FIELD_START:
state = STATE.HEADER_FIELD;
setMark('headerField');
index = 0;
case STATE.HEADER_FIELD:
if (c === CR) {
clearMarkSymbol('headerField');
state = STATE.HEADERS_ALMOST_DONE;
break;
}
index++;
if (c === HYPHEN) {
break;
}
if (c === COLON) {
if (index === 1) {
// empty header field
return i;
}
dataCallback('headerField', true);
state = STATE.HEADER_VALUE_START;
break;
}
cl = lower(c);
if (cl < A || cl > Z) {
return i;
}
break;
case STATE.HEADER_VALUE_START:
if (c === SPACE) {
break;
}
setMark('headerValue');
state = STATE.HEADER_VALUE;
case STATE.HEADER_VALUE:
if (c === CR) {
dataCallback('headerValue', true);
this._handleCallback('headerEnd');
state = STATE.HEADER_VALUE_ALMOST_DONE;
}
break;
case STATE.HEADER_VALUE_ALMOST_DONE:
if (c !== LF) {
return i;
}
state = STATE.HEADER_FIELD_START;
break;
case STATE.HEADERS_ALMOST_DONE:
if (c !== LF) {
return i;
}
this._handleCallback('headersEnd');
state = STATE.PART_DATA_START;
break;
case STATE.PART_DATA_START:
state = STATE.PART_DATA;
setMark('partData');
case STATE.PART_DATA:
prevIndex = index;
if (index === 0) {
// boyer-moore derrived algorithm to safely skip non-boundary data
i += boundaryEnd;
while (i < this.bufferLength && !(buffer[i] in boundaryChars)) {
i += boundaryLength;
}
i -= boundaryEnd;
c = buffer[i];
}
if (index < boundary.length) {
if (boundary[index] === c) {
if (index === 0) {
dataCallback('partData', true);
}
index++;
} else {
index = 0;
}
} else if (index === boundary.length) {
index++;
if (c === CR) {
// CR = part boundary
flags |= FBOUNDARY.PART_BOUNDARY;
} else if (c === HYPHEN) {
// HYPHEN = end boundary
flags |= FBOUNDARY.LAST_BOUNDARY;
} else {
index = 0;
}
} else if (index - 1 === boundary.length) {
if (flags & FBOUNDARY.PART_BOUNDARY) {
index = 0;
if (c === LF) {
// unset the PART_BOUNDARY flag
flags &= ~FBOUNDARY.PART_BOUNDARY;
this._handleCallback('partEnd');
this._handleCallback('partBegin');
state = STATE.HEADER_FIELD_START;
break;
}
} else if (flags & FBOUNDARY.LAST_BOUNDARY) {
if (c === HYPHEN) {
this._handleCallback('partEnd');
this._handleCallback('end');
state = STATE.END;
flags = 0;
} else {
index = 0;
}
} else {
index = 0;
}
}
if (index > 0) {
// when matching a possible boundary, keep a lookbehind reference
// in case it turns out to be a false lead
lookbehind[index - 1] = c;
} else if (prevIndex > 0) {
// if our boundary turned out to be rubbish, the captured lookbehind
// belongs to partData
this._handleCallback('partData', lookbehind, 0, prevIndex);
prevIndex = 0;
setMark('partData');
// reconsider the current character even so it interrupted the sequence
// it could be the beginning of a new sequence
i--;
}
break;
case STATE.END:
break;
default:
return i;
}
}
dataCallback('headerField');
dataCallback('headerValue');
dataCallback('partData');
this.index = index;
this.state = state;
this.flags = flags;
done();
return this.bufferLength;
}
explain() {
return `state = ${MultipartParser.stateToString(this.state)}`;
}
}
// eslint-disable-next-line consistent-return
MultipartParser.stateToString = (stateNumber) => {
// eslint-disable-next-line no-restricted-syntax, guard-for-in
for (const stateName in STATE) {
const number = STATE[stateName];
if (number === stateNumber) return stateName;
}
};
module.exports = Object.assign(MultipartParser, { STATES: exports.STATES });

12
node_modules/formidable/src/parsers/OctetStream.js generated vendored Normal file
View file

@ -0,0 +1,12 @@
'use strict';
const { PassThrough } = require('stream');
class OctetStreamParser extends PassThrough {
constructor(options = {}) {
super();
this.globalOptions = { ...options };
}
}
module.exports = OctetStreamParser;

38
node_modules/formidable/src/parsers/Querystring.js generated vendored Normal file
View file

@ -0,0 +1,38 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const { Transform } = require('stream');
const querystring = require('querystring');
// This is a buffering parser, not quite as nice as the multipart one.
// If I find time I'll rewrite this to be fully streaming as well
class QuerystringParser extends Transform {
constructor(options = {}) {
super({ readableObjectMode: true });
this.globalOptions = { ...options };
this.buffer = '';
this.bufferLength = 0;
}
_transform(buffer, encoding, callback) {
this.buffer += buffer.toString('ascii');
this.bufferLength = this.buffer.length;
callback();
}
_flush(callback) {
const fields = querystring.parse(this.buffer, '&', '=');
// eslint-disable-next-line no-restricted-syntax, guard-for-in
for (const key in fields) {
this.push({
key,
value: fields[key],
});
}
this.buffer = '';
callback();
}
}
module.exports = QuerystringParser;

View file

@ -0,0 +1,121 @@
// not used
/* eslint-disable no-underscore-dangle */
'use strict';
const { Transform } = require('stream');
const errors = require('../FormidableError.js');
const { FormidableError } = errors;
const AMPERSAND = 38;
const EQUALS = 61;
class QuerystringParser extends Transform {
constructor(options = {}) {
super({ readableObjectMode: true });
const { maxFieldSize } = options;
this.maxFieldLength = maxFieldSize;
this.buffer = Buffer.from('');
this.fieldCount = 0;
this.sectionStart = 0;
this.key = '';
this.readingKey = true;
}
_transform(buffer, encoding, callback) {
let len = buffer.length;
if (this.buffer && this.buffer.length) {
// we have some data left over from the last write which we are in the middle of processing
len += this.buffer.length;
buffer = Buffer.concat([this.buffer, buffer], len);
}
for (let i = this.buffer.length || 0; i < len; i += 1) {
const c = buffer[i];
if (this.readingKey) {
// KEY, check for =
if (c === EQUALS) {
this.key = this.getSection(buffer, i);
this.readingKey = false;
this.sectionStart = i + 1;
} else if (c === AMPERSAND) {
// just key, no value. Prepare to read another key
this.emitField(this.getSection(buffer, i));
this.sectionStart = i + 1;
}
// VALUE, check for &
} else if (c === AMPERSAND) {
this.emitField(this.key, this.getSection(buffer, i));
this.sectionStart = i + 1;
}
if (
this.maxFieldLength &&
i - this.sectionStart === this.maxFieldLength
) {
callback(
new FormidableError(
`${
this.readingKey ? 'Key' : `Value for ${this.key}`
} longer than maxFieldLength`,
),
errors.maxFieldsSizeExceeded,
413,
);
}
}
// Prepare the remaining key or value (from sectionStart to the end) for the next write() or for end()
len -= this.sectionStart;
if (len) {
// i.e. Unless the last character was a & or =
this.buffer = Buffer.from(this.buffer, 0, this.sectionStart);
} else this.buffer = null;
this.sectionStart = 0;
callback();
}
_flush(callback) {
// Emit the last field
if (this.readingKey) {
// we only have a key if there's something in the buffer. We definitely have no value
if (this.buffer && this.buffer.length){
this.emitField(this.buffer.toString('ascii'));
}
} else {
// We have a key, we may or may not have a value
this.emitField(
this.key,
this.buffer && this.buffer.length && this.buffer.toString('ascii'),
);
}
this.buffer = '';
callback();
}
getSection(buffer, i) {
if (i === this.sectionStart) return '';
return buffer.toString('ascii', this.sectionStart, i);
}
emitField(key, val) {
this.key = '';
this.readingKey = true;
this.push({ key, value: val || '' });
}
}
module.exports = QuerystringParser;
// const q = new QuerystringParser({maxFieldSize: 100});
// (async function() {
// for await (const chunk of q) {
// console.log(chunk);
// }
// })();
// q.write("a=b&c=d")
// q.end()

17
node_modules/formidable/src/parsers/index.js generated vendored Normal file
View file

@ -0,0 +1,17 @@
'use strict';
const JSONParser = require('./JSON');
const DummyParser = require('./Dummy');
const MultipartParser = require('./Multipart');
const OctetStreamParser = require('./OctetStream');
const QueryStringParser = require('./Querystring');
Object.assign(exports, {
JSONParser,
DummyParser,
MultipartParser,
OctetStreamParser,
OctetstreamParser: OctetStreamParser,
QueryStringParser,
QuerystringParser: QueryStringParser,
});

13
node_modules/formidable/src/plugins/index.js generated vendored Normal file
View file

@ -0,0 +1,13 @@
'use strict';
const octetstream = require('./octetstream');
const querystring = require('./querystring');
const multipart = require('./multipart');
const json = require('./json');
Object.assign(exports, {
octetstream,
querystring,
multipart,
json,
});

38
node_modules/formidable/src/plugins/json.js generated vendored Normal file
View file

@ -0,0 +1,38 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const JSONParser = require('../parsers/JSON');
// the `options` is also available through the `this.options` / `formidable.options`
module.exports = function plugin(formidable, options) {
// the `this` context is always formidable, as the first argument of a plugin
// but this allows us to customize/test each plugin
/* istanbul ignore next */
const self = this || formidable;
if (/json/i.test(self.headers['content-type'])) {
init.call(self, self, options);
}
};
// Note that it's a good practice (but it's up to you) to use the `this.options` instead
// of the passed `options` (second) param, because when you decide
// to test the plugin you can pass custom `this` context to it (and so `this.options`)
function init(_self, _opts) {
this.type = 'json';
const parser = new JSONParser(this.options);
parser.on('data', ({ key, value }) => {
this.emit('field', key, value);
});
parser.once('end', () => {
this.ended = true;
this._maybeEnd();
});
this._parser = parser;
}

173
node_modules/formidable/src/plugins/multipart.js generated vendored Normal file
View file

@ -0,0 +1,173 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const { Stream } = require('stream');
const MultipartParser = require('../parsers/Multipart');
const errors = require('../FormidableError.js');
const { FormidableError } = errors;
// the `options` is also available through the `options` / `formidable.options`
module.exports = function plugin(formidable, options) {
// the `this` context is always formidable, as the first argument of a plugin
// but this allows us to customize/test each plugin
/* istanbul ignore next */
const self = this || formidable;
// NOTE: we (currently) support both multipart/form-data and multipart/related
const multipart = /multipart/i.test(self.headers['content-type']);
if (multipart) {
const m = self.headers['content-type'].match(
/boundary=(?:"([^"]+)"|([^;]+))/i,
);
if (m) {
const initMultipart = createInitMultipart(m[1] || m[2]);
initMultipart.call(self, self, options); // lgtm [js/superfluous-trailing-arguments]
} else {
const err = new FormidableError(
'bad content-type header, no multipart boundary',
errors.missingMultipartBoundary,
400,
);
self._error(err);
}
}
};
// Note that it's a good practice (but it's up to you) to use the `this.options` instead
// of the passed `options` (second) param, because when you decide
// to test the plugin you can pass custom `this` context to it (and so `this.options`)
function createInitMultipart(boundary) {
return function initMultipart() {
this.type = 'multipart';
const parser = new MultipartParser(this.options);
let headerField;
let headerValue;
let part;
parser.initWithBoundary(boundary);
// eslint-disable-next-line max-statements, consistent-return
parser.on('data', ({ name, buffer, start, end }) => {
if (name === 'partBegin') {
part = new Stream();
part.readable = true;
part.headers = {};
part.name = null;
part.originalFilename = null;
part.mimetype = null;
part.transferEncoding = this.options.encoding;
part.transferBuffer = '';
headerField = '';
headerValue = '';
} else if (name === 'headerField') {
headerField += buffer.toString(this.options.encoding, start, end);
} else if (name === 'headerValue') {
headerValue += buffer.toString(this.options.encoding, start, end);
} else if (name === 'headerEnd') {
headerField = headerField.toLowerCase();
part.headers[headerField] = headerValue;
// matches either a quoted-string or a token (RFC 2616 section 19.5.1)
const m = headerValue.match(
// eslint-disable-next-line no-useless-escape
/\bname=("([^"]*)"|([^\(\)<>@,;:\\"\/\[\]\?=\{\}\s\t/]+))/i,
);
if (headerField === 'content-disposition') {
if (m) {
part.name = m[2] || m[3] || '';
}
part.originalFilename = this._getFileName(headerValue);
} else if (headerField === 'content-type') {
part.mimetype = headerValue;
} else if (headerField === 'content-transfer-encoding') {
part.transferEncoding = headerValue.toLowerCase();
}
headerField = '';
headerValue = '';
} else if (name === 'headersEnd') {
switch (part.transferEncoding) {
case 'binary':
case '7bit':
case '8bit':
case 'utf-8': {
const dataPropagation = (ctx) => {
if (ctx.name === 'partData') {
part.emit('data', ctx.buffer.slice(ctx.start, ctx.end));
}
};
const dataStopPropagation = (ctx) => {
if (ctx.name === 'partEnd') {
part.emit('end');
parser.removeListener('data', dataPropagation);
parser.removeListener('data', dataStopPropagation);
}
};
parser.on('data', dataPropagation);
parser.on('data', dataStopPropagation);
break;
}
case 'base64': {
const dataPropagation = (ctx) => {
if (ctx.name === 'partData') {
part.transferBuffer += ctx.buffer
.slice(ctx.start, ctx.end)
.toString('ascii');
/*
four bytes (chars) in base64 converts to three bytes in binary
encoding. So we should always work with a number of bytes that
can be divided by 4, it will result in a number of buytes that
can be divided vy 3.
*/
const offset = parseInt(part.transferBuffer.length / 4, 10) * 4;
part.emit(
'data',
Buffer.from(
part.transferBuffer.substring(0, offset),
'base64',
),
);
part.transferBuffer = part.transferBuffer.substring(offset);
}
};
const dataStopPropagation = (ctx) => {
if (ctx.name === 'partEnd') {
part.emit('data', Buffer.from(part.transferBuffer, 'base64'));
part.emit('end');
parser.removeListener('data', dataPropagation);
parser.removeListener('data', dataStopPropagation);
}
};
parser.on('data', dataPropagation);
parser.on('data', dataStopPropagation);
break;
}
default:
return this._error(
new FormidableError(
'unknown transfer-encoding',
errors.unknownTransferEncoding,
501,
),
);
}
this.onPart(part);
} else if (name === 'end') {
this.ended = true;
this._maybeEnd();
}
});
this._parser = parser;
};
}

86
node_modules/formidable/src/plugins/octetstream.js generated vendored Normal file
View file

@ -0,0 +1,86 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const OctetStreamParser = require('../parsers/OctetStream');
// the `options` is also available through the `options` / `formidable.options`
module.exports = function plugin(formidable, options) {
// the `this` context is always formidable, as the first argument of a plugin
// but this allows us to customize/test each plugin
/* istanbul ignore next */
const self = this || formidable;
if (/octet-stream/i.test(self.headers['content-type'])) {
init.call(self, self, options);
}
return self;
};
// Note that it's a good practice (but it's up to you) to use the `this.options` instead
// of the passed `options` (second) param, because when you decide
// to test the plugin you can pass custom `this` context to it (and so `this.options`)
function init(_self, _opts) {
this.type = 'octet-stream';
const originalFilename = this.headers['x-file-name'];
const mimetype = this.headers['content-type'];
const thisPart = {
originalFilename,
mimetype,
};
const newFilename = this._getNewName(thisPart);
const filepath = this._joinDirectoryName(newFilename);
const file = this._newFile({
newFilename,
filepath,
originalFilename,
mimetype,
});
this.emit('fileBegin', originalFilename, file);
file.open();
this.openedFiles.push(file);
this._flushing += 1;
this._parser = new OctetStreamParser(this.options);
// Keep track of writes that haven't finished so we don't emit the file before it's done being written
let outstandingWrites = 0;
this._parser.on('data', (buffer) => {
this.pause();
outstandingWrites += 1;
file.write(buffer, () => {
outstandingWrites -= 1;
this.resume();
if (this.ended) {
this._parser.emit('doneWritingFile');
}
});
});
this._parser.on('end', () => {
this._flushing -= 1;
this.ended = true;
const done = () => {
file.end(() => {
this.emit('file', 'file', file);
this._maybeEnd();
});
};
if (outstandingWrites === 0) {
done();
} else {
this._parser.once('doneWritingFile', done);
}
});
return this;
}

42
node_modules/formidable/src/plugins/querystring.js generated vendored Normal file
View file

@ -0,0 +1,42 @@
/* eslint-disable no-underscore-dangle */
'use strict';
const QuerystringParser = require('../parsers/Querystring');
// the `options` is also available through the `this.options` / `formidable.options`
module.exports = function plugin(formidable, options) {
// the `this` context is always formidable, as the first argument of a plugin
// but this allows us to customize/test each plugin
/* istanbul ignore next */
const self = this || formidable;
if (/urlencoded/i.test(self.headers['content-type'])) {
init.call(self, self, options);
}
return self;
};
// Note that it's a good practice (but it's up to you) to use the `this.options` instead
// of the passed `options` (second) param, because when you decide
// to test the plugin you can pass custom `this` context to it (and so `this.options`)
function init(_self, _opts) {
this.type = 'urlencoded';
const parser = new QuerystringParser(this.options);
parser.on('data', ({ key, value }) => {
this.emit('field', key, value);
});
parser.once('end', () => {
this.ended = true;
this._maybeEnd();
});
this._parser = parser;
return this;
}

15
node_modules/fs-minipass/LICENSE generated vendored Normal file
View file

@ -0,0 +1,15 @@
The ISC License
Copyright (c) Isaac Z. Schlueter and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

70
node_modules/fs-minipass/README.md generated vendored Normal file
View file

@ -0,0 +1,70 @@
# fs-minipass
Filesystem streams based on [minipass](http://npm.im/minipass).
4 classes are exported:
- ReadStream
- ReadStreamSync
- WriteStream
- WriteStreamSync
When using `ReadStreamSync`, all of the data is made available
immediately upon consuming the stream. Nothing is buffered in memory
when the stream is constructed. If the stream is piped to a writer,
then it will synchronously `read()` and emit data into the writer as
fast as the writer can consume it. (That is, it will respect
backpressure.) If you call `stream.read()` then it will read the
entire file and return the contents.
When using `WriteStreamSync`, every write is flushed to the file
synchronously. If your writes all come in a single tick, then it'll
write it all out in a single tick. It's as synchronous as you are.
The async versions work much like their node builtin counterparts,
with the exception of introducing significantly less Stream machinery
overhead.
## USAGE
It's just streams, you pipe them or read() them or write() to them.
```js
const fsm = require('fs-minipass')
const readStream = new fsm.ReadStream('file.txt')
const writeStream = new fsm.WriteStream('output.txt')
writeStream.write('some file header or whatever\n')
readStream.pipe(writeStream)
```
## ReadStream(path, options)
Path string is required, but somewhat irrelevant if an open file
descriptor is passed in as an option.
Options:
- `fd` Pass in a numeric file descriptor, if the file is already open.
- `readSize` The size of reads to do, defaults to 16MB
- `size` The size of the file, if known. Prevents zero-byte read()
call at the end.
- `autoClose` Set to `false` to prevent the file descriptor from being
closed when the file is done being read.
## WriteStream(path, options)
Path string is required, but somewhat irrelevant if an open file
descriptor is passed in as an option.
Options:
- `fd` Pass in a numeric file descriptor, if the file is already open.
- `mode` The mode to create the file with. Defaults to `0o666`.
- `start` The position in the file to start reading. If not
specified, then the file will start writing at position zero, and be
truncated by default.
- `autoClose` Set to `false` to prevent the file descriptor from being
closed when the stream is ended.
- `flags` Flags to use when opening the file. Irrelevant if `fd` is
passed in, since file won't be opened in that case. Defaults to
`'a'` if a `pos` is specified, or `'w'` otherwise.

432
node_modules/fs-minipass/index.js generated vendored Normal file
View file

@ -0,0 +1,432 @@
'use strict'
const MiniPass = require('minipass')
const EE = require('events').EventEmitter
const fs = require('fs')
let writev = fs.writev
/* istanbul ignore next */
if (!writev) {
// This entire block can be removed if support for earlier than Node.js
// 12.9.0 is not needed.
try {
const binding = process.binding('fs')
const FSReqWrap = binding.FSReqWrap || binding.FSReqCallback
writev = (fd, iovec, pos, cb) => {
const done = (er, bw) => cb(er, bw, iovec)
const req = new FSReqWrap()
req.oncomplete = done
binding.writeBuffers(fd, iovec, pos, req)
}
} catch(ex) {
//Patched for Bun
writev = (fd, iovec, pos, cb) => {
var fbuf = Buffer.concat(iovec)
fs.write(fd, fbuf, 0, fbuf.byteLength, pos, (er, bw) => {
cb(er,bw,iovec)
});
}
}
}
const _autoClose = Symbol('_autoClose')
const _close = Symbol('_close')
const _ended = Symbol('_ended')
const _fd = Symbol('_fd')
const _finished = Symbol('_finished')
const _flags = Symbol('_flags')
const _flush = Symbol('_flush')
const _handleChunk = Symbol('_handleChunk')
const _makeBuf = Symbol('_makeBuf')
const _mode = Symbol('_mode')
const _needDrain = Symbol('_needDrain')
const _onerror = Symbol('_onerror')
const _onopen = Symbol('_onopen')
const _onread = Symbol('_onread')
const _onwrite = Symbol('_onwrite')
const _open = Symbol('_open')
const _path = Symbol('_path')
const _pos = Symbol('_pos')
const _queue = Symbol('_queue')
const _read = Symbol('_read')
const _readSize = Symbol('_readSize')
const _reading = Symbol('_reading')
const _remain = Symbol('_remain')
const _size = Symbol('_size')
const _write = Symbol('_write')
const _writing = Symbol('_writing')
const _defaultFlag = Symbol('_defaultFlag')
const _errored = Symbol('_errored')
class ReadStream extends MiniPass {
constructor (path, opt) {
opt = opt || {}
super(opt)
this.readable = true
this.writable = false
if (typeof path !== 'string')
throw new TypeError('path must be a string')
this[_errored] = false
this[_fd] = typeof opt.fd === 'number' ? opt.fd : null
this[_path] = path
this[_readSize] = opt.readSize || 16*1024*1024
this[_reading] = false
this[_size] = typeof opt.size === 'number' ? opt.size : Infinity
this[_remain] = this[_size]
this[_autoClose] = typeof opt.autoClose === 'boolean' ?
opt.autoClose : true
if (typeof this[_fd] === 'number')
this[_read]()
else
this[_open]()
}
get fd () { return this[_fd] }
get path () { return this[_path] }
write () {
throw new TypeError('this is a readable stream')
}
end () {
throw new TypeError('this is a readable stream')
}
[_open] () {
fs.open(this[_path], 'r', (er, fd) => this[_onopen](er, fd))
}
[_onopen] (er, fd) {
if (er)
this[_onerror](er)
else {
this[_fd] = fd
this.emit('open', fd)
this[_read]()
}
}
[_makeBuf] () {
return Buffer.allocUnsafe(Math.min(this[_readSize], this[_remain]))
}
[_read] () {
if (!this[_reading]) {
this[_reading] = true
const buf = this[_makeBuf]()
/* istanbul ignore if */
if (buf.length === 0)
return process.nextTick(() => this[_onread](null, 0, buf))
fs.read(this[_fd], buf, 0, buf.length, null, (er, br, buf) =>
this[_onread](er, br, buf))
}
}
[_onread] (er, br, buf) {
this[_reading] = false
if (er)
this[_onerror](er)
else if (this[_handleChunk](br, buf))
this[_read]()
}
[_close] () {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd]
this[_fd] = null
fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'))
}
}
[_onerror] (er) {
this[_reading] = true
this[_close]()
this.emit('error', er)
}
[_handleChunk] (br, buf) {
let ret = false
// no effect if infinite
this[_remain] -= br
if (br > 0)
ret = super.write(br < buf.length ? buf.slice(0, br) : buf)
if (br === 0 || this[_remain] <= 0) {
ret = false
this[_close]()
super.end()
}
return ret
}
emit (ev, data) {
switch (ev) {
case 'prefinish':
case 'finish':
break
case 'drain':
if (typeof this[_fd] === 'number')
this[_read]()
break
case 'error':
if (this[_errored])
return
this[_errored] = true
return super.emit(ev, data)
default:
return super.emit(ev, data)
}
}
}
class ReadStreamSync extends ReadStream {
[_open] () {
let threw = true
try {
this[_onopen](null, fs.openSync(this[_path], 'r'))
threw = false
} finally {
if (threw)
this[_close]()
}
}
[_read] () {
let threw = true
try {
if (!this[_reading]) {
this[_reading] = true
do {
const buf = this[_makeBuf]()
/* istanbul ignore next */
const br = buf.length === 0 ? 0
: fs.readSync(this[_fd], buf, 0, buf.length, null)
if (!this[_handleChunk](br, buf))
break
} while (true)
this[_reading] = false
}
threw = false
} finally {
if (threw)
this[_close]()
}
}
[_close] () {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd]
this[_fd] = null
fs.closeSync(fd)
this.emit('close')
}
}
}
class WriteStream extends EE {
constructor (path, opt) {
opt = opt || {}
super(opt)
this.readable = false
this.writable = true
this[_errored] = false
this[_writing] = false
this[_ended] = false
this[_needDrain] = false
this[_queue] = []
this[_path] = path
this[_fd] = typeof opt.fd === 'number' ? opt.fd : null
this[_mode] = opt.mode === undefined ? 0o666 : opt.mode
this[_pos] = typeof opt.start === 'number' ? opt.start : null
this[_autoClose] = typeof opt.autoClose === 'boolean' ?
opt.autoClose : true
// truncating makes no sense when writing into the middle
const defaultFlag = this[_pos] !== null ? 'r+' : 'w'
this[_defaultFlag] = opt.flags === undefined
this[_flags] = this[_defaultFlag] ? defaultFlag : opt.flags
if (this[_fd] === null)
this[_open]()
}
emit (ev, data) {
if (ev === 'error') {
if (this[_errored])
return
this[_errored] = true
}
return super.emit(ev, data)
}
get fd () { return this[_fd] }
get path () { return this[_path] }
[_onerror] (er) {
this[_close]()
this[_writing] = true
this.emit('error', er)
}
[_open] () {
fs.open(this[_path], this[_flags], this[_mode],
(er, fd) => this[_onopen](er, fd))
}
[_onopen] (er, fd) {
if (this[_defaultFlag] &&
this[_flags] === 'r+' &&
er && er.code === 'ENOENT') {
this[_flags] = 'w'
this[_open]()
} else if (er)
this[_onerror](er)
else {
this[_fd] = fd
this.emit('open', fd)
this[_flush]()
}
}
end (buf, enc) {
if (buf)
this.write(buf, enc)
this[_ended] = true
// synthetic after-write logic, where drain/finish live
if (!this[_writing] && !this[_queue].length &&
typeof this[_fd] === 'number')
this[_onwrite](null, 0)
return this
}
write (buf, enc) {
if (typeof buf === 'string')
buf = Buffer.from(buf, enc)
if (this[_ended]) {
this.emit('error', new Error('write() after end()'))
return false
}
if (this[_fd] === null || this[_writing] || this[_queue].length) {
this[_queue].push(buf)
this[_needDrain] = true
return false
}
this[_writing] = true
this[_write](buf)
return true
}
[_write] (buf) {
fs.write(this[_fd], buf, 0, buf.length, this[_pos], (er, bw) =>
this[_onwrite](er, bw))
}
[_onwrite] (er, bw) {
if (er)
this[_onerror](er)
else {
if (this[_pos] !== null)
this[_pos] += bw
if (this[_queue].length)
this[_flush]()
else {
this[_writing] = false
if (this[_ended] && !this[_finished]) {
this[_finished] = true
this[_close]()
this.emit('finish')
} else if (this[_needDrain]) {
this[_needDrain] = false
this.emit('drain')
}
}
}
}
[_flush] () {
if (this[_queue].length === 0) {
if (this[_ended])
this[_onwrite](null, 0)
} else if (this[_queue].length === 1)
this[_write](this[_queue].pop())
else {
const iovec = this[_queue]
this[_queue] = []
writev(this[_fd], iovec, this[_pos],
(er, bw) => this[_onwrite](er, bw))
}
}
[_close] () {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd]
this[_fd] = null
fs.close(fd, er => er ? this.emit('error', er) : this.emit('close'))
}
}
}
class WriteStreamSync extends WriteStream {
[_open] () {
let fd
// only wrap in a try{} block if we know we'll retry, to avoid
// the rethrow obscuring the error's source frame in most cases.
if (this[_defaultFlag] && this[_flags] === 'r+') {
try {
fd = fs.openSync(this[_path], this[_flags], this[_mode])
} catch (er) {
if (er.code === 'ENOENT') {
this[_flags] = 'w'
return this[_open]()
} else
throw er
}
} else
fd = fs.openSync(this[_path], this[_flags], this[_mode])
this[_onopen](null, fd)
}
[_close] () {
if (this[_autoClose] && typeof this[_fd] === 'number') {
const fd = this[_fd]
this[_fd] = null
fs.closeSync(fd)
this.emit('close')
}
}
[_write] (buf) {
// throw the original, but try to close if it fails
let threw = true
try {
this[_onwrite](null,
fs.writeSync(this[_fd], buf, 0, buf.length, this[_pos]))
threw = false
} finally {
if (threw)
try { this[_close]() } catch (_) {}
}
}
}
exports.ReadStream = ReadStream
exports.ReadStreamSync = ReadStreamSync
exports.WriteStream = WriteStream
exports.WriteStreamSync = WriteStreamSync

15
node_modules/fs-minipass/node_modules/minipass/LICENSE generated vendored Normal file
View file

@ -0,0 +1,15 @@
The ISC License
Copyright (c) 2017-2022 npm, Inc., Isaac Z. Schlueter, and Contributors
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

View file

@ -0,0 +1,728 @@
# minipass
A _very_ minimal implementation of a [PassThrough
stream](https://nodejs.org/api/stream.html#stream_class_stream_passthrough)
[It's very
fast](https://docs.google.com/spreadsheets/d/1oObKSrVwLX_7Ut4Z6g3fZW-AX1j1-k6w-cDsrkaSbHM/edit#gid=0)
for objects, strings, and buffers.
Supports `pipe()`ing (including multi-`pipe()` and backpressure transmission),
buffering data until either a `data` event handler or `pipe()` is added (so
you don't lose the first chunk), and most other cases where PassThrough is
a good idea.
There is a `read()` method, but it's much more efficient to consume data
from this stream via `'data'` events or by calling `pipe()` into some other
stream. Calling `read()` requires the buffer to be flattened in some
cases, which requires copying memory.
If you set `objectMode: true` in the options, then whatever is written will
be emitted. Otherwise, it'll do a minimal amount of Buffer copying to
ensure proper Streams semantics when `read(n)` is called.
`objectMode` can also be set by doing `stream.objectMode = true`, or by
writing any non-string/non-buffer data. `objectMode` cannot be set to
false once it is set.
This is not a `through` or `through2` stream. It doesn't transform the
data, it just passes it right through. If you want to transform the data,
extend the class, and override the `write()` method. Once you're done
transforming the data however you want, call `super.write()` with the
transform output.
For some examples of streams that extend Minipass in various ways, check
out:
- [minizlib](http://npm.im/minizlib)
- [fs-minipass](http://npm.im/fs-minipass)
- [tar](http://npm.im/tar)
- [minipass-collect](http://npm.im/minipass-collect)
- [minipass-flush](http://npm.im/minipass-flush)
- [minipass-pipeline](http://npm.im/minipass-pipeline)
- [tap](http://npm.im/tap)
- [tap-parser](http://npm.im/tap-parser)
- [treport](http://npm.im/treport)
- [minipass-fetch](http://npm.im/minipass-fetch)
- [pacote](http://npm.im/pacote)
- [make-fetch-happen](http://npm.im/make-fetch-happen)
- [cacache](http://npm.im/cacache)
- [ssri](http://npm.im/ssri)
- [npm-registry-fetch](http://npm.im/npm-registry-fetch)
- [minipass-json-stream](http://npm.im/minipass-json-stream)
- [minipass-sized](http://npm.im/minipass-sized)
## Differences from Node.js Streams
There are several things that make Minipass streams different from (and in
some ways superior to) Node.js core streams.
Please read these caveats if you are familiar with node-core streams and
intend to use Minipass streams in your programs.
You can avoid most of these differences entirely (for a very
small performance penalty) by setting `{async: true}` in the
constructor options.
### Timing
Minipass streams are designed to support synchronous use-cases. Thus, data
is emitted as soon as it is available, always. It is buffered until read,
but no longer. Another way to look at it is that Minipass streams are
exactly as synchronous as the logic that writes into them.
This can be surprising if your code relies on `PassThrough.write()` always
providing data on the next tick rather than the current one, or being able
to call `resume()` and not have the entire buffer disappear immediately.
However, without this synchronicity guarantee, there would be no way for
Minipass to achieve the speeds it does, or support the synchronous use
cases that it does. Simply put, waiting takes time.
This non-deferring approach makes Minipass streams much easier to reason
about, especially in the context of Promises and other flow-control
mechanisms.
Example:
```js
const Minipass = require('minipass')
const stream = new Minipass({ async: true })
stream.on('data', () => console.log('data event'))
console.log('before write')
stream.write('hello')
console.log('after write')
// output:
// before write
// data event
// after write
```
### Exception: Async Opt-In
If you wish to have a Minipass stream with behavior that more
closely mimics Node.js core streams, you can set the stream in
async mode either by setting `async: true` in the constructor
options, or by setting `stream.async = true` later on.
```js
const Minipass = require('minipass')
const asyncStream = new Minipass({ async: true })
asyncStream.on('data', () => console.log('data event'))
console.log('before write')
asyncStream.write('hello')
console.log('after write')
// output:
// before write
// after write
// data event <-- this is deferred until the next tick
```
Switching _out_ of async mode is unsafe, as it could cause data
corruption, and so is not enabled. Example:
```js
const Minipass = require('minipass')
const stream = new Minipass({ encoding: 'utf8' })
stream.on('data', chunk => console.log(chunk))
stream.async = true
console.log('before writes')
stream.write('hello')
setStreamSyncAgainSomehow(stream) // <-- this doesn't actually exist!
stream.write('world')
console.log('after writes')
// hypothetical output would be:
// before writes
// world
// after writes
// hello
// NOT GOOD!
```
To avoid this problem, once set into async mode, any attempt to
make the stream sync again will be ignored.
```js
const Minipass = require('minipass')
const stream = new Minipass({ encoding: 'utf8' })
stream.on('data', chunk => console.log(chunk))
stream.async = true
console.log('before writes')
stream.write('hello')
stream.async = false // <-- no-op, stream already async
stream.write('world')
console.log('after writes')
// actual output:
// before writes
// after writes
// hello
// world
```
### No High/Low Water Marks
Node.js core streams will optimistically fill up a buffer, returning `true`
on all writes until the limit is hit, even if the data has nowhere to go.
Then, they will not attempt to draw more data in until the buffer size dips
below a minimum value.
Minipass streams are much simpler. The `write()` method will return `true`
if the data has somewhere to go (which is to say, given the timing
guarantees, that the data is already there by the time `write()` returns).
If the data has nowhere to go, then `write()` returns false, and the data
sits in a buffer, to be drained out immediately as soon as anyone consumes
it.
Since nothing is ever buffered unnecessarily, there is much less
copying data, and less bookkeeping about buffer capacity levels.
### Hazards of Buffering (or: Why Minipass Is So Fast)
Since data written to a Minipass stream is immediately written all the way
through the pipeline, and `write()` always returns true/false based on
whether the data was fully flushed, backpressure is communicated
immediately to the upstream caller. This minimizes buffering.
Consider this case:
```js
const {PassThrough} = require('stream')
const p1 = new PassThrough({ highWaterMark: 1024 })
const p2 = new PassThrough({ highWaterMark: 1024 })
const p3 = new PassThrough({ highWaterMark: 1024 })
const p4 = new PassThrough({ highWaterMark: 1024 })
p1.pipe(p2).pipe(p3).pipe(p4)
p4.on('data', () => console.log('made it through'))
// this returns false and buffers, then writes to p2 on next tick (1)
// p2 returns false and buffers, pausing p1, then writes to p3 on next tick (2)
// p3 returns false and buffers, pausing p2, then writes to p4 on next tick (3)
// p4 returns false and buffers, pausing p3, then emits 'data' and 'drain'
// on next tick (4)
// p3 sees p4's 'drain' event, and calls resume(), emitting 'resume' and
// 'drain' on next tick (5)
// p2 sees p3's 'drain', calls resume(), emits 'resume' and 'drain' on next tick (6)
// p1 sees p2's 'drain', calls resume(), emits 'resume' and 'drain' on next
// tick (7)
p1.write(Buffer.alloc(2048)) // returns false
```
Along the way, the data was buffered and deferred at each stage, and
multiple event deferrals happened, for an unblocked pipeline where it was
perfectly safe to write all the way through!
Furthermore, setting a `highWaterMark` of `1024` might lead someone reading
the code to think an advisory maximum of 1KiB is being set for the
pipeline. However, the actual advisory buffering level is the _sum_ of
`highWaterMark` values, since each one has its own bucket.
Consider the Minipass case:
```js
const m1 = new Minipass()
const m2 = new Minipass()
const m3 = new Minipass()
const m4 = new Minipass()
m1.pipe(m2).pipe(m3).pipe(m4)
m4.on('data', () => console.log('made it through'))
// m1 is flowing, so it writes the data to m2 immediately
// m2 is flowing, so it writes the data to m3 immediately
// m3 is flowing, so it writes the data to m4 immediately
// m4 is flowing, so it fires the 'data' event immediately, returns true
// m4's write returned true, so m3 is still flowing, returns true
// m3's write returned true, so m2 is still flowing, returns true
// m2's write returned true, so m1 is still flowing, returns true
// No event deferrals or buffering along the way!
m1.write(Buffer.alloc(2048)) // returns true
```
It is extremely unlikely that you _don't_ want to buffer any data written,
or _ever_ buffer data that can be flushed all the way through. Neither
node-core streams nor Minipass ever fail to buffer written data, but
node-core streams do a lot of unnecessary buffering and pausing.
As always, the faster implementation is the one that does less stuff and
waits less time to do it.
### Immediately emit `end` for empty streams (when not paused)
If a stream is not paused, and `end()` is called before writing any data
into it, then it will emit `end` immediately.
If you have logic that occurs on the `end` event which you don't want to
potentially happen immediately (for example, closing file descriptors,
moving on to the next entry in an archive parse stream, etc.) then be sure
to call `stream.pause()` on creation, and then `stream.resume()` once you
are ready to respond to the `end` event.
However, this is _usually_ not a problem because:
### Emit `end` When Asked
One hazard of immediately emitting `'end'` is that you may not yet have had
a chance to add a listener. In order to avoid this hazard, Minipass
streams safely re-emit the `'end'` event if a new listener is added after
`'end'` has been emitted.
Ie, if you do `stream.on('end', someFunction)`, and the stream has already
emitted `end`, then it will call the handler right away. (You can think of
this somewhat like attaching a new `.then(fn)` to a previously-resolved
Promise.)
To prevent calling handlers multiple times who would not expect multiple
ends to occur, all listeners are removed from the `'end'` event whenever it
is emitted.
### Emit `error` When Asked
The most recent error object passed to the `'error'` event is
stored on the stream. If a new `'error'` event handler is added,
and an error was previously emitted, then the event handler will
be called immediately (or on `process.nextTick` in the case of
async streams).
This makes it much more difficult to end up trying to interact
with a broken stream, if the error handler is added after an
error was previously emitted.
### Impact of "immediate flow" on Tee-streams
A "tee stream" is a stream piping to multiple destinations:
```js
const tee = new Minipass()
t.pipe(dest1)
t.pipe(dest2)
t.write('foo') // goes to both destinations
```
Since Minipass streams _immediately_ process any pending data through the
pipeline when a new pipe destination is added, this can have surprising
effects, especially when a stream comes in from some other function and may
or may not have data in its buffer.
```js
// WARNING! WILL LOSE DATA!
const src = new Minipass()
src.write('foo')
src.pipe(dest1) // 'foo' chunk flows to dest1 immediately, and is gone
src.pipe(dest2) // gets nothing!
```
One solution is to create a dedicated tee-stream junction that pipes to
both locations, and then pipe to _that_ instead.
```js
// Safe example: tee to both places
const src = new Minipass()
src.write('foo')
const tee = new Minipass()
tee.pipe(dest1)
tee.pipe(dest2)
src.pipe(tee) // tee gets 'foo', pipes to both locations
```
The same caveat applies to `on('data')` event listeners. The first one
added will _immediately_ receive all of the data, leaving nothing for the
second:
```js
// WARNING! WILL LOSE DATA!
const src = new Minipass()
src.write('foo')
src.on('data', handler1) // receives 'foo' right away
src.on('data', handler2) // nothing to see here!
```
Using a dedicated tee-stream can be used in this case as well:
```js
// Safe example: tee to both data handlers
const src = new Minipass()
src.write('foo')
const tee = new Minipass()
tee.on('data', handler1)
tee.on('data', handler2)
src.pipe(tee)
```
All of the hazards in this section are avoided by setting `{
async: true }` in the Minipass constructor, or by setting
`stream.async = true` afterwards. Note that this does add some
overhead, so should only be done in cases where you are willing
to lose a bit of performance in order to avoid having to refactor
program logic.
## USAGE
It's a stream! Use it like a stream and it'll most likely do what you
want.
```js
const Minipass = require('minipass')
const mp = new Minipass(options) // optional: { encoding, objectMode }
mp.write('foo')
mp.pipe(someOtherStream)
mp.end('bar')
```
### OPTIONS
* `encoding` How would you like the data coming _out_ of the stream to be
encoded? Accepts any values that can be passed to `Buffer.toString()`.
* `objectMode` Emit data exactly as it comes in. This will be flipped on
by default if you write() something other than a string or Buffer at any
point. Setting `objectMode: true` will prevent setting any encoding
value.
* `async` Defaults to `false`. Set to `true` to defer data
emission until next tick. This reduces performance slightly,
but makes Minipass streams use timing behavior closer to Node
core streams. See [Timing](#timing) for more details.
### API
Implements the user-facing portions of Node.js's `Readable` and `Writable`
streams.
### Methods
* `write(chunk, [encoding], [callback])` - Put data in. (Note that, in the
base Minipass class, the same data will come out.) Returns `false` if
the stream will buffer the next write, or true if it's still in "flowing"
mode.
* `end([chunk, [encoding]], [callback])` - Signal that you have no more
data to write. This will queue an `end` event to be fired when all the
data has been consumed.
* `setEncoding(encoding)` - Set the encoding for data coming of the stream.
This can only be done once.
* `pause()` - No more data for a while, please. This also prevents `end`
from being emitted for empty streams until the stream is resumed.
* `resume()` - Resume the stream. If there's data in the buffer, it is all
discarded. Any buffered events are immediately emitted.
* `pipe(dest)` - Send all output to the stream provided. When
data is emitted, it is immediately written to any and all pipe
destinations. (Or written on next tick in `async` mode.)
* `unpipe(dest)` - Stop piping to the destination stream. This
is immediate, meaning that any asynchronously queued data will
_not_ make it to the destination when running in `async` mode.
* `options.end` - Boolean, end the destination stream when
the source stream ends. Default `true`.
* `options.proxyErrors` - Boolean, proxy `error` events from
the source stream to the destination stream. Note that
errors are _not_ proxied after the pipeline terminates,
either due to the source emitting `'end'` or manually
unpiping with `src.unpipe(dest)`. Default `false`.
* `on(ev, fn)`, `emit(ev, fn)` - Minipass streams are EventEmitters. Some
events are given special treatment, however. (See below under "events".)
* `promise()` - Returns a Promise that resolves when the stream emits
`end`, or rejects if the stream emits `error`.
* `collect()` - Return a Promise that resolves on `end` with an array
containing each chunk of data that was emitted, or rejects if the stream
emits `error`. Note that this consumes the stream data.
* `concat()` - Same as `collect()`, but concatenates the data into a single
Buffer object. Will reject the returned promise if the stream is in
objectMode, or if it goes into objectMode by the end of the data.
* `read(n)` - Consume `n` bytes of data out of the buffer. If `n` is not
provided, then consume all of it. If `n` bytes are not available, then
it returns null. **Note** consuming streams in this way is less
efficient, and can lead to unnecessary Buffer copying.
* `destroy([er])` - Destroy the stream. If an error is provided, then an
`'error'` event is emitted. If the stream has a `close()` method, and
has not emitted a `'close'` event yet, then `stream.close()` will be
called. Any Promises returned by `.promise()`, `.collect()` or
`.concat()` will be rejected. After being destroyed, writing to the
stream will emit an error. No more data will be emitted if the stream is
destroyed, even if it was previously buffered.
### Properties
* `bufferLength` Read-only. Total number of bytes buffered, or in the case
of objectMode, the total number of objects.
* `encoding` The encoding that has been set. (Setting this is equivalent
to calling `setEncoding(enc)` and has the same prohibition against
setting multiple times.)
* `flowing` Read-only. Boolean indicating whether a chunk written to the
stream will be immediately emitted.
* `emittedEnd` Read-only. Boolean indicating whether the end-ish events
(ie, `end`, `prefinish`, `finish`) have been emitted. Note that
listening on any end-ish event will immediateyl re-emit it if it has
already been emitted.
* `writable` Whether the stream is writable. Default `true`. Set to
`false` when `end()`
* `readable` Whether the stream is readable. Default `true`.
* `buffer` A [yallist](http://npm.im/yallist) linked list of chunks written
to the stream that have not yet been emitted. (It's probably a bad idea
to mess with this.)
* `pipes` A [yallist](http://npm.im/yallist) linked list of streams that
this stream is piping into. (It's probably a bad idea to mess with
this.)
* `destroyed` A getter that indicates whether the stream was destroyed.
* `paused` True if the stream has been explicitly paused, otherwise false.
* `objectMode` Indicates whether the stream is in `objectMode`. Once set
to `true`, it cannot be set to `false`.
### Events
* `data` Emitted when there's data to read. Argument is the data to read.
This is never emitted while not flowing. If a listener is attached, that
will resume the stream.
* `end` Emitted when there's no more data to read. This will be emitted
immediately for empty streams when `end()` is called. If a listener is
attached, and `end` was already emitted, then it will be emitted again.
All listeners are removed when `end` is emitted.
* `prefinish` An end-ish event that follows the same logic as `end` and is
emitted in the same conditions where `end` is emitted. Emitted after
`'end'`.
* `finish` An end-ish event that follows the same logic as `end` and is
emitted in the same conditions where `end` is emitted. Emitted after
`'prefinish'`.
* `close` An indication that an underlying resource has been released.
Minipass does not emit this event, but will defer it until after `end`
has been emitted, since it throws off some stream libraries otherwise.
* `drain` Emitted when the internal buffer empties, and it is again
suitable to `write()` into the stream.
* `readable` Emitted when data is buffered and ready to be read by a
consumer.
* `resume` Emitted when stream changes state from buffering to flowing
mode. (Ie, when `resume` is called, `pipe` is called, or a `data` event
listener is added.)
### Static Methods
* `Minipass.isStream(stream)` Returns `true` if the argument is a stream,
and false otherwise. To be considered a stream, the object must be
either an instance of Minipass, or an EventEmitter that has either a
`pipe()` method, or both `write()` and `end()` methods. (Pretty much any
stream in node-land will return `true` for this.)
## EXAMPLES
Here are some examples of things you can do with Minipass streams.
### simple "are you done yet" promise
```js
mp.promise().then(() => {
// stream is finished
}, er => {
// stream emitted an error
})
```
### collecting
```js
mp.collect().then(all => {
// all is an array of all the data emitted
// encoding is supported in this case, so
// so the result will be a collection of strings if
// an encoding is specified, or buffers/objects if not.
//
// In an async function, you may do
// const data = await stream.collect()
})
```
### collecting into a single blob
This is a bit slower because it concatenates the data into one chunk for
you, but if you're going to do it yourself anyway, it's convenient this
way:
```js
mp.concat().then(onebigchunk => {
// onebigchunk is a string if the stream
// had an encoding set, or a buffer otherwise.
})
```
### iteration
You can iterate over streams synchronously or asynchronously in platforms
that support it.
Synchronous iteration will end when the currently available data is
consumed, even if the `end` event has not been reached. In string and
buffer mode, the data is concatenated, so unless multiple writes are
occurring in the same tick as the `read()`, sync iteration loops will
generally only have a single iteration.
To consume chunks in this way exactly as they have been written, with no
flattening, create the stream with the `{ objectMode: true }` option.
```js
const mp = new Minipass({ objectMode: true })
mp.write('a')
mp.write('b')
for (let letter of mp) {
console.log(letter) // a, b
}
mp.write('c')
mp.write('d')
for (let letter of mp) {
console.log(letter) // c, d
}
mp.write('e')
mp.end()
for (let letter of mp) {
console.log(letter) // e
}
for (let letter of mp) {
console.log(letter) // nothing
}
```
Asynchronous iteration will continue until the end event is reached,
consuming all of the data.
```js
const mp = new Minipass({ encoding: 'utf8' })
// some source of some data
let i = 5
const inter = setInterval(() => {
if (i-- > 0)
mp.write(Buffer.from('foo\n', 'utf8'))
else {
mp.end()
clearInterval(inter)
}
}, 100)
// consume the data with asynchronous iteration
async function consume () {
for await (let chunk of mp) {
console.log(chunk)
}
return 'ok'
}
consume().then(res => console.log(res))
// logs `foo\n` 5 times, and then `ok`
```
### subclass that `console.log()`s everything written into it
```js
class Logger extends Minipass {
write (chunk, encoding, callback) {
console.log('WRITE', chunk, encoding)
return super.write(chunk, encoding, callback)
}
end (chunk, encoding, callback) {
console.log('END', chunk, encoding)
return super.end(chunk, encoding, callback)
}
}
someSource.pipe(new Logger()).pipe(someDest)
```
### same thing, but using an inline anonymous class
```js
// js classes are fun
someSource
.pipe(new (class extends Minipass {
emit (ev, ...data) {
// let's also log events, because debugging some weird thing
console.log('EMIT', ev)
return super.emit(ev, ...data)
}
write (chunk, encoding, callback) {
console.log('WRITE', chunk, encoding)
return super.write(chunk, encoding, callback)
}
end (chunk, encoding, callback) {
console.log('END', chunk, encoding)
return super.end(chunk, encoding, callback)
}
}))
.pipe(someDest)
```
### subclass that defers 'end' for some reason
```js
class SlowEnd extends Minipass {
emit (ev, ...args) {
if (ev === 'end') {
console.log('going to end, hold on a sec')
setTimeout(() => {
console.log('ok, ready to end now')
super.emit('end', ...args)
}, 100)
} else {
return super.emit(ev, ...args)
}
}
}
```
### transform that creates newline-delimited JSON
```js
class NDJSONEncode extends Minipass {
write (obj, cb) {
try {
// JSON.stringify can throw, emit an error on that
return super.write(JSON.stringify(obj) + '\n', 'utf8', cb)
} catch (er) {
this.emit('error', er)
}
}
end (obj, cb) {
if (typeof obj === 'function') {
cb = obj
obj = undefined
}
if (obj !== undefined) {
this.write(obj)
}
return super.end(cb)
}
}
```
### transform that parses newline-delimited JSON
```js
class NDJSONDecode extends Minipass {
constructor (options) {
// always be in object mode, as far as Minipass is concerned
super({ objectMode: true })
this._jsonBuffer = ''
}
write (chunk, encoding, cb) {
if (typeof chunk === 'string' &&
typeof encoding === 'string' &&
encoding !== 'utf8') {
chunk = Buffer.from(chunk, encoding).toString()
} else if (Buffer.isBuffer(chunk))
chunk = chunk.toString()
}
if (typeof encoding === 'function') {
cb = encoding
}
const jsonData = (this._jsonBuffer + chunk).split('\n')
this._jsonBuffer = jsonData.pop()
for (let i = 0; i < jsonData.length; i++) {
try {
// JSON.parse can throw, emit an error on that
super.write(JSON.parse(jsonData[i]))
} catch (er) {
this.emit('error', er)
continue
}
}
if (cb)
cb()
}
}
```

View file

@ -0,0 +1,155 @@
/// <reference types="node" />
import { EventEmitter } from 'events'
import { Stream } from 'stream'
declare namespace Minipass {
type Encoding = BufferEncoding | 'buffer' | null
interface Writable extends EventEmitter {
end(): any
write(chunk: any, ...args: any[]): any
}
interface Readable extends EventEmitter {
pause(): any
resume(): any
pipe(): any
}
interface Pipe<R, W> {
src: Minipass<R, W>
dest: Writable
opts: PipeOptions
}
type DualIterable<T> = Iterable<T> & AsyncIterable<T>
type ContiguousData = Buffer | ArrayBufferLike | ArrayBufferView | string
type BufferOrString = Buffer | string
interface StringOptions {
encoding: BufferEncoding
objectMode?: boolean
async?: boolean
}
interface BufferOptions {
encoding?: null | 'buffer'
objectMode?: boolean
async?: boolean
}
interface ObjectModeOptions {
objectMode: true
async?: boolean
}
interface PipeOptions {
end?: boolean
proxyErrors?: boolean
}
type Options<T> = T extends string
? StringOptions
: T extends Buffer
? BufferOptions
: ObjectModeOptions
}
declare class Minipass<
RType extends any = Buffer,
WType extends any = RType extends Minipass.BufferOrString
? Minipass.ContiguousData
: RType
>
extends Stream
implements Minipass.DualIterable<RType>
{
static isStream(stream: any): stream is Minipass.Readable | Minipass.Writable
readonly bufferLength: number
readonly flowing: boolean
readonly writable: boolean
readonly readable: boolean
readonly paused: boolean
readonly emittedEnd: boolean
readonly destroyed: boolean
/**
* Not technically private or readonly, but not safe to mutate.
*/
private readonly buffer: RType[]
private readonly pipes: Minipass.Pipe<RType, WType>[]
/**
* Technically writable, but mutating it can change the type,
* so is not safe to do in TypeScript.
*/
readonly objectMode: boolean
async: boolean
/**
* Note: encoding is not actually read-only, and setEncoding(enc)
* exists. However, this type definition will insist that TypeScript
* programs declare the type of a Minipass stream up front, and if
* that type is string, then an encoding MUST be set in the ctor. If
* the type is Buffer, then the encoding must be missing, or set to
* 'buffer' or null. If the type is anything else, then objectMode
* must be set in the constructor options. So there is effectively
* no allowed way that a TS program can set the encoding after
* construction, as doing so will destroy any hope of type safety.
* TypeScript does not provide many options for changing the type of
* an object at run-time, which is what changing the encoding does.
*/
readonly encoding: Minipass.Encoding
// setEncoding(encoding: Encoding): void
// Options required if not reading buffers
constructor(
...args: RType extends Buffer
? [] | [Minipass.Options<RType>]
: [Minipass.Options<RType>]
)
write(chunk: WType, cb?: () => void): boolean
write(chunk: WType, encoding?: Minipass.Encoding, cb?: () => void): boolean
read(size?: number): RType
end(cb?: () => void): this
end(chunk: any, cb?: () => void): this
end(chunk: any, encoding?: Minipass.Encoding, cb?: () => void): this
pause(): void
resume(): void
promise(): Promise<void>
collect(): Promise<RType[]>
concat(): RType extends Minipass.BufferOrString ? Promise<RType> : never
destroy(er?: any): void
pipe<W extends Minipass.Writable>(dest: W, opts?: Minipass.PipeOptions): W
unpipe<W extends Minipass.Writable>(dest: W): void
/**
* alias for on()
*/
addEventHandler(event: string, listener: (...args: any[]) => any): this
on(event: string, listener: (...args: any[]) => any): this
on(event: 'data', listener: (chunk: RType) => any): this
on(event: 'error', listener: (error: any) => any): this
on(
event:
| 'readable'
| 'drain'
| 'resume'
| 'end'
| 'prefinish'
| 'finish'
| 'close',
listener: () => any
): this
[Symbol.iterator](): Iterator<RType>
[Symbol.asyncIterator](): AsyncIterator<RType>
}
export = Minipass

649
node_modules/fs-minipass/node_modules/minipass/index.js generated vendored Normal file
View file

@ -0,0 +1,649 @@
'use strict'
const proc = typeof process === 'object' && process ? process : {
stdout: null,
stderr: null,
}
const EE = require('events')
const Stream = require('stream')
const SD = require('string_decoder').StringDecoder
const EOF = Symbol('EOF')
const MAYBE_EMIT_END = Symbol('maybeEmitEnd')
const EMITTED_END = Symbol('emittedEnd')
const EMITTING_END = Symbol('emittingEnd')
const EMITTED_ERROR = Symbol('emittedError')
const CLOSED = Symbol('closed')
const READ = Symbol('read')
const FLUSH = Symbol('flush')
const FLUSHCHUNK = Symbol('flushChunk')
const ENCODING = Symbol('encoding')
const DECODER = Symbol('decoder')
const FLOWING = Symbol('flowing')
const PAUSED = Symbol('paused')
const RESUME = Symbol('resume')
const BUFFERLENGTH = Symbol('bufferLength')
const BUFFERPUSH = Symbol('bufferPush')
const BUFFERSHIFT = Symbol('bufferShift')
const OBJECTMODE = Symbol('objectMode')
const DESTROYED = Symbol('destroyed')
const EMITDATA = Symbol('emitData')
const EMITEND = Symbol('emitEnd')
const EMITEND2 = Symbol('emitEnd2')
const ASYNC = Symbol('async')
const defer = fn => Promise.resolve().then(fn)
// TODO remove when Node v8 support drops
const doIter = global._MP_NO_ITERATOR_SYMBOLS_ !== '1'
const ASYNCITERATOR = doIter && Symbol.asyncIterator
|| Symbol('asyncIterator not implemented')
const ITERATOR = doIter && Symbol.iterator
|| Symbol('iterator not implemented')
// events that mean 'the stream is over'
// these are treated specially, and re-emitted
// if they are listened for after emitting.
const isEndish = ev =>
ev === 'end' ||
ev === 'finish' ||
ev === 'prefinish'
const isArrayBuffer = b => b instanceof ArrayBuffer ||
typeof b === 'object' &&
b.constructor &&
b.constructor.name === 'ArrayBuffer' &&
b.byteLength >= 0
const isArrayBufferView = b => !Buffer.isBuffer(b) && ArrayBuffer.isView(b)
class Pipe {
constructor (src, dest, opts) {
this.src = src
this.dest = dest
this.opts = opts
this.ondrain = () => src[RESUME]()
dest.on('drain', this.ondrain)
}
unpipe () {
this.dest.removeListener('drain', this.ondrain)
}
// istanbul ignore next - only here for the prototype
proxyErrors () {}
end () {
this.unpipe()
if (this.opts.end)
this.dest.end()
}
}
class PipeProxyErrors extends Pipe {
unpipe () {
this.src.removeListener('error', this.proxyErrors)
super.unpipe()
}
constructor (src, dest, opts) {
super(src, dest, opts)
this.proxyErrors = er => dest.emit('error', er)
src.on('error', this.proxyErrors)
}
}
module.exports = class Minipass extends Stream {
constructor (options) {
super()
this[FLOWING] = false
// whether we're explicitly paused
this[PAUSED] = false
this.pipes = []
this.buffer = []
this[OBJECTMODE] = options && options.objectMode || false
if (this[OBJECTMODE])
this[ENCODING] = null
else
this[ENCODING] = options && options.encoding || null
if (this[ENCODING] === 'buffer')
this[ENCODING] = null
this[ASYNC] = options && !!options.async || false
this[DECODER] = this[ENCODING] ? new SD(this[ENCODING]) : null
this[EOF] = false
this[EMITTED_END] = false
this[EMITTING_END] = false
this[CLOSED] = false
this[EMITTED_ERROR] = null
this.writable = true
this.readable = true
this[BUFFERLENGTH] = 0
this[DESTROYED] = false
}
get bufferLength () { return this[BUFFERLENGTH] }
get encoding () { return this[ENCODING] }
set encoding (enc) {
if (this[OBJECTMODE])
throw new Error('cannot set encoding in objectMode')
if (this[ENCODING] && enc !== this[ENCODING] &&
(this[DECODER] && this[DECODER].lastNeed || this[BUFFERLENGTH]))
throw new Error('cannot change encoding')
if (this[ENCODING] !== enc) {
this[DECODER] = enc ? new SD(enc) : null
if (this.buffer.length)
this.buffer = this.buffer.map(chunk => this[DECODER].write(chunk))
}
this[ENCODING] = enc
}
setEncoding (enc) {
this.encoding = enc
}
get objectMode () { return this[OBJECTMODE] }
set objectMode (om) { this[OBJECTMODE] = this[OBJECTMODE] || !!om }
get ['async'] () { return this[ASYNC] }
set ['async'] (a) { this[ASYNC] = this[ASYNC] || !!a }
write (chunk, encoding, cb) {
if (this[EOF])
throw new Error('write after end')
if (this[DESTROYED]) {
this.emit('error', Object.assign(
new Error('Cannot call write after a stream was destroyed'),
{ code: 'ERR_STREAM_DESTROYED' }
))
return true
}
if (typeof encoding === 'function')
cb = encoding, encoding = 'utf8'
if (!encoding)
encoding = 'utf8'
const fn = this[ASYNC] ? defer : f => f()
// convert array buffers and typed array views into buffers
// at some point in the future, we may want to do the opposite!
// leave strings and buffers as-is
// anything else switches us into object mode
if (!this[OBJECTMODE] && !Buffer.isBuffer(chunk)) {
if (isArrayBufferView(chunk))
chunk = Buffer.from(chunk.buffer, chunk.byteOffset, chunk.byteLength)
else if (isArrayBuffer(chunk))
chunk = Buffer.from(chunk)
else if (typeof chunk !== 'string')
// use the setter so we throw if we have encoding set
this.objectMode = true
}
// handle object mode up front, since it's simpler
// this yields better performance, fewer checks later.
if (this[OBJECTMODE]) {
/* istanbul ignore if - maybe impossible? */
if (this.flowing && this[BUFFERLENGTH] !== 0)
this[FLUSH](true)
if (this.flowing)
this.emit('data', chunk)
else
this[BUFFERPUSH](chunk)
if (this[BUFFERLENGTH] !== 0)
this.emit('readable')
if (cb)
fn(cb)
return this.flowing
}
// at this point the chunk is a buffer or string
// don't buffer it up or send it to the decoder
if (!chunk.length) {
if (this[BUFFERLENGTH] !== 0)
this.emit('readable')
if (cb)
fn(cb)
return this.flowing
}
// fast-path writing strings of same encoding to a stream with
// an empty buffer, skipping the buffer/decoder dance
if (typeof chunk === 'string' &&
// unless it is a string already ready for us to use
!(encoding === this[ENCODING] && !this[DECODER].lastNeed)) {
chunk = Buffer.from(chunk, encoding)
}
if (Buffer.isBuffer(chunk) && this[ENCODING])
chunk = this[DECODER].write(chunk)
// Note: flushing CAN potentially switch us into not-flowing mode
if (this.flowing && this[BUFFERLENGTH] !== 0)
this[FLUSH](true)
if (this.flowing)
this.emit('data', chunk)
else
this[BUFFERPUSH](chunk)
if (this[BUFFERLENGTH] !== 0)
this.emit('readable')
if (cb)
fn(cb)
return this.flowing
}
read (n) {
if (this[DESTROYED])
return null
if (this[BUFFERLENGTH] === 0 || n === 0 || n > this[BUFFERLENGTH]) {
this[MAYBE_EMIT_END]()
return null
}
if (this[OBJECTMODE])
n = null
if (this.buffer.length > 1 && !this[OBJECTMODE]) {
if (this.encoding)
this.buffer = [this.buffer.join('')]
else
this.buffer = [Buffer.concat(this.buffer, this[BUFFERLENGTH])]
}
const ret = this[READ](n || null, this.buffer[0])
this[MAYBE_EMIT_END]()
return ret
}
[READ] (n, chunk) {
if (n === chunk.length || n === null)
this[BUFFERSHIFT]()
else {
this.buffer[0] = chunk.slice(n)
chunk = chunk.slice(0, n)
this[BUFFERLENGTH] -= n
}
this.emit('data', chunk)
if (!this.buffer.length && !this[EOF])
this.emit('drain')
return chunk
}
end (chunk, encoding, cb) {
if (typeof chunk === 'function')
cb = chunk, chunk = null
if (typeof encoding === 'function')
cb = encoding, encoding = 'utf8'
if (chunk)
this.write(chunk, encoding)
if (cb)
this.once('end', cb)
this[EOF] = true
this.writable = false
// if we haven't written anything, then go ahead and emit,
// even if we're not reading.
// we'll re-emit if a new 'end' listener is added anyway.
// This makes MP more suitable to write-only use cases.
if (this.flowing || !this[PAUSED])
this[MAYBE_EMIT_END]()
return this
}
// don't let the internal resume be overwritten
[RESUME] () {
if (this[DESTROYED])
return
this[PAUSED] = false
this[FLOWING] = true
this.emit('resume')
if (this.buffer.length)
this[FLUSH]()
else if (this[EOF])
this[MAYBE_EMIT_END]()
else
this.emit('drain')
}
resume () {
return this[RESUME]()
}
pause () {
this[FLOWING] = false
this[PAUSED] = true
}
get destroyed () {
return this[DESTROYED]
}
get flowing () {
return this[FLOWING]
}
get paused () {
return this[PAUSED]
}
[BUFFERPUSH] (chunk) {
if (this[OBJECTMODE])
this[BUFFERLENGTH] += 1
else
this[BUFFERLENGTH] += chunk.length
this.buffer.push(chunk)
}
[BUFFERSHIFT] () {
if (this.buffer.length) {
if (this[OBJECTMODE])
this[BUFFERLENGTH] -= 1
else
this[BUFFERLENGTH] -= this.buffer[0].length
}
return this.buffer.shift()
}
[FLUSH] (noDrain) {
do {} while (this[FLUSHCHUNK](this[BUFFERSHIFT]()))
if (!noDrain && !this.buffer.length && !this[EOF])
this.emit('drain')
}
[FLUSHCHUNK] (chunk) {
return chunk ? (this.emit('data', chunk), this.flowing) : false
}
pipe (dest, opts) {
if (this[DESTROYED])
return
const ended = this[EMITTED_END]
opts = opts || {}
if (dest === proc.stdout || dest === proc.stderr)
opts.end = false
else
opts.end = opts.end !== false
opts.proxyErrors = !!opts.proxyErrors
// piping an ended stream ends immediately
if (ended) {
if (opts.end)
dest.end()
} else {
this.pipes.push(!opts.proxyErrors ? new Pipe(this, dest, opts)
: new PipeProxyErrors(this, dest, opts))
if (this[ASYNC])
defer(() => this[RESUME]())
else
this[RESUME]()
}
return dest
}
unpipe (dest) {
const p = this.pipes.find(p => p.dest === dest)
if (p) {
this.pipes.splice(this.pipes.indexOf(p), 1)
p.unpipe()
}
}
addListener (ev, fn) {
return this.on(ev, fn)
}
on (ev, fn) {
const ret = super.on(ev, fn)
if (ev === 'data' && !this.pipes.length && !this.flowing)
this[RESUME]()
else if (ev === 'readable' && this[BUFFERLENGTH] !== 0)
super.emit('readable')
else if (isEndish(ev) && this[EMITTED_END]) {
super.emit(ev)
this.removeAllListeners(ev)
} else if (ev === 'error' && this[EMITTED_ERROR]) {
if (this[ASYNC])
defer(() => fn.call(this, this[EMITTED_ERROR]))
else
fn.call(this, this[EMITTED_ERROR])
}
return ret
}
get emittedEnd () {
return this[EMITTED_END]
}
[MAYBE_EMIT_END] () {
if (!this[EMITTING_END] &&
!this[EMITTED_END] &&
!this[DESTROYED] &&
this.buffer.length === 0 &&
this[EOF]) {
this[EMITTING_END] = true
this.emit('end')
this.emit('prefinish')
this.emit('finish')
if (this[CLOSED])
this.emit('close')
this[EMITTING_END] = false
}
}
emit (ev, data, ...extra) {
// error and close are only events allowed after calling destroy()
if (ev !== 'error' && ev !== 'close' && ev !== DESTROYED && this[DESTROYED])
return
else if (ev === 'data') {
return !data ? false
: this[ASYNC] ? defer(() => this[EMITDATA](data))
: this[EMITDATA](data)
} else if (ev === 'end') {
return this[EMITEND]()
} else if (ev === 'close') {
this[CLOSED] = true
// don't emit close before 'end' and 'finish'
if (!this[EMITTED_END] && !this[DESTROYED])
return
const ret = super.emit('close')
this.removeAllListeners('close')
return ret
} else if (ev === 'error') {
this[EMITTED_ERROR] = data
const ret = super.emit('error', data)
this[MAYBE_EMIT_END]()
return ret
} else if (ev === 'resume') {
const ret = super.emit('resume')
this[MAYBE_EMIT_END]()
return ret
} else if (ev === 'finish' || ev === 'prefinish') {
const ret = super.emit(ev)
this.removeAllListeners(ev)
return ret
}
// Some other unknown event
const ret = super.emit(ev, data, ...extra)
this[MAYBE_EMIT_END]()
return ret
}
[EMITDATA] (data) {
for (const p of this.pipes) {
if (p.dest.write(data) === false)
this.pause()
}
const ret = super.emit('data', data)
this[MAYBE_EMIT_END]()
return ret
}
[EMITEND] () {
if (this[EMITTED_END])
return
this[EMITTED_END] = true
this.readable = false
if (this[ASYNC])
defer(() => this[EMITEND2]())
else
this[EMITEND2]()
}
[EMITEND2] () {
if (this[DECODER]) {
const data = this[DECODER].end()
if (data) {
for (const p of this.pipes) {
p.dest.write(data)
}
super.emit('data', data)
}
}
for (const p of this.pipes) {
p.end()
}
const ret = super.emit('end')
this.removeAllListeners('end')
return ret
}
// const all = await stream.collect()
collect () {
const buf = []
if (!this[OBJECTMODE])
buf.dataLength = 0
// set the promise first, in case an error is raised
// by triggering the flow here.
const p = this.promise()
this.on('data', c => {
buf.push(c)
if (!this[OBJECTMODE])
buf.dataLength += c.length
})
return p.then(() => buf)
}
// const data = await stream.concat()
concat () {
return this[OBJECTMODE]
? Promise.reject(new Error('cannot concat in objectMode'))
: this.collect().then(buf =>
this[OBJECTMODE]
? Promise.reject(new Error('cannot concat in objectMode'))
: this[ENCODING] ? buf.join('') : Buffer.concat(buf, buf.dataLength))
}
// stream.promise().then(() => done, er => emitted error)
promise () {
return new Promise((resolve, reject) => {
this.on(DESTROYED, () => reject(new Error('stream destroyed')))
this.on('error', er => reject(er))
this.on('end', () => resolve())
})
}
// for await (let chunk of stream)
[ASYNCITERATOR] () {
const next = () => {
const res = this.read()
if (res !== null)
return Promise.resolve({ done: false, value: res })
if (this[EOF])
return Promise.resolve({ done: true })
let resolve = null
let reject = null
const onerr = er => {
this.removeListener('data', ondata)
this.removeListener('end', onend)
reject(er)
}
const ondata = value => {
this.removeListener('error', onerr)
this.removeListener('end', onend)
this.pause()
resolve({ value: value, done: !!this[EOF] })
}
const onend = () => {
this.removeListener('error', onerr)
this.removeListener('data', ondata)
resolve({ done: true })
}
const ondestroy = () => onerr(new Error('stream destroyed'))
return new Promise((res, rej) => {
reject = rej
resolve = res
this.once(DESTROYED, ondestroy)
this.once('error', onerr)
this.once('end', onend)
this.once('data', ondata)
})
}
return { next }
}
// for (let chunk of stream)
[ITERATOR] () {
const next = () => {
const value = this.read()
const done = value === null
return { value, done }
}
return { next }
}
destroy (er) {
if (this[DESTROYED]) {
if (er)
this.emit('error', er)
else
this.emit(DESTROYED)
return this
}
this[DESTROYED] = true
// throw away all buffered data, it's never coming out
this.buffer.length = 0
this[BUFFERLENGTH] = 0
if (typeof this.close === 'function' && !this[CLOSED])
this.close()
if (er)
this.emit('error', er)
else // if no error to emit, still reject pending promises
this.emit(DESTROYED)
return this
}
static isStream (s) {
return !!s && (s instanceof Minipass || s instanceof Stream ||
s instanceof EE && (
typeof s.pipe === 'function' || // readable
(typeof s.write === 'function' && typeof s.end === 'function') // writable
))
}
}

View file

@ -0,0 +1,89 @@
{
"_from": "minipass@^3.0.0",
"_id": "minipass@3.3.6",
"_inBundle": false,
"_integrity": "sha512-DxiNidxSEK+tHG6zOIklvNOwm3hvCrbUrdtzY74U6HKTJxvIDfOUL5W5P2Ghd3DTkhhKPYGqeNUIh5qcM4YBfw==",
"_location": "/fs-minipass/minipass",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "minipass@^3.0.0",
"name": "minipass",
"escapedName": "minipass",
"rawSpec": "^3.0.0",
"saveSpec": null,
"fetchSpec": "^3.0.0"
},
"_requiredBy": [
"/fs-minipass"
],
"_resolved": "https://registry.npmjs.org/minipass/-/minipass-3.3.6.tgz",
"_shasum": "7bba384db3a1520d18c9c0e5251c3444e95dd94a",
"_spec": "minipass@^3.0.0",
"_where": "/home/ubuntu/fix/node_modules/fs-minipass",
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
"url": "http://blog.izs.me/"
},
"bugs": {
"url": "https://github.com/isaacs/minipass/issues"
},
"bundleDependencies": false,
"dependencies": {
"yallist": "^4.0.0"
},
"deprecated": false,
"description": "minimal implementation of a PassThrough stream",
"devDependencies": {
"@types/node": "^17.0.41",
"end-of-stream": "^1.4.0",
"prettier": "^2.6.2",
"tap": "^16.2.0",
"through2": "^2.0.3",
"ts-node": "^10.8.1",
"typescript": "^4.7.3"
},
"engines": {
"node": ">=8"
},
"files": [
"index.d.ts",
"index.js"
],
"homepage": "https://github.com/isaacs/minipass#readme",
"keywords": [
"passthrough",
"stream"
],
"license": "ISC",
"main": "index.js",
"name": "minipass",
"prettier": {
"semi": false,
"printWidth": 80,
"tabWidth": 2,
"useTabs": false,
"singleQuote": true,
"jsxSingleQuote": false,
"bracketSameLine": true,
"arrowParens": "avoid",
"endOfLine": "lf"
},
"repository": {
"type": "git",
"url": "git+https://github.com/isaacs/minipass.git"
},
"scripts": {
"postpublish": "git push origin --follow-tags",
"postversion": "npm publish",
"preversion": "npm test",
"test": "tap"
},
"tap": {
"check-coverage": true
},
"types": "index.d.ts",
"version": "3.3.6"
}

68
node_modules/fs-minipass/package.json generated vendored Normal file
View file

@ -0,0 +1,68 @@
{
"_from": "fs-minipass@^2.0.0",
"_id": "fs-minipass@2.1.0",
"_inBundle": false,
"_integrity": "sha512-V/JgOLFCS+R6Vcq0slCuaeWEdNC3ouDlJMNIsacH2VtALiu9mV4LPrHc5cDl8k5aw6J8jwgWWpiTo5RYhmIzvg==",
"_location": "/fs-minipass",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "fs-minipass@^2.0.0",
"name": "fs-minipass",
"escapedName": "fs-minipass",
"rawSpec": "^2.0.0",
"saveSpec": null,
"fetchSpec": "^2.0.0"
},
"_requiredBy": [
"/tar"
],
"_resolved": "https://registry.npmjs.org/fs-minipass/-/fs-minipass-2.1.0.tgz",
"_shasum": "7f5036fdbf12c63c169190cbe4199c852271f9fb",
"_spec": "fs-minipass@^2.0.0",
"_where": "/media/serveradmin/Server/developement/node_modules/tar",
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
"url": "http://blog.izs.me/"
},
"bugs": {
"url": "https://github.com/npm/fs-minipass/issues"
},
"bundleDependencies": false,
"dependencies": {
"minipass": "^3.0.0"
},
"deprecated": false,
"description": "fs read and write streams based on minipass",
"devDependencies": {
"mutate-fs": "^2.0.1",
"tap": "^14.6.4"
},
"engines": {
"node": ">= 8"
},
"files": [
"index.js"
],
"homepage": "https://github.com/npm/fs-minipass#readme",
"keywords": [],
"license": "ISC",
"main": "index.js",
"name": "fs-minipass",
"repository": {
"type": "git",
"url": "git+https://github.com/npm/fs-minipass.git"
},
"scripts": {
"postpublish": "git push origin --follow-tags",
"postversion": "npm publish",
"preversion": "npm test",
"test": "tap"
},
"tap": {
"check-coverage": true
},
"version": "2.1.0"
}

20
node_modules/function-bind/.editorconfig generated vendored Normal file
View file

@ -0,0 +1,20 @@
root = true
[*]
indent_style = tab
indent_size = 4
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
max_line_length = 120
[CHANGELOG.md]
indent_style = space
indent_size = 2
[*.json]
max_line_length = off
[Makefile]
max_line_length = off

15
node_modules/function-bind/.eslintrc generated vendored Normal file
View file

@ -0,0 +1,15 @@
{
"root": true,
"extends": "@ljharb",
"rules": {
"func-name-matching": 0,
"indent": [2, 4],
"max-nested-callbacks": [2, 3],
"max-params": [2, 3],
"max-statements": [2, 20],
"no-new-func": [1],
"strict": [0]
}
}

176
node_modules/function-bind/.jscs.json generated vendored Normal file
View file

@ -0,0 +1,176 @@
{
"es3": true,
"additionalRules": [],
"requireSemicolons": true,
"disallowMultipleSpaces": true,
"disallowIdentifierNames": [],
"requireCurlyBraces": {
"allExcept": [],
"keywords": ["if", "else", "for", "while", "do", "try", "catch"]
},
"requireSpaceAfterKeywords": ["if", "else", "for", "while", "do", "switch", "return", "try", "catch", "function"],
"disallowSpaceAfterKeywords": [],
"disallowSpaceBeforeComma": true,
"disallowSpaceAfterComma": false,
"disallowSpaceBeforeSemicolon": true,
"disallowNodeTypes": [
"DebuggerStatement",
"ForInStatement",
"LabeledStatement",
"SwitchCase",
"SwitchStatement",
"WithStatement"
],
"requireObjectKeysOnNewLine": { "allExcept": ["sameLine"] },
"requireSpacesInAnonymousFunctionExpression": { "beforeOpeningRoundBrace": true, "beforeOpeningCurlyBrace": true },
"requireSpacesInNamedFunctionExpression": { "beforeOpeningCurlyBrace": true },
"disallowSpacesInNamedFunctionExpression": { "beforeOpeningRoundBrace": true },
"requireSpacesInFunctionDeclaration": { "beforeOpeningCurlyBrace": true },
"disallowSpacesInFunctionDeclaration": { "beforeOpeningRoundBrace": true },
"requireSpaceBetweenArguments": true,
"disallowSpacesInsideParentheses": true,
"disallowSpacesInsideArrayBrackets": true,
"disallowQuotedKeysInObjects": { "allExcept": ["reserved"] },
"disallowSpaceAfterObjectKeys": true,
"requireCommaBeforeLineBreak": true,
"disallowSpaceAfterPrefixUnaryOperators": ["++", "--", "+", "-", "~", "!"],
"requireSpaceAfterPrefixUnaryOperators": [],
"disallowSpaceBeforePostfixUnaryOperators": ["++", "--"],
"requireSpaceBeforePostfixUnaryOperators": [],
"disallowSpaceBeforeBinaryOperators": [],
"requireSpaceBeforeBinaryOperators": ["+", "-", "/", "*", "=", "==", "===", "!=", "!=="],
"requireSpaceAfterBinaryOperators": ["+", "-", "/", "*", "=", "==", "===", "!=", "!=="],
"disallowSpaceAfterBinaryOperators": [],
"disallowImplicitTypeConversion": ["binary", "string"],
"disallowKeywords": ["with", "eval"],
"requireKeywordsOnNewLine": [],
"disallowKeywordsOnNewLine": ["else"],
"requireLineFeedAtFileEnd": true,
"disallowTrailingWhitespace": true,
"disallowTrailingComma": true,
"excludeFiles": ["node_modules/**", "vendor/**"],
"disallowMultipleLineStrings": true,
"requireDotNotation": { "allExcept": ["keywords"] },
"requireParenthesesAroundIIFE": true,
"validateLineBreaks": "LF",
"validateQuoteMarks": {
"escape": true,
"mark": "'"
},
"disallowOperatorBeforeLineBreak": [],
"requireSpaceBeforeKeywords": [
"do",
"for",
"if",
"else",
"switch",
"case",
"try",
"catch",
"finally",
"while",
"with",
"return"
],
"validateAlignedFunctionParameters": {
"lineBreakAfterOpeningBraces": true,
"lineBreakBeforeClosingBraces": true
},
"requirePaddingNewLinesBeforeExport": true,
"validateNewlineAfterArrayElements": {
"maximum": 8
},
"requirePaddingNewLinesAfterUseStrict": true,
"disallowArrowFunctions": true,
"disallowMultiLineTernary": true,
"validateOrderInObjectKeys": "asc-insensitive",
"disallowIdenticalDestructuringNames": true,
"disallowNestedTernaries": { "maxLevel": 1 },
"requireSpaceAfterComma": { "allExcept": ["trailing"] },
"requireAlignedMultilineParams": false,
"requireSpacesInGenerator": {
"afterStar": true
},
"disallowSpacesInGenerator": {
"beforeStar": true
},
"disallowVar": false,
"requireArrayDestructuring": false,
"requireEnhancedObjectLiterals": false,
"requireObjectDestructuring": false,
"requireEarlyReturn": false,
"requireCapitalizedConstructorsNew": {
"allExcept": ["Function", "String", "Object", "Symbol", "Number", "Date", "RegExp", "Error", "Boolean", "Array"]
},
"requireImportAlphabetized": false,
"requireSpaceBeforeObjectValues": true,
"requireSpaceBeforeDestructuredValues": true,
"disallowSpacesInsideTemplateStringPlaceholders": true,
"disallowArrayDestructuringReturn": false,
"requireNewlineBeforeSingleStatementsInIf": false,
"disallowUnusedVariables": true,
"requireSpacesInsideImportedObjectBraces": true,
"requireUseStrict": true
}

22
node_modules/function-bind/.npmignore generated vendored Normal file
View file

@ -0,0 +1,22 @@
# gitignore
.DS_Store
.monitor
.*.swp
.nodemonignore
releases
*.log
*.err
fleet.json
public/browserify
bin/*.json
.bin
build
compile
.lock-wscript
coverage
node_modules
# Only apps should have lockfiles
npm-shrinkwrap.json
package-lock.json
yarn.lock

168
node_modules/function-bind/.travis.yml generated vendored Normal file
View file

@ -0,0 +1,168 @@
language: node_js
os:
- linux
node_js:
- "8.4"
- "7.10"
- "6.11"
- "5.12"
- "4.8"
- "iojs-v3.3"
- "iojs-v2.5"
- "iojs-v1.8"
- "0.12"
- "0.10"
- "0.8"
before_install:
- 'if [ "${TRAVIS_NODE_VERSION}" = "0.6" ]; then npm install -g npm@1.3 ; elif [ "${TRAVIS_NODE_VERSION}" != "0.9" ]; then case "$(npm --version)" in 1.*) npm install -g npm@1.4.28 ;; 2.*) npm install -g npm@2 ;; esac ; fi'
- 'if [ "${TRAVIS_NODE_VERSION}" != "0.6" ] && [ "${TRAVIS_NODE_VERSION}" != "0.9" ]; then if [ "${TRAVIS_NODE_VERSION%${TRAVIS_NODE_VERSION#[0-9]}}" = "0" ] || [ "${TRAVIS_NODE_VERSION:0:4}" = "iojs" ]; then npm install -g npm@4.5 ; else npm install -g npm; fi; fi'
install:
- 'if [ "${TRAVIS_NODE_VERSION}" = "0.6" ]; then nvm install 0.8 && npm install -g npm@1.3 && npm install -g npm@1.4.28 && npm install -g npm@2 && npm install && nvm use "${TRAVIS_NODE_VERSION}"; else npm install; fi;'
script:
- 'if [ -n "${PRETEST-}" ]; then npm run pretest ; fi'
- 'if [ -n "${POSTTEST-}" ]; then npm run posttest ; fi'
- 'if [ -n "${COVERAGE-}" ]; then npm run coverage ; fi'
- 'if [ -n "${TEST-}" ]; then npm run tests-only ; fi'
sudo: false
env:
- TEST=true
matrix:
fast_finish: true
include:
- node_js: "node"
env: PRETEST=true
- node_js: "4"
env: COVERAGE=true
- node_js: "8.3"
env: TEST=true ALLOW_FAILURE=true
- node_js: "8.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "8.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "8.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.9"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.8"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.7"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.6"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.5"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.4"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.3"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "7.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.10"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.9"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.8"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.7"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.6"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.5"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.4"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.3"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "6.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.11"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.10"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.9"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.8"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.7"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.6"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.5"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.4"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.3"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "5.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.7"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.6"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.5"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.4"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.3"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "4.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v3.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v3.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v3.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v2.4"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v2.3"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v2.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v2.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v2.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.7"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.6"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.5"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.4"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.3"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.2"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.1"
env: TEST=true ALLOW_FAILURE=true
- node_js: "iojs-v1.0"
env: TEST=true ALLOW_FAILURE=true
- node_js: "0.11"
env: TEST=true ALLOW_FAILURE=true
- node_js: "0.9"
env: TEST=true ALLOW_FAILURE=true
- node_js: "0.6"
env: TEST=true ALLOW_FAILURE=true
- node_js: "0.4"
env: TEST=true ALLOW_FAILURE=true
allow_failures:
- os: osx
- env: TEST=true ALLOW_FAILURE=true

20
node_modules/function-bind/LICENSE generated vendored Normal file
View file

@ -0,0 +1,20 @@
Copyright (c) 2013 Raynos.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

48
node_modules/function-bind/README.md generated vendored Normal file
View file

@ -0,0 +1,48 @@
# function-bind
<!--
[![build status][travis-svg]][travis-url]
[![NPM version][npm-badge-svg]][npm-url]
[![Coverage Status][5]][6]
[![gemnasium Dependency Status][7]][8]
[![Dependency status][deps-svg]][deps-url]
[![Dev Dependency status][dev-deps-svg]][dev-deps-url]
-->
<!-- [![browser support][11]][12] -->
Implementation of function.prototype.bind
## Example
I mainly do this for unit tests I run on phantomjs.
PhantomJS does not have Function.prototype.bind :(
```js
Function.prototype.bind = require("function-bind")
```
## Installation
`npm install function-bind`
## Contributors
- Raynos
## MIT Licenced
[travis-svg]: https://travis-ci.org/Raynos/function-bind.svg
[travis-url]: https://travis-ci.org/Raynos/function-bind
[npm-badge-svg]: https://badge.fury.io/js/function-bind.svg
[npm-url]: https://npmjs.org/package/function-bind
[5]: https://coveralls.io/repos/Raynos/function-bind/badge.png
[6]: https://coveralls.io/r/Raynos/function-bind
[7]: https://gemnasium.com/Raynos/function-bind.png
[8]: https://gemnasium.com/Raynos/function-bind
[deps-svg]: https://david-dm.org/Raynos/function-bind.svg
[deps-url]: https://david-dm.org/Raynos/function-bind
[dev-deps-svg]: https://david-dm.org/Raynos/function-bind/dev-status.svg
[dev-deps-url]: https://david-dm.org/Raynos/function-bind#info=devDependencies
[11]: https://ci.testling.com/Raynos/function-bind.png
[12]: https://ci.testling.com/Raynos/function-bind

52
node_modules/function-bind/implementation.js generated vendored Normal file
View file

@ -0,0 +1,52 @@
'use strict';
/* eslint no-invalid-this: 1 */
var ERROR_MESSAGE = 'Function.prototype.bind called on incompatible ';
var slice = Array.prototype.slice;
var toStr = Object.prototype.toString;
var funcType = '[object Function]';
module.exports = function bind(that) {
var target = this;
if (typeof target !== 'function' || toStr.call(target) !== funcType) {
throw new TypeError(ERROR_MESSAGE + target);
}
var args = slice.call(arguments, 1);
var bound;
var binder = function () {
if (this instanceof bound) {
var result = target.apply(
this,
args.concat(slice.call(arguments))
);
if (Object(result) === result) {
return result;
}
return this;
} else {
return target.apply(
that,
args.concat(slice.call(arguments))
);
}
};
var boundLength = Math.max(0, target.length - args.length);
var boundArgs = [];
for (var i = 0; i < boundLength; i++) {
boundArgs.push('$' + i);
}
bound = Function('binder', 'return function (' + boundArgs.join(',') + '){ return binder.apply(this,arguments); }')(binder);
if (target.prototype) {
var Empty = function Empty() {};
Empty.prototype = target.prototype;
bound.prototype = new Empty();
Empty.prototype = null;
}
return bound;
};

5
node_modules/function-bind/index.js generated vendored Normal file
View file

@ -0,0 +1,5 @@
'use strict';
var implementation = require('./implementation');
module.exports = Function.prototype.bind || implementation;

Some files were not shown because too many files have changed in this diff Show more