Merge branch 'develop' into add-channels
This commit is contained in:
commit
9c830273da
6620 changed files with 174387 additions and 136546 deletions
|
@ -6,7 +6,7 @@
|
|||
* @author Jay Trees <github.jay@grandel.anonaddy.me>
|
||||
*/
|
||||
|
||||
define('VERSION', '0.4.0');
|
||||
define('VERSION', '0.5.0');
|
||||
define('ROOT', __DIR__);
|
||||
define('DEFAULT_LOCALE', 'en_GB');
|
||||
|
||||
|
|
819
node_modules/.package-lock.json
generated
vendored
819
node_modules/.package-lock.json
generated
vendored
File diff suppressed because it is too large
Load diff
9
node_modules/@actions/core/LICENSE.md
generated
vendored
Normal file
9
node_modules/@actions/core/LICENSE.md
generated
vendored
Normal file
|
@ -0,0 +1,9 @@
|
|||
The MIT License (MIT)
|
||||
|
||||
Copyright 2019 GitHub
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
312
node_modules/@actions/core/README.md
generated
vendored
Normal file
312
node_modules/@actions/core/README.md
generated
vendored
Normal file
|
@ -0,0 +1,312 @@
|
|||
# `@actions/core`
|
||||
|
||||
> Core functions for setting results, logging, registering secrets and exporting variables across actions
|
||||
|
||||
## Usage
|
||||
|
||||
### Import the package
|
||||
|
||||
```js
|
||||
// javascript
|
||||
const core = require('@actions/core');
|
||||
|
||||
// typescript
|
||||
import * as core from '@actions/core';
|
||||
```
|
||||
|
||||
#### Inputs/Outputs
|
||||
|
||||
Action inputs can be read with `getInput` which returns a `string` or `getBooleanInput` which parses a boolean based on the [yaml 1.2 specification](https://yaml.org/spec/1.2/spec.html#id2804923). If `required` set to be false, the input should have a default value in `action.yml`.
|
||||
|
||||
Outputs can be set with `setOutput` which makes them available to be mapped into inputs of other actions to ensure they are decoupled.
|
||||
|
||||
```js
|
||||
const myInput = core.getInput('inputName', { required: true });
|
||||
const myBooleanInput = core.getBooleanInput('booleanInputName', { required: true });
|
||||
const myMultilineInput = core.getMultilineInput('multilineInputName', { required: true });
|
||||
core.setOutput('outputKey', 'outputVal');
|
||||
```
|
||||
|
||||
#### Exporting variables
|
||||
|
||||
Since each step runs in a separate process, you can use `exportVariable` to add it to this step and future steps environment blocks.
|
||||
|
||||
```js
|
||||
core.exportVariable('envVar', 'Val');
|
||||
```
|
||||
|
||||
#### Setting a secret
|
||||
|
||||
Setting a secret registers the secret with the runner to ensure it is masked in logs.
|
||||
|
||||
```js
|
||||
core.setSecret('myPassword');
|
||||
```
|
||||
|
||||
#### PATH Manipulation
|
||||
|
||||
To make a tool's path available in the path for the remainder of the job (without altering the machine or containers state), use `addPath`. The runner will prepend the path given to the jobs PATH.
|
||||
|
||||
```js
|
||||
core.addPath('/path/to/mytool');
|
||||
```
|
||||
|
||||
#### Exit codes
|
||||
|
||||
You should use this library to set the failing exit code for your action. If status is not set and the script runs to completion, that will lead to a success.
|
||||
|
||||
```js
|
||||
const core = require('@actions/core');
|
||||
|
||||
try {
|
||||
// Do stuff
|
||||
}
|
||||
catch (err) {
|
||||
// setFailed logs the message and sets a failing exit code
|
||||
core.setFailed(`Action failed with error ${err}`);
|
||||
}
|
||||
```
|
||||
|
||||
Note that `setNeutral` is not yet implemented in actions V2 but equivalent functionality is being planned.
|
||||
|
||||
#### Logging
|
||||
|
||||
Finally, this library provides some utilities for logging. Note that debug logging is hidden from the logs by default. This behavior can be toggled by enabling the [Step Debug Logs](../../docs/action-debugging.md#step-debug-logs).
|
||||
|
||||
```js
|
||||
const core = require('@actions/core');
|
||||
|
||||
const myInput = core.getInput('input');
|
||||
try {
|
||||
core.debug('Inside try block');
|
||||
|
||||
if (!myInput) {
|
||||
core.warning('myInput was not set');
|
||||
}
|
||||
|
||||
if (core.isDebug()) {
|
||||
// curl -v https://github.com
|
||||
} else {
|
||||
// curl https://github.com
|
||||
}
|
||||
|
||||
// Do stuff
|
||||
core.info('Output to the actions build log')
|
||||
|
||||
core.notice('This is a message that will also emit an annotation')
|
||||
}
|
||||
catch (err) {
|
||||
core.error(`Error ${err}, action may still succeed though`);
|
||||
}
|
||||
```
|
||||
|
||||
This library can also wrap chunks of output in foldable groups.
|
||||
|
||||
```js
|
||||
const core = require('@actions/core')
|
||||
|
||||
// Manually wrap output
|
||||
core.startGroup('Do some function')
|
||||
doSomeFunction()
|
||||
core.endGroup()
|
||||
|
||||
// Wrap an asynchronous function call
|
||||
const result = await core.group('Do something async', async () => {
|
||||
const response = await doSomeHTTPRequest()
|
||||
return response
|
||||
})
|
||||
```
|
||||
|
||||
#### Annotations
|
||||
|
||||
This library has 3 methods that will produce [annotations](https://docs.github.com/en/rest/reference/checks#create-a-check-run).
|
||||
```js
|
||||
core.error('This is a bad error. This will also fail the build.')
|
||||
|
||||
core.warning('Something went wrong, but it\'s not bad enough to fail the build.')
|
||||
|
||||
core.notice('Something happened that you might want to know about.')
|
||||
```
|
||||
|
||||
These will surface to the UI in the Actions page and on Pull Requests. They look something like this:
|
||||
|
||||
![Annotations Image](../../docs/assets/annotations.png)
|
||||
|
||||
These annotations can also be attached to particular lines and columns of your source files to show exactly where a problem is occuring.
|
||||
|
||||
These options are:
|
||||
```typescript
|
||||
export interface AnnotationProperties {
|
||||
/**
|
||||
* A title for the annotation.
|
||||
*/
|
||||
title?: string
|
||||
|
||||
/**
|
||||
* The name of the file for which the annotation should be created.
|
||||
*/
|
||||
file?: string
|
||||
|
||||
/**
|
||||
* The start line for the annotation.
|
||||
*/
|
||||
startLine?: number
|
||||
|
||||
/**
|
||||
* The end line for the annotation. Defaults to `startLine` when `startLine` is provided.
|
||||
*/
|
||||
endLine?: number
|
||||
|
||||
/**
|
||||
* The start column for the annotation. Cannot be sent when `startLine` and `endLine` are different values.
|
||||
*/
|
||||
startColumn?: number
|
||||
|
||||
/**
|
||||
* The start column for the annotation. Cannot be sent when `startLine` and `endLine` are different values.
|
||||
* Defaults to `startColumn` when `startColumn` is provided.
|
||||
*/
|
||||
endColumn?: number
|
||||
}
|
||||
```
|
||||
|
||||
#### Styling output
|
||||
|
||||
Colored output is supported in the Action logs via standard [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code). 3/4 bit, 8 bit and 24 bit colors are all supported.
|
||||
|
||||
Foreground colors:
|
||||
|
||||
```js
|
||||
// 3/4 bit
|
||||
core.info('\u001b[35mThis foreground will be magenta')
|
||||
|
||||
// 8 bit
|
||||
core.info('\u001b[38;5;6mThis foreground will be cyan')
|
||||
|
||||
// 24 bit
|
||||
core.info('\u001b[38;2;255;0;0mThis foreground will be bright red')
|
||||
```
|
||||
|
||||
Background colors:
|
||||
|
||||
```js
|
||||
// 3/4 bit
|
||||
core.info('\u001b[43mThis background will be yellow');
|
||||
|
||||
// 8 bit
|
||||
core.info('\u001b[48;5;6mThis background will be cyan')
|
||||
|
||||
// 24 bit
|
||||
core.info('\u001b[48;2;255;0;0mThis background will be bright red')
|
||||
```
|
||||
|
||||
Special styles:
|
||||
|
||||
```js
|
||||
core.info('\u001b[1mBold text')
|
||||
core.info('\u001b[3mItalic text')
|
||||
core.info('\u001b[4mUnderlined text')
|
||||
```
|
||||
|
||||
ANSI escape codes can be combined with one another:
|
||||
|
||||
```js
|
||||
core.info('\u001b[31;46mRed foreground with a cyan background and \u001b[1mbold text at the end');
|
||||
```
|
||||
|
||||
> Note: Escape codes reset at the start of each line
|
||||
|
||||
```js
|
||||
core.info('\u001b[35mThis foreground will be magenta')
|
||||
core.info('This foreground will reset to the default')
|
||||
```
|
||||
|
||||
Manually typing escape codes can be a little difficult, but you can use third party modules such as [ansi-styles](https://github.com/chalk/ansi-styles).
|
||||
|
||||
```js
|
||||
const style = require('ansi-styles');
|
||||
core.info(style.color.ansi16m.hex('#abcdef') + 'Hello world!')
|
||||
```
|
||||
|
||||
#### Action state
|
||||
|
||||
You can use this library to save state and get state for sharing information between a given wrapper action:
|
||||
|
||||
**action.yml**:
|
||||
|
||||
```yaml
|
||||
name: 'Wrapper action sample'
|
||||
inputs:
|
||||
name:
|
||||
default: 'GitHub'
|
||||
runs:
|
||||
using: 'node12'
|
||||
main: 'main.js'
|
||||
post: 'cleanup.js'
|
||||
```
|
||||
|
||||
In action's `main.js`:
|
||||
|
||||
```js
|
||||
const core = require('@actions/core');
|
||||
|
||||
core.saveState("pidToKill", 12345);
|
||||
```
|
||||
|
||||
In action's `cleanup.js`:
|
||||
|
||||
```js
|
||||
const core = require('@actions/core');
|
||||
|
||||
var pid = core.getState("pidToKill");
|
||||
|
||||
process.kill(pid);
|
||||
```
|
||||
|
||||
#### OIDC Token
|
||||
|
||||
You can use these methods to interact with the GitHub OIDC provider and get a JWT ID token which would help to get access token from third party cloud providers.
|
||||
|
||||
**Method Name**: getIDToken()
|
||||
|
||||
**Inputs**
|
||||
|
||||
audience : optional
|
||||
|
||||
**Outputs**
|
||||
|
||||
A [JWT](https://jwt.io/) ID Token
|
||||
|
||||
In action's `main.ts`:
|
||||
```js
|
||||
const core = require('@actions/core');
|
||||
async function getIDTokenAction(): Promise<void> {
|
||||
|
||||
const audience = core.getInput('audience', {required: false})
|
||||
|
||||
const id_token1 = await core.getIDToken() // ID Token with default audience
|
||||
const id_token2 = await core.getIDToken(audience) // ID token with custom audience
|
||||
|
||||
// this id_token can be used to get access token from third party cloud providers
|
||||
}
|
||||
getIDTokenAction()
|
||||
```
|
||||
|
||||
In action's `actions.yml`:
|
||||
|
||||
```yaml
|
||||
name: 'GetIDToken'
|
||||
description: 'Get ID token from Github OIDC provider'
|
||||
inputs:
|
||||
audience:
|
||||
description: 'Audience for which the ID token is intended for'
|
||||
required: false
|
||||
outputs:
|
||||
id_token1:
|
||||
description: 'ID token obtained from OIDC provider'
|
||||
id_token2:
|
||||
description: 'ID token obtained from OIDC provider'
|
||||
runs:
|
||||
using: 'node12'
|
||||
main: 'dist/index.js'
|
||||
```
|
15
node_modules/@actions/core/lib/command.d.ts
generated
vendored
Normal file
15
node_modules/@actions/core/lib/command.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,15 @@
|
|||
export interface CommandProperties {
|
||||
[key: string]: any;
|
||||
}
|
||||
/**
|
||||
* Commands
|
||||
*
|
||||
* Command Format:
|
||||
* ::name key=value,key=value::message
|
||||
*
|
||||
* Examples:
|
||||
* ::warning::This is the message
|
||||
* ::set-env name=MY_VAR::some value
|
||||
*/
|
||||
export declare function issueCommand(command: string, properties: CommandProperties, message: any): void;
|
||||
export declare function issue(name: string, message?: string): void;
|
92
node_modules/@actions/core/lib/command.js
generated
vendored
Normal file
92
node_modules/@actions/core/lib/command.js
generated
vendored
Normal file
|
@ -0,0 +1,92 @@
|
|||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.issue = exports.issueCommand = void 0;
|
||||
const os = __importStar(require("os"));
|
||||
const utils_1 = require("./utils");
|
||||
/**
|
||||
* Commands
|
||||
*
|
||||
* Command Format:
|
||||
* ::name key=value,key=value::message
|
||||
*
|
||||
* Examples:
|
||||
* ::warning::This is the message
|
||||
* ::set-env name=MY_VAR::some value
|
||||
*/
|
||||
function issueCommand(command, properties, message) {
|
||||
const cmd = new Command(command, properties, message);
|
||||
process.stdout.write(cmd.toString() + os.EOL);
|
||||
}
|
||||
exports.issueCommand = issueCommand;
|
||||
function issue(name, message = '') {
|
||||
issueCommand(name, {}, message);
|
||||
}
|
||||
exports.issue = issue;
|
||||
const CMD_STRING = '::';
|
||||
class Command {
|
||||
constructor(command, properties, message) {
|
||||
if (!command) {
|
||||
command = 'missing.command';
|
||||
}
|
||||
this.command = command;
|
||||
this.properties = properties;
|
||||
this.message = message;
|
||||
}
|
||||
toString() {
|
||||
let cmdStr = CMD_STRING + this.command;
|
||||
if (this.properties && Object.keys(this.properties).length > 0) {
|
||||
cmdStr += ' ';
|
||||
let first = true;
|
||||
for (const key in this.properties) {
|
||||
if (this.properties.hasOwnProperty(key)) {
|
||||
const val = this.properties[key];
|
||||
if (val) {
|
||||
if (first) {
|
||||
first = false;
|
||||
}
|
||||
else {
|
||||
cmdStr += ',';
|
||||
}
|
||||
cmdStr += `${key}=${escapeProperty(val)}`;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
cmdStr += `${CMD_STRING}${escapeData(this.message)}`;
|
||||
return cmdStr;
|
||||
}
|
||||
}
|
||||
function escapeData(s) {
|
||||
return utils_1.toCommandValue(s)
|
||||
.replace(/%/g, '%25')
|
||||
.replace(/\r/g, '%0D')
|
||||
.replace(/\n/g, '%0A');
|
||||
}
|
||||
function escapeProperty(s) {
|
||||
return utils_1.toCommandValue(s)
|
||||
.replace(/%/g, '%25')
|
||||
.replace(/\r/g, '%0D')
|
||||
.replace(/\n/g, '%0A')
|
||||
.replace(/:/g, '%3A')
|
||||
.replace(/,/g, '%2C');
|
||||
}
|
||||
//# sourceMappingURL=command.js.map
|
1
node_modules/@actions/core/lib/command.js.map
generated
vendored
Normal file
1
node_modules/@actions/core/lib/command.js.map
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
{"version":3,"file":"command.js","sourceRoot":"","sources":["../src/command.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAwB;AACxB,mCAAsC;AAWtC;;;;;;;;;GASG;AACH,SAAgB,YAAY,CAC1B,OAAe,EACf,UAA6B,EAC7B,OAAY;IAEZ,MAAM,GAAG,GAAG,IAAI,OAAO,CAAC,OAAO,EAAE,UAAU,EAAE,OAAO,CAAC,CAAA;IACrD,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,GAAG,CAAC,QAAQ,EAAE,GAAG,EAAE,CAAC,GAAG,CAAC,CAAA;AAC/C,CAAC;AAPD,oCAOC;AAED,SAAgB,KAAK,CAAC,IAAY,EAAE,OAAO,GAAG,EAAE;IAC9C,YAAY,CAAC,IAAI,EAAE,EAAE,EAAE,OAAO,CAAC,CAAA;AACjC,CAAC;AAFD,sBAEC;AAED,MAAM,UAAU,GAAG,IAAI,CAAA;AAEvB,MAAM,OAAO;IAKX,YAAY,OAAe,EAAE,UAA6B,EAAE,OAAe;QACzE,IAAI,CAAC,OAAO,EAAE;YACZ,OAAO,GAAG,iBAAiB,CAAA;SAC5B;QAED,IAAI,CAAC,OAAO,GAAG,OAAO,CAAA;QACtB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAA;QAC5B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAA;IACxB,CAAC;IAED,QAAQ;QACN,IAAI,MAAM,GAAG,UAAU,GAAG,IAAI,CAAC,OAAO,CAAA;QAEtC,IAAI,IAAI,CAAC,UAAU,IAAI,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE;YAC9D,MAAM,IAAI,GAAG,CAAA;YACb,IAAI,KAAK,GAAG,IAAI,CAAA;YAChB,KAAK,MAAM,GAAG,IAAI,IAAI,CAAC,UAAU,EAAE;gBACjC,IAAI,IAAI,CAAC,UAAU,CAAC,cAAc,CAAC,GAAG,CAAC,EAAE;oBACvC,MAAM,GAAG,GAAG,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;oBAChC,IAAI,GAAG,EAAE;wBACP,IAAI,KAAK,EAAE;4BACT,KAAK,GAAG,KAAK,CAAA;yBACd;6BAAM;4BACL,MAAM,IAAI,GAAG,CAAA;yBACd;wBAED,MAAM,IAAI,GAAG,GAAG,IAAI,cAAc,CAAC,GAAG,CAAC,EAAE,CAAA;qBAC1C;iBACF;aACF;SACF;QAED,MAAM,IAAI,GAAG,UAAU,GAAG,UAAU,CAAC,IAAI,CAAC,OAAO,CAAC,EAAE,CAAA;QACpD,OAAO,MAAM,CAAA;IACf,CAAC;CACF;AAED,SAAS,UAAU,CAAC,CAAM;IACxB,OAAO,sBAAc,CAAC,CAAC,CAAC;SACrB,OAAO,CAAC,IAAI,EAAE,KAAK,CAAC;SACpB,OAAO,CAAC,KAAK,EAAE,KAAK,CAAC;SACrB,OAAO,CAAC,KAAK,EAAE,KAAK,CAAC,CAAA;AAC1B,CAAC;AAED,SAAS,cAAc,CAAC,CAAM;IAC5B,OAAO,sBAAc,CAAC,CAAC,CAAC;SACrB,OAAO,CAAC,IAAI,EAAE,KAAK,CAAC;SACpB,OAAO,CAAC,KAAK,EAAE,KAAK,CAAC;SACrB,OAAO,CAAC,KAAK,EAAE,KAAK,CAAC;SACrB,OAAO,CAAC,IAAI,EAAE,KAAK,CAAC;SACpB,OAAO,CAAC,IAAI,EAAE,KAAK,CAAC,CAAA;AACzB,CAAC"}
|
186
node_modules/@actions/core/lib/core.d.ts
generated
vendored
Normal file
186
node_modules/@actions/core/lib/core.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,186 @@
|
|||
/**
|
||||
* Interface for getInput options
|
||||
*/
|
||||
export interface InputOptions {
|
||||
/** Optional. Whether the input is required. If required and not present, will throw. Defaults to false */
|
||||
required?: boolean;
|
||||
/** Optional. Whether leading/trailing whitespace will be trimmed for the input. Defaults to true */
|
||||
trimWhitespace?: boolean;
|
||||
}
|
||||
/**
|
||||
* The code to exit an action
|
||||
*/
|
||||
export declare enum ExitCode {
|
||||
/**
|
||||
* A code indicating that the action was successful
|
||||
*/
|
||||
Success = 0,
|
||||
/**
|
||||
* A code indicating that the action was a failure
|
||||
*/
|
||||
Failure = 1
|
||||
}
|
||||
/**
|
||||
* Optional properties that can be sent with annotatation commands (notice, error, and warning)
|
||||
* See: https://docs.github.com/en/rest/reference/checks#create-a-check-run for more information about annotations.
|
||||
*/
|
||||
export interface AnnotationProperties {
|
||||
/**
|
||||
* A title for the annotation.
|
||||
*/
|
||||
title?: string;
|
||||
/**
|
||||
* The path of the file for which the annotation should be created.
|
||||
*/
|
||||
file?: string;
|
||||
/**
|
||||
* The start line for the annotation.
|
||||
*/
|
||||
startLine?: number;
|
||||
/**
|
||||
* The end line for the annotation. Defaults to `startLine` when `startLine` is provided.
|
||||
*/
|
||||
endLine?: number;
|
||||
/**
|
||||
* The start column for the annotation. Cannot be sent when `startLine` and `endLine` are different values.
|
||||
*/
|
||||
startColumn?: number;
|
||||
/**
|
||||
* The start column for the annotation. Cannot be sent when `startLine` and `endLine` are different values.
|
||||
* Defaults to `startColumn` when `startColumn` is provided.
|
||||
*/
|
||||
endColumn?: number;
|
||||
}
|
||||
/**
|
||||
* Sets env variable for this action and future actions in the job
|
||||
* @param name the name of the variable to set
|
||||
* @param val the value of the variable. Non-string values will be converted to a string via JSON.stringify
|
||||
*/
|
||||
export declare function exportVariable(name: string, val: any): void;
|
||||
/**
|
||||
* Registers a secret which will get masked from logs
|
||||
* @param secret value of the secret
|
||||
*/
|
||||
export declare function setSecret(secret: string): void;
|
||||
/**
|
||||
* Prepends inputPath to the PATH (for this action and future actions)
|
||||
* @param inputPath
|
||||
*/
|
||||
export declare function addPath(inputPath: string): void;
|
||||
/**
|
||||
* Gets the value of an input.
|
||||
* Unless trimWhitespace is set to false in InputOptions, the value is also trimmed.
|
||||
* Returns an empty string if the value is not defined.
|
||||
*
|
||||
* @param name name of the input to get
|
||||
* @param options optional. See InputOptions.
|
||||
* @returns string
|
||||
*/
|
||||
export declare function getInput(name: string, options?: InputOptions): string;
|
||||
/**
|
||||
* Gets the values of an multiline input. Each value is also trimmed.
|
||||
*
|
||||
* @param name name of the input to get
|
||||
* @param options optional. See InputOptions.
|
||||
* @returns string[]
|
||||
*
|
||||
*/
|
||||
export declare function getMultilineInput(name: string, options?: InputOptions): string[];
|
||||
/**
|
||||
* Gets the input value of the boolean type in the YAML 1.2 "core schema" specification.
|
||||
* Support boolean input list: `true | True | TRUE | false | False | FALSE` .
|
||||
* The return value is also in boolean type.
|
||||
* ref: https://yaml.org/spec/1.2/spec.html#id2804923
|
||||
*
|
||||
* @param name name of the input to get
|
||||
* @param options optional. See InputOptions.
|
||||
* @returns boolean
|
||||
*/
|
||||
export declare function getBooleanInput(name: string, options?: InputOptions): boolean;
|
||||
/**
|
||||
* Sets the value of an output.
|
||||
*
|
||||
* @param name name of the output to set
|
||||
* @param value value to store. Non-string values will be converted to a string via JSON.stringify
|
||||
*/
|
||||
export declare function setOutput(name: string, value: any): void;
|
||||
/**
|
||||
* Enables or disables the echoing of commands into stdout for the rest of the step.
|
||||
* Echoing is disabled by default if ACTIONS_STEP_DEBUG is not set.
|
||||
*
|
||||
*/
|
||||
export declare function setCommandEcho(enabled: boolean): void;
|
||||
/**
|
||||
* Sets the action status to failed.
|
||||
* When the action exits it will be with an exit code of 1
|
||||
* @param message add error issue message
|
||||
*/
|
||||
export declare function setFailed(message: string | Error): void;
|
||||
/**
|
||||
* Gets whether Actions Step Debug is on or not
|
||||
*/
|
||||
export declare function isDebug(): boolean;
|
||||
/**
|
||||
* Writes debug message to user log
|
||||
* @param message debug message
|
||||
*/
|
||||
export declare function debug(message: string): void;
|
||||
/**
|
||||
* Adds an error issue
|
||||
* @param message error issue message. Errors will be converted to string via toString()
|
||||
* @param properties optional properties to add to the annotation.
|
||||
*/
|
||||
export declare function error(message: string | Error, properties?: AnnotationProperties): void;
|
||||
/**
|
||||
* Adds a warning issue
|
||||
* @param message warning issue message. Errors will be converted to string via toString()
|
||||
* @param properties optional properties to add to the annotation.
|
||||
*/
|
||||
export declare function warning(message: string | Error, properties?: AnnotationProperties): void;
|
||||
/**
|
||||
* Adds a notice issue
|
||||
* @param message notice issue message. Errors will be converted to string via toString()
|
||||
* @param properties optional properties to add to the annotation.
|
||||
*/
|
||||
export declare function notice(message: string | Error, properties?: AnnotationProperties): void;
|
||||
/**
|
||||
* Writes info to log with console.log.
|
||||
* @param message info message
|
||||
*/
|
||||
export declare function info(message: string): void;
|
||||
/**
|
||||
* Begin an output group.
|
||||
*
|
||||
* Output until the next `groupEnd` will be foldable in this group
|
||||
*
|
||||
* @param name The name of the output group
|
||||
*/
|
||||
export declare function startGroup(name: string): void;
|
||||
/**
|
||||
* End an output group.
|
||||
*/
|
||||
export declare function endGroup(): void;
|
||||
/**
|
||||
* Wrap an asynchronous function call in a group.
|
||||
*
|
||||
* Returns the same type as the function itself.
|
||||
*
|
||||
* @param name The name of the group
|
||||
* @param fn The function to wrap in the group
|
||||
*/
|
||||
export declare function group<T>(name: string, fn: () => Promise<T>): Promise<T>;
|
||||
/**
|
||||
* Saves state for current action, the state can only be retrieved by this action's post job execution.
|
||||
*
|
||||
* @param name name of the state to store
|
||||
* @param value value to store. Non-string values will be converted to a string via JSON.stringify
|
||||
*/
|
||||
export declare function saveState(name: string, value: any): void;
|
||||
/**
|
||||
* Gets the value of an state set by this action's main execution.
|
||||
*
|
||||
* @param name name of the state to get
|
||||
* @returns string
|
||||
*/
|
||||
export declare function getState(name: string): string;
|
||||
export declare function getIDToken(aud?: string): Promise<string>;
|
312
node_modules/@actions/core/lib/core.js
generated
vendored
Normal file
312
node_modules/@actions/core/lib/core.js
generated
vendored
Normal file
|
@ -0,0 +1,312 @@
|
|||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
||||
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
||||
return new (P || (P = Promise))(function (resolve, reject) {
|
||||
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
|
||||
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
|
||||
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
|
||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
||||
});
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.getIDToken = exports.getState = exports.saveState = exports.group = exports.endGroup = exports.startGroup = exports.info = exports.notice = exports.warning = exports.error = exports.debug = exports.isDebug = exports.setFailed = exports.setCommandEcho = exports.setOutput = exports.getBooleanInput = exports.getMultilineInput = exports.getInput = exports.addPath = exports.setSecret = exports.exportVariable = exports.ExitCode = void 0;
|
||||
const command_1 = require("./command");
|
||||
const file_command_1 = require("./file-command");
|
||||
const utils_1 = require("./utils");
|
||||
const os = __importStar(require("os"));
|
||||
const path = __importStar(require("path"));
|
||||
const oidc_utils_1 = require("./oidc-utils");
|
||||
/**
|
||||
* The code to exit an action
|
||||
*/
|
||||
var ExitCode;
|
||||
(function (ExitCode) {
|
||||
/**
|
||||
* A code indicating that the action was successful
|
||||
*/
|
||||
ExitCode[ExitCode["Success"] = 0] = "Success";
|
||||
/**
|
||||
* A code indicating that the action was a failure
|
||||
*/
|
||||
ExitCode[ExitCode["Failure"] = 1] = "Failure";
|
||||
})(ExitCode = exports.ExitCode || (exports.ExitCode = {}));
|
||||
//-----------------------------------------------------------------------
|
||||
// Variables
|
||||
//-----------------------------------------------------------------------
|
||||
/**
|
||||
* Sets env variable for this action and future actions in the job
|
||||
* @param name the name of the variable to set
|
||||
* @param val the value of the variable. Non-string values will be converted to a string via JSON.stringify
|
||||
*/
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
function exportVariable(name, val) {
|
||||
const convertedVal = utils_1.toCommandValue(val);
|
||||
process.env[name] = convertedVal;
|
||||
const filePath = process.env['GITHUB_ENV'] || '';
|
||||
if (filePath) {
|
||||
const delimiter = '_GitHubActionsFileCommandDelimeter_';
|
||||
const commandValue = `${name}<<${delimiter}${os.EOL}${convertedVal}${os.EOL}${delimiter}`;
|
||||
file_command_1.issueCommand('ENV', commandValue);
|
||||
}
|
||||
else {
|
||||
command_1.issueCommand('set-env', { name }, convertedVal);
|
||||
}
|
||||
}
|
||||
exports.exportVariable = exportVariable;
|
||||
/**
|
||||
* Registers a secret which will get masked from logs
|
||||
* @param secret value of the secret
|
||||
*/
|
||||
function setSecret(secret) {
|
||||
command_1.issueCommand('add-mask', {}, secret);
|
||||
}
|
||||
exports.setSecret = setSecret;
|
||||
/**
|
||||
* Prepends inputPath to the PATH (for this action and future actions)
|
||||
* @param inputPath
|
||||
*/
|
||||
function addPath(inputPath) {
|
||||
const filePath = process.env['GITHUB_PATH'] || '';
|
||||
if (filePath) {
|
||||
file_command_1.issueCommand('PATH', inputPath);
|
||||
}
|
||||
else {
|
||||
command_1.issueCommand('add-path', {}, inputPath);
|
||||
}
|
||||
process.env['PATH'] = `${inputPath}${path.delimiter}${process.env['PATH']}`;
|
||||
}
|
||||
exports.addPath = addPath;
|
||||
/**
|
||||
* Gets the value of an input.
|
||||
* Unless trimWhitespace is set to false in InputOptions, the value is also trimmed.
|
||||
* Returns an empty string if the value is not defined.
|
||||
*
|
||||
* @param name name of the input to get
|
||||
* @param options optional. See InputOptions.
|
||||
* @returns string
|
||||
*/
|
||||
function getInput(name, options) {
|
||||
const val = process.env[`INPUT_${name.replace(/ /g, '_').toUpperCase()}`] || '';
|
||||
if (options && options.required && !val) {
|
||||
throw new Error(`Input required and not supplied: ${name}`);
|
||||
}
|
||||
if (options && options.trimWhitespace === false) {
|
||||
return val;
|
||||
}
|
||||
return val.trim();
|
||||
}
|
||||
exports.getInput = getInput;
|
||||
/**
|
||||
* Gets the values of an multiline input. Each value is also trimmed.
|
||||
*
|
||||
* @param name name of the input to get
|
||||
* @param options optional. See InputOptions.
|
||||
* @returns string[]
|
||||
*
|
||||
*/
|
||||
function getMultilineInput(name, options) {
|
||||
const inputs = getInput(name, options)
|
||||
.split('\n')
|
||||
.filter(x => x !== '');
|
||||
return inputs;
|
||||
}
|
||||
exports.getMultilineInput = getMultilineInput;
|
||||
/**
|
||||
* Gets the input value of the boolean type in the YAML 1.2 "core schema" specification.
|
||||
* Support boolean input list: `true | True | TRUE | false | False | FALSE` .
|
||||
* The return value is also in boolean type.
|
||||
* ref: https://yaml.org/spec/1.2/spec.html#id2804923
|
||||
*
|
||||
* @param name name of the input to get
|
||||
* @param options optional. See InputOptions.
|
||||
* @returns boolean
|
||||
*/
|
||||
function getBooleanInput(name, options) {
|
||||
const trueValue = ['true', 'True', 'TRUE'];
|
||||
const falseValue = ['false', 'False', 'FALSE'];
|
||||
const val = getInput(name, options);
|
||||
if (trueValue.includes(val))
|
||||
return true;
|
||||
if (falseValue.includes(val))
|
||||
return false;
|
||||
throw new TypeError(`Input does not meet YAML 1.2 "Core Schema" specification: ${name}\n` +
|
||||
`Support boolean input list: \`true | True | TRUE | false | False | FALSE\``);
|
||||
}
|
||||
exports.getBooleanInput = getBooleanInput;
|
||||
/**
|
||||
* Sets the value of an output.
|
||||
*
|
||||
* @param name name of the output to set
|
||||
* @param value value to store. Non-string values will be converted to a string via JSON.stringify
|
||||
*/
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
function setOutput(name, value) {
|
||||
process.stdout.write(os.EOL);
|
||||
command_1.issueCommand('set-output', { name }, value);
|
||||
}
|
||||
exports.setOutput = setOutput;
|
||||
/**
|
||||
* Enables or disables the echoing of commands into stdout for the rest of the step.
|
||||
* Echoing is disabled by default if ACTIONS_STEP_DEBUG is not set.
|
||||
*
|
||||
*/
|
||||
function setCommandEcho(enabled) {
|
||||
command_1.issue('echo', enabled ? 'on' : 'off');
|
||||
}
|
||||
exports.setCommandEcho = setCommandEcho;
|
||||
//-----------------------------------------------------------------------
|
||||
// Results
|
||||
//-----------------------------------------------------------------------
|
||||
/**
|
||||
* Sets the action status to failed.
|
||||
* When the action exits it will be with an exit code of 1
|
||||
* @param message add error issue message
|
||||
*/
|
||||
function setFailed(message) {
|
||||
process.exitCode = ExitCode.Failure;
|
||||
error(message);
|
||||
}
|
||||
exports.setFailed = setFailed;
|
||||
//-----------------------------------------------------------------------
|
||||
// Logging Commands
|
||||
//-----------------------------------------------------------------------
|
||||
/**
|
||||
* Gets whether Actions Step Debug is on or not
|
||||
*/
|
||||
function isDebug() {
|
||||
return process.env['RUNNER_DEBUG'] === '1';
|
||||
}
|
||||
exports.isDebug = isDebug;
|
||||
/**
|
||||
* Writes debug message to user log
|
||||
* @param message debug message
|
||||
*/
|
||||
function debug(message) {
|
||||
command_1.issueCommand('debug', {}, message);
|
||||
}
|
||||
exports.debug = debug;
|
||||
/**
|
||||
* Adds an error issue
|
||||
* @param message error issue message. Errors will be converted to string via toString()
|
||||
* @param properties optional properties to add to the annotation.
|
||||
*/
|
||||
function error(message, properties = {}) {
|
||||
command_1.issueCommand('error', utils_1.toCommandProperties(properties), message instanceof Error ? message.toString() : message);
|
||||
}
|
||||
exports.error = error;
|
||||
/**
|
||||
* Adds a warning issue
|
||||
* @param message warning issue message. Errors will be converted to string via toString()
|
||||
* @param properties optional properties to add to the annotation.
|
||||
*/
|
||||
function warning(message, properties = {}) {
|
||||
command_1.issueCommand('warning', utils_1.toCommandProperties(properties), message instanceof Error ? message.toString() : message);
|
||||
}
|
||||
exports.warning = warning;
|
||||
/**
|
||||
* Adds a notice issue
|
||||
* @param message notice issue message. Errors will be converted to string via toString()
|
||||
* @param properties optional properties to add to the annotation.
|
||||
*/
|
||||
function notice(message, properties = {}) {
|
||||
command_1.issueCommand('notice', utils_1.toCommandProperties(properties), message instanceof Error ? message.toString() : message);
|
||||
}
|
||||
exports.notice = notice;
|
||||
/**
|
||||
* Writes info to log with console.log.
|
||||
* @param message info message
|
||||
*/
|
||||
function info(message) {
|
||||
process.stdout.write(message + os.EOL);
|
||||
}
|
||||
exports.info = info;
|
||||
/**
|
||||
* Begin an output group.
|
||||
*
|
||||
* Output until the next `groupEnd` will be foldable in this group
|
||||
*
|
||||
* @param name The name of the output group
|
||||
*/
|
||||
function startGroup(name) {
|
||||
command_1.issue('group', name);
|
||||
}
|
||||
exports.startGroup = startGroup;
|
||||
/**
|
||||
* End an output group.
|
||||
*/
|
||||
function endGroup() {
|
||||
command_1.issue('endgroup');
|
||||
}
|
||||
exports.endGroup = endGroup;
|
||||
/**
|
||||
* Wrap an asynchronous function call in a group.
|
||||
*
|
||||
* Returns the same type as the function itself.
|
||||
*
|
||||
* @param name The name of the group
|
||||
* @param fn The function to wrap in the group
|
||||
*/
|
||||
function group(name, fn) {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
startGroup(name);
|
||||
let result;
|
||||
try {
|
||||
result = yield fn();
|
||||
}
|
||||
finally {
|
||||
endGroup();
|
||||
}
|
||||
return result;
|
||||
});
|
||||
}
|
||||
exports.group = group;
|
||||
//-----------------------------------------------------------------------
|
||||
// Wrapper action state
|
||||
//-----------------------------------------------------------------------
|
||||
/**
|
||||
* Saves state for current action, the state can only be retrieved by this action's post job execution.
|
||||
*
|
||||
* @param name name of the state to store
|
||||
* @param value value to store. Non-string values will be converted to a string via JSON.stringify
|
||||
*/
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
function saveState(name, value) {
|
||||
command_1.issueCommand('save-state', { name }, value);
|
||||
}
|
||||
exports.saveState = saveState;
|
||||
/**
|
||||
* Gets the value of an state set by this action's main execution.
|
||||
*
|
||||
* @param name name of the state to get
|
||||
* @returns string
|
||||
*/
|
||||
function getState(name) {
|
||||
return process.env[`STATE_${name}`] || '';
|
||||
}
|
||||
exports.getState = getState;
|
||||
function getIDToken(aud) {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
return yield oidc_utils_1.OidcClient.getIDToken(aud);
|
||||
});
|
||||
}
|
||||
exports.getIDToken = getIDToken;
|
||||
//# sourceMappingURL=core.js.map
|
1
node_modules/@actions/core/lib/core.js.map
generated
vendored
Normal file
1
node_modules/@actions/core/lib/core.js.map
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
{"version":3,"file":"core.js","sourceRoot":"","sources":["../src/core.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAA6C;AAC7C,iDAA+D;AAC/D,mCAA2D;AAE3D,uCAAwB;AACxB,2CAA4B;AAE5B,6CAAuC;AAavC;;GAEG;AACH,IAAY,QAUX;AAVD,WAAY,QAAQ;IAClB;;OAEG;IACH,6CAAW,CAAA;IAEX;;OAEG;IACH,6CAAW,CAAA;AACb,CAAC,EAVW,QAAQ,GAAR,gBAAQ,KAAR,gBAAQ,QAUnB;AAuCD,yEAAyE;AACzE,YAAY;AACZ,yEAAyE;AAEzE;;;;GAIG;AACH,8DAA8D;AAC9D,SAAgB,cAAc,CAAC,IAAY,EAAE,GAAQ;IACnD,MAAM,YAAY,GAAG,sBAAc,CAAC,GAAG,CAAC,CAAA;IACxC,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,GAAG,YAAY,CAAA;IAEhC,MAAM,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,IAAI,EAAE,CAAA;IAChD,IAAI,QAAQ,EAAE;QACZ,MAAM,SAAS,GAAG,qCAAqC,CAAA;QACvD,MAAM,YAAY,GAAG,GAAG,IAAI,KAAK,SAAS,GAAG,EAAE,CAAC,GAAG,GAAG,YAAY,GAAG,EAAE,CAAC,GAAG,GAAG,SAAS,EAAE,CAAA;QACzF,2BAAgB,CAAC,KAAK,EAAE,YAAY,CAAC,CAAA;KACtC;SAAM;QACL,sBAAY,CAAC,SAAS,EAAE,EAAC,IAAI,EAAC,EAAE,YAAY,CAAC,CAAA;KAC9C;AACH,CAAC;AAZD,wCAYC;AAED;;;GAGG;AACH,SAAgB,SAAS,CAAC,MAAc;IACtC,sBAAY,CAAC,UAAU,EAAE,EAAE,EAAE,MAAM,CAAC,CAAA;AACtC,CAAC;AAFD,8BAEC;AAED;;;GAGG;AACH,SAAgB,OAAO,CAAC,SAAiB;IACvC,MAAM,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,IAAI,EAAE,CAAA;IACjD,IAAI,QAAQ,EAAE;QACZ,2BAAgB,CAAC,MAAM,EAAE,SAAS,CAAC,CAAA;KACpC;SAAM;QACL,sBAAY,CAAC,UAAU,EAAE,EAAE,EAAE,SAAS,CAAC,CAAA;KACxC;IACD,OAAO,CAAC,GAAG,CAAC,MAAM,CAAC,GAAG,GAAG,SAAS,GAAG,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,GAAG,CAAC,MAAM,CAAC,EAAE,CAAA;AAC7E,CAAC;AARD,0BAQC;AAED;;;;;;;;GAQG;AACH,SAAgB,QAAQ,CAAC,IAAY,EAAE,OAAsB;IAC3D,MAAM,GAAG,GACP,OAAO,CAAC,GAAG,CAAC,SAAS,IAAI,CAAC,OAAO,CAAC,IAAI,EAAE,GAAG,CAAC,CAAC,WAAW,EAAE,EAAE,CAAC,IAAI,EAAE,CAAA;IACrE,IAAI,OAAO,IAAI,OAAO,CAAC,QAAQ,IAAI,CAAC,GAAG,EAAE;QACvC,MAAM,IAAI,KAAK,CAAC,oCAAoC,IAAI,EAAE,CAAC,CAAA;KAC5D;IAED,IAAI,OAAO,IAAI,OAAO,CAAC,cAAc,KAAK,KAAK,EAAE;QAC/C,OAAO,GAAG,CAAA;KACX;IAED,OAAO,GAAG,CAAC,IAAI,EAAE,CAAA;AACnB,CAAC;AAZD,4BAYC;AAED;;;;;;;GAOG;AACH,SAAgB,iBAAiB,CAC/B,IAAY,EACZ,OAAsB;IAEtB,MAAM,MAAM,GAAa,QAAQ,CAAC,IAAI,EAAE,OAAO,CAAC;SAC7C,KAAK,CAAC,IAAI,CAAC;SACX,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,KAAK,EAAE,CAAC,CAAA;IAExB,OAAO,MAAM,CAAA;AACf,CAAC;AATD,8CASC;AAED;;;;;;;;;GASG;AACH,SAAgB,eAAe,CAAC,IAAY,EAAE,OAAsB;IAClE,MAAM,SAAS,GAAG,CAAC,MAAM,EAAE,MAAM,EAAE,MAAM,CAAC,CAAA;IAC1C,MAAM,UAAU,GAAG,CAAC,OAAO,EAAE,OAAO,EAAE,OAAO,CAAC,CAAA;IAC9C,MAAM,GAAG,GAAG,QAAQ,CAAC,IAAI,EAAE,OAAO,CAAC,CAAA;IACnC,IAAI,SAAS,CAAC,QAAQ,CAAC,GAAG,CAAC;QAAE,OAAO,IAAI,CAAA;IACxC,IAAI,UAAU,CAAC,QAAQ,CAAC,GAAG,CAAC;QAAE,OAAO,KAAK,CAAA;IAC1C,MAAM,IAAI,SAAS,CACjB,6DAA6D,IAAI,IAAI;QACnE,4EAA4E,CAC/E,CAAA;AACH,CAAC;AAVD,0CAUC;AAED;;;;;GAKG;AACH,8DAA8D;AAC9D,SAAgB,SAAS,CAAC,IAAY,EAAE,KAAU;IAChD,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,CAAC,GAAG,CAAC,CAAA;IAC5B,sBAAY,CAAC,YAAY,EAAE,EAAC,IAAI,EAAC,EAAE,KAAK,CAAC,CAAA;AAC3C,CAAC;AAHD,8BAGC;AAED;;;;GAIG;AACH,SAAgB,cAAc,CAAC,OAAgB;IAC7C,eAAK,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,CAAC,CAAA;AACvC,CAAC;AAFD,wCAEC;AAED,yEAAyE;AACzE,UAAU;AACV,yEAAyE;AAEzE;;;;GAIG;AACH,SAAgB,SAAS,CAAC,OAAuB;IAC/C,OAAO,CAAC,QAAQ,GAAG,QAAQ,CAAC,OAAO,CAAA;IAEnC,KAAK,CAAC,OAAO,CAAC,CAAA;AAChB,CAAC;AAJD,8BAIC;AAED,yEAAyE;AACzE,mBAAmB;AACnB,yEAAyE;AAEzE;;GAEG;AACH,SAAgB,OAAO;IACrB,OAAO,OAAO,CAAC,GAAG,CAAC,cAAc,CAAC,KAAK,GAAG,CAAA;AAC5C,CAAC;AAFD,0BAEC;AAED;;;GAGG;AACH,SAAgB,KAAK,CAAC,OAAe;IACnC,sBAAY,CAAC,OAAO,EAAE,EAAE,EAAE,OAAO,CAAC,CAAA;AACpC,CAAC;AAFD,sBAEC;AAED;;;;GAIG;AACH,SAAgB,KAAK,CACnB,OAAuB,EACvB,aAAmC,EAAE;IAErC,sBAAY,CACV,OAAO,EACP,2BAAmB,CAAC,UAAU,CAAC,EAC/B,OAAO,YAAY,KAAK,CAAC,CAAC,CAAC,OAAO,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAC,OAAO,CACxD,CAAA;AACH,CAAC;AATD,sBASC;AAED;;;;GAIG;AACH,SAAgB,OAAO,CACrB,OAAuB,EACvB,aAAmC,EAAE;IAErC,sBAAY,CACV,SAAS,EACT,2BAAmB,CAAC,UAAU,CAAC,EAC/B,OAAO,YAAY,KAAK,CAAC,CAAC,CAAC,OAAO,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAC,OAAO,CACxD,CAAA;AACH,CAAC;AATD,0BASC;AAED;;;;GAIG;AACH,SAAgB,MAAM,CACpB,OAAuB,EACvB,aAAmC,EAAE;IAErC,sBAAY,CACV,QAAQ,EACR,2BAAmB,CAAC,UAAU,CAAC,EAC/B,OAAO,YAAY,KAAK,CAAC,CAAC,CAAC,OAAO,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAC,OAAO,CACxD,CAAA;AACH,CAAC;AATD,wBASC;AAED;;;GAGG;AACH,SAAgB,IAAI,CAAC,OAAe;IAClC,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,OAAO,GAAG,EAAE,CAAC,GAAG,CAAC,CAAA;AACxC,CAAC;AAFD,oBAEC;AAED;;;;;;GAMG;AACH,SAAgB,UAAU,CAAC,IAAY;IACrC,eAAK,CAAC,OAAO,EAAE,IAAI,CAAC,CAAA;AACtB,CAAC;AAFD,gCAEC;AAED;;GAEG;AACH,SAAgB,QAAQ;IACtB,eAAK,CAAC,UAAU,CAAC,CAAA;AACnB,CAAC;AAFD,4BAEC;AAED;;;;;;;GAOG;AACH,SAAsB,KAAK,CAAI,IAAY,EAAE,EAAoB;;QAC/D,UAAU,CAAC,IAAI,CAAC,CAAA;QAEhB,IAAI,MAAS,CAAA;QAEb,IAAI;YACF,MAAM,GAAG,MAAM,EAAE,EAAE,CAAA;SACpB;gBAAS;YACR,QAAQ,EAAE,CAAA;SACX;QAED,OAAO,MAAM,CAAA;IACf,CAAC;CAAA;AAZD,sBAYC;AAED,yEAAyE;AACzE,uBAAuB;AACvB,yEAAyE;AAEzE;;;;;GAKG;AACH,8DAA8D;AAC9D,SAAgB,SAAS,CAAC,IAAY,EAAE,KAAU;IAChD,sBAAY,CAAC,YAAY,EAAE,EAAC,IAAI,EAAC,EAAE,KAAK,CAAC,CAAA;AAC3C,CAAC;AAFD,8BAEC;AAED;;;;;GAKG;AACH,SAAgB,QAAQ,CAAC,IAAY;IACnC,OAAO,OAAO,CAAC,GAAG,CAAC,SAAS,IAAI,EAAE,CAAC,IAAI,EAAE,CAAA;AAC3C,CAAC;AAFD,4BAEC;AAED,SAAsB,UAAU,CAAC,GAAY;;QAC3C,OAAO,MAAM,uBAAU,CAAC,UAAU,CAAC,GAAG,CAAC,CAAA;IACzC,CAAC;CAAA;AAFD,gCAEC"}
|
1
node_modules/@actions/core/lib/file-command.d.ts
generated
vendored
Normal file
1
node_modules/@actions/core/lib/file-command.d.ts
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
export declare function issueCommand(command: string, message: any): void;
|
42
node_modules/@actions/core/lib/file-command.js
generated
vendored
Normal file
42
node_modules/@actions/core/lib/file-command.js
generated
vendored
Normal file
|
@ -0,0 +1,42 @@
|
|||
"use strict";
|
||||
// For internal use, subject to change.
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.issueCommand = void 0;
|
||||
// We use any as a valid input type
|
||||
/* eslint-disable @typescript-eslint/no-explicit-any */
|
||||
const fs = __importStar(require("fs"));
|
||||
const os = __importStar(require("os"));
|
||||
const utils_1 = require("./utils");
|
||||
function issueCommand(command, message) {
|
||||
const filePath = process.env[`GITHUB_${command}`];
|
||||
if (!filePath) {
|
||||
throw new Error(`Unable to find environment variable for file command ${command}`);
|
||||
}
|
||||
if (!fs.existsSync(filePath)) {
|
||||
throw new Error(`Missing file at path: ${filePath}`);
|
||||
}
|
||||
fs.appendFileSync(filePath, `${utils_1.toCommandValue(message)}${os.EOL}`, {
|
||||
encoding: 'utf8'
|
||||
});
|
||||
}
|
||||
exports.issueCommand = issueCommand;
|
||||
//# sourceMappingURL=file-command.js.map
|
1
node_modules/@actions/core/lib/file-command.js.map
generated
vendored
Normal file
1
node_modules/@actions/core/lib/file-command.js.map
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
{"version":3,"file":"file-command.js","sourceRoot":"","sources":["../src/file-command.ts"],"names":[],"mappings":";AAAA,uCAAuC;;;;;;;;;;;;;;;;;;;;;;AAEvC,mCAAmC;AACnC,uDAAuD;AAEvD,uCAAwB;AACxB,uCAAwB;AACxB,mCAAsC;AAEtC,SAAgB,YAAY,CAAC,OAAe,EAAE,OAAY;IACxD,MAAM,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,UAAU,OAAO,EAAE,CAAC,CAAA;IACjD,IAAI,CAAC,QAAQ,EAAE;QACb,MAAM,IAAI,KAAK,CACb,wDAAwD,OAAO,EAAE,CAClE,CAAA;KACF;IACD,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,QAAQ,CAAC,EAAE;QAC5B,MAAM,IAAI,KAAK,CAAC,yBAAyB,QAAQ,EAAE,CAAC,CAAA;KACrD;IAED,EAAE,CAAC,cAAc,CAAC,QAAQ,EAAE,GAAG,sBAAc,CAAC,OAAO,CAAC,GAAG,EAAE,CAAC,GAAG,EAAE,EAAE;QACjE,QAAQ,EAAE,MAAM;KACjB,CAAC,CAAA;AACJ,CAAC;AAdD,oCAcC"}
|
7
node_modules/@actions/core/lib/oidc-utils.d.ts
generated
vendored
Normal file
7
node_modules/@actions/core/lib/oidc-utils.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,7 @@
|
|||
export declare class OidcClient {
|
||||
private static createHttpClient;
|
||||
private static getRequestToken;
|
||||
private static getIDTokenUrl;
|
||||
private static getCall;
|
||||
static getIDToken(audience?: string): Promise<string>;
|
||||
}
|
77
node_modules/@actions/core/lib/oidc-utils.js
generated
vendored
Normal file
77
node_modules/@actions/core/lib/oidc-utils.js
generated
vendored
Normal file
|
@ -0,0 +1,77 @@
|
|||
"use strict";
|
||||
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
||||
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
||||
return new (P || (P = Promise))(function (resolve, reject) {
|
||||
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
|
||||
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
|
||||
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
|
||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
||||
});
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.OidcClient = void 0;
|
||||
const http_client_1 = require("@actions/http-client");
|
||||
const auth_1 = require("@actions/http-client/auth");
|
||||
const core_1 = require("./core");
|
||||
class OidcClient {
|
||||
static createHttpClient(allowRetry = true, maxRetry = 10) {
|
||||
const requestOptions = {
|
||||
allowRetries: allowRetry,
|
||||
maxRetries: maxRetry
|
||||
};
|
||||
return new http_client_1.HttpClient('actions/oidc-client', [new auth_1.BearerCredentialHandler(OidcClient.getRequestToken())], requestOptions);
|
||||
}
|
||||
static getRequestToken() {
|
||||
const token = process.env['ACTIONS_ID_TOKEN_REQUEST_TOKEN'];
|
||||
if (!token) {
|
||||
throw new Error('Unable to get ACTIONS_ID_TOKEN_REQUEST_TOKEN env variable');
|
||||
}
|
||||
return token;
|
||||
}
|
||||
static getIDTokenUrl() {
|
||||
const runtimeUrl = process.env['ACTIONS_ID_TOKEN_REQUEST_URL'];
|
||||
if (!runtimeUrl) {
|
||||
throw new Error('Unable to get ACTIONS_ID_TOKEN_REQUEST_URL env variable');
|
||||
}
|
||||
return runtimeUrl;
|
||||
}
|
||||
static getCall(id_token_url) {
|
||||
var _a;
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
const httpclient = OidcClient.createHttpClient();
|
||||
const res = yield httpclient
|
||||
.getJson(id_token_url)
|
||||
.catch(error => {
|
||||
throw new Error(`Failed to get ID Token. \n
|
||||
Error Code : ${error.statusCode}\n
|
||||
Error Message: ${error.result.message}`);
|
||||
});
|
||||
const id_token = (_a = res.result) === null || _a === void 0 ? void 0 : _a.value;
|
||||
if (!id_token) {
|
||||
throw new Error('Response json body do not have ID Token field');
|
||||
}
|
||||
return id_token;
|
||||
});
|
||||
}
|
||||
static getIDToken(audience) {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
try {
|
||||
// New ID Token is requested from action service
|
||||
let id_token_url = OidcClient.getIDTokenUrl();
|
||||
if (audience) {
|
||||
const encodedAudience = encodeURIComponent(audience);
|
||||
id_token_url = `${id_token_url}&audience=${encodedAudience}`;
|
||||
}
|
||||
core_1.debug(`ID token url is ${id_token_url}`);
|
||||
const id_token = yield OidcClient.getCall(id_token_url);
|
||||
core_1.setSecret(id_token);
|
||||
return id_token;
|
||||
}
|
||||
catch (error) {
|
||||
throw new Error(`Error message: ${error.message}`);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
exports.OidcClient = OidcClient;
|
||||
//# sourceMappingURL=oidc-utils.js.map
|
1
node_modules/@actions/core/lib/oidc-utils.js.map
generated
vendored
Normal file
1
node_modules/@actions/core/lib/oidc-utils.js.map
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
{"version":3,"file":"oidc-utils.js","sourceRoot":"","sources":["../src/oidc-utils.ts"],"names":[],"mappings":";;;;;;;;;;;;AAGA,sDAA+C;AAC/C,oDAAiE;AACjE,iCAAuC;AAKvC,MAAa,UAAU;IACb,MAAM,CAAC,gBAAgB,CAC7B,UAAU,GAAG,IAAI,EACjB,QAAQ,GAAG,EAAE;QAEb,MAAM,cAAc,GAAoB;YACtC,YAAY,EAAE,UAAU;YACxB,UAAU,EAAE,QAAQ;SACrB,CAAA;QAED,OAAO,IAAI,wBAAU,CACnB,qBAAqB,EACrB,CAAC,IAAI,8BAAuB,CAAC,UAAU,CAAC,eAAe,EAAE,CAAC,CAAC,EAC3D,cAAc,CACf,CAAA;IACH,CAAC;IAEO,MAAM,CAAC,eAAe;QAC5B,MAAM,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,gCAAgC,CAAC,CAAA;QAC3D,IAAI,CAAC,KAAK,EAAE;YACV,MAAM,IAAI,KAAK,CACb,2DAA2D,CAC5D,CAAA;SACF;QACD,OAAO,KAAK,CAAA;IACd,CAAC;IAEO,MAAM,CAAC,aAAa;QAC1B,MAAM,UAAU,GAAG,OAAO,CAAC,GAAG,CAAC,8BAA8B,CAAC,CAAA;QAC9D,IAAI,CAAC,UAAU,EAAE;YACf,MAAM,IAAI,KAAK,CAAC,yDAAyD,CAAC,CAAA;SAC3E;QACD,OAAO,UAAU,CAAA;IACnB,CAAC;IAEO,MAAM,CAAO,OAAO,CAAC,YAAoB;;;YAC/C,MAAM,UAAU,GAAG,UAAU,CAAC,gBAAgB,EAAE,CAAA;YAEhD,MAAM,GAAG,GAAG,MAAM,UAAU;iBACzB,OAAO,CAAgB,YAAY,CAAC;iBACpC,KAAK,CAAC,KAAK,CAAC,EAAE;gBACb,MAAM,IAAI,KAAK,CACb;uBACa,KAAK,CAAC,UAAU;yBACd,KAAK,CAAC,MAAM,CAAC,OAAO,EAAE,CACtC,CAAA;YACH,CAAC,CAAC,CAAA;YAEJ,MAAM,QAAQ,SAAG,GAAG,CAAC,MAAM,0CAAE,KAAK,CAAA;YAClC,IAAI,CAAC,QAAQ,EAAE;gBACb,MAAM,IAAI,KAAK,CAAC,+CAA+C,CAAC,CAAA;aACjE;YACD,OAAO,QAAQ,CAAA;;KAChB;IAED,MAAM,CAAO,UAAU,CAAC,QAAiB;;YACvC,IAAI;gBACF,gDAAgD;gBAChD,IAAI,YAAY,GAAW,UAAU,CAAC,aAAa,EAAE,CAAA;gBACrD,IAAI,QAAQ,EAAE;oBACZ,MAAM,eAAe,GAAG,kBAAkB,CAAC,QAAQ,CAAC,CAAA;oBACpD,YAAY,GAAG,GAAG,YAAY,aAAa,eAAe,EAAE,CAAA;iBAC7D;gBAED,YAAK,CAAC,mBAAmB,YAAY,EAAE,CAAC,CAAA;gBAExC,MAAM,QAAQ,GAAG,MAAM,UAAU,CAAC,OAAO,CAAC,YAAY,CAAC,CAAA;gBACvD,gBAAS,CAAC,QAAQ,CAAC,CAAA;gBACnB,OAAO,QAAQ,CAAA;aAChB;YAAC,OAAO,KAAK,EAAE;gBACd,MAAM,IAAI,KAAK,CAAC,kBAAkB,KAAK,CAAC,OAAO,EAAE,CAAC,CAAA;aACnD;QACH,CAAC;KAAA;CACF;AAzED,gCAyEC"}
|
14
node_modules/@actions/core/lib/utils.d.ts
generated
vendored
Normal file
14
node_modules/@actions/core/lib/utils.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,14 @@
|
|||
import { AnnotationProperties } from './core';
|
||||
import { CommandProperties } from './command';
|
||||
/**
|
||||
* Sanitizes an input into a string so it can be passed into issueCommand safely
|
||||
* @param input input to sanitize into a string
|
||||
*/
|
||||
export declare function toCommandValue(input: any): string;
|
||||
/**
|
||||
*
|
||||
* @param annotationProperties
|
||||
* @returns The command properties to send with the actual annotation command
|
||||
* See IssueCommandProperties: https://github.com/actions/runner/blob/main/src/Runner.Worker/ActionCommandManager.cs#L646
|
||||
*/
|
||||
export declare function toCommandProperties(annotationProperties: AnnotationProperties): CommandProperties;
|
40
node_modules/@actions/core/lib/utils.js
generated
vendored
Normal file
40
node_modules/@actions/core/lib/utils.js
generated
vendored
Normal file
|
@ -0,0 +1,40 @@
|
|||
"use strict";
|
||||
// We use any as a valid input type
|
||||
/* eslint-disable @typescript-eslint/no-explicit-any */
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.toCommandProperties = exports.toCommandValue = void 0;
|
||||
/**
|
||||
* Sanitizes an input into a string so it can be passed into issueCommand safely
|
||||
* @param input input to sanitize into a string
|
||||
*/
|
||||
function toCommandValue(input) {
|
||||
if (input === null || input === undefined) {
|
||||
return '';
|
||||
}
|
||||
else if (typeof input === 'string' || input instanceof String) {
|
||||
return input;
|
||||
}
|
||||
return JSON.stringify(input);
|
||||
}
|
||||
exports.toCommandValue = toCommandValue;
|
||||
/**
|
||||
*
|
||||
* @param annotationProperties
|
||||
* @returns The command properties to send with the actual annotation command
|
||||
* See IssueCommandProperties: https://github.com/actions/runner/blob/main/src/Runner.Worker/ActionCommandManager.cs#L646
|
||||
*/
|
||||
function toCommandProperties(annotationProperties) {
|
||||
if (!Object.keys(annotationProperties).length) {
|
||||
return {};
|
||||
}
|
||||
return {
|
||||
title: annotationProperties.title,
|
||||
file: annotationProperties.file,
|
||||
line: annotationProperties.startLine,
|
||||
endLine: annotationProperties.endLine,
|
||||
col: annotationProperties.startColumn,
|
||||
endColumn: annotationProperties.endColumn
|
||||
};
|
||||
}
|
||||
exports.toCommandProperties = toCommandProperties;
|
||||
//# sourceMappingURL=utils.js.map
|
1
node_modules/@actions/core/lib/utils.js.map
generated
vendored
Normal file
1
node_modules/@actions/core/lib/utils.js.map
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
{"version":3,"file":"utils.js","sourceRoot":"","sources":["../src/utils.ts"],"names":[],"mappings":";AAAA,mCAAmC;AACnC,uDAAuD;;;AAKvD;;;GAGG;AACH,SAAgB,cAAc,CAAC,KAAU;IACvC,IAAI,KAAK,KAAK,IAAI,IAAI,KAAK,KAAK,SAAS,EAAE;QACzC,OAAO,EAAE,CAAA;KACV;SAAM,IAAI,OAAO,KAAK,KAAK,QAAQ,IAAI,KAAK,YAAY,MAAM,EAAE;QAC/D,OAAO,KAAe,CAAA;KACvB;IACD,OAAO,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,CAAA;AAC9B,CAAC;AAPD,wCAOC;AAED;;;;;GAKG;AACH,SAAgB,mBAAmB,CACjC,oBAA0C;IAE1C,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,oBAAoB,CAAC,CAAC,MAAM,EAAE;QAC7C,OAAO,EAAE,CAAA;KACV;IAED,OAAO;QACL,KAAK,EAAE,oBAAoB,CAAC,KAAK;QACjC,IAAI,EAAE,oBAAoB,CAAC,IAAI;QAC/B,IAAI,EAAE,oBAAoB,CAAC,SAAS;QACpC,OAAO,EAAE,oBAAoB,CAAC,OAAO;QACrC,GAAG,EAAE,oBAAoB,CAAC,WAAW;QACrC,SAAS,EAAE,oBAAoB,CAAC,SAAS;KAC1C,CAAA;AACH,CAAC;AAfD,kDAeC"}
|
44
node_modules/@actions/core/package.json
generated
vendored
Normal file
44
node_modules/@actions/core/package.json
generated
vendored
Normal file
|
@ -0,0 +1,44 @@
|
|||
{
|
||||
"name": "@actions/core",
|
||||
"version": "1.6.0",
|
||||
"description": "Actions core lib",
|
||||
"keywords": [
|
||||
"github",
|
||||
"actions",
|
||||
"core"
|
||||
],
|
||||
"homepage": "https://github.com/actions/toolkit/tree/main/packages/core",
|
||||
"license": "MIT",
|
||||
"main": "lib/core.js",
|
||||
"types": "lib/core.d.ts",
|
||||
"directories": {
|
||||
"lib": "lib",
|
||||
"test": "__tests__"
|
||||
},
|
||||
"files": [
|
||||
"lib",
|
||||
"!.DS_Store"
|
||||
],
|
||||
"publishConfig": {
|
||||
"access": "public"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/actions/toolkit.git",
|
||||
"directory": "packages/core"
|
||||
},
|
||||
"scripts": {
|
||||
"audit-moderate": "npm install && npm audit --json --audit-level=moderate > audit.json",
|
||||
"test": "echo \"Error: run tests from root\" && exit 1",
|
||||
"tsc": "tsc"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/actions/toolkit/issues"
|
||||
},
|
||||
"dependencies": {
|
||||
"@actions/http-client": "^1.0.11"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^12.0.2"
|
||||
}
|
||||
}
|
21
node_modules/@actions/http-client/LICENSE
generated
vendored
Normal file
21
node_modules/@actions/http-client/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
|||
Actions Http Client for Node.js
|
||||
|
||||
Copyright (c) GitHub, Inc.
|
||||
|
||||
All rights reserved.
|
||||
|
||||
MIT License
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and
|
||||
associated documentation files (the "Software"), to deal in the Software without restriction,
|
||||
including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
|
||||
and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so,
|
||||
subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT
|
||||
LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
|
||||
NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
|
||||
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
|
||||
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
79
node_modules/@actions/http-client/README.md
generated
vendored
Normal file
79
node_modules/@actions/http-client/README.md
generated
vendored
Normal file
|
@ -0,0 +1,79 @@
|
|||
|
||||
<p align="center">
|
||||
<img src="actions.png">
|
||||
</p>
|
||||
|
||||
# Actions Http-Client
|
||||
|
||||
[![Http Status](https://github.com/actions/http-client/workflows/http-tests/badge.svg)](https://github.com/actions/http-client/actions)
|
||||
|
||||
A lightweight HTTP client optimized for use with actions, TypeScript with generics and async await.
|
||||
|
||||
## Features
|
||||
|
||||
- HTTP client with TypeScript generics and async/await/Promises
|
||||
- Typings included so no need to acquire separately (great for intellisense and no versioning drift)
|
||||
- [Proxy support](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-self-hosted-runners#using-a-proxy-server-with-self-hosted-runners) just works with actions and the runner
|
||||
- Targets ES2019 (runner runs actions with node 12+). Only supported on node 12+.
|
||||
- Basic, Bearer and PAT Support out of the box. Extensible handlers for others.
|
||||
- Redirects supported
|
||||
|
||||
Features and releases [here](./RELEASES.md)
|
||||
|
||||
## Install
|
||||
|
||||
```
|
||||
npm install @actions/http-client --save
|
||||
```
|
||||
|
||||
## Samples
|
||||
|
||||
See the [HTTP](./__tests__) tests for detailed examples.
|
||||
|
||||
## Errors
|
||||
|
||||
### HTTP
|
||||
|
||||
The HTTP client does not throw unless truly exceptional.
|
||||
|
||||
* A request that successfully executes resulting in a 404, 500 etc... will return a response object with a status code and a body.
|
||||
* Redirects (3xx) will be followed by default.
|
||||
|
||||
See [HTTP tests](./__tests__) for detailed examples.
|
||||
|
||||
## Debugging
|
||||
|
||||
To enable detailed console logging of all HTTP requests and responses, set the NODE_DEBUG environment varible:
|
||||
|
||||
```
|
||||
export NODE_DEBUG=http
|
||||
```
|
||||
|
||||
## Node support
|
||||
|
||||
The http-client is built using the latest LTS version of Node 12. It may work on previous node LTS versions but it's tested and officially supported on Node12+.
|
||||
|
||||
## Support and Versioning
|
||||
|
||||
We follow semver and will hold compatibility between major versions and increment the minor version with new features and capabilities (while holding compat).
|
||||
|
||||
## Contributing
|
||||
|
||||
We welcome PRs. Please create an issue and if applicable, a design before proceeding with code.
|
||||
|
||||
once:
|
||||
|
||||
```bash
|
||||
$ npm install
|
||||
```
|
||||
|
||||
To build:
|
||||
|
||||
```bash
|
||||
$ npm run build
|
||||
```
|
||||
|
||||
To run all tests:
|
||||
```bash
|
||||
$ npm test
|
||||
```
|
26
node_modules/@actions/http-client/RELEASES.md
generated
vendored
Normal file
26
node_modules/@actions/http-client/RELEASES.md
generated
vendored
Normal file
|
@ -0,0 +1,26 @@
|
|||
## Releases
|
||||
|
||||
## 1.0.10
|
||||
|
||||
Contains a bug fix where proxy is defined without a user and password. see [PR here](https://github.com/actions/http-client/pull/42)
|
||||
|
||||
## 1.0.9
|
||||
Throw HttpClientError instead of a generic Error from the \<verb>Json() helper methods when the server responds with a non-successful status code.
|
||||
|
||||
## 1.0.8
|
||||
Fixed security issue where a redirect (e.g. 302) to another domain would pass headers. The fix was to strip the authorization header if the hostname was different. More [details in PR #27](https://github.com/actions/http-client/pull/27)
|
||||
|
||||
## 1.0.7
|
||||
Update NPM dependencies and add 429 to the list of HttpCodes
|
||||
|
||||
## 1.0.6
|
||||
Automatically sends Content-Type and Accept application/json headers for \<verb>Json() helper methods if not set in the client or parameters.
|
||||
|
||||
## 1.0.5
|
||||
Adds \<verb>Json() helper methods for json over http scenarios.
|
||||
|
||||
## 1.0.4
|
||||
Started to add \<verb>Json() helper methods. Do not use this release for that. Use >= 1.0.5 since there was an issue with types.
|
||||
|
||||
## 1.0.1 to 1.0.3
|
||||
Adds proxy support.
|
BIN
node_modules/@actions/http-client/actions.png
generated
vendored
Normal file
BIN
node_modules/@actions/http-client/actions.png
generated
vendored
Normal file
Binary file not shown.
After Width: | Height: | Size: 33 KiB |
23
node_modules/@actions/http-client/auth.d.ts
generated
vendored
Normal file
23
node_modules/@actions/http-client/auth.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,23 @@
|
|||
import ifm = require('./interfaces');
|
||||
export declare class BasicCredentialHandler implements ifm.IRequestHandler {
|
||||
username: string;
|
||||
password: string;
|
||||
constructor(username: string, password: string);
|
||||
prepareRequest(options: any): void;
|
||||
canHandleAuthentication(response: ifm.IHttpClientResponse): boolean;
|
||||
handleAuthentication(httpClient: ifm.IHttpClient, requestInfo: ifm.IRequestInfo, objs: any): Promise<ifm.IHttpClientResponse>;
|
||||
}
|
||||
export declare class BearerCredentialHandler implements ifm.IRequestHandler {
|
||||
token: string;
|
||||
constructor(token: string);
|
||||
prepareRequest(options: any): void;
|
||||
canHandleAuthentication(response: ifm.IHttpClientResponse): boolean;
|
||||
handleAuthentication(httpClient: ifm.IHttpClient, requestInfo: ifm.IRequestInfo, objs: any): Promise<ifm.IHttpClientResponse>;
|
||||
}
|
||||
export declare class PersonalAccessTokenCredentialHandler implements ifm.IRequestHandler {
|
||||
token: string;
|
||||
constructor(token: string);
|
||||
prepareRequest(options: any): void;
|
||||
canHandleAuthentication(response: ifm.IHttpClientResponse): boolean;
|
||||
handleAuthentication(httpClient: ifm.IHttpClient, requestInfo: ifm.IRequestInfo, objs: any): Promise<ifm.IHttpClientResponse>;
|
||||
}
|
58
node_modules/@actions/http-client/auth.js
generated
vendored
Normal file
58
node_modules/@actions/http-client/auth.js
generated
vendored
Normal file
|
@ -0,0 +1,58 @@
|
|||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
class BasicCredentialHandler {
|
||||
constructor(username, password) {
|
||||
this.username = username;
|
||||
this.password = password;
|
||||
}
|
||||
prepareRequest(options) {
|
||||
options.headers['Authorization'] =
|
||||
'Basic ' +
|
||||
Buffer.from(this.username + ':' + this.password).toString('base64');
|
||||
}
|
||||
// This handler cannot handle 401
|
||||
canHandleAuthentication(response) {
|
||||
return false;
|
||||
}
|
||||
handleAuthentication(httpClient, requestInfo, objs) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
exports.BasicCredentialHandler = BasicCredentialHandler;
|
||||
class BearerCredentialHandler {
|
||||
constructor(token) {
|
||||
this.token = token;
|
||||
}
|
||||
// currently implements pre-authorization
|
||||
// TODO: support preAuth = false where it hooks on 401
|
||||
prepareRequest(options) {
|
||||
options.headers['Authorization'] = 'Bearer ' + this.token;
|
||||
}
|
||||
// This handler cannot handle 401
|
||||
canHandleAuthentication(response) {
|
||||
return false;
|
||||
}
|
||||
handleAuthentication(httpClient, requestInfo, objs) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
exports.BearerCredentialHandler = BearerCredentialHandler;
|
||||
class PersonalAccessTokenCredentialHandler {
|
||||
constructor(token) {
|
||||
this.token = token;
|
||||
}
|
||||
// currently implements pre-authorization
|
||||
// TODO: support preAuth = false where it hooks on 401
|
||||
prepareRequest(options) {
|
||||
options.headers['Authorization'] =
|
||||
'Basic ' + Buffer.from('PAT:' + this.token).toString('base64');
|
||||
}
|
||||
// This handler cannot handle 401
|
||||
canHandleAuthentication(response) {
|
||||
return false;
|
||||
}
|
||||
handleAuthentication(httpClient, requestInfo, objs) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
exports.PersonalAccessTokenCredentialHandler = PersonalAccessTokenCredentialHandler;
|
124
node_modules/@actions/http-client/index.d.ts
generated
vendored
Normal file
124
node_modules/@actions/http-client/index.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,124 @@
|
|||
/// <reference types="node" />
|
||||
import http = require('http');
|
||||
import ifm = require('./interfaces');
|
||||
export declare enum HttpCodes {
|
||||
OK = 200,
|
||||
MultipleChoices = 300,
|
||||
MovedPermanently = 301,
|
||||
ResourceMoved = 302,
|
||||
SeeOther = 303,
|
||||
NotModified = 304,
|
||||
UseProxy = 305,
|
||||
SwitchProxy = 306,
|
||||
TemporaryRedirect = 307,
|
||||
PermanentRedirect = 308,
|
||||
BadRequest = 400,
|
||||
Unauthorized = 401,
|
||||
PaymentRequired = 402,
|
||||
Forbidden = 403,
|
||||
NotFound = 404,
|
||||
MethodNotAllowed = 405,
|
||||
NotAcceptable = 406,
|
||||
ProxyAuthenticationRequired = 407,
|
||||
RequestTimeout = 408,
|
||||
Conflict = 409,
|
||||
Gone = 410,
|
||||
TooManyRequests = 429,
|
||||
InternalServerError = 500,
|
||||
NotImplemented = 501,
|
||||
BadGateway = 502,
|
||||
ServiceUnavailable = 503,
|
||||
GatewayTimeout = 504
|
||||
}
|
||||
export declare enum Headers {
|
||||
Accept = "accept",
|
||||
ContentType = "content-type"
|
||||
}
|
||||
export declare enum MediaTypes {
|
||||
ApplicationJson = "application/json"
|
||||
}
|
||||
/**
|
||||
* Returns the proxy URL, depending upon the supplied url and proxy environment variables.
|
||||
* @param serverUrl The server URL where the request will be sent. For example, https://api.github.com
|
||||
*/
|
||||
export declare function getProxyUrl(serverUrl: string): string;
|
||||
export declare class HttpClientError extends Error {
|
||||
constructor(message: string, statusCode: number);
|
||||
statusCode: number;
|
||||
result?: any;
|
||||
}
|
||||
export declare class HttpClientResponse implements ifm.IHttpClientResponse {
|
||||
constructor(message: http.IncomingMessage);
|
||||
message: http.IncomingMessage;
|
||||
readBody(): Promise<string>;
|
||||
}
|
||||
export declare function isHttps(requestUrl: string): boolean;
|
||||
export declare class HttpClient {
|
||||
userAgent: string | undefined;
|
||||
handlers: ifm.IRequestHandler[];
|
||||
requestOptions: ifm.IRequestOptions;
|
||||
private _ignoreSslError;
|
||||
private _socketTimeout;
|
||||
private _allowRedirects;
|
||||
private _allowRedirectDowngrade;
|
||||
private _maxRedirects;
|
||||
private _allowRetries;
|
||||
private _maxRetries;
|
||||
private _agent;
|
||||
private _proxyAgent;
|
||||
private _keepAlive;
|
||||
private _disposed;
|
||||
constructor(userAgent?: string, handlers?: ifm.IRequestHandler[], requestOptions?: ifm.IRequestOptions);
|
||||
options(requestUrl: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
get(requestUrl: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
del(requestUrl: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
post(requestUrl: string, data: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
patch(requestUrl: string, data: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
put(requestUrl: string, data: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
head(requestUrl: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
sendStream(verb: string, requestUrl: string, stream: NodeJS.ReadableStream, additionalHeaders?: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
/**
|
||||
* Gets a typed object from an endpoint
|
||||
* Be aware that not found returns a null. Other errors (4xx, 5xx) reject the promise
|
||||
*/
|
||||
getJson<T>(requestUrl: string, additionalHeaders?: ifm.IHeaders): Promise<ifm.ITypedResponse<T>>;
|
||||
postJson<T>(requestUrl: string, obj: any, additionalHeaders?: ifm.IHeaders): Promise<ifm.ITypedResponse<T>>;
|
||||
putJson<T>(requestUrl: string, obj: any, additionalHeaders?: ifm.IHeaders): Promise<ifm.ITypedResponse<T>>;
|
||||
patchJson<T>(requestUrl: string, obj: any, additionalHeaders?: ifm.IHeaders): Promise<ifm.ITypedResponse<T>>;
|
||||
/**
|
||||
* Makes a raw http request.
|
||||
* All other methods such as get, post, patch, and request ultimately call this.
|
||||
* Prefer get, del, post and patch
|
||||
*/
|
||||
request(verb: string, requestUrl: string, data: string | NodeJS.ReadableStream, headers: ifm.IHeaders): Promise<ifm.IHttpClientResponse>;
|
||||
/**
|
||||
* Needs to be called if keepAlive is set to true in request options.
|
||||
*/
|
||||
dispose(): void;
|
||||
/**
|
||||
* Raw request.
|
||||
* @param info
|
||||
* @param data
|
||||
*/
|
||||
requestRaw(info: ifm.IRequestInfo, data: string | NodeJS.ReadableStream): Promise<ifm.IHttpClientResponse>;
|
||||
/**
|
||||
* Raw request with callback.
|
||||
* @param info
|
||||
* @param data
|
||||
* @param onResult
|
||||
*/
|
||||
requestRawWithCallback(info: ifm.IRequestInfo, data: string | NodeJS.ReadableStream, onResult: (err: any, res: ifm.IHttpClientResponse) => void): void;
|
||||
/**
|
||||
* Gets an http agent. This function is useful when you need an http agent that handles
|
||||
* routing through a proxy server - depending upon the url and proxy environment variables.
|
||||
* @param serverUrl The server URL where the request will be sent. For example, https://api.github.com
|
||||
*/
|
||||
getAgent(serverUrl: string): http.Agent;
|
||||
private _prepareRequest;
|
||||
private _mergeHeaders;
|
||||
private _getExistingOrDefaultHeader;
|
||||
private _getAgent;
|
||||
private _performExponentialBackoff;
|
||||
private static dateTimeDeserializer;
|
||||
private _processResponse;
|
||||
}
|
537
node_modules/@actions/http-client/index.js
generated
vendored
Normal file
537
node_modules/@actions/http-client/index.js
generated
vendored
Normal file
|
@ -0,0 +1,537 @@
|
|||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const http = require("http");
|
||||
const https = require("https");
|
||||
const pm = require("./proxy");
|
||||
let tunnel;
|
||||
var HttpCodes;
|
||||
(function (HttpCodes) {
|
||||
HttpCodes[HttpCodes["OK"] = 200] = "OK";
|
||||
HttpCodes[HttpCodes["MultipleChoices"] = 300] = "MultipleChoices";
|
||||
HttpCodes[HttpCodes["MovedPermanently"] = 301] = "MovedPermanently";
|
||||
HttpCodes[HttpCodes["ResourceMoved"] = 302] = "ResourceMoved";
|
||||
HttpCodes[HttpCodes["SeeOther"] = 303] = "SeeOther";
|
||||
HttpCodes[HttpCodes["NotModified"] = 304] = "NotModified";
|
||||
HttpCodes[HttpCodes["UseProxy"] = 305] = "UseProxy";
|
||||
HttpCodes[HttpCodes["SwitchProxy"] = 306] = "SwitchProxy";
|
||||
HttpCodes[HttpCodes["TemporaryRedirect"] = 307] = "TemporaryRedirect";
|
||||
HttpCodes[HttpCodes["PermanentRedirect"] = 308] = "PermanentRedirect";
|
||||
HttpCodes[HttpCodes["BadRequest"] = 400] = "BadRequest";
|
||||
HttpCodes[HttpCodes["Unauthorized"] = 401] = "Unauthorized";
|
||||
HttpCodes[HttpCodes["PaymentRequired"] = 402] = "PaymentRequired";
|
||||
HttpCodes[HttpCodes["Forbidden"] = 403] = "Forbidden";
|
||||
HttpCodes[HttpCodes["NotFound"] = 404] = "NotFound";
|
||||
HttpCodes[HttpCodes["MethodNotAllowed"] = 405] = "MethodNotAllowed";
|
||||
HttpCodes[HttpCodes["NotAcceptable"] = 406] = "NotAcceptable";
|
||||
HttpCodes[HttpCodes["ProxyAuthenticationRequired"] = 407] = "ProxyAuthenticationRequired";
|
||||
HttpCodes[HttpCodes["RequestTimeout"] = 408] = "RequestTimeout";
|
||||
HttpCodes[HttpCodes["Conflict"] = 409] = "Conflict";
|
||||
HttpCodes[HttpCodes["Gone"] = 410] = "Gone";
|
||||
HttpCodes[HttpCodes["TooManyRequests"] = 429] = "TooManyRequests";
|
||||
HttpCodes[HttpCodes["InternalServerError"] = 500] = "InternalServerError";
|
||||
HttpCodes[HttpCodes["NotImplemented"] = 501] = "NotImplemented";
|
||||
HttpCodes[HttpCodes["BadGateway"] = 502] = "BadGateway";
|
||||
HttpCodes[HttpCodes["ServiceUnavailable"] = 503] = "ServiceUnavailable";
|
||||
HttpCodes[HttpCodes["GatewayTimeout"] = 504] = "GatewayTimeout";
|
||||
})(HttpCodes = exports.HttpCodes || (exports.HttpCodes = {}));
|
||||
var Headers;
|
||||
(function (Headers) {
|
||||
Headers["Accept"] = "accept";
|
||||
Headers["ContentType"] = "content-type";
|
||||
})(Headers = exports.Headers || (exports.Headers = {}));
|
||||
var MediaTypes;
|
||||
(function (MediaTypes) {
|
||||
MediaTypes["ApplicationJson"] = "application/json";
|
||||
})(MediaTypes = exports.MediaTypes || (exports.MediaTypes = {}));
|
||||
/**
|
||||
* Returns the proxy URL, depending upon the supplied url and proxy environment variables.
|
||||
* @param serverUrl The server URL where the request will be sent. For example, https://api.github.com
|
||||
*/
|
||||
function getProxyUrl(serverUrl) {
|
||||
let proxyUrl = pm.getProxyUrl(new URL(serverUrl));
|
||||
return proxyUrl ? proxyUrl.href : '';
|
||||
}
|
||||
exports.getProxyUrl = getProxyUrl;
|
||||
const HttpRedirectCodes = [
|
||||
HttpCodes.MovedPermanently,
|
||||
HttpCodes.ResourceMoved,
|
||||
HttpCodes.SeeOther,
|
||||
HttpCodes.TemporaryRedirect,
|
||||
HttpCodes.PermanentRedirect
|
||||
];
|
||||
const HttpResponseRetryCodes = [
|
||||
HttpCodes.BadGateway,
|
||||
HttpCodes.ServiceUnavailable,
|
||||
HttpCodes.GatewayTimeout
|
||||
];
|
||||
const RetryableHttpVerbs = ['OPTIONS', 'GET', 'DELETE', 'HEAD'];
|
||||
const ExponentialBackoffCeiling = 10;
|
||||
const ExponentialBackoffTimeSlice = 5;
|
||||
class HttpClientError extends Error {
|
||||
constructor(message, statusCode) {
|
||||
super(message);
|
||||
this.name = 'HttpClientError';
|
||||
this.statusCode = statusCode;
|
||||
Object.setPrototypeOf(this, HttpClientError.prototype);
|
||||
}
|
||||
}
|
||||
exports.HttpClientError = HttpClientError;
|
||||
class HttpClientResponse {
|
||||
constructor(message) {
|
||||
this.message = message;
|
||||
}
|
||||
readBody() {
|
||||
return new Promise(async (resolve, reject) => {
|
||||
let output = Buffer.alloc(0);
|
||||
this.message.on('data', (chunk) => {
|
||||
output = Buffer.concat([output, chunk]);
|
||||
});
|
||||
this.message.on('end', () => {
|
||||
resolve(output.toString());
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
exports.HttpClientResponse = HttpClientResponse;
|
||||
function isHttps(requestUrl) {
|
||||
let parsedUrl = new URL(requestUrl);
|
||||
return parsedUrl.protocol === 'https:';
|
||||
}
|
||||
exports.isHttps = isHttps;
|
||||
class HttpClient {
|
||||
constructor(userAgent, handlers, requestOptions) {
|
||||
this._ignoreSslError = false;
|
||||
this._allowRedirects = true;
|
||||
this._allowRedirectDowngrade = false;
|
||||
this._maxRedirects = 50;
|
||||
this._allowRetries = false;
|
||||
this._maxRetries = 1;
|
||||
this._keepAlive = false;
|
||||
this._disposed = false;
|
||||
this.userAgent = userAgent;
|
||||
this.handlers = handlers || [];
|
||||
this.requestOptions = requestOptions;
|
||||
if (requestOptions) {
|
||||
if (requestOptions.ignoreSslError != null) {
|
||||
this._ignoreSslError = requestOptions.ignoreSslError;
|
||||
}
|
||||
this._socketTimeout = requestOptions.socketTimeout;
|
||||
if (requestOptions.allowRedirects != null) {
|
||||
this._allowRedirects = requestOptions.allowRedirects;
|
||||
}
|
||||
if (requestOptions.allowRedirectDowngrade != null) {
|
||||
this._allowRedirectDowngrade = requestOptions.allowRedirectDowngrade;
|
||||
}
|
||||
if (requestOptions.maxRedirects != null) {
|
||||
this._maxRedirects = Math.max(requestOptions.maxRedirects, 0);
|
||||
}
|
||||
if (requestOptions.keepAlive != null) {
|
||||
this._keepAlive = requestOptions.keepAlive;
|
||||
}
|
||||
if (requestOptions.allowRetries != null) {
|
||||
this._allowRetries = requestOptions.allowRetries;
|
||||
}
|
||||
if (requestOptions.maxRetries != null) {
|
||||
this._maxRetries = requestOptions.maxRetries;
|
||||
}
|
||||
}
|
||||
}
|
||||
options(requestUrl, additionalHeaders) {
|
||||
return this.request('OPTIONS', requestUrl, null, additionalHeaders || {});
|
||||
}
|
||||
get(requestUrl, additionalHeaders) {
|
||||
return this.request('GET', requestUrl, null, additionalHeaders || {});
|
||||
}
|
||||
del(requestUrl, additionalHeaders) {
|
||||
return this.request('DELETE', requestUrl, null, additionalHeaders || {});
|
||||
}
|
||||
post(requestUrl, data, additionalHeaders) {
|
||||
return this.request('POST', requestUrl, data, additionalHeaders || {});
|
||||
}
|
||||
patch(requestUrl, data, additionalHeaders) {
|
||||
return this.request('PATCH', requestUrl, data, additionalHeaders || {});
|
||||
}
|
||||
put(requestUrl, data, additionalHeaders) {
|
||||
return this.request('PUT', requestUrl, data, additionalHeaders || {});
|
||||
}
|
||||
head(requestUrl, additionalHeaders) {
|
||||
return this.request('HEAD', requestUrl, null, additionalHeaders || {});
|
||||
}
|
||||
sendStream(verb, requestUrl, stream, additionalHeaders) {
|
||||
return this.request(verb, requestUrl, stream, additionalHeaders);
|
||||
}
|
||||
/**
|
||||
* Gets a typed object from an endpoint
|
||||
* Be aware that not found returns a null. Other errors (4xx, 5xx) reject the promise
|
||||
*/
|
||||
async getJson(requestUrl, additionalHeaders = {}) {
|
||||
additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);
|
||||
let res = await this.get(requestUrl, additionalHeaders);
|
||||
return this._processResponse(res, this.requestOptions);
|
||||
}
|
||||
async postJson(requestUrl, obj, additionalHeaders = {}) {
|
||||
let data = JSON.stringify(obj, null, 2);
|
||||
additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);
|
||||
additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);
|
||||
let res = await this.post(requestUrl, data, additionalHeaders);
|
||||
return this._processResponse(res, this.requestOptions);
|
||||
}
|
||||
async putJson(requestUrl, obj, additionalHeaders = {}) {
|
||||
let data = JSON.stringify(obj, null, 2);
|
||||
additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);
|
||||
additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);
|
||||
let res = await this.put(requestUrl, data, additionalHeaders);
|
||||
return this._processResponse(res, this.requestOptions);
|
||||
}
|
||||
async patchJson(requestUrl, obj, additionalHeaders = {}) {
|
||||
let data = JSON.stringify(obj, null, 2);
|
||||
additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);
|
||||
additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);
|
||||
let res = await this.patch(requestUrl, data, additionalHeaders);
|
||||
return this._processResponse(res, this.requestOptions);
|
||||
}
|
||||
/**
|
||||
* Makes a raw http request.
|
||||
* All other methods such as get, post, patch, and request ultimately call this.
|
||||
* Prefer get, del, post and patch
|
||||
*/
|
||||
async request(verb, requestUrl, data, headers) {
|
||||
if (this._disposed) {
|
||||
throw new Error('Client has already been disposed.');
|
||||
}
|
||||
let parsedUrl = new URL(requestUrl);
|
||||
let info = this._prepareRequest(verb, parsedUrl, headers);
|
||||
// Only perform retries on reads since writes may not be idempotent.
|
||||
let maxTries = this._allowRetries && RetryableHttpVerbs.indexOf(verb) != -1
|
||||
? this._maxRetries + 1
|
||||
: 1;
|
||||
let numTries = 0;
|
||||
let response;
|
||||
while (numTries < maxTries) {
|
||||
response = await this.requestRaw(info, data);
|
||||
// Check if it's an authentication challenge
|
||||
if (response &&
|
||||
response.message &&
|
||||
response.message.statusCode === HttpCodes.Unauthorized) {
|
||||
let authenticationHandler;
|
||||
for (let i = 0; i < this.handlers.length; i++) {
|
||||
if (this.handlers[i].canHandleAuthentication(response)) {
|
||||
authenticationHandler = this.handlers[i];
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (authenticationHandler) {
|
||||
return authenticationHandler.handleAuthentication(this, info, data);
|
||||
}
|
||||
else {
|
||||
// We have received an unauthorized response but have no handlers to handle it.
|
||||
// Let the response return to the caller.
|
||||
return response;
|
||||
}
|
||||
}
|
||||
let redirectsRemaining = this._maxRedirects;
|
||||
while (HttpRedirectCodes.indexOf(response.message.statusCode) != -1 &&
|
||||
this._allowRedirects &&
|
||||
redirectsRemaining > 0) {
|
||||
const redirectUrl = response.message.headers['location'];
|
||||
if (!redirectUrl) {
|
||||
// if there's no location to redirect to, we won't
|
||||
break;
|
||||
}
|
||||
let parsedRedirectUrl = new URL(redirectUrl);
|
||||
if (parsedUrl.protocol == 'https:' &&
|
||||
parsedUrl.protocol != parsedRedirectUrl.protocol &&
|
||||
!this._allowRedirectDowngrade) {
|
||||
throw new Error('Redirect from HTTPS to HTTP protocol. This downgrade is not allowed for security reasons. If you want to allow this behavior, set the allowRedirectDowngrade option to true.');
|
||||
}
|
||||
// we need to finish reading the response before reassigning response
|
||||
// which will leak the open socket.
|
||||
await response.readBody();
|
||||
// strip authorization header if redirected to a different hostname
|
||||
if (parsedRedirectUrl.hostname !== parsedUrl.hostname) {
|
||||
for (let header in headers) {
|
||||
// header names are case insensitive
|
||||
if (header.toLowerCase() === 'authorization') {
|
||||
delete headers[header];
|
||||
}
|
||||
}
|
||||
}
|
||||
// let's make the request with the new redirectUrl
|
||||
info = this._prepareRequest(verb, parsedRedirectUrl, headers);
|
||||
response = await this.requestRaw(info, data);
|
||||
redirectsRemaining--;
|
||||
}
|
||||
if (HttpResponseRetryCodes.indexOf(response.message.statusCode) == -1) {
|
||||
// If not a retry code, return immediately instead of retrying
|
||||
return response;
|
||||
}
|
||||
numTries += 1;
|
||||
if (numTries < maxTries) {
|
||||
await response.readBody();
|
||||
await this._performExponentialBackoff(numTries);
|
||||
}
|
||||
}
|
||||
return response;
|
||||
}
|
||||
/**
|
||||
* Needs to be called if keepAlive is set to true in request options.
|
||||
*/
|
||||
dispose() {
|
||||
if (this._agent) {
|
||||
this._agent.destroy();
|
||||
}
|
||||
this._disposed = true;
|
||||
}
|
||||
/**
|
||||
* Raw request.
|
||||
* @param info
|
||||
* @param data
|
||||
*/
|
||||
requestRaw(info, data) {
|
||||
return new Promise((resolve, reject) => {
|
||||
let callbackForResult = function (err, res) {
|
||||
if (err) {
|
||||
reject(err);
|
||||
}
|
||||
resolve(res);
|
||||
};
|
||||
this.requestRawWithCallback(info, data, callbackForResult);
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Raw request with callback.
|
||||
* @param info
|
||||
* @param data
|
||||
* @param onResult
|
||||
*/
|
||||
requestRawWithCallback(info, data, onResult) {
|
||||
let socket;
|
||||
if (typeof data === 'string') {
|
||||
info.options.headers['Content-Length'] = Buffer.byteLength(data, 'utf8');
|
||||
}
|
||||
let callbackCalled = false;
|
||||
let handleResult = (err, res) => {
|
||||
if (!callbackCalled) {
|
||||
callbackCalled = true;
|
||||
onResult(err, res);
|
||||
}
|
||||
};
|
||||
let req = info.httpModule.request(info.options, (msg) => {
|
||||
let res = new HttpClientResponse(msg);
|
||||
handleResult(null, res);
|
||||
});
|
||||
req.on('socket', sock => {
|
||||
socket = sock;
|
||||
});
|
||||
// If we ever get disconnected, we want the socket to timeout eventually
|
||||
req.setTimeout(this._socketTimeout || 3 * 60000, () => {
|
||||
if (socket) {
|
||||
socket.end();
|
||||
}
|
||||
handleResult(new Error('Request timeout: ' + info.options.path), null);
|
||||
});
|
||||
req.on('error', function (err) {
|
||||
// err has statusCode property
|
||||
// res should have headers
|
||||
handleResult(err, null);
|
||||
});
|
||||
if (data && typeof data === 'string') {
|
||||
req.write(data, 'utf8');
|
||||
}
|
||||
if (data && typeof data !== 'string') {
|
||||
data.on('close', function () {
|
||||
req.end();
|
||||
});
|
||||
data.pipe(req);
|
||||
}
|
||||
else {
|
||||
req.end();
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Gets an http agent. This function is useful when you need an http agent that handles
|
||||
* routing through a proxy server - depending upon the url and proxy environment variables.
|
||||
* @param serverUrl The server URL where the request will be sent. For example, https://api.github.com
|
||||
*/
|
||||
getAgent(serverUrl) {
|
||||
let parsedUrl = new URL(serverUrl);
|
||||
return this._getAgent(parsedUrl);
|
||||
}
|
||||
_prepareRequest(method, requestUrl, headers) {
|
||||
const info = {};
|
||||
info.parsedUrl = requestUrl;
|
||||
const usingSsl = info.parsedUrl.protocol === 'https:';
|
||||
info.httpModule = usingSsl ? https : http;
|
||||
const defaultPort = usingSsl ? 443 : 80;
|
||||
info.options = {};
|
||||
info.options.host = info.parsedUrl.hostname;
|
||||
info.options.port = info.parsedUrl.port
|
||||
? parseInt(info.parsedUrl.port)
|
||||
: defaultPort;
|
||||
info.options.path =
|
||||
(info.parsedUrl.pathname || '') + (info.parsedUrl.search || '');
|
||||
info.options.method = method;
|
||||
info.options.headers = this._mergeHeaders(headers);
|
||||
if (this.userAgent != null) {
|
||||
info.options.headers['user-agent'] = this.userAgent;
|
||||
}
|
||||
info.options.agent = this._getAgent(info.parsedUrl);
|
||||
// gives handlers an opportunity to participate
|
||||
if (this.handlers) {
|
||||
this.handlers.forEach(handler => {
|
||||
handler.prepareRequest(info.options);
|
||||
});
|
||||
}
|
||||
return info;
|
||||
}
|
||||
_mergeHeaders(headers) {
|
||||
const lowercaseKeys = obj => Object.keys(obj).reduce((c, k) => ((c[k.toLowerCase()] = obj[k]), c), {});
|
||||
if (this.requestOptions && this.requestOptions.headers) {
|
||||
return Object.assign({}, lowercaseKeys(this.requestOptions.headers), lowercaseKeys(headers));
|
||||
}
|
||||
return lowercaseKeys(headers || {});
|
||||
}
|
||||
_getExistingOrDefaultHeader(additionalHeaders, header, _default) {
|
||||
const lowercaseKeys = obj => Object.keys(obj).reduce((c, k) => ((c[k.toLowerCase()] = obj[k]), c), {});
|
||||
let clientHeader;
|
||||
if (this.requestOptions && this.requestOptions.headers) {
|
||||
clientHeader = lowercaseKeys(this.requestOptions.headers)[header];
|
||||
}
|
||||
return additionalHeaders[header] || clientHeader || _default;
|
||||
}
|
||||
_getAgent(parsedUrl) {
|
||||
let agent;
|
||||
let proxyUrl = pm.getProxyUrl(parsedUrl);
|
||||
let useProxy = proxyUrl && proxyUrl.hostname;
|
||||
if (this._keepAlive && useProxy) {
|
||||
agent = this._proxyAgent;
|
||||
}
|
||||
if (this._keepAlive && !useProxy) {
|
||||
agent = this._agent;
|
||||
}
|
||||
// if agent is already assigned use that agent.
|
||||
if (!!agent) {
|
||||
return agent;
|
||||
}
|
||||
const usingSsl = parsedUrl.protocol === 'https:';
|
||||
let maxSockets = 100;
|
||||
if (!!this.requestOptions) {
|
||||
maxSockets = this.requestOptions.maxSockets || http.globalAgent.maxSockets;
|
||||
}
|
||||
if (useProxy) {
|
||||
// If using proxy, need tunnel
|
||||
if (!tunnel) {
|
||||
tunnel = require('tunnel');
|
||||
}
|
||||
const agentOptions = {
|
||||
maxSockets: maxSockets,
|
||||
keepAlive: this._keepAlive,
|
||||
proxy: {
|
||||
...((proxyUrl.username || proxyUrl.password) && {
|
||||
proxyAuth: `${proxyUrl.username}:${proxyUrl.password}`
|
||||
}),
|
||||
host: proxyUrl.hostname,
|
||||
port: proxyUrl.port
|
||||
}
|
||||
};
|
||||
let tunnelAgent;
|
||||
const overHttps = proxyUrl.protocol === 'https:';
|
||||
if (usingSsl) {
|
||||
tunnelAgent = overHttps ? tunnel.httpsOverHttps : tunnel.httpsOverHttp;
|
||||
}
|
||||
else {
|
||||
tunnelAgent = overHttps ? tunnel.httpOverHttps : tunnel.httpOverHttp;
|
||||
}
|
||||
agent = tunnelAgent(agentOptions);
|
||||
this._proxyAgent = agent;
|
||||
}
|
||||
// if reusing agent across request and tunneling agent isn't assigned create a new agent
|
||||
if (this._keepAlive && !agent) {
|
||||
const options = { keepAlive: this._keepAlive, maxSockets: maxSockets };
|
||||
agent = usingSsl ? new https.Agent(options) : new http.Agent(options);
|
||||
this._agent = agent;
|
||||
}
|
||||
// if not using private agent and tunnel agent isn't setup then use global agent
|
||||
if (!agent) {
|
||||
agent = usingSsl ? https.globalAgent : http.globalAgent;
|
||||
}
|
||||
if (usingSsl && this._ignoreSslError) {
|
||||
// we don't want to set NODE_TLS_REJECT_UNAUTHORIZED=0 since that will affect request for entire process
|
||||
// http.RequestOptions doesn't expose a way to modify RequestOptions.agent.options
|
||||
// we have to cast it to any and change it directly
|
||||
agent.options = Object.assign(agent.options || {}, {
|
||||
rejectUnauthorized: false
|
||||
});
|
||||
}
|
||||
return agent;
|
||||
}
|
||||
_performExponentialBackoff(retryNumber) {
|
||||
retryNumber = Math.min(ExponentialBackoffCeiling, retryNumber);
|
||||
const ms = ExponentialBackoffTimeSlice * Math.pow(2, retryNumber);
|
||||
return new Promise(resolve => setTimeout(() => resolve(), ms));
|
||||
}
|
||||
static dateTimeDeserializer(key, value) {
|
||||
if (typeof value === 'string') {
|
||||
let a = new Date(value);
|
||||
if (!isNaN(a.valueOf())) {
|
||||
return a;
|
||||
}
|
||||
}
|
||||
return value;
|
||||
}
|
||||
async _processResponse(res, options) {
|
||||
return new Promise(async (resolve, reject) => {
|
||||
const statusCode = res.message.statusCode;
|
||||
const response = {
|
||||
statusCode: statusCode,
|
||||
result: null,
|
||||
headers: {}
|
||||
};
|
||||
// not found leads to null obj returned
|
||||
if (statusCode == HttpCodes.NotFound) {
|
||||
resolve(response);
|
||||
}
|
||||
let obj;
|
||||
let contents;
|
||||
// get the result from the body
|
||||
try {
|
||||
contents = await res.readBody();
|
||||
if (contents && contents.length > 0) {
|
||||
if (options && options.deserializeDates) {
|
||||
obj = JSON.parse(contents, HttpClient.dateTimeDeserializer);
|
||||
}
|
||||
else {
|
||||
obj = JSON.parse(contents);
|
||||
}
|
||||
response.result = obj;
|
||||
}
|
||||
response.headers = res.message.headers;
|
||||
}
|
||||
catch (err) {
|
||||
// Invalid resource (contents not json); leaving result obj null
|
||||
}
|
||||
// note that 3xx redirects are handled by the http layer.
|
||||
if (statusCode > 299) {
|
||||
let msg;
|
||||
// if exception/error in body, attempt to get better error
|
||||
if (obj && obj.message) {
|
||||
msg = obj.message;
|
||||
}
|
||||
else if (contents && contents.length > 0) {
|
||||
// it may be the case that the exception is in the body message as string
|
||||
msg = contents;
|
||||
}
|
||||
else {
|
||||
msg = 'Failed request: (' + statusCode + ')';
|
||||
}
|
||||
let err = new HttpClientError(msg, statusCode);
|
||||
err.result = response.result;
|
||||
reject(err);
|
||||
}
|
||||
else {
|
||||
resolve(response);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
exports.HttpClient = HttpClient;
|
49
node_modules/@actions/http-client/interfaces.d.ts
generated
vendored
Normal file
49
node_modules/@actions/http-client/interfaces.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,49 @@
|
|||
/// <reference types="node" />
|
||||
import http = require('http');
|
||||
export interface IHeaders {
|
||||
[key: string]: any;
|
||||
}
|
||||
export interface IHttpClient {
|
||||
options(requestUrl: string, additionalHeaders?: IHeaders): Promise<IHttpClientResponse>;
|
||||
get(requestUrl: string, additionalHeaders?: IHeaders): Promise<IHttpClientResponse>;
|
||||
del(requestUrl: string, additionalHeaders?: IHeaders): Promise<IHttpClientResponse>;
|
||||
post(requestUrl: string, data: string, additionalHeaders?: IHeaders): Promise<IHttpClientResponse>;
|
||||
patch(requestUrl: string, data: string, additionalHeaders?: IHeaders): Promise<IHttpClientResponse>;
|
||||
put(requestUrl: string, data: string, additionalHeaders?: IHeaders): Promise<IHttpClientResponse>;
|
||||
sendStream(verb: string, requestUrl: string, stream: NodeJS.ReadableStream, additionalHeaders?: IHeaders): Promise<IHttpClientResponse>;
|
||||
request(verb: string, requestUrl: string, data: string | NodeJS.ReadableStream, headers: IHeaders): Promise<IHttpClientResponse>;
|
||||
requestRaw(info: IRequestInfo, data: string | NodeJS.ReadableStream): Promise<IHttpClientResponse>;
|
||||
requestRawWithCallback(info: IRequestInfo, data: string | NodeJS.ReadableStream, onResult: (err: any, res: IHttpClientResponse) => void): void;
|
||||
}
|
||||
export interface IRequestHandler {
|
||||
prepareRequest(options: http.RequestOptions): void;
|
||||
canHandleAuthentication(response: IHttpClientResponse): boolean;
|
||||
handleAuthentication(httpClient: IHttpClient, requestInfo: IRequestInfo, objs: any): Promise<IHttpClientResponse>;
|
||||
}
|
||||
export interface IHttpClientResponse {
|
||||
message: http.IncomingMessage;
|
||||
readBody(): Promise<string>;
|
||||
}
|
||||
export interface IRequestInfo {
|
||||
options: http.RequestOptions;
|
||||
parsedUrl: URL;
|
||||
httpModule: any;
|
||||
}
|
||||
export interface IRequestOptions {
|
||||
headers?: IHeaders;
|
||||
socketTimeout?: number;
|
||||
ignoreSslError?: boolean;
|
||||
allowRedirects?: boolean;
|
||||
allowRedirectDowngrade?: boolean;
|
||||
maxRedirects?: number;
|
||||
maxSockets?: number;
|
||||
keepAlive?: boolean;
|
||||
deserializeDates?: boolean;
|
||||
allowRetries?: boolean;
|
||||
maxRetries?: number;
|
||||
}
|
||||
export interface ITypedResponse<T> {
|
||||
statusCode: number;
|
||||
result: T | null;
|
||||
headers: Object;
|
||||
}
|
2
node_modules/@actions/http-client/interfaces.js
generated
vendored
Normal file
2
node_modules/@actions/http-client/interfaces.js
generated
vendored
Normal file
|
@ -0,0 +1,2 @@
|
|||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
39
node_modules/@actions/http-client/package.json
generated
vendored
Normal file
39
node_modules/@actions/http-client/package.json
generated
vendored
Normal file
|
@ -0,0 +1,39 @@
|
|||
{
|
||||
"name": "@actions/http-client",
|
||||
"version": "1.0.11",
|
||||
"description": "Actions Http Client",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"build": "rm -Rf ./_out && tsc && cp package*.json ./_out && cp *.md ./_out && cp LICENSE ./_out && cp actions.png ./_out",
|
||||
"test": "jest",
|
||||
"format": "prettier --write *.ts && prettier --write **/*.ts",
|
||||
"format-check": "prettier --check *.ts && prettier --check **/*.ts",
|
||||
"audit-check": "npm audit --audit-level=moderate"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/actions/http-client.git"
|
||||
},
|
||||
"keywords": [
|
||||
"Actions",
|
||||
"Http"
|
||||
],
|
||||
"author": "GitHub, Inc.",
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/actions/http-client/issues"
|
||||
},
|
||||
"homepage": "https://github.com/actions/http-client#readme",
|
||||
"devDependencies": {
|
||||
"@types/jest": "^25.1.4",
|
||||
"@types/node": "^12.12.31",
|
||||
"jest": "^25.1.0",
|
||||
"prettier": "^2.0.4",
|
||||
"proxy": "^1.0.1",
|
||||
"ts-jest": "^25.2.1",
|
||||
"typescript": "^3.8.3"
|
||||
},
|
||||
"dependencies": {
|
||||
"tunnel": "0.0.6"
|
||||
}
|
||||
}
|
2
node_modules/@actions/http-client/proxy.d.ts
generated
vendored
Normal file
2
node_modules/@actions/http-client/proxy.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,2 @@
|
|||
export declare function getProxyUrl(reqUrl: URL): URL | undefined;
|
||||
export declare function checkBypass(reqUrl: URL): boolean;
|
57
node_modules/@actions/http-client/proxy.js
generated
vendored
Normal file
57
node_modules/@actions/http-client/proxy.js
generated
vendored
Normal file
|
@ -0,0 +1,57 @@
|
|||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
function getProxyUrl(reqUrl) {
|
||||
let usingSsl = reqUrl.protocol === 'https:';
|
||||
let proxyUrl;
|
||||
if (checkBypass(reqUrl)) {
|
||||
return proxyUrl;
|
||||
}
|
||||
let proxyVar;
|
||||
if (usingSsl) {
|
||||
proxyVar = process.env['https_proxy'] || process.env['HTTPS_PROXY'];
|
||||
}
|
||||
else {
|
||||
proxyVar = process.env['http_proxy'] || process.env['HTTP_PROXY'];
|
||||
}
|
||||
if (proxyVar) {
|
||||
proxyUrl = new URL(proxyVar);
|
||||
}
|
||||
return proxyUrl;
|
||||
}
|
||||
exports.getProxyUrl = getProxyUrl;
|
||||
function checkBypass(reqUrl) {
|
||||
if (!reqUrl.hostname) {
|
||||
return false;
|
||||
}
|
||||
let noProxy = process.env['no_proxy'] || process.env['NO_PROXY'] || '';
|
||||
if (!noProxy) {
|
||||
return false;
|
||||
}
|
||||
// Determine the request port
|
||||
let reqPort;
|
||||
if (reqUrl.port) {
|
||||
reqPort = Number(reqUrl.port);
|
||||
}
|
||||
else if (reqUrl.protocol === 'http:') {
|
||||
reqPort = 80;
|
||||
}
|
||||
else if (reqUrl.protocol === 'https:') {
|
||||
reqPort = 443;
|
||||
}
|
||||
// Format the request hostname and hostname with port
|
||||
let upperReqHosts = [reqUrl.hostname.toUpperCase()];
|
||||
if (typeof reqPort === 'number') {
|
||||
upperReqHosts.push(`${upperReqHosts[0]}:${reqPort}`);
|
||||
}
|
||||
// Compare request host against noproxy
|
||||
for (let upperNoProxyItem of noProxy
|
||||
.split(',')
|
||||
.map(x => x.trim().toUpperCase())
|
||||
.filter(x => x)) {
|
||||
if (upperReqHosts.some(x => x === upperNoProxyItem)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
exports.checkBypass = checkBypass;
|
5
node_modules/@choojs/findup/.travis.yml
generated
vendored
5
node_modules/@choojs/findup/.travis.yml
generated
vendored
|
@ -1,5 +0,0 @@
|
|||
language: node_js
|
||||
node_js:
|
||||
- "stable"
|
||||
- "8"
|
||||
- "6"
|
144
node_modules/@choojs/findup/README.md
generated
vendored
144
node_modules/@choojs/findup/README.md
generated
vendored
|
@ -1,144 +0,0 @@
|
|||
[![build status](https://secure.travis-ci.org/choojs/findup.png)](http://travis-ci.org/choojs/findup)
|
||||
@choojs/findup
|
||||
=======
|
||||
|
||||
> This is a fork of [Filirom1/findup](https://github.com/Filirom1/findup), pending [#16](https://github.com/Filirom1/findup/pull/16).
|
||||
|
||||
### Install
|
||||
|
||||
```sh
|
||||
npm install -g @choojs/findup
|
||||
```
|
||||
|
||||
### Usage
|
||||
|
||||
Find up a file in ancestor's dir
|
||||
|
||||
|
||||
.
|
||||
├── config.json
|
||||
└── f
|
||||
└── e
|
||||
└── d
|
||||
└── c
|
||||
├── b
|
||||
│ └── a
|
||||
└── config.json
|
||||
|
||||
### Options
|
||||
|
||||
- `maxdepth`: (Number, default -1) How far to traverse before giving up. If maxdepth is `-1`, then there is no limit.
|
||||
|
||||
#### Async
|
||||
|
||||
findup(dir, fileName, options, callback)
|
||||
findup(dir, iterator, options, callback) with `iterator(dir, cb)` where cb only accept `true` or `false`
|
||||
|
||||
```js
|
||||
var findup = require('@choojs/findup');
|
||||
|
||||
|
||||
findup(__dirname + '/f/e/d/c/b/a', 'config.json', function(err, dir){
|
||||
// if(e) e === new Error('not found')
|
||||
// dir === '/f/e/d/c'
|
||||
});
|
||||
```
|
||||
|
||||
or
|
||||
|
||||
```js
|
||||
findup(__dirname + '/f/e/d/c/b/a', function(dir, cb){
|
||||
require('path').exists(dir + '/config.json', cb);
|
||||
}, function(err, dir){
|
||||
// if(e) e === new Error('not found')
|
||||
// dir === '/f/e/d/c'
|
||||
});
|
||||
```
|
||||
|
||||
#### EventEmitter
|
||||
|
||||
findup(dir, fileName, options)
|
||||
|
||||
```js
|
||||
var findup = require('@choojs/findup');
|
||||
var fup = findup(__dirname + '/f/e/d/c/b/a', 'config.json');
|
||||
```
|
||||
|
||||
findup(dir, iterator, options) with `iterator(dir, cb)` where cb only accept `true` or `false`
|
||||
|
||||
```js
|
||||
var findup = require('@choojs/findup');
|
||||
var fup = findup(__dirname + '/f/e/d/c/b/a', function(dir, cb){
|
||||
require('path').exists(dir + '/config.json', cb);
|
||||
});
|
||||
```
|
||||
|
||||
findup return an EventEmitter. 3 events are emitted: `found`, `error`, `end`
|
||||
|
||||
`found` event is emitted each time a file is found.
|
||||
|
||||
You can stop the traversing by calling `stop` manually.
|
||||
|
||||
```js
|
||||
fup.on('found', function(dir){
|
||||
// dir === '/f/e/d/c'
|
||||
fup.stop();
|
||||
});
|
||||
```
|
||||
|
||||
`error` event is emitted when error happens
|
||||
|
||||
```js
|
||||
fup.on('error', function(e){
|
||||
// if(e) e === new Error('not found')
|
||||
});
|
||||
```
|
||||
|
||||
`end` event is emitted at the end of the traversing or after `stop()` is
|
||||
called.
|
||||
|
||||
```js
|
||||
fup.on('end', function(){
|
||||
// happy end
|
||||
});
|
||||
```
|
||||
|
||||
#### Sync
|
||||
|
||||
findup(dir, fileName)
|
||||
findup(dir, iteratorSync) with `iteratorSync` return `true` or `false`
|
||||
```js
|
||||
var findup = require('@choojs/findup');
|
||||
|
||||
try{
|
||||
var dir = findup.sync(__dirname + '/f/e/d/c/b/a', 'config.json'); // dir === '/f/e/d/c'
|
||||
}catch(e){
|
||||
// if(e) e === new Error('not found')
|
||||
}
|
||||
```
|
||||
|
||||
#### CLI
|
||||
```js
|
||||
npm install -g @choojs/findup
|
||||
|
||||
$ cd test/fixture/f/e/d/c/b/a/
|
||||
$ findup package.json
|
||||
/root/findup/package.json
|
||||
```
|
||||
|
||||
Usage
|
||||
|
||||
```
|
||||
$ findup -h
|
||||
|
||||
Usage: findup [FILE]
|
||||
|
||||
--name, -n The name of the file to found
|
||||
--dir, -d The directoy where we will start walking up $PWD
|
||||
--help, -h show usage false
|
||||
--verbose, -v print log false
|
||||
```
|
||||
|
||||
### LICENSE MIT
|
||||
|
||||
### Read the tests :)
|
41
node_modules/@choojs/findup/bin/findup.js
generated
vendored
41
node_modules/@choojs/findup/bin/findup.js
generated
vendored
|
@ -1,41 +0,0 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
var findup = require('..'),
|
||||
path = require('path'),
|
||||
pkg = require('../package'),
|
||||
program = require('commander'),
|
||||
options = {},
|
||||
optionKeys = ['name', 'dir', 'maxdepth', 'verbose'],
|
||||
EXIT_FAILURE = -1;
|
||||
|
||||
program
|
||||
.version(pkg.version)
|
||||
.option('--name <name>', 'The name of the file to find', String)
|
||||
.option('--dir <dir>', 'The directoy where we will start walking up', process.cwd(), path)
|
||||
.option('--maxdepth <levels>', 'Ascend at most <levels> levels before giving up. -1 means no limit', -1, Number)
|
||||
.option('--verbose', 'print log', false, Boolean)
|
||||
.parse(process.argv);
|
||||
|
||||
optionKeys.forEach(function(optionKey){
|
||||
if (optionKey === 'maxdepth') {
|
||||
options[optionKey] = +program[optionKey];
|
||||
} else {
|
||||
options[optionKey] = program[optionKey];
|
||||
}
|
||||
});
|
||||
|
||||
if(program.args && program.args.length >=1 && !options.name){
|
||||
options.name = program.args[0];
|
||||
}
|
||||
|
||||
if(!options.name) {
|
||||
program.outputHelp();
|
||||
process.exit(EXIT_FAILURE);
|
||||
}
|
||||
|
||||
var file = options.name;
|
||||
|
||||
findup(options.dir, file, options, function(err, dir){
|
||||
if(err) return console.error(err.message ? err.message : err);
|
||||
console.log(path.join(dir, file));
|
||||
});
|
120
node_modules/@choojs/findup/index.js
generated
vendored
120
node_modules/@choojs/findup/index.js
generated
vendored
|
@ -1,120 +0,0 @@
|
|||
var fs = require('fs'),
|
||||
Path = require('path'),
|
||||
util = require('util'),
|
||||
EE = require('events').EventEmitter;
|
||||
|
||||
function fsExists(file, cb) {
|
||||
if (!fs.access) return fs.exists(file, cb);
|
||||
fs.access(file, function(err) {
|
||||
cb(err ? false : true);
|
||||
});
|
||||
}
|
||||
|
||||
function fsExistsSync(file) {
|
||||
if (!fs.accessSync) return fs.existsSync(file);
|
||||
try {
|
||||
fs.accessSync(file);
|
||||
} catch(err) {
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
module.exports = function(dir, iterator, options, callback){
|
||||
return FindUp(dir, iterator, options, callback);
|
||||
};
|
||||
|
||||
function FindUp(dir, iterator, options, callback){
|
||||
if (!(this instanceof FindUp)) {
|
||||
return new FindUp(dir, iterator, options, callback);
|
||||
}
|
||||
if(typeof options === 'function'){
|
||||
callback = options;
|
||||
options = {};
|
||||
}
|
||||
options = options || {};
|
||||
|
||||
EE.call(this);
|
||||
this.found = false;
|
||||
this.stopPlease = false;
|
||||
var self = this;
|
||||
|
||||
if(typeof iterator === 'string'){
|
||||
var file = iterator;
|
||||
iterator = function(dir, cb){
|
||||
return fsExists(Path.join(dir, file), cb);
|
||||
};
|
||||
}
|
||||
|
||||
if(callback) {
|
||||
this.on('found', function(dir){
|
||||
if(options.verbose) console.log(('found '+ dir ));
|
||||
callback(null, dir);
|
||||
self.stop();
|
||||
});
|
||||
|
||||
this.on('end', function(){
|
||||
if(options.verbose) console.log('end');
|
||||
if(!self.found) callback(new Error('not found'));
|
||||
});
|
||||
|
||||
this.on('error', function(err){
|
||||
if(options.verbose) console.log('error', err);
|
||||
callback(err);
|
||||
});
|
||||
}
|
||||
|
||||
this._find(dir, iterator, options, callback);
|
||||
}
|
||||
util.inherits(FindUp, EE);
|
||||
|
||||
FindUp.prototype._find = function(dir, iterator, options, callback, currentDepth){
|
||||
var self = this;
|
||||
if (typeof currentDepth !== 'number') currentDepth = 0;
|
||||
|
||||
iterator(dir, function(exists){
|
||||
if(options.verbose) console.log(('traverse '+ dir));
|
||||
if (typeof options.maxdepth === 'number' && options.maxdepth >= 0 && currentDepth > options.maxdepth) {
|
||||
return self.emit('end');
|
||||
}
|
||||
currentDepth++;
|
||||
if(exists) {
|
||||
self.found = true;
|
||||
self.emit('found', dir);
|
||||
}
|
||||
|
||||
var parentDir = Path.join(dir, '..');
|
||||
if (self.stopPlease) return self.emit('end');
|
||||
if (dir === parentDir) return self.emit('end');
|
||||
if(dir.indexOf('../../') !== -1 ) return self.emit('error', new Error(dir + ' is not correct.'));
|
||||
self._find(parentDir, iterator, options, callback, currentDepth);
|
||||
});
|
||||
};
|
||||
|
||||
FindUp.prototype.stop = function(){
|
||||
this.stopPlease = true;
|
||||
};
|
||||
|
||||
module.exports.FindUp = FindUp;
|
||||
|
||||
module.exports.sync = function(dir, iteratorSync, options){
|
||||
if(typeof iteratorSync === 'string'){
|
||||
var file = iteratorSync;
|
||||
iteratorSync = function(dir){
|
||||
return fsExistsSync(Path.join(dir, file));
|
||||
};
|
||||
}
|
||||
options = options || {};
|
||||
var initialDir = dir;
|
||||
var currentDepth = 0;
|
||||
while(dir !== Path.join(dir, '..')){
|
||||
if (typeof options.maxdepth === 'number' && options.maxdepth >= 0 && currentDepth > options.maxdepth) {
|
||||
break;
|
||||
}
|
||||
currentDepth++;
|
||||
if(dir.indexOf('../../') !== -1 ) throw new Error(initialDir + ' is not correct.');
|
||||
if(iteratorSync(dir)) return dir;
|
||||
dir = Path.join(dir, '..');
|
||||
}
|
||||
throw new Error('not found');
|
||||
};
|
61
node_modules/@choojs/findup/package.json
generated
vendored
61
node_modules/@choojs/findup/package.json
generated
vendored
|
@ -1,61 +0,0 @@
|
|||
{
|
||||
"_args": [
|
||||
[
|
||||
"@choojs/findup@0.2.1",
|
||||
"F:\\laragon\\www\\wishthis"
|
||||
]
|
||||
],
|
||||
"_from": "@choojs/findup@0.2.1",
|
||||
"_id": "@choojs/findup@0.2.1",
|
||||
"_inBundle": false,
|
||||
"_integrity": "sha512-YstAqNb0MCN8PjdLCDfRsBcGVRN41f3vgLvaI0IrIcBp4AqILRSS0DeWNGkicC+f/zRIPJLc+9RURVSepwvfBw==",
|
||||
"_location": "/@choojs/findup",
|
||||
"_phantomChildren": {},
|
||||
"_requested": {
|
||||
"type": "version",
|
||||
"registry": true,
|
||||
"raw": "@choojs/findup@0.2.1",
|
||||
"name": "@choojs/findup",
|
||||
"escapedName": "@choojs%2ffindup",
|
||||
"scope": "@choojs",
|
||||
"rawSpec": "0.2.1",
|
||||
"saveSpec": null,
|
||||
"fetchSpec": "0.2.1"
|
||||
},
|
||||
"_requiredBy": [
|
||||
"/rtlcss"
|
||||
],
|
||||
"_resolved": "https://registry.npmjs.org/@choojs/findup/-/findup-0.2.1.tgz",
|
||||
"_spec": "0.2.1",
|
||||
"_where": "F:\\laragon\\www\\wishthis",
|
||||
"author": {
|
||||
"name": "Filirom1",
|
||||
"email": "filirom1@gmail.com"
|
||||
},
|
||||
"bin": {
|
||||
"findup": "bin/findup.js"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/choojs/findup/issues"
|
||||
},
|
||||
"dependencies": {
|
||||
"commander": "^2.15.1"
|
||||
},
|
||||
"description": "Find a file by walking up the directory tree",
|
||||
"devDependencies": {
|
||||
"chai": "^4.1.2",
|
||||
"mocha": "^5.0.5"
|
||||
},
|
||||
"homepage": "https://github.com/choojs/findup",
|
||||
"license": "MIT",
|
||||
"main": "index.js",
|
||||
"name": "@choojs/findup",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/choojs/findup.git"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "mocha ./test/*.js"
|
||||
},
|
||||
"version": "0.2.1"
|
||||
}
|
164
node_modules/@choojs/findup/test/findup-test.js
generated
vendored
164
node_modules/@choojs/findup/test/findup-test.js
generated
vendored
|
@ -1,164 +0,0 @@
|
|||
var assert = require('chai').assert,
|
||||
Path = require('path'),
|
||||
fs = require('fs'),
|
||||
findup = require('..');
|
||||
|
||||
describe('find-up', function(){
|
||||
var fixtureDir = Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c', 'b', 'a'),
|
||||
fsExists = fs.exists ? fs.exists : Path.exists;
|
||||
it('accept a function', function(done){
|
||||
findup(fixtureDir, function(dir, cb){
|
||||
return fsExists(Path.join(dir, 'config.json'), cb);
|
||||
}, function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('accept a string', function(done){
|
||||
findup(fixtureDir, 'config.json', function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('is usable with the Object syntax', function(done) {
|
||||
new findup.FindUp(fixtureDir, 'config.json', function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('find several files when using with the EventEmitter syntax', function(done){
|
||||
var ee = new findup.FindUp(fixtureDir, 'config.json');
|
||||
ee.once('found', function(file){
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
|
||||
ee.once('found', function(file){
|
||||
assert.equal(file, Path.join(__dirname, 'fixture'));
|
||||
|
||||
ee.once('end', function(){
|
||||
done();
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
it('return files in top dir', function(done){
|
||||
findup(fixtureDir, 'top.json', function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c', 'b', 'a'));
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('return files in root dir', function(done){
|
||||
findup(fixtureDir, 'dev', function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, '/');
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('return an error when looking for non existiong files', function(done){
|
||||
findup(fixtureDir, 'toto.json', function(err, file){
|
||||
assert.isNotNull(err);
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('return an error when looking in a non existing directory', function(done){
|
||||
findup('dsqkjfnqsdkjghq', 'toto.json', function(err, file){
|
||||
assert.isNotNull(err);
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('trigger an error event when looking in a non existing directory', function(done){
|
||||
findup('dsqkjfnqsdkjghq', 'toto.json').on('error', function(err, files){
|
||||
assert.isNotNull(err);
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
describe('Sync API', function(){
|
||||
it('accept a string', function(){
|
||||
var file = findup.sync(fixtureDir, 'config.json');
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
});
|
||||
|
||||
it('return a file in top dir', function(){
|
||||
var file = findup.sync(fixtureDir, 'top.json');
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c', 'b', 'a'));
|
||||
});
|
||||
|
||||
it('throw error when looking for a non existing file', function(){
|
||||
assert.throw(function(){
|
||||
findup.sync(fixtureDir, 'toto.json');
|
||||
});
|
||||
});
|
||||
|
||||
it('throw error when looking for in a non existing directory', function(){
|
||||
assert.throw(function(){
|
||||
findup.sync('uhjhbjkg,nfg', 'toto.json');
|
||||
});
|
||||
});
|
||||
|
||||
it('should work with maxdepth (top, maxdepth 0)', function(){
|
||||
var file = findup.sync(fixtureDir, 'top.json', {maxdepth: 0});
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c', 'b', 'a'));
|
||||
});
|
||||
|
||||
it('should work with maxdepth (config, maxdepth 0)', function(){
|
||||
try {
|
||||
findup.sync(fixtureDir, 'config.json', {maxdepth: 0});
|
||||
} catch (err) {
|
||||
assert.equal(err.message, 'not found');
|
||||
}
|
||||
});
|
||||
|
||||
it('should work with maxdepth (config, maxdepth 2)', function(){
|
||||
var file = findup.sync(fixtureDir, 'config.json', {maxdepth: 2});
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
});
|
||||
|
||||
it('should work with maxdepth (config, maxdepth -1)', function(){
|
||||
var file = findup.sync(fixtureDir, 'config.json', {maxdepth: -1});
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
});
|
||||
});
|
||||
|
||||
it('should work with maxdepth (top, maxdepth 0)', function(done){
|
||||
findup(fixtureDir, 'top.json', {maxdepth: 0}, function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c', 'b', 'a'));
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('should work with maxdepth (config, maxdepth 0)', function(done){
|
||||
findup(fixtureDir, 'config.json', {maxdepth: 0}, function(err, file){
|
||||
assert.equal(err.message, 'not found');
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('should work with maxdepth (config, maxdepth 2)', function(done){
|
||||
findup(fixtureDir, 'config.json', {maxdepth: 2}, function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('should work with maxdepth (config, maxdepth -1)', function(done){
|
||||
findup(fixtureDir, 'config.json', {maxdepth: -1}, function(err, file){
|
||||
assert.ifError(err);
|
||||
assert.equal(file, Path.join(__dirname, 'fixture', 'f', 'e', 'd', 'c'));
|
||||
done();
|
||||
});
|
||||
});
|
||||
});
|
0
node_modules/@choojs/findup/test/fixture/config.json
generated
vendored
0
node_modules/@choojs/findup/test/fixture/config.json
generated
vendored
0
node_modules/@choojs/findup/test/fixture/f/e/d/c/b/a/top.json
generated
vendored
0
node_modules/@choojs/findup/test/fixture/f/e/d/c/b/a/top.json
generated
vendored
0
node_modules/@choojs/findup/test/fixture/f/e/d/c/config.json
generated
vendored
0
node_modules/@choojs/findup/test/fixture/f/e/d/c/config.json
generated
vendored
1
node_modules/@choojs/findup/test/mocha.opts
generated
vendored
1
node_modules/@choojs/findup/test/mocha.opts
generated
vendored
|
@ -1 +0,0 @@
|
|||
--reporter spec
|
248
node_modules/ansi-escapes/index.d.ts
generated
vendored
Normal file
248
node_modules/ansi-escapes/index.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,248 @@
|
|||
/// <reference types="node"/>
|
||||
import {LiteralUnion} from 'type-fest';
|
||||
|
||||
declare namespace ansiEscapes {
|
||||
interface ImageOptions {
|
||||
/**
|
||||
The width is given as a number followed by a unit, or the word `'auto'`.
|
||||
|
||||
- `N`: N character cells.
|
||||
- `Npx`: N pixels.
|
||||
- `N%`: N percent of the session's width or height.
|
||||
- `auto`: The image's inherent size will be used to determine an appropriate dimension.
|
||||
*/
|
||||
readonly width?: LiteralUnion<'auto', number | string>;
|
||||
|
||||
/**
|
||||
The height is given as a number followed by a unit, or the word `'auto'`.
|
||||
|
||||
- `N`: N character cells.
|
||||
- `Npx`: N pixels.
|
||||
- `N%`: N percent of the session's width or height.
|
||||
- `auto`: The image's inherent size will be used to determine an appropriate dimension.
|
||||
*/
|
||||
readonly height?: LiteralUnion<'auto', number | string>;
|
||||
|
||||
readonly preserveAspectRatio?: boolean;
|
||||
}
|
||||
|
||||
interface AnnotationOptions {
|
||||
/**
|
||||
Nonzero number of columns to annotate.
|
||||
|
||||
Default: The remainder of the line.
|
||||
*/
|
||||
readonly length?: number;
|
||||
|
||||
/**
|
||||
Starting X coordinate.
|
||||
|
||||
Must be used with `y` and `length`.
|
||||
|
||||
Default: The cursor position
|
||||
*/
|
||||
readonly x?: number;
|
||||
|
||||
/**
|
||||
Starting Y coordinate.
|
||||
|
||||
Must be used with `x` and `length`.
|
||||
|
||||
Default: Cursor position.
|
||||
*/
|
||||
readonly y?: number;
|
||||
|
||||
/**
|
||||
Create a "hidden" annotation.
|
||||
|
||||
Annotations created this way can be shown using the "Show Annotations" iTerm command.
|
||||
*/
|
||||
readonly isHidden?: boolean;
|
||||
}
|
||||
}
|
||||
|
||||
declare const ansiEscapes: {
|
||||
/**
|
||||
Set the absolute position of the cursor. `x0` `y0` is the top left of the screen.
|
||||
*/
|
||||
cursorTo(x: number, y?: number): string;
|
||||
|
||||
/**
|
||||
Set the position of the cursor relative to its current position.
|
||||
*/
|
||||
cursorMove(x: number, y?: number): string;
|
||||
|
||||
/**
|
||||
Move cursor up a specific amount of rows.
|
||||
|
||||
@param count - Count of rows to move up. Default is `1`.
|
||||
*/
|
||||
cursorUp(count?: number): string;
|
||||
|
||||
/**
|
||||
Move cursor down a specific amount of rows.
|
||||
|
||||
@param count - Count of rows to move down. Default is `1`.
|
||||
*/
|
||||
cursorDown(count?: number): string;
|
||||
|
||||
/**
|
||||
Move cursor forward a specific amount of rows.
|
||||
|
||||
@param count - Count of rows to move forward. Default is `1`.
|
||||
*/
|
||||
cursorForward(count?: number): string;
|
||||
|
||||
/**
|
||||
Move cursor backward a specific amount of rows.
|
||||
|
||||
@param count - Count of rows to move backward. Default is `1`.
|
||||
*/
|
||||
cursorBackward(count?: number): string;
|
||||
|
||||
/**
|
||||
Move cursor to the left side.
|
||||
*/
|
||||
cursorLeft: string;
|
||||
|
||||
/**
|
||||
Save cursor position.
|
||||
*/
|
||||
cursorSavePosition: string;
|
||||
|
||||
/**
|
||||
Restore saved cursor position.
|
||||
*/
|
||||
cursorRestorePosition: string;
|
||||
|
||||
/**
|
||||
Get cursor position.
|
||||
*/
|
||||
cursorGetPosition: string;
|
||||
|
||||
/**
|
||||
Move cursor to the next line.
|
||||
*/
|
||||
cursorNextLine: string;
|
||||
|
||||
/**
|
||||
Move cursor to the previous line.
|
||||
*/
|
||||
cursorPrevLine: string;
|
||||
|
||||
/**
|
||||
Hide cursor.
|
||||
*/
|
||||
cursorHide: string;
|
||||
|
||||
/**
|
||||
Show cursor.
|
||||
*/
|
||||
cursorShow: string;
|
||||
|
||||
/**
|
||||
Erase from the current cursor position up the specified amount of rows.
|
||||
|
||||
@param count - Count of rows to erase.
|
||||
*/
|
||||
eraseLines(count: number): string;
|
||||
|
||||
/**
|
||||
Erase from the current cursor position to the end of the current line.
|
||||
*/
|
||||
eraseEndLine: string;
|
||||
|
||||
/**
|
||||
Erase from the current cursor position to the start of the current line.
|
||||
*/
|
||||
eraseStartLine: string;
|
||||
|
||||
/**
|
||||
Erase the entire current line.
|
||||
*/
|
||||
eraseLine: string;
|
||||
|
||||
/**
|
||||
Erase the screen from the current line down to the bottom of the screen.
|
||||
*/
|
||||
eraseDown: string;
|
||||
|
||||
/**
|
||||
Erase the screen from the current line up to the top of the screen.
|
||||
*/
|
||||
eraseUp: string;
|
||||
|
||||
/**
|
||||
Erase the screen and move the cursor the top left position.
|
||||
*/
|
||||
eraseScreen: string;
|
||||
|
||||
/**
|
||||
Scroll display up one line.
|
||||
*/
|
||||
scrollUp: string;
|
||||
|
||||
/**
|
||||
Scroll display down one line.
|
||||
*/
|
||||
scrollDown: string;
|
||||
|
||||
/**
|
||||
Clear the terminal screen. (Viewport)
|
||||
*/
|
||||
clearScreen: string;
|
||||
|
||||
/**
|
||||
Clear the whole terminal, including scrollback buffer. (Not just the visible part of it)
|
||||
*/
|
||||
clearTerminal: string;
|
||||
|
||||
/**
|
||||
Output a beeping sound.
|
||||
*/
|
||||
beep: string;
|
||||
|
||||
/**
|
||||
Create a clickable link.
|
||||
|
||||
[Supported terminals.](https://gist.github.com/egmontkob/eb114294efbcd5adb1944c9f3cb5feda) Use [`supports-hyperlinks`](https://github.com/jamestalmage/supports-hyperlinks) to detect link support.
|
||||
*/
|
||||
link(text: string, url: string): string;
|
||||
|
||||
/**
|
||||
Display an image.
|
||||
|
||||
_Currently only supported on iTerm2 >=3_
|
||||
|
||||
See [term-img](https://github.com/sindresorhus/term-img) for a higher-level module.
|
||||
|
||||
@param buffer - Buffer of an image. Usually read in with `fs.readFile()`.
|
||||
*/
|
||||
image(buffer: Buffer, options?: ansiEscapes.ImageOptions): string;
|
||||
|
||||
iTerm: {
|
||||
/**
|
||||
[Inform iTerm2](https://www.iterm2.com/documentation-escape-codes.html) of the current directory to help semantic history and enable [Cmd-clicking relative paths](https://coderwall.com/p/b7e82q/quickly-open-files-in-iterm-with-cmd-click).
|
||||
|
||||
@param cwd - Current directory. Default: `process.cwd()`.
|
||||
*/
|
||||
setCwd(cwd?: string): string;
|
||||
|
||||
/**
|
||||
An annotation looks like this when shown:
|
||||
|
||||
![screenshot of iTerm annotation](https://user-images.githubusercontent.com/924465/64382136-b60ac700-cfe9-11e9-8a35-9682e8dc4b72.png)
|
||||
|
||||
See the [iTerm Proprietary Escape Codes documentation](https://iterm2.com/documentation-escape-codes.html) for more information.
|
||||
|
||||
@param message - The message to display within the annotation. The `|` character is disallowed and will be stripped.
|
||||
@returns An escape code which will create an annotation when printed in iTerm2.
|
||||
*/
|
||||
annotation(message: string, options?: ansiEscapes.AnnotationOptions): string;
|
||||
};
|
||||
|
||||
// TODO: remove this in the next major version
|
||||
default: typeof ansiEscapes;
|
||||
};
|
||||
|
||||
export = ansiEscapes;
|
114
node_modules/ansi-escapes/index.js
generated
vendored
114
node_modules/ansi-escapes/index.js
generated
vendored
|
@ -1,12 +1,15 @@
|
|||
'use strict';
|
||||
const x = module.exports;
|
||||
const ansiEscapes = module.exports;
|
||||
// TODO: remove this in the next major version
|
||||
module.exports.default = ansiEscapes;
|
||||
|
||||
const ESC = '\u001B[';
|
||||
const OSC = '\u001B]';
|
||||
const BEL = '\u0007';
|
||||
const SEP = ';';
|
||||
const isTerminalApp = process.env.TERM_PROGRAM === 'Apple_Terminal';
|
||||
|
||||
x.cursorTo = (x, y) => {
|
||||
ansiEscapes.cursorTo = (x, y) => {
|
||||
if (typeof x !== 'number') {
|
||||
throw new TypeError('The `x` argument is required');
|
||||
}
|
||||
|
@ -18,7 +21,7 @@ x.cursorTo = (x, y) => {
|
|||
return ESC + (y + 1) + ';' + (x + 1) + 'H';
|
||||
};
|
||||
|
||||
x.cursorMove = (x, y) => {
|
||||
ansiEscapes.cursorMove = (x, y) => {
|
||||
if (typeof x !== 'number') {
|
||||
throw new TypeError('The `x` argument is required');
|
||||
}
|
||||
|
@ -40,56 +43,56 @@ x.cursorMove = (x, y) => {
|
|||
return ret;
|
||||
};
|
||||
|
||||
x.cursorUp = count => ESC + (typeof count === 'number' ? count : 1) + 'A';
|
||||
x.cursorDown = count => ESC + (typeof count === 'number' ? count : 1) + 'B';
|
||||
x.cursorForward = count => ESC + (typeof count === 'number' ? count : 1) + 'C';
|
||||
x.cursorBackward = count => ESC + (typeof count === 'number' ? count : 1) + 'D';
|
||||
ansiEscapes.cursorUp = (count = 1) => ESC + count + 'A';
|
||||
ansiEscapes.cursorDown = (count = 1) => ESC + count + 'B';
|
||||
ansiEscapes.cursorForward = (count = 1) => ESC + count + 'C';
|
||||
ansiEscapes.cursorBackward = (count = 1) => ESC + count + 'D';
|
||||
|
||||
x.cursorLeft = ESC + 'G';
|
||||
x.cursorSavePosition = ESC + (isTerminalApp ? '7' : 's');
|
||||
x.cursorRestorePosition = ESC + (isTerminalApp ? '8' : 'u');
|
||||
x.cursorGetPosition = ESC + '6n';
|
||||
x.cursorNextLine = ESC + 'E';
|
||||
x.cursorPrevLine = ESC + 'F';
|
||||
x.cursorHide = ESC + '?25l';
|
||||
x.cursorShow = ESC + '?25h';
|
||||
ansiEscapes.cursorLeft = ESC + 'G';
|
||||
ansiEscapes.cursorSavePosition = isTerminalApp ? '\u001B7' : ESC + 's';
|
||||
ansiEscapes.cursorRestorePosition = isTerminalApp ? '\u001B8' : ESC + 'u';
|
||||
ansiEscapes.cursorGetPosition = ESC + '6n';
|
||||
ansiEscapes.cursorNextLine = ESC + 'E';
|
||||
ansiEscapes.cursorPrevLine = ESC + 'F';
|
||||
ansiEscapes.cursorHide = ESC + '?25l';
|
||||
ansiEscapes.cursorShow = ESC + '?25h';
|
||||
|
||||
x.eraseLines = count => {
|
||||
ansiEscapes.eraseLines = count => {
|
||||
let clear = '';
|
||||
|
||||
for (let i = 0; i < count; i++) {
|
||||
clear += x.eraseLine + (i < count - 1 ? x.cursorUp() : '');
|
||||
clear += ansiEscapes.eraseLine + (i < count - 1 ? ansiEscapes.cursorUp() : '');
|
||||
}
|
||||
|
||||
if (count) {
|
||||
clear += x.cursorLeft;
|
||||
clear += ansiEscapes.cursorLeft;
|
||||
}
|
||||
|
||||
return clear;
|
||||
};
|
||||
|
||||
x.eraseEndLine = ESC + 'K';
|
||||
x.eraseStartLine = ESC + '1K';
|
||||
x.eraseLine = ESC + '2K';
|
||||
x.eraseDown = ESC + 'J';
|
||||
x.eraseUp = ESC + '1J';
|
||||
x.eraseScreen = ESC + '2J';
|
||||
x.scrollUp = ESC + 'S';
|
||||
x.scrollDown = ESC + 'T';
|
||||
ansiEscapes.eraseEndLine = ESC + 'K';
|
||||
ansiEscapes.eraseStartLine = ESC + '1K';
|
||||
ansiEscapes.eraseLine = ESC + '2K';
|
||||
ansiEscapes.eraseDown = ESC + 'J';
|
||||
ansiEscapes.eraseUp = ESC + '1J';
|
||||
ansiEscapes.eraseScreen = ESC + '2J';
|
||||
ansiEscapes.scrollUp = ESC + 'S';
|
||||
ansiEscapes.scrollDown = ESC + 'T';
|
||||
|
||||
x.clearScreen = '\u001Bc';
|
||||
ansiEscapes.clearScreen = '\u001Bc';
|
||||
|
||||
x.clearTerminal = process.platform === 'win32' ?
|
||||
`${x.eraseScreen}${ESC}0f` :
|
||||
ansiEscapes.clearTerminal = process.platform === 'win32' ?
|
||||
`${ansiEscapes.eraseScreen}${ESC}0f` :
|
||||
// 1. Erases the screen (Only done in case `2` is not supported)
|
||||
// 2. Erases the whole screen including scrollback buffer
|
||||
// 3. Moves cursor to the top-left position
|
||||
// More info: https://www.real-world-systems.com/docs/ANSIcode.html
|
||||
`${x.eraseScreen}${ESC}3J${ESC}H`;
|
||||
`${ansiEscapes.eraseScreen}${ESC}3J${ESC}H`;
|
||||
|
||||
x.beep = BEL;
|
||||
ansiEscapes.beep = BEL;
|
||||
|
||||
x.link = (text, url) => {
|
||||
ansiEscapes.link = (text, url) => {
|
||||
return [
|
||||
OSC,
|
||||
'8',
|
||||
|
@ -106,26 +109,49 @@ x.link = (text, url) => {
|
|||
].join('');
|
||||
};
|
||||
|
||||
x.image = (buf, opts) => {
|
||||
opts = opts || {};
|
||||
ansiEscapes.image = (buffer, options = {}) => {
|
||||
let ret = `${OSC}1337;File=inline=1`;
|
||||
|
||||
let ret = OSC + '1337;File=inline=1';
|
||||
|
||||
if (opts.width) {
|
||||
ret += `;width=${opts.width}`;
|
||||
if (options.width) {
|
||||
ret += `;width=${options.width}`;
|
||||
}
|
||||
|
||||
if (opts.height) {
|
||||
ret += `;height=${opts.height}`;
|
||||
if (options.height) {
|
||||
ret += `;height=${options.height}`;
|
||||
}
|
||||
|
||||
if (opts.preserveAspectRatio === false) {
|
||||
if (options.preserveAspectRatio === false) {
|
||||
ret += ';preserveAspectRatio=0';
|
||||
}
|
||||
|
||||
return ret + ':' + buf.toString('base64') + BEL;
|
||||
return ret + ':' + buffer.toString('base64') + BEL;
|
||||
};
|
||||
|
||||
x.iTerm = {};
|
||||
ansiEscapes.iTerm = {
|
||||
setCwd: (cwd = process.cwd()) => `${OSC}50;CurrentDir=${cwd}${BEL}`,
|
||||
|
||||
x.iTerm.setCwd = cwd => OSC + '50;CurrentDir=' + (cwd || process.cwd()) + BEL;
|
||||
annotation: (message, options = {}) => {
|
||||
let ret = `${OSC}1337;`;
|
||||
|
||||
const hasX = typeof options.x !== 'undefined';
|
||||
const hasY = typeof options.y !== 'undefined';
|
||||
if ((hasX || hasY) && !(hasX && hasY && typeof options.length !== 'undefined')) {
|
||||
throw new Error('`x`, `y` and `length` must be defined when `x` or `y` is defined');
|
||||
}
|
||||
|
||||
message = message.replace(/\|/g, '');
|
||||
|
||||
ret += options.isHidden ? 'AddHiddenAnnotation=' : 'AddAnnotation=';
|
||||
|
||||
if (options.length > 0) {
|
||||
ret +=
|
||||
(hasX ?
|
||||
[message, options.length, options.x, options.y] :
|
||||
[options.length, message]).join('|');
|
||||
} else {
|
||||
ret += message;
|
||||
}
|
||||
|
||||
return ret + BEL;
|
||||
}
|
||||
};
|
||||
|
|
2
node_modules/ansi-escapes/license
generated
vendored
2
node_modules/ansi-escapes/license
generated
vendored
|
@ -1,6 +1,6 @@
|
|||
MIT License
|
||||
|
||||
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
|
||||
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (https://sindresorhus.com)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
|
|
138
node_modules/ansi-escapes/package.json
generated
vendored
138
node_modules/ansi-escapes/package.json
generated
vendored
|
@ -1,85 +1,57 @@
|
|||
{
|
||||
"_args": [
|
||||
[
|
||||
"ansi-escapes@3.2.0",
|
||||
"F:\\laragon\\www\\wishthis"
|
||||
]
|
||||
],
|
||||
"_from": "ansi-escapes@3.2.0",
|
||||
"_id": "ansi-escapes@3.2.0",
|
||||
"_inBundle": false,
|
||||
"_integrity": "sha512-cBhpre4ma+U0T1oM5fXg7Dy1Jw7zzwv7lt/GoCpr+hDQJoYnKVPLL4dCvSEFMmQurOQvSrwT7SL/DAlhBI97RQ==",
|
||||
"_location": "/ansi-escapes",
|
||||
"_phantomChildren": {},
|
||||
"_requested": {
|
||||
"type": "version",
|
||||
"registry": true,
|
||||
"raw": "ansi-escapes@3.2.0",
|
||||
"name": "ansi-escapes",
|
||||
"escapedName": "ansi-escapes",
|
||||
"rawSpec": "3.2.0",
|
||||
"saveSpec": null,
|
||||
"fetchSpec": "3.2.0"
|
||||
},
|
||||
"_requiredBy": [
|
||||
"/inquirer"
|
||||
],
|
||||
"_resolved": "https://registry.npmjs.org/ansi-escapes/-/ansi-escapes-3.2.0.tgz",
|
||||
"_spec": "3.2.0",
|
||||
"_where": "F:\\laragon\\www\\wishthis",
|
||||
"author": {
|
||||
"name": "Sindre Sorhus",
|
||||
"email": "sindresorhus@gmail.com",
|
||||
"url": "sindresorhus.com"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/sindresorhus/ansi-escapes/issues"
|
||||
},
|
||||
"description": "ANSI escape codes for manipulating the terminal",
|
||||
"devDependencies": {
|
||||
"ava": "*",
|
||||
"xo": "*"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
},
|
||||
"files": [
|
||||
"index.js"
|
||||
],
|
||||
"homepage": "https://github.com/sindresorhus/ansi-escapes#readme",
|
||||
"keywords": [
|
||||
"ansi",
|
||||
"terminal",
|
||||
"console",
|
||||
"cli",
|
||||
"string",
|
||||
"tty",
|
||||
"escape",
|
||||
"escapes",
|
||||
"formatting",
|
||||
"shell",
|
||||
"xterm",
|
||||
"log",
|
||||
"logging",
|
||||
"command-line",
|
||||
"text",
|
||||
"vt100",
|
||||
"sequence",
|
||||
"control",
|
||||
"code",
|
||||
"codes",
|
||||
"cursor",
|
||||
"iterm",
|
||||
"iterm2"
|
||||
],
|
||||
"license": "MIT",
|
||||
"name": "ansi-escapes",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/sindresorhus/ansi-escapes.git"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "xo && ava"
|
||||
},
|
||||
"version": "3.2.0"
|
||||
"name": "ansi-escapes",
|
||||
"version": "4.3.2",
|
||||
"description": "ANSI escape codes for manipulating the terminal",
|
||||
"license": "MIT",
|
||||
"repository": "sindresorhus/ansi-escapes",
|
||||
"funding": "https://github.com/sponsors/sindresorhus",
|
||||
"author": {
|
||||
"name": "Sindre Sorhus",
|
||||
"email": "sindresorhus@gmail.com",
|
||||
"url": "https://sindresorhus.com"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "xo && ava && tsd"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"index.d.ts"
|
||||
],
|
||||
"keywords": [
|
||||
"ansi",
|
||||
"terminal",
|
||||
"console",
|
||||
"cli",
|
||||
"string",
|
||||
"tty",
|
||||
"escape",
|
||||
"escapes",
|
||||
"formatting",
|
||||
"shell",
|
||||
"xterm",
|
||||
"log",
|
||||
"logging",
|
||||
"command-line",
|
||||
"text",
|
||||
"vt100",
|
||||
"sequence",
|
||||
"control",
|
||||
"code",
|
||||
"codes",
|
||||
"cursor",
|
||||
"iterm",
|
||||
"iterm2"
|
||||
],
|
||||
"dependencies": {
|
||||
"type-fest": "^0.21.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^13.7.7",
|
||||
"ava": "^2.1.0",
|
||||
"tsd": "^0.14.0",
|
||||
"xo": "^0.25.3"
|
||||
}
|
||||
}
|
||||
|
|
93
node_modules/ansi-escapes/readme.md
generated
vendored
93
node_modules/ansi-escapes/readme.md
generated
vendored
|
@ -1,15 +1,13 @@
|
|||
# ansi-escapes [![Build Status](https://travis-ci.org/sindresorhus/ansi-escapes.svg?branch=master)](https://travis-ci.org/sindresorhus/ansi-escapes)
|
||||
# ansi-escapes
|
||||
|
||||
> [ANSI escape codes](http://www.termsys.demon.co.uk/vtansi.htm) for manipulating the terminal
|
||||
|
||||
|
||||
## Install
|
||||
|
||||
```
|
||||
$ npm install ansi-escapes
|
||||
```
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
```js
|
||||
|
@ -20,14 +18,13 @@ process.stdout.write(ansiEscapes.cursorUp(2) + ansiEscapes.cursorLeft);
|
|||
//=> '\u001B[2A\u001B[1000D'
|
||||
```
|
||||
|
||||
|
||||
## API
|
||||
|
||||
### cursorTo(x, [y])
|
||||
### cursorTo(x, y?)
|
||||
|
||||
Set the absolute position of the cursor. `x0` `y0` is the top left of the screen.
|
||||
|
||||
### cursorMove(x, [y])
|
||||
### cursorMove(x, y?)
|
||||
|
||||
Set the position of the cursor relative to its current position.
|
||||
|
||||
|
@ -41,11 +38,11 @@ Move cursor down a specific amount of rows. Default is `1`.
|
|||
|
||||
### cursorForward(count)
|
||||
|
||||
Move cursor forward a specific amount of rows. Default is `1`.
|
||||
Move cursor forward a specific amount of columns. Default is `1`.
|
||||
|
||||
### cursorBackward(count)
|
||||
|
||||
Move cursor backward a specific amount of rows. Default is `1`.
|
||||
Move cursor backward a specific amount of columns. Default is `1`.
|
||||
|
||||
### cursorLeft
|
||||
|
||||
|
@ -133,7 +130,7 @@ Create a clickable link.
|
|||
|
||||
[Supported terminals.](https://gist.github.com/egmontkob/eb114294efbcd5adb1944c9f3cb5feda) Use [`supports-hyperlinks`](https://github.com/jamestalmage/supports-hyperlinks) to detect link support.
|
||||
|
||||
### image(input, [options])
|
||||
### image(filePath, options?)
|
||||
|
||||
Display an image.
|
||||
|
||||
|
@ -149,10 +146,12 @@ Buffer of an image. Usually read in with `fs.readFile()`.
|
|||
|
||||
#### options
|
||||
|
||||
Type: `object`
|
||||
|
||||
##### width
|
||||
##### height
|
||||
|
||||
Type: `string` `number`
|
||||
Type: `string | number`
|
||||
|
||||
The width and height are given as a number followed by a unit, or the word "auto".
|
||||
|
||||
|
@ -163,22 +162,84 @@ The width and height are given as a number followed by a unit, or the word "auto
|
|||
|
||||
##### preserveAspectRatio
|
||||
|
||||
Type: `boolean`<br>
|
||||
Type: `boolean`\
|
||||
Default: `true`
|
||||
|
||||
### iTerm.setCwd([path])
|
||||
### iTerm.setCwd(path?)
|
||||
|
||||
Type: `string`<br>
|
||||
Type: `string`\
|
||||
Default: `process.cwd()`
|
||||
|
||||
[Inform iTerm2](https://www.iterm2.com/documentation-escape-codes.html) of the current directory to help semantic history and enable [Cmd-clicking relative paths](https://coderwall.com/p/b7e82q/quickly-open-files-in-iterm-with-cmd-click).
|
||||
|
||||
### iTerm.annotation(message, options?)
|
||||
|
||||
Creates an escape code to display an "annotation" in iTerm2.
|
||||
|
||||
An annotation looks like this when shown:
|
||||
|
||||
<img src="https://user-images.githubusercontent.com/924465/64382136-b60ac700-cfe9-11e9-8a35-9682e8dc4b72.png" width="500">
|
||||
|
||||
See the [iTerm Proprietary Escape Codes documentation](https://iterm2.com/documentation-escape-codes.html) for more information.
|
||||
|
||||
#### message
|
||||
|
||||
Type: `string`
|
||||
|
||||
The message to display within the annotation.
|
||||
|
||||
The `|` character is disallowed and will be stripped.
|
||||
|
||||
#### options
|
||||
|
||||
Type: `object`
|
||||
|
||||
##### length
|
||||
|
||||
Type: `number`\
|
||||
Default: The remainder of the line
|
||||
|
||||
Nonzero number of columns to annotate.
|
||||
|
||||
##### x
|
||||
|
||||
Type: `number`\
|
||||
Default: Cursor position
|
||||
|
||||
Starting X coordinate.
|
||||
|
||||
Must be used with `y` and `length`.
|
||||
|
||||
##### y
|
||||
|
||||
Type: `number`\
|
||||
Default: Cursor position
|
||||
|
||||
Starting Y coordinate.
|
||||
|
||||
Must be used with `x` and `length`.
|
||||
|
||||
##### isHidden
|
||||
|
||||
Type: `boolean`\
|
||||
Default: `false`
|
||||
|
||||
Create a "hidden" annotation.
|
||||
|
||||
Annotations created this way can be shown using the "Show Annotations" iTerm command.
|
||||
|
||||
## Related
|
||||
|
||||
- [ansi-styles](https://github.com/chalk/ansi-styles) - ANSI escape codes for styling strings in the terminal
|
||||
|
||||
---
|
||||
|
||||
## License
|
||||
|
||||
MIT © [Sindre Sorhus](https://sindresorhus.com)
|
||||
<div align="center">
|
||||
<b>
|
||||
<a href="https://tidelift.com/subscription/pkg/npm-ansi-escapes?utm_source=npm-ansi-escapes&utm_medium=referral&utm_campaign=readme">Get professional support for this package with a Tidelift subscription</a>
|
||||
</b>
|
||||
<br>
|
||||
<sub>
|
||||
Tidelift helps make open source sustainable for maintainers while giving companies<br>assurances about security, maintenance, and licensing for their dependencies.
|
||||
</sub>
|
||||
</div>
|
||||
|
|
64
node_modules/atob/package.json
generated
vendored
64
node_modules/atob/package.json
generated
vendored
|
@ -1,56 +1,24 @@
|
|||
{
|
||||
"_args": [
|
||||
[
|
||||
"atob@2.1.2",
|
||||
"F:\\laragon\\www\\wishthis"
|
||||
]
|
||||
],
|
||||
"_from": "atob@2.1.2",
|
||||
"_id": "atob@2.1.2",
|
||||
"_inBundle": false,
|
||||
"_integrity": "sha512-Wm6ukoaOGJi/73p/cl2GvLjTI5JM1k/O14isD73YML8StrH/7/lRFgmg8nICZgD3bZZvjwCGxtMOD3wWNAu8cg==",
|
||||
"_location": "/atob",
|
||||
"_phantomChildren": {},
|
||||
"_requested": {
|
||||
"type": "version",
|
||||
"registry": true,
|
||||
"raw": "atob@2.1.2",
|
||||
"name": "atob",
|
||||
"escapedName": "atob",
|
||||
"rawSpec": "2.1.2",
|
||||
"saveSpec": null,
|
||||
"fetchSpec": "2.1.2"
|
||||
},
|
||||
"_requiredBy": [
|
||||
"/source-map-resolve"
|
||||
],
|
||||
"_resolved": "https://registry.npmjs.org/atob/-/atob-2.1.2.tgz",
|
||||
"_spec": "2.1.2",
|
||||
"_where": "F:\\laragon\\www\\wishthis",
|
||||
"author": {
|
||||
"name": "AJ ONeal",
|
||||
"email": "coolaj86@gmail.com",
|
||||
"url": "https://coolaj86.com"
|
||||
},
|
||||
"bin": {
|
||||
"atob": "bin/atob.js"
|
||||
},
|
||||
"browser": "browser-atob.js",
|
||||
"description": "atob for Node.JS and Linux / Mac / Windows CLI (it's a one-liner)",
|
||||
"engines": {
|
||||
"node": ">= 4.5.0"
|
||||
},
|
||||
"homepage": "https://git.coolaj86.com/coolaj86/atob.js.git",
|
||||
"keywords": [
|
||||
"atob",
|
||||
"browser"
|
||||
],
|
||||
"license": "(MIT OR Apache-2.0)",
|
||||
"main": "node-atob.js",
|
||||
"name": "atob",
|
||||
"homepage": "https://git.coolaj86.com/coolaj86/atob.js.git",
|
||||
"description": "atob for Node.JS and Linux / Mac / Windows CLI (it's a one-liner)",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://git.coolaj86.com/coolaj86/atob.js.git"
|
||||
},
|
||||
"keywords": [
|
||||
"atob",
|
||||
"browser"
|
||||
],
|
||||
"author": "AJ ONeal <coolaj86@gmail.com> (https://coolaj86.com)",
|
||||
"engines": {
|
||||
"node": ">= 4.5.0"
|
||||
},
|
||||
"main": "node-atob.js",
|
||||
"browser": "browser-atob.js",
|
||||
"bin": {
|
||||
"atob": "bin/atob.js"
|
||||
},
|
||||
"license": "(MIT OR Apache-2.0)",
|
||||
"version": "2.1.2"
|
||||
}
|
||||
|
|
74
node_modules/autoprefixer/package.json
generated
vendored
74
node_modules/autoprefixer/package.json
generated
vendored
|
@ -1,42 +1,22 @@
|
|||
{
|
||||
"_args": [
|
||||
[
|
||||
"autoprefixer@9.8.8",
|
||||
"F:\\laragon\\www\\wishthis"
|
||||
]
|
||||
"name": "autoprefixer",
|
||||
"version": "9.8.8",
|
||||
"description": "Parse CSS and add vendor prefixes to CSS rules using values from the Can I Use website",
|
||||
"keywords": [
|
||||
"autoprefixer",
|
||||
"css",
|
||||
"prefix",
|
||||
"postcss",
|
||||
"postcss-plugin"
|
||||
],
|
||||
"_from": "autoprefixer@9.8.8",
|
||||
"_id": "autoprefixer@9.8.8",
|
||||
"_inBundle": false,
|
||||
"_integrity": "sha512-eM9d/swFopRt5gdJ7jrpCwgvEMIayITpojhkkSMRsFHYuH5bkSQ4p/9qTEHtmNudUZh22Tehu7I6CxAW0IXTKA==",
|
||||
"_location": "/autoprefixer",
|
||||
"_phantomChildren": {},
|
||||
"_requested": {
|
||||
"type": "version",
|
||||
"registry": true,
|
||||
"raw": "autoprefixer@9.8.8",
|
||||
"name": "autoprefixer",
|
||||
"escapedName": "autoprefixer",
|
||||
"rawSpec": "9.8.8",
|
||||
"saveSpec": null,
|
||||
"fetchSpec": "9.8.8"
|
||||
},
|
||||
"_requiredBy": [
|
||||
"/gulp-autoprefixer"
|
||||
],
|
||||
"_resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-9.8.8.tgz",
|
||||
"_spec": "9.8.8",
|
||||
"_where": "F:\\laragon\\www\\wishthis",
|
||||
"author": {
|
||||
"name": "Andrey Sitnik",
|
||||
"email": "andrey@sitnik.ru"
|
||||
},
|
||||
"bin": {
|
||||
"autoprefixer": "bin/autoprefixer"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/postcss/autoprefixer/issues"
|
||||
"bin": "./bin/autoprefixer",
|
||||
"funding": {
|
||||
"type": "tidelift",
|
||||
"url": "https://tidelift.com/funding/github/npm/autoprefixer"
|
||||
},
|
||||
"author": "Andrey Sitnik <andrey@sitnik.ru>",
|
||||
"license": "MIT",
|
||||
"repository": "postcss/autoprefixer",
|
||||
"dependencies": {
|
||||
"browserslist": "^4.12.0",
|
||||
"caniuse-lite": "^1.0.30001109",
|
||||
|
@ -46,25 +26,5 @@
|
|||
"postcss": "^7.0.32",
|
||||
"postcss-value-parser": "^4.1.0"
|
||||
},
|
||||
"description": "Parse CSS and add vendor prefixes to CSS rules using values from the Can I Use website",
|
||||
"funding": {
|
||||
"type": "tidelift",
|
||||
"url": "https://tidelift.com/funding/github/npm/autoprefixer"
|
||||
},
|
||||
"homepage": "https://github.com/postcss/autoprefixer#readme",
|
||||
"keywords": [
|
||||
"autoprefixer",
|
||||
"css",
|
||||
"prefix",
|
||||
"postcss",
|
||||
"postcss-plugin"
|
||||
],
|
||||
"license": "MIT",
|
||||
"main": "lib/autoprefixer",
|
||||
"name": "autoprefixer",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/postcss/autoprefixer.git"
|
||||
},
|
||||
"version": "9.8.8"
|
||||
"main": "lib/autoprefixer"
|
||||
}
|
||||
|
|
21
node_modules/base64-js/LICENSE
generated
vendored
Normal file
21
node_modules/base64-js/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
|||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014 Jameson Little
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
34
node_modules/base64-js/README.md
generated
vendored
Normal file
34
node_modules/base64-js/README.md
generated
vendored
Normal file
|
@ -0,0 +1,34 @@
|
|||
base64-js
|
||||
=========
|
||||
|
||||
`base64-js` does basic base64 encoding/decoding in pure JS.
|
||||
|
||||
[![build status](https://secure.travis-ci.org/beatgammit/base64-js.png)](http://travis-ci.org/beatgammit/base64-js)
|
||||
|
||||
Many browsers already have base64 encoding/decoding functionality, but it is for text data, not all-purpose binary data.
|
||||
|
||||
Sometimes encoding/decoding binary data in the browser is useful, and that is what this module does.
|
||||
|
||||
## install
|
||||
|
||||
With [npm](https://npmjs.org) do:
|
||||
|
||||
`npm install base64-js` and `var base64js = require('base64-js')`
|
||||
|
||||
For use in web browsers do:
|
||||
|
||||
`<script src="base64js.min.js"></script>`
|
||||
|
||||
[Get supported base64-js with the Tidelift Subscription](https://tidelift.com/subscription/pkg/npm-base64-js?utm_source=npm-base64-js&utm_medium=referral&utm_campaign=readme)
|
||||
|
||||
## methods
|
||||
|
||||
`base64js` has three exposed functions, `byteLength`, `toByteArray` and `fromByteArray`, which both take a single argument.
|
||||
|
||||
* `byteLength` - Takes a base64 string and returns length of byte array
|
||||
* `toByteArray` - Takes a base64 string and returns a byte array
|
||||
* `fromByteArray` - Takes a byte array and returns a base64 string
|
||||
|
||||
## license
|
||||
|
||||
MIT
|
1
node_modules/base64-js/base64js.min.js
generated
vendored
Normal file
1
node_modules/base64-js/base64js.min.js
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
(function(a){if("object"==typeof exports&&"undefined"!=typeof module)module.exports=a();else if("function"==typeof define&&define.amd)define([],a);else{var b;b="undefined"==typeof window?"undefined"==typeof global?"undefined"==typeof self?this:self:global:window,b.base64js=a()}})(function(){return function(){function b(d,e,g){function a(j,i){if(!e[j]){if(!d[j]){var f="function"==typeof require&&require;if(!i&&f)return f(j,!0);if(h)return h(j,!0);var c=new Error("Cannot find module '"+j+"'");throw c.code="MODULE_NOT_FOUND",c}var k=e[j]={exports:{}};d[j][0].call(k.exports,function(b){var c=d[j][1][b];return a(c||b)},k,k.exports,b,d,e,g)}return e[j].exports}for(var h="function"==typeof require&&require,c=0;c<g.length;c++)a(g[c]);return a}return b}()({"/":[function(a,b,c){'use strict';function d(a){var b=a.length;if(0<b%4)throw new Error("Invalid string. Length must be a multiple of 4");var c=a.indexOf("=");-1===c&&(c=b);var d=c===b?0:4-c%4;return[c,d]}function e(a,b,c){return 3*(b+c)/4-c}function f(a){var b,c,f=d(a),g=f[0],h=f[1],j=new m(e(a,g,h)),k=0,n=0<h?g-4:g;for(c=0;c<n;c+=4)b=l[a.charCodeAt(c)]<<18|l[a.charCodeAt(c+1)]<<12|l[a.charCodeAt(c+2)]<<6|l[a.charCodeAt(c+3)],j[k++]=255&b>>16,j[k++]=255&b>>8,j[k++]=255&b;return 2===h&&(b=l[a.charCodeAt(c)]<<2|l[a.charCodeAt(c+1)]>>4,j[k++]=255&b),1===h&&(b=l[a.charCodeAt(c)]<<10|l[a.charCodeAt(c+1)]<<4|l[a.charCodeAt(c+2)]>>2,j[k++]=255&b>>8,j[k++]=255&b),j}function g(a){return k[63&a>>18]+k[63&a>>12]+k[63&a>>6]+k[63&a]}function h(a,b,c){for(var d,e=[],f=b;f<c;f+=3)d=(16711680&a[f]<<16)+(65280&a[f+1]<<8)+(255&a[f+2]),e.push(g(d));return e.join("")}function j(a){for(var b,c=a.length,d=c%3,e=[],f=16383,g=0,j=c-d;g<j;g+=f)e.push(h(a,g,g+f>j?j:g+f));return 1===d?(b=a[c-1],e.push(k[b>>2]+k[63&b<<4]+"==")):2===d&&(b=(a[c-2]<<8)+a[c-1],e.push(k[b>>10]+k[63&b>>4]+k[63&b<<2]+"=")),e.join("")}c.byteLength=function(a){var b=d(a),c=b[0],e=b[1];return 3*(c+e)/4-e},c.toByteArray=f,c.fromByteArray=j;for(var k=[],l=[],m="undefined"==typeof Uint8Array?Array:Uint8Array,n="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/",o=0,p=n.length;o<p;++o)k[o]=n[o],l[n.charCodeAt(o)]=o;l[45]=62,l[95]=63},{}]},{},[])("/")});
|
3
node_modules/base64-js/index.d.ts
generated
vendored
Normal file
3
node_modules/base64-js/index.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
export function byteLength(b64: string): number;
|
||||
export function toByteArray(b64: string): Uint8Array;
|
||||
export function fromByteArray(uint8: Uint8Array): string;
|
150
node_modules/base64-js/index.js
generated
vendored
Normal file
150
node_modules/base64-js/index.js
generated
vendored
Normal file
|
@ -0,0 +1,150 @@
|
|||
'use strict'
|
||||
|
||||
exports.byteLength = byteLength
|
||||
exports.toByteArray = toByteArray
|
||||
exports.fromByteArray = fromByteArray
|
||||
|
||||
var lookup = []
|
||||
var revLookup = []
|
||||
var Arr = typeof Uint8Array !== 'undefined' ? Uint8Array : Array
|
||||
|
||||
var code = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/'
|
||||
for (var i = 0, len = code.length; i < len; ++i) {
|
||||
lookup[i] = code[i]
|
||||
revLookup[code.charCodeAt(i)] = i
|
||||
}
|
||||
|
||||
// Support decoding URL-safe base64 strings, as Node.js does.
|
||||
// See: https://en.wikipedia.org/wiki/Base64#URL_applications
|
||||
revLookup['-'.charCodeAt(0)] = 62
|
||||
revLookup['_'.charCodeAt(0)] = 63
|
||||
|
||||
function getLens (b64) {
|
||||
var len = b64.length
|
||||
|
||||
if (len % 4 > 0) {
|
||||
throw new Error('Invalid string. Length must be a multiple of 4')
|
||||
}
|
||||
|
||||
// Trim off extra bytes after placeholder bytes are found
|
||||
// See: https://github.com/beatgammit/base64-js/issues/42
|
||||
var validLen = b64.indexOf('=')
|
||||
if (validLen === -1) validLen = len
|
||||
|
||||
var placeHoldersLen = validLen === len
|
||||
? 0
|
||||
: 4 - (validLen % 4)
|
||||
|
||||
return [validLen, placeHoldersLen]
|
||||
}
|
||||
|
||||
// base64 is 4/3 + up to two characters of the original data
|
||||
function byteLength (b64) {
|
||||
var lens = getLens(b64)
|
||||
var validLen = lens[0]
|
||||
var placeHoldersLen = lens[1]
|
||||
return ((validLen + placeHoldersLen) * 3 / 4) - placeHoldersLen
|
||||
}
|
||||
|
||||
function _byteLength (b64, validLen, placeHoldersLen) {
|
||||
return ((validLen + placeHoldersLen) * 3 / 4) - placeHoldersLen
|
||||
}
|
||||
|
||||
function toByteArray (b64) {
|
||||
var tmp
|
||||
var lens = getLens(b64)
|
||||
var validLen = lens[0]
|
||||
var placeHoldersLen = lens[1]
|
||||
|
||||
var arr = new Arr(_byteLength(b64, validLen, placeHoldersLen))
|
||||
|
||||
var curByte = 0
|
||||
|
||||
// if there are placeholders, only get up to the last complete 4 chars
|
||||
var len = placeHoldersLen > 0
|
||||
? validLen - 4
|
||||
: validLen
|
||||
|
||||
var i
|
||||
for (i = 0; i < len; i += 4) {
|
||||
tmp =
|
||||
(revLookup[b64.charCodeAt(i)] << 18) |
|
||||
(revLookup[b64.charCodeAt(i + 1)] << 12) |
|
||||
(revLookup[b64.charCodeAt(i + 2)] << 6) |
|
||||
revLookup[b64.charCodeAt(i + 3)]
|
||||
arr[curByte++] = (tmp >> 16) & 0xFF
|
||||
arr[curByte++] = (tmp >> 8) & 0xFF
|
||||
arr[curByte++] = tmp & 0xFF
|
||||
}
|
||||
|
||||
if (placeHoldersLen === 2) {
|
||||
tmp =
|
||||
(revLookup[b64.charCodeAt(i)] << 2) |
|
||||
(revLookup[b64.charCodeAt(i + 1)] >> 4)
|
||||
arr[curByte++] = tmp & 0xFF
|
||||
}
|
||||
|
||||
if (placeHoldersLen === 1) {
|
||||
tmp =
|
||||
(revLookup[b64.charCodeAt(i)] << 10) |
|
||||
(revLookup[b64.charCodeAt(i + 1)] << 4) |
|
||||
(revLookup[b64.charCodeAt(i + 2)] >> 2)
|
||||
arr[curByte++] = (tmp >> 8) & 0xFF
|
||||
arr[curByte++] = tmp & 0xFF
|
||||
}
|
||||
|
||||
return arr
|
||||
}
|
||||
|
||||
function tripletToBase64 (num) {
|
||||
return lookup[num >> 18 & 0x3F] +
|
||||
lookup[num >> 12 & 0x3F] +
|
||||
lookup[num >> 6 & 0x3F] +
|
||||
lookup[num & 0x3F]
|
||||
}
|
||||
|
||||
function encodeChunk (uint8, start, end) {
|
||||
var tmp
|
||||
var output = []
|
||||
for (var i = start; i < end; i += 3) {
|
||||
tmp =
|
||||
((uint8[i] << 16) & 0xFF0000) +
|
||||
((uint8[i + 1] << 8) & 0xFF00) +
|
||||
(uint8[i + 2] & 0xFF)
|
||||
output.push(tripletToBase64(tmp))
|
||||
}
|
||||
return output.join('')
|
||||
}
|
||||
|
||||
function fromByteArray (uint8) {
|
||||
var tmp
|
||||
var len = uint8.length
|
||||
var extraBytes = len % 3 // if we have 1 byte left, pad 2 bytes
|
||||
var parts = []
|
||||
var maxChunkLength = 16383 // must be multiple of 3
|
||||
|
||||
// go through the array every three bytes, we'll deal with trailing stuff later
|
||||
for (var i = 0, len2 = len - extraBytes; i < len2; i += maxChunkLength) {
|
||||
parts.push(encodeChunk(uint8, i, (i + maxChunkLength) > len2 ? len2 : (i + maxChunkLength)))
|
||||
}
|
||||
|
||||
// pad the end with zeros, but make sure to not forget the extra bytes
|
||||
if (extraBytes === 1) {
|
||||
tmp = uint8[len - 1]
|
||||
parts.push(
|
||||
lookup[tmp >> 2] +
|
||||
lookup[(tmp << 4) & 0x3F] +
|
||||
'=='
|
||||
)
|
||||
} else if (extraBytes === 2) {
|
||||
tmp = (uint8[len - 2] << 8) + uint8[len - 1]
|
||||
parts.push(
|
||||
lookup[tmp >> 10] +
|
||||
lookup[(tmp >> 4) & 0x3F] +
|
||||
lookup[(tmp << 2) & 0x3F] +
|
||||
'='
|
||||
)
|
||||
}
|
||||
|
||||
return parts.join('')
|
||||
}
|
47
node_modules/base64-js/package.json
generated
vendored
Normal file
47
node_modules/base64-js/package.json
generated
vendored
Normal file
|
@ -0,0 +1,47 @@
|
|||
{
|
||||
"name": "base64-js",
|
||||
"description": "Base64 encoding/decoding in pure JS",
|
||||
"version": "1.5.1",
|
||||
"author": "T. Jameson Little <t.jameson.little@gmail.com>",
|
||||
"typings": "index.d.ts",
|
||||
"bugs": {
|
||||
"url": "https://github.com/beatgammit/base64-js/issues"
|
||||
},
|
||||
"devDependencies": {
|
||||
"babel-minify": "^0.5.1",
|
||||
"benchmark": "^2.1.4",
|
||||
"browserify": "^16.3.0",
|
||||
"standard": "*",
|
||||
"tape": "4.x"
|
||||
},
|
||||
"homepage": "https://github.com/beatgammit/base64-js",
|
||||
"keywords": [
|
||||
"base64"
|
||||
],
|
||||
"license": "MIT",
|
||||
"main": "index.js",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/beatgammit/base64-js.git"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "browserify -s base64js -r ./ | minify > base64js.min.js",
|
||||
"lint": "standard",
|
||||
"test": "npm run lint && npm run unit",
|
||||
"unit": "tape test/*.js"
|
||||
},
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/feross"
|
||||
},
|
||||
{
|
||||
"type": "patreon",
|
||||
"url": "https://www.patreon.com/feross"
|
||||
},
|
||||
{
|
||||
"type": "consulting",
|
||||
"url": "https://feross.org/support"
|
||||
}
|
||||
]
|
||||
}
|
17
node_modules/bl/.travis.yml
generated
vendored
Normal file
17
node_modules/bl/.travis.yml
generated
vendored
Normal file
|
@ -0,0 +1,17 @@
|
|||
sudo: false
|
||||
arch:
|
||||
- amd64
|
||||
- ppc64le
|
||||
language: node_js
|
||||
node_js:
|
||||
- '6'
|
||||
- '8'
|
||||
- '10'
|
||||
- '12'
|
||||
- '14'
|
||||
- '15'
|
||||
- lts/*
|
||||
notifications:
|
||||
email:
|
||||
- rod@vagg.org
|
||||
- matteo.collina@gmail.com
|
396
node_modules/bl/BufferList.js
generated
vendored
Normal file
396
node_modules/bl/BufferList.js
generated
vendored
Normal file
|
@ -0,0 +1,396 @@
|
|||
'use strict'
|
||||
|
||||
const { Buffer } = require('buffer')
|
||||
const symbol = Symbol.for('BufferList')
|
||||
|
||||
function BufferList (buf) {
|
||||
if (!(this instanceof BufferList)) {
|
||||
return new BufferList(buf)
|
||||
}
|
||||
|
||||
BufferList._init.call(this, buf)
|
||||
}
|
||||
|
||||
BufferList._init = function _init (buf) {
|
||||
Object.defineProperty(this, symbol, { value: true })
|
||||
|
||||
this._bufs = []
|
||||
this.length = 0
|
||||
|
||||
if (buf) {
|
||||
this.append(buf)
|
||||
}
|
||||
}
|
||||
|
||||
BufferList.prototype._new = function _new (buf) {
|
||||
return new BufferList(buf)
|
||||
}
|
||||
|
||||
BufferList.prototype._offset = function _offset (offset) {
|
||||
if (offset === 0) {
|
||||
return [0, 0]
|
||||
}
|
||||
|
||||
let tot = 0
|
||||
|
||||
for (let i = 0; i < this._bufs.length; i++) {
|
||||
const _t = tot + this._bufs[i].length
|
||||
if (offset < _t || i === this._bufs.length - 1) {
|
||||
return [i, offset - tot]
|
||||
}
|
||||
tot = _t
|
||||
}
|
||||
}
|
||||
|
||||
BufferList.prototype._reverseOffset = function (blOffset) {
|
||||
const bufferId = blOffset[0]
|
||||
let offset = blOffset[1]
|
||||
|
||||
for (let i = 0; i < bufferId; i++) {
|
||||
offset += this._bufs[i].length
|
||||
}
|
||||
|
||||
return offset
|
||||
}
|
||||
|
||||
BufferList.prototype.get = function get (index) {
|
||||
if (index > this.length || index < 0) {
|
||||
return undefined
|
||||
}
|
||||
|
||||
const offset = this._offset(index)
|
||||
|
||||
return this._bufs[offset[0]][offset[1]]
|
||||
}
|
||||
|
||||
BufferList.prototype.slice = function slice (start, end) {
|
||||
if (typeof start === 'number' && start < 0) {
|
||||
start += this.length
|
||||
}
|
||||
|
||||
if (typeof end === 'number' && end < 0) {
|
||||
end += this.length
|
||||
}
|
||||
|
||||
return this.copy(null, 0, start, end)
|
||||
}
|
||||
|
||||
BufferList.prototype.copy = function copy (dst, dstStart, srcStart, srcEnd) {
|
||||
if (typeof srcStart !== 'number' || srcStart < 0) {
|
||||
srcStart = 0
|
||||
}
|
||||
|
||||
if (typeof srcEnd !== 'number' || srcEnd > this.length) {
|
||||
srcEnd = this.length
|
||||
}
|
||||
|
||||
if (srcStart >= this.length) {
|
||||
return dst || Buffer.alloc(0)
|
||||
}
|
||||
|
||||
if (srcEnd <= 0) {
|
||||
return dst || Buffer.alloc(0)
|
||||
}
|
||||
|
||||
const copy = !!dst
|
||||
const off = this._offset(srcStart)
|
||||
const len = srcEnd - srcStart
|
||||
let bytes = len
|
||||
let bufoff = (copy && dstStart) || 0
|
||||
let start = off[1]
|
||||
|
||||
// copy/slice everything
|
||||
if (srcStart === 0 && srcEnd === this.length) {
|
||||
if (!copy) {
|
||||
// slice, but full concat if multiple buffers
|
||||
return this._bufs.length === 1
|
||||
? this._bufs[0]
|
||||
: Buffer.concat(this._bufs, this.length)
|
||||
}
|
||||
|
||||
// copy, need to copy individual buffers
|
||||
for (let i = 0; i < this._bufs.length; i++) {
|
||||
this._bufs[i].copy(dst, bufoff)
|
||||
bufoff += this._bufs[i].length
|
||||
}
|
||||
|
||||
return dst
|
||||
}
|
||||
|
||||
// easy, cheap case where it's a subset of one of the buffers
|
||||
if (bytes <= this._bufs[off[0]].length - start) {
|
||||
return copy
|
||||
? this._bufs[off[0]].copy(dst, dstStart, start, start + bytes)
|
||||
: this._bufs[off[0]].slice(start, start + bytes)
|
||||
}
|
||||
|
||||
if (!copy) {
|
||||
// a slice, we need something to copy in to
|
||||
dst = Buffer.allocUnsafe(len)
|
||||
}
|
||||
|
||||
for (let i = off[0]; i < this._bufs.length; i++) {
|
||||
const l = this._bufs[i].length - start
|
||||
|
||||
if (bytes > l) {
|
||||
this._bufs[i].copy(dst, bufoff, start)
|
||||
bufoff += l
|
||||
} else {
|
||||
this._bufs[i].copy(dst, bufoff, start, start + bytes)
|
||||
bufoff += l
|
||||
break
|
||||
}
|
||||
|
||||
bytes -= l
|
||||
|
||||
if (start) {
|
||||
start = 0
|
||||
}
|
||||
}
|
||||
|
||||
// safeguard so that we don't return uninitialized memory
|
||||
if (dst.length > bufoff) return dst.slice(0, bufoff)
|
||||
|
||||
return dst
|
||||
}
|
||||
|
||||
BufferList.prototype.shallowSlice = function shallowSlice (start, end) {
|
||||
start = start || 0
|
||||
end = typeof end !== 'number' ? this.length : end
|
||||
|
||||
if (start < 0) {
|
||||
start += this.length
|
||||
}
|
||||
|
||||
if (end < 0) {
|
||||
end += this.length
|
||||
}
|
||||
|
||||
if (start === end) {
|
||||
return this._new()
|
||||
}
|
||||
|
||||
const startOffset = this._offset(start)
|
||||
const endOffset = this._offset(end)
|
||||
const buffers = this._bufs.slice(startOffset[0], endOffset[0] + 1)
|
||||
|
||||
if (endOffset[1] === 0) {
|
||||
buffers.pop()
|
||||
} else {
|
||||
buffers[buffers.length - 1] = buffers[buffers.length - 1].slice(0, endOffset[1])
|
||||
}
|
||||
|
||||
if (startOffset[1] !== 0) {
|
||||
buffers[0] = buffers[0].slice(startOffset[1])
|
||||
}
|
||||
|
||||
return this._new(buffers)
|
||||
}
|
||||
|
||||
BufferList.prototype.toString = function toString (encoding, start, end) {
|
||||
return this.slice(start, end).toString(encoding)
|
||||
}
|
||||
|
||||
BufferList.prototype.consume = function consume (bytes) {
|
||||
// first, normalize the argument, in accordance with how Buffer does it
|
||||
bytes = Math.trunc(bytes)
|
||||
// do nothing if not a positive number
|
||||
if (Number.isNaN(bytes) || bytes <= 0) return this
|
||||
|
||||
while (this._bufs.length) {
|
||||
if (bytes >= this._bufs[0].length) {
|
||||
bytes -= this._bufs[0].length
|
||||
this.length -= this._bufs[0].length
|
||||
this._bufs.shift()
|
||||
} else {
|
||||
this._bufs[0] = this._bufs[0].slice(bytes)
|
||||
this.length -= bytes
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
return this
|
||||
}
|
||||
|
||||
BufferList.prototype.duplicate = function duplicate () {
|
||||
const copy = this._new()
|
||||
|
||||
for (let i = 0; i < this._bufs.length; i++) {
|
||||
copy.append(this._bufs[i])
|
||||
}
|
||||
|
||||
return copy
|
||||
}
|
||||
|
||||
BufferList.prototype.append = function append (buf) {
|
||||
if (buf == null) {
|
||||
return this
|
||||
}
|
||||
|
||||
if (buf.buffer) {
|
||||
// append a view of the underlying ArrayBuffer
|
||||
this._appendBuffer(Buffer.from(buf.buffer, buf.byteOffset, buf.byteLength))
|
||||
} else if (Array.isArray(buf)) {
|
||||
for (let i = 0; i < buf.length; i++) {
|
||||
this.append(buf[i])
|
||||
}
|
||||
} else if (this._isBufferList(buf)) {
|
||||
// unwrap argument into individual BufferLists
|
||||
for (let i = 0; i < buf._bufs.length; i++) {
|
||||
this.append(buf._bufs[i])
|
||||
}
|
||||
} else {
|
||||
// coerce number arguments to strings, since Buffer(number) does
|
||||
// uninitialized memory allocation
|
||||
if (typeof buf === 'number') {
|
||||
buf = buf.toString()
|
||||
}
|
||||
|
||||
this._appendBuffer(Buffer.from(buf))
|
||||
}
|
||||
|
||||
return this
|
||||
}
|
||||
|
||||
BufferList.prototype._appendBuffer = function appendBuffer (buf) {
|
||||
this._bufs.push(buf)
|
||||
this.length += buf.length
|
||||
}
|
||||
|
||||
BufferList.prototype.indexOf = function (search, offset, encoding) {
|
||||
if (encoding === undefined && typeof offset === 'string') {
|
||||
encoding = offset
|
||||
offset = undefined
|
||||
}
|
||||
|
||||
if (typeof search === 'function' || Array.isArray(search)) {
|
||||
throw new TypeError('The "value" argument must be one of type string, Buffer, BufferList, or Uint8Array.')
|
||||
} else if (typeof search === 'number') {
|
||||
search = Buffer.from([search])
|
||||
} else if (typeof search === 'string') {
|
||||
search = Buffer.from(search, encoding)
|
||||
} else if (this._isBufferList(search)) {
|
||||
search = search.slice()
|
||||
} else if (Array.isArray(search.buffer)) {
|
||||
search = Buffer.from(search.buffer, search.byteOffset, search.byteLength)
|
||||
} else if (!Buffer.isBuffer(search)) {
|
||||
search = Buffer.from(search)
|
||||
}
|
||||
|
||||
offset = Number(offset || 0)
|
||||
|
||||
if (isNaN(offset)) {
|
||||
offset = 0
|
||||
}
|
||||
|
||||
if (offset < 0) {
|
||||
offset = this.length + offset
|
||||
}
|
||||
|
||||
if (offset < 0) {
|
||||
offset = 0
|
||||
}
|
||||
|
||||
if (search.length === 0) {
|
||||
return offset > this.length ? this.length : offset
|
||||
}
|
||||
|
||||
const blOffset = this._offset(offset)
|
||||
let blIndex = blOffset[0] // index of which internal buffer we're working on
|
||||
let buffOffset = blOffset[1] // offset of the internal buffer we're working on
|
||||
|
||||
// scan over each buffer
|
||||
for (; blIndex < this._bufs.length; blIndex++) {
|
||||
const buff = this._bufs[blIndex]
|
||||
|
||||
while (buffOffset < buff.length) {
|
||||
const availableWindow = buff.length - buffOffset
|
||||
|
||||
if (availableWindow >= search.length) {
|
||||
const nativeSearchResult = buff.indexOf(search, buffOffset)
|
||||
|
||||
if (nativeSearchResult !== -1) {
|
||||
return this._reverseOffset([blIndex, nativeSearchResult])
|
||||
}
|
||||
|
||||
buffOffset = buff.length - search.length + 1 // end of native search window
|
||||
} else {
|
||||
const revOffset = this._reverseOffset([blIndex, buffOffset])
|
||||
|
||||
if (this._match(revOffset, search)) {
|
||||
return revOffset
|
||||
}
|
||||
|
||||
buffOffset++
|
||||
}
|
||||
}
|
||||
|
||||
buffOffset = 0
|
||||
}
|
||||
|
||||
return -1
|
||||
}
|
||||
|
||||
BufferList.prototype._match = function (offset, search) {
|
||||
if (this.length - offset < search.length) {
|
||||
return false
|
||||
}
|
||||
|
||||
for (let searchOffset = 0; searchOffset < search.length; searchOffset++) {
|
||||
if (this.get(offset + searchOffset) !== search[searchOffset]) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
;(function () {
|
||||
const methods = {
|
||||
readDoubleBE: 8,
|
||||
readDoubleLE: 8,
|
||||
readFloatBE: 4,
|
||||
readFloatLE: 4,
|
||||
readInt32BE: 4,
|
||||
readInt32LE: 4,
|
||||
readUInt32BE: 4,
|
||||
readUInt32LE: 4,
|
||||
readInt16BE: 2,
|
||||
readInt16LE: 2,
|
||||
readUInt16BE: 2,
|
||||
readUInt16LE: 2,
|
||||
readInt8: 1,
|
||||
readUInt8: 1,
|
||||
readIntBE: null,
|
||||
readIntLE: null,
|
||||
readUIntBE: null,
|
||||
readUIntLE: null
|
||||
}
|
||||
|
||||
for (const m in methods) {
|
||||
(function (m) {
|
||||
if (methods[m] === null) {
|
||||
BufferList.prototype[m] = function (offset, byteLength) {
|
||||
return this.slice(offset, offset + byteLength)[m](0, byteLength)
|
||||
}
|
||||
} else {
|
||||
BufferList.prototype[m] = function (offset = 0) {
|
||||
return this.slice(offset, offset + methods[m])[m](0)
|
||||
}
|
||||
}
|
||||
}(m))
|
||||
}
|
||||
}())
|
||||
|
||||
// Used internally by the class and also as an indicator of this object being
|
||||
// a `BufferList`. It's not possible to use `instanceof BufferList` in a browser
|
||||
// environment because there could be multiple different copies of the
|
||||
// BufferList class and some `BufferList`s might be `BufferList`s.
|
||||
BufferList.prototype._isBufferList = function _isBufferList (b) {
|
||||
return b instanceof BufferList || BufferList.isBufferList(b)
|
||||
}
|
||||
|
||||
BufferList.isBufferList = function isBufferList (b) {
|
||||
return b != null && b[symbol]
|
||||
}
|
||||
|
||||
module.exports = BufferList
|
13
node_modules/bl/LICENSE.md
generated
vendored
Normal file
13
node_modules/bl/LICENSE.md
generated
vendored
Normal file
|
@ -0,0 +1,13 @@
|
|||
The MIT License (MIT)
|
||||
=====================
|
||||
|
||||
Copyright (c) 2013-2019 bl contributors
|
||||
----------------------------------
|
||||
|
||||
*bl contributors listed at <https://github.com/rvagg/bl#contributors>*
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
247
node_modules/bl/README.md
generated
vendored
Normal file
247
node_modules/bl/README.md
generated
vendored
Normal file
|
@ -0,0 +1,247 @@
|
|||
# bl *(BufferList)*
|
||||
|
||||
[![Build Status](https://api.travis-ci.com/rvagg/bl.svg?branch=master)](https://travis-ci.com/rvagg/bl/)
|
||||
|
||||
**A Node.js Buffer list collector, reader and streamer thingy.**
|
||||
|
||||
[![NPM](https://nodei.co/npm/bl.svg)](https://nodei.co/npm/bl/)
|
||||
|
||||
**bl** is a storage object for collections of Node Buffers, exposing them with the main Buffer readable API. Also works as a duplex stream so you can collect buffers from a stream that emits them and emit buffers to a stream that consumes them!
|
||||
|
||||
The original buffers are kept intact and copies are only done as necessary. Any reads that require the use of a single original buffer will return a slice of that buffer only (which references the same memory as the original buffer). Reads that span buffers perform concatenation as required and return the results transparently.
|
||||
|
||||
```js
|
||||
const { BufferList } = require('bl')
|
||||
|
||||
const bl = new BufferList()
|
||||
bl.append(Buffer.from('abcd'))
|
||||
bl.append(Buffer.from('efg'))
|
||||
bl.append('hi') // bl will also accept & convert Strings
|
||||
bl.append(Buffer.from('j'))
|
||||
bl.append(Buffer.from([ 0x3, 0x4 ]))
|
||||
|
||||
console.log(bl.length) // 12
|
||||
|
||||
console.log(bl.slice(0, 10).toString('ascii')) // 'abcdefghij'
|
||||
console.log(bl.slice(3, 10).toString('ascii')) // 'defghij'
|
||||
console.log(bl.slice(3, 6).toString('ascii')) // 'def'
|
||||
console.log(bl.slice(3, 8).toString('ascii')) // 'defgh'
|
||||
console.log(bl.slice(5, 10).toString('ascii')) // 'fghij'
|
||||
|
||||
console.log(bl.indexOf('def')) // 3
|
||||
console.log(bl.indexOf('asdf')) // -1
|
||||
|
||||
// or just use toString!
|
||||
console.log(bl.toString()) // 'abcdefghij\u0003\u0004'
|
||||
console.log(bl.toString('ascii', 3, 8)) // 'defgh'
|
||||
console.log(bl.toString('ascii', 5, 10)) // 'fghij'
|
||||
|
||||
// other standard Buffer readables
|
||||
console.log(bl.readUInt16BE(10)) // 0x0304
|
||||
console.log(bl.readUInt16LE(10)) // 0x0403
|
||||
```
|
||||
|
||||
Give it a callback in the constructor and use it just like **[concat-stream](https://github.com/maxogden/node-concat-stream)**:
|
||||
|
||||
```js
|
||||
const { BufferListStream } = require('bl')
|
||||
const fs = require('fs')
|
||||
|
||||
fs.createReadStream('README.md')
|
||||
.pipe(BufferListStream((err, data) => { // note 'new' isn't strictly required
|
||||
// `data` is a complete Buffer object containing the full data
|
||||
console.log(data.toString())
|
||||
}))
|
||||
```
|
||||
|
||||
Note that when you use the *callback* method like this, the resulting `data` parameter is a concatenation of all `Buffer` objects in the list. If you want to avoid the overhead of this concatenation (in cases of extreme performance consciousness), then avoid the *callback* method and just listen to `'end'` instead, like a standard Stream.
|
||||
|
||||
Or to fetch a URL using [hyperquest](https://github.com/substack/hyperquest) (should work with [request](http://github.com/mikeal/request) and even plain Node http too!):
|
||||
|
||||
```js
|
||||
const hyperquest = require('hyperquest')
|
||||
const { BufferListStream } = require('bl')
|
||||
|
||||
const url = 'https://raw.github.com/rvagg/bl/master/README.md'
|
||||
|
||||
hyperquest(url).pipe(BufferListStream((err, data) => {
|
||||
console.log(data.toString())
|
||||
}))
|
||||
```
|
||||
|
||||
Or, use it as a readable stream to recompose a list of Buffers to an output source:
|
||||
|
||||
```js
|
||||
const { BufferListStream } = require('bl')
|
||||
const fs = require('fs')
|
||||
|
||||
var bl = new BufferListStream()
|
||||
bl.append(Buffer.from('abcd'))
|
||||
bl.append(Buffer.from('efg'))
|
||||
bl.append(Buffer.from('hi'))
|
||||
bl.append(Buffer.from('j'))
|
||||
|
||||
bl.pipe(fs.createWriteStream('gibberish.txt'))
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
* <a href="#ctor"><code><b>new BufferList([ buf ])</b></code></a>
|
||||
* <a href="#isBufferList"><code><b>BufferList.isBufferList(obj)</b></code></a>
|
||||
* <a href="#length"><code>bl.<b>length</b></code></a>
|
||||
* <a href="#append"><code>bl.<b>append(buffer)</b></code></a>
|
||||
* <a href="#get"><code>bl.<b>get(index)</b></code></a>
|
||||
* <a href="#indexOf"><code>bl.<b>indexOf(value[, byteOffset][, encoding])</b></code></a>
|
||||
* <a href="#slice"><code>bl.<b>slice([ start[, end ] ])</b></code></a>
|
||||
* <a href="#shallowSlice"><code>bl.<b>shallowSlice([ start[, end ] ])</b></code></a>
|
||||
* <a href="#copy"><code>bl.<b>copy(dest, [ destStart, [ srcStart [, srcEnd ] ] ])</b></code></a>
|
||||
* <a href="#duplicate"><code>bl.<b>duplicate()</b></code></a>
|
||||
* <a href="#consume"><code>bl.<b>consume(bytes)</b></code></a>
|
||||
* <a href="#toString"><code>bl.<b>toString([encoding, [ start, [ end ]]])</b></code></a>
|
||||
* <a href="#readXX"><code>bl.<b>readDoubleBE()</b></code>, <code>bl.<b>readDoubleLE()</b></code>, <code>bl.<b>readFloatBE()</b></code>, <code>bl.<b>readFloatLE()</b></code>, <code>bl.<b>readInt32BE()</b></code>, <code>bl.<b>readInt32LE()</b></code>, <code>bl.<b>readUInt32BE()</b></code>, <code>bl.<b>readUInt32LE()</b></code>, <code>bl.<b>readInt16BE()</b></code>, <code>bl.<b>readInt16LE()</b></code>, <code>bl.<b>readUInt16BE()</b></code>, <code>bl.<b>readUInt16LE()</b></code>, <code>bl.<b>readInt8()</b></code>, <code>bl.<b>readUInt8()</b></code></a>
|
||||
* <a href="#ctorStream"><code><b>new BufferListStream([ callback ])</b></code></a>
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="ctor"></a>
|
||||
### new BufferList([ Buffer | Buffer array | BufferList | BufferList array | String ])
|
||||
No arguments are _required_ for the constructor, but you can initialise the list by passing in a single `Buffer` object or an array of `Buffer` objects.
|
||||
|
||||
`new` is not strictly required, if you don't instantiate a new object, it will be done automatically for you so you can create a new instance simply with:
|
||||
|
||||
```js
|
||||
const { BufferList } = require('bl')
|
||||
const bl = BufferList()
|
||||
|
||||
// equivalent to:
|
||||
|
||||
const { BufferList } = require('bl')
|
||||
const bl = new BufferList()
|
||||
```
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="isBufferList"></a>
|
||||
### BufferList.isBufferList(obj)
|
||||
Determines if the passed object is a `BufferList`. It will return `true` if the passed object is an instance of `BufferList` **or** `BufferListStream` and `false` otherwise.
|
||||
|
||||
N.B. this won't return `true` for `BufferList` or `BufferListStream` instances created by versions of this library before this static method was added.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="length"></a>
|
||||
### bl.length
|
||||
Get the length of the list in bytes. This is the sum of the lengths of all of the buffers contained in the list, minus any initial offset for a semi-consumed buffer at the beginning. Should accurately represent the total number of bytes that can be read from the list.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="append"></a>
|
||||
### bl.append(Buffer | Buffer array | BufferList | BufferList array | String)
|
||||
`append(buffer)` adds an additional buffer or BufferList to the internal list. `this` is returned so it can be chained.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="get"></a>
|
||||
### bl.get(index)
|
||||
`get()` will return the byte at the specified index.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="indexOf"></a>
|
||||
### bl.indexOf(value[, byteOffset][, encoding])
|
||||
`get()` will return the byte at the specified index.
|
||||
`indexOf()` method returns the first index at which a given element can be found in the BufferList, or -1 if it is not present.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="slice"></a>
|
||||
### bl.slice([ start, [ end ] ])
|
||||
`slice()` returns a new `Buffer` object containing the bytes within the range specified. Both `start` and `end` are optional and will default to the beginning and end of the list respectively.
|
||||
|
||||
If the requested range spans a single internal buffer then a slice of that buffer will be returned which shares the original memory range of that Buffer. If the range spans multiple buffers then copy operations will likely occur to give you a uniform Buffer.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="shallowSlice"></a>
|
||||
### bl.shallowSlice([ start, [ end ] ])
|
||||
`shallowSlice()` returns a new `BufferList` object containing the bytes within the range specified. Both `start` and `end` are optional and will default to the beginning and end of the list respectively.
|
||||
|
||||
No copies will be performed. All buffers in the result share memory with the original list.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="copy"></a>
|
||||
### bl.copy(dest, [ destStart, [ srcStart [, srcEnd ] ] ])
|
||||
`copy()` copies the content of the list in the `dest` buffer, starting from `destStart` and containing the bytes within the range specified with `srcStart` to `srcEnd`. `destStart`, `start` and `end` are optional and will default to the beginning of the `dest` buffer, and the beginning and end of the list respectively.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="duplicate"></a>
|
||||
### bl.duplicate()
|
||||
`duplicate()` performs a **shallow-copy** of the list. The internal Buffers remains the same, so if you change the underlying Buffers, the change will be reflected in both the original and the duplicate. This method is needed if you want to call `consume()` or `pipe()` and still keep the original list.Example:
|
||||
|
||||
```js
|
||||
var bl = new BufferListStream()
|
||||
|
||||
bl.append('hello')
|
||||
bl.append(' world')
|
||||
bl.append('\n')
|
||||
|
||||
bl.duplicate().pipe(process.stdout, { end: false })
|
||||
|
||||
console.log(bl.toString())
|
||||
```
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="consume"></a>
|
||||
### bl.consume(bytes)
|
||||
`consume()` will shift bytes *off the start of the list*. The number of bytes consumed don't need to line up with the sizes of the internal Buffers—initial offsets will be calculated accordingly in order to give you a consistent view of the data.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="toString"></a>
|
||||
### bl.toString([encoding, [ start, [ end ]]])
|
||||
`toString()` will return a string representation of the buffer. The optional `start` and `end` arguments are passed on to `slice()`, while the `encoding` is passed on to `toString()` of the resulting Buffer. See the [Buffer#toString()](http://nodejs.org/docs/latest/api/buffer.html#buffer_buf_tostring_encoding_start_end) documentation for more information.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="readXX"></a>
|
||||
### bl.readDoubleBE(), bl.readDoubleLE(), bl.readFloatBE(), bl.readFloatLE(), bl.readInt32BE(), bl.readInt32LE(), bl.readUInt32BE(), bl.readUInt32LE(), bl.readInt16BE(), bl.readInt16LE(), bl.readUInt16BE(), bl.readUInt16LE(), bl.readInt8(), bl.readUInt8()
|
||||
|
||||
All of the standard byte-reading methods of the `Buffer` interface are implemented and will operate across internal Buffer boundaries transparently.
|
||||
|
||||
See the <b><code>[Buffer](http://nodejs.org/docs/latest/api/buffer.html)</code></b> documentation for how these work.
|
||||
|
||||
--------------------------------------------------------
|
||||
<a name="ctorStream"></a>
|
||||
### new BufferListStream([ callback | Buffer | Buffer array | BufferList | BufferList array | String ])
|
||||
**BufferListStream** is a Node **[Duplex Stream](http://nodejs.org/docs/latest/api/stream.html#stream_class_stream_duplex)**, so it can be read from and written to like a standard Node stream. You can also `pipe()` to and from a **BufferListStream** instance.
|
||||
|
||||
The constructor takes an optional callback, if supplied, the callback will be called with an error argument followed by a reference to the **bl** instance, when `bl.end()` is called (i.e. from a piped stream). This is a convenient method of collecting the entire contents of a stream, particularly when the stream is *chunky*, such as a network stream.
|
||||
|
||||
Normally, no arguments are required for the constructor, but you can initialise the list by passing in a single `Buffer` object or an array of `Buffer` object.
|
||||
|
||||
`new` is not strictly required, if you don't instantiate a new object, it will be done automatically for you so you can create a new instance simply with:
|
||||
|
||||
```js
|
||||
const { BufferListStream } = require('bl')
|
||||
const bl = BufferListStream()
|
||||
|
||||
// equivalent to:
|
||||
|
||||
const { BufferListStream } = require('bl')
|
||||
const bl = new BufferListStream()
|
||||
```
|
||||
|
||||
N.B. For backwards compatibility reasons, `BufferListStream` is the **default** export when you `require('bl')`:
|
||||
|
||||
```js
|
||||
const { BufferListStream } = require('bl')
|
||||
// equivalent to:
|
||||
const BufferListStream = require('bl')
|
||||
```
|
||||
|
||||
--------------------------------------------------------
|
||||
|
||||
## Contributors
|
||||
|
||||
**bl** is brought to you by the following hackers:
|
||||
|
||||
* [Rod Vagg](https://github.com/rvagg)
|
||||
* [Matteo Collina](https://github.com/mcollina)
|
||||
* [Jarett Cruger](https://github.com/jcrugzz)
|
||||
|
||||
<a name="license"></a>
|
||||
## License & copyright
|
||||
|
||||
Copyright (c) 2013-2019 bl contributors (listed above).
|
||||
|
||||
bl is licensed under the MIT license. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE.md file for more details.
|
84
node_modules/bl/bl.js
generated
vendored
Normal file
84
node_modules/bl/bl.js
generated
vendored
Normal file
|
@ -0,0 +1,84 @@
|
|||
'use strict'
|
||||
|
||||
const DuplexStream = require('readable-stream').Duplex
|
||||
const inherits = require('inherits')
|
||||
const BufferList = require('./BufferList')
|
||||
|
||||
function BufferListStream (callback) {
|
||||
if (!(this instanceof BufferListStream)) {
|
||||
return new BufferListStream(callback)
|
||||
}
|
||||
|
||||
if (typeof callback === 'function') {
|
||||
this._callback = callback
|
||||
|
||||
const piper = function piper (err) {
|
||||
if (this._callback) {
|
||||
this._callback(err)
|
||||
this._callback = null
|
||||
}
|
||||
}.bind(this)
|
||||
|
||||
this.on('pipe', function onPipe (src) {
|
||||
src.on('error', piper)
|
||||
})
|
||||
this.on('unpipe', function onUnpipe (src) {
|
||||
src.removeListener('error', piper)
|
||||
})
|
||||
|
||||
callback = null
|
||||
}
|
||||
|
||||
BufferList._init.call(this, callback)
|
||||
DuplexStream.call(this)
|
||||
}
|
||||
|
||||
inherits(BufferListStream, DuplexStream)
|
||||
Object.assign(BufferListStream.prototype, BufferList.prototype)
|
||||
|
||||
BufferListStream.prototype._new = function _new (callback) {
|
||||
return new BufferListStream(callback)
|
||||
}
|
||||
|
||||
BufferListStream.prototype._write = function _write (buf, encoding, callback) {
|
||||
this._appendBuffer(buf)
|
||||
|
||||
if (typeof callback === 'function') {
|
||||
callback()
|
||||
}
|
||||
}
|
||||
|
||||
BufferListStream.prototype._read = function _read (size) {
|
||||
if (!this.length) {
|
||||
return this.push(null)
|
||||
}
|
||||
|
||||
size = Math.min(size, this.length)
|
||||
this.push(this.slice(0, size))
|
||||
this.consume(size)
|
||||
}
|
||||
|
||||
BufferListStream.prototype.end = function end (chunk) {
|
||||
DuplexStream.prototype.end.call(this, chunk)
|
||||
|
||||
if (this._callback) {
|
||||
this._callback(null, this.slice())
|
||||
this._callback = null
|
||||
}
|
||||
}
|
||||
|
||||
BufferListStream.prototype._destroy = function _destroy (err, cb) {
|
||||
this._bufs.length = 0
|
||||
this.length = 0
|
||||
cb(err)
|
||||
}
|
||||
|
||||
BufferListStream.prototype._isBufferList = function _isBufferList (b) {
|
||||
return b instanceof BufferListStream || b instanceof BufferList || BufferListStream.isBufferList(b)
|
||||
}
|
||||
|
||||
BufferListStream.isBufferList = BufferList.isBufferList
|
||||
|
||||
module.exports = BufferListStream
|
||||
module.exports.BufferListStream = BufferListStream
|
||||
module.exports.BufferList = BufferList
|
38
node_modules/bl/node_modules/readable-stream/CONTRIBUTING.md
generated
vendored
Normal file
38
node_modules/bl/node_modules/readable-stream/CONTRIBUTING.md
generated
vendored
Normal file
|
@ -0,0 +1,38 @@
|
|||
# Developer's Certificate of Origin 1.1
|
||||
|
||||
By making a contribution to this project, I certify that:
|
||||
|
||||
* (a) The contribution was created in whole or in part by me and I
|
||||
have the right to submit it under the open source license
|
||||
indicated in the file; or
|
||||
|
||||
* (b) The contribution is based upon previous work that, to the best
|
||||
of my knowledge, is covered under an appropriate open source
|
||||
license and I have the right under that license to submit that
|
||||
work with modifications, whether created in whole or in part
|
||||
by me, under the same open source license (unless I am
|
||||
permitted to submit under a different license), as indicated
|
||||
in the file; or
|
||||
|
||||
* (c) The contribution was provided directly to me by some other
|
||||
person who certified (a), (b) or (c) and I have not modified
|
||||
it.
|
||||
|
||||
* (d) I understand and agree that this project and the contribution
|
||||
are public and that a record of the contribution (including all
|
||||
personal information I submit with it, including my sign-off) is
|
||||
maintained indefinitely and may be redistributed consistent with
|
||||
this project or the open source license(s) involved.
|
||||
|
||||
## Moderation Policy
|
||||
|
||||
The [Node.js Moderation Policy] applies to this WG.
|
||||
|
||||
## Code of Conduct
|
||||
|
||||
The [Node.js Code of Conduct][] applies to this WG.
|
||||
|
||||
[Node.js Code of Conduct]:
|
||||
https://github.com/nodejs/node/blob/master/CODE_OF_CONDUCT.md
|
||||
[Node.js Moderation Policy]:
|
||||
https://github.com/nodejs/TSC/blob/master/Moderation-Policy.md
|
136
node_modules/bl/node_modules/readable-stream/GOVERNANCE.md
generated
vendored
Normal file
136
node_modules/bl/node_modules/readable-stream/GOVERNANCE.md
generated
vendored
Normal file
|
@ -0,0 +1,136 @@
|
|||
### Streams Working Group
|
||||
|
||||
The Node.js Streams is jointly governed by a Working Group
|
||||
(WG)
|
||||
that is responsible for high-level guidance of the project.
|
||||
|
||||
The WG has final authority over this project including:
|
||||
|
||||
* Technical direction
|
||||
* Project governance and process (including this policy)
|
||||
* Contribution policy
|
||||
* GitHub repository hosting
|
||||
* Conduct guidelines
|
||||
* Maintaining the list of additional Collaborators
|
||||
|
||||
For the current list of WG members, see the project
|
||||
[README.md](./README.md#current-project-team-members).
|
||||
|
||||
### Collaborators
|
||||
|
||||
The readable-stream GitHub repository is
|
||||
maintained by the WG and additional Collaborators who are added by the
|
||||
WG on an ongoing basis.
|
||||
|
||||
Individuals making significant and valuable contributions are made
|
||||
Collaborators and given commit-access to the project. These
|
||||
individuals are identified by the WG and their addition as
|
||||
Collaborators is discussed during the WG meeting.
|
||||
|
||||
_Note:_ If you make a significant contribution and are not considered
|
||||
for commit-access log an issue or contact a WG member directly and it
|
||||
will be brought up in the next WG meeting.
|
||||
|
||||
Modifications of the contents of the readable-stream repository are
|
||||
made on
|
||||
a collaborative basis. Anybody with a GitHub account may propose a
|
||||
modification via pull request and it will be considered by the project
|
||||
Collaborators. All pull requests must be reviewed and accepted by a
|
||||
Collaborator with sufficient expertise who is able to take full
|
||||
responsibility for the change. In the case of pull requests proposed
|
||||
by an existing Collaborator, an additional Collaborator is required
|
||||
for sign-off. Consensus should be sought if additional Collaborators
|
||||
participate and there is disagreement around a particular
|
||||
modification. See _Consensus Seeking Process_ below for further detail
|
||||
on the consensus model used for governance.
|
||||
|
||||
Collaborators may opt to elevate significant or controversial
|
||||
modifications, or modifications that have not found consensus to the
|
||||
WG for discussion by assigning the ***WG-agenda*** tag to a pull
|
||||
request or issue. The WG should serve as the final arbiter where
|
||||
required.
|
||||
|
||||
For the current list of Collaborators, see the project
|
||||
[README.md](./README.md#members).
|
||||
|
||||
### WG Membership
|
||||
|
||||
WG seats are not time-limited. There is no fixed size of the WG.
|
||||
However, the expected target is between 6 and 12, to ensure adequate
|
||||
coverage of important areas of expertise, balanced with the ability to
|
||||
make decisions efficiently.
|
||||
|
||||
There is no specific set of requirements or qualifications for WG
|
||||
membership beyond these rules.
|
||||
|
||||
The WG may add additional members to the WG by unanimous consensus.
|
||||
|
||||
A WG member may be removed from the WG by voluntary resignation, or by
|
||||
unanimous consensus of all other WG members.
|
||||
|
||||
Changes to WG membership should be posted in the agenda, and may be
|
||||
suggested as any other agenda item (see "WG Meetings" below).
|
||||
|
||||
If an addition or removal is proposed during a meeting, and the full
|
||||
WG is not in attendance to participate, then the addition or removal
|
||||
is added to the agenda for the subsequent meeting. This is to ensure
|
||||
that all members are given the opportunity to participate in all
|
||||
membership decisions. If a WG member is unable to attend a meeting
|
||||
where a planned membership decision is being made, then their consent
|
||||
is assumed.
|
||||
|
||||
No more than 1/3 of the WG members may be affiliated with the same
|
||||
employer. If removal or resignation of a WG member, or a change of
|
||||
employment by a WG member, creates a situation where more than 1/3 of
|
||||
the WG membership shares an employer, then the situation must be
|
||||
immediately remedied by the resignation or removal of one or more WG
|
||||
members affiliated with the over-represented employer(s).
|
||||
|
||||
### WG Meetings
|
||||
|
||||
The WG meets occasionally on a Google Hangout On Air. A designated moderator
|
||||
approved by the WG runs the meeting. Each meeting should be
|
||||
published to YouTube.
|
||||
|
||||
Items are added to the WG agenda that are considered contentious or
|
||||
are modifications of governance, contribution policy, WG membership,
|
||||
or release process.
|
||||
|
||||
The intention of the agenda is not to approve or review all patches;
|
||||
that should happen continuously on GitHub and be handled by the larger
|
||||
group of Collaborators.
|
||||
|
||||
Any community member or contributor can ask that something be added to
|
||||
the next meeting's agenda by logging a GitHub Issue. Any Collaborator,
|
||||
WG member or the moderator can add the item to the agenda by adding
|
||||
the ***WG-agenda*** tag to the issue.
|
||||
|
||||
Prior to each WG meeting the moderator will share the Agenda with
|
||||
members of the WG. WG members can add any items they like to the
|
||||
agenda at the beginning of each meeting. The moderator and the WG
|
||||
cannot veto or remove items.
|
||||
|
||||
The WG may invite persons or representatives from certain projects to
|
||||
participate in a non-voting capacity.
|
||||
|
||||
The moderator is responsible for summarizing the discussion of each
|
||||
agenda item and sends it as a pull request after the meeting.
|
||||
|
||||
### Consensus Seeking Process
|
||||
|
||||
The WG follows a
|
||||
[Consensus
|
||||
Seeking](http://en.wikipedia.org/wiki/Consensus-seeking_decision-making)
|
||||
decision-making model.
|
||||
|
||||
When an agenda item has appeared to reach a consensus the moderator
|
||||
will ask "Does anyone object?" as a final call for dissent from the
|
||||
consensus.
|
||||
|
||||
If an agenda item cannot reach a consensus a WG member can call for
|
||||
either a closing vote or a vote to table the issue to the next
|
||||
meeting. The call for a vote must be seconded by a majority of the WG
|
||||
or else the discussion will continue. Simple majority wins.
|
||||
|
||||
Note that changes to WG membership require a majority consensus. See
|
||||
"WG Membership" above.
|
47
node_modules/bl/node_modules/readable-stream/LICENSE
generated
vendored
Normal file
47
node_modules/bl/node_modules/readable-stream/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,47 @@
|
|||
Node.js is licensed for use as follows:
|
||||
|
||||
"""
|
||||
Copyright Node.js contributors. All rights reserved.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to
|
||||
deal in the Software without restriction, including without limitation the
|
||||
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
|
||||
sell copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
|
||||
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
|
||||
IN THE SOFTWARE.
|
||||
"""
|
||||
|
||||
This license applies to parts of Node.js originating from the
|
||||
https://github.com/joyent/node repository:
|
||||
|
||||
"""
|
||||
Copyright Joyent, Inc. and other Node contributors. All rights reserved.
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to
|
||||
deal in the Software without restriction, including without limitation the
|
||||
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
|
||||
sell copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
|
||||
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
|
||||
IN THE SOFTWARE.
|
||||
"""
|
106
node_modules/bl/node_modules/readable-stream/README.md
generated
vendored
Normal file
106
node_modules/bl/node_modules/readable-stream/README.md
generated
vendored
Normal file
|
@ -0,0 +1,106 @@
|
|||
# readable-stream
|
||||
|
||||
***Node.js core streams for userland*** [![Build Status](https://travis-ci.com/nodejs/readable-stream.svg?branch=master)](https://travis-ci.com/nodejs/readable-stream)
|
||||
|
||||
|
||||
[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)
|
||||
[![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/)
|
||||
|
||||
|
||||
[![Sauce Test Status](https://saucelabs.com/browser-matrix/readabe-stream.svg)](https://saucelabs.com/u/readabe-stream)
|
||||
|
||||
```bash
|
||||
npm install --save readable-stream
|
||||
```
|
||||
|
||||
This package is a mirror of the streams implementations in Node.js.
|
||||
|
||||
Full documentation may be found on the [Node.js website](https://nodejs.org/dist/v10.19.0/docs/api/stream.html).
|
||||
|
||||
If you want to guarantee a stable streams base, regardless of what version of
|
||||
Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html).
|
||||
|
||||
As of version 2.0.0 **readable-stream** uses semantic versioning.
|
||||
|
||||
## Version 3.x.x
|
||||
|
||||
v3.x.x of `readable-stream` is a cut from Node 10. This version supports Node 6, 8, and 10, as well as evergreen browsers, IE 11 and latest Safari. The breaking changes introduced by v3 are composed by the combined breaking changes in [Node v9](https://nodejs.org/en/blog/release/v9.0.0/) and [Node v10](https://nodejs.org/en/blog/release/v10.0.0/), as follows:
|
||||
|
||||
1. Error codes: https://github.com/nodejs/node/pull/13310,
|
||||
https://github.com/nodejs/node/pull/13291,
|
||||
https://github.com/nodejs/node/pull/16589,
|
||||
https://github.com/nodejs/node/pull/15042,
|
||||
https://github.com/nodejs/node/pull/15665,
|
||||
https://github.com/nodejs/readable-stream/pull/344
|
||||
2. 'readable' have precedence over flowing
|
||||
https://github.com/nodejs/node/pull/18994
|
||||
3. make virtual methods errors consistent
|
||||
https://github.com/nodejs/node/pull/18813
|
||||
4. updated streams error handling
|
||||
https://github.com/nodejs/node/pull/18438
|
||||
5. writable.end should return this.
|
||||
https://github.com/nodejs/node/pull/18780
|
||||
6. readable continues to read when push('')
|
||||
https://github.com/nodejs/node/pull/18211
|
||||
7. add custom inspect to BufferList
|
||||
https://github.com/nodejs/node/pull/17907
|
||||
8. always defer 'readable' with nextTick
|
||||
https://github.com/nodejs/node/pull/17979
|
||||
|
||||
## Version 2.x.x
|
||||
v2.x.x of `readable-stream` is a cut of the stream module from Node 8 (there have been no semver-major changes from Node 4 to 8). This version supports all Node.js versions from 0.8, as well as evergreen browsers and IE 10 & 11.
|
||||
|
||||
### Big Thanks
|
||||
|
||||
Cross-browser Testing Platform and Open Source <3 Provided by [Sauce Labs][sauce]
|
||||
|
||||
# Usage
|
||||
|
||||
You can swap your `require('stream')` with `require('readable-stream')`
|
||||
without any changes, if you are just using one of the main classes and
|
||||
functions.
|
||||
|
||||
```js
|
||||
const {
|
||||
Readable,
|
||||
Writable,
|
||||
Transform,
|
||||
Duplex,
|
||||
pipeline,
|
||||
finished
|
||||
} = require('readable-stream')
|
||||
````
|
||||
|
||||
Note that `require('stream')` will return `Stream`, while
|
||||
`require('readable-stream')` will return `Readable`. We discourage using
|
||||
whatever is exported directly, but rather use one of the properties as
|
||||
shown in the example above.
|
||||
|
||||
# Streams Working Group
|
||||
|
||||
`readable-stream` is maintained by the Streams Working Group, which
|
||||
oversees the development and maintenance of the Streams API within
|
||||
Node.js. The responsibilities of the Streams Working Group include:
|
||||
|
||||
* Addressing stream issues on the Node.js issue tracker.
|
||||
* Authoring and editing stream documentation within the Node.js project.
|
||||
* Reviewing changes to stream subclasses within the Node.js project.
|
||||
* Redirecting changes to streams from the Node.js project to this
|
||||
project.
|
||||
* Assisting in the implementation of stream providers within Node.js.
|
||||
* Recommending versions of `readable-stream` to be included in Node.js.
|
||||
* Messaging about the future of streams to give the community advance
|
||||
notice of changes.
|
||||
|
||||
<a name="members"></a>
|
||||
## Team Members
|
||||
|
||||
* **Calvin Metcalf** ([@calvinmetcalf](https://github.com/calvinmetcalf)) <calvin.metcalf@gmail.com>
|
||||
- Release GPG key: F3EF5F62A87FC27A22E643F714CE4FF5015AA242
|
||||
* **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) <mathiasbuus@gmail.com>
|
||||
* **Matteo Collina** ([@mcollina](https://github.com/mcollina)) <matteo.collina@gmail.com>
|
||||
- Release GPG key: 3ABC01543F22DD2239285CDD818674489FBC127E
|
||||
* **Irina Shestak** ([@lrlna](https://github.com/lrlna)) <shestak.irina@gmail.com>
|
||||
* **Yoshua Wyuts** ([@yoshuawuyts](https://github.com/yoshuawuyts)) <yoshuawuyts@gmail.com>
|
||||
|
||||
[sauce]: https://saucelabs.com
|
127
node_modules/bl/node_modules/readable-stream/errors-browser.js
generated
vendored
Normal file
127
node_modules/bl/node_modules/readable-stream/errors-browser.js
generated
vendored
Normal file
|
@ -0,0 +1,127 @@
|
|||
'use strict';
|
||||
|
||||
function _inheritsLoose(subClass, superClass) { subClass.prototype = Object.create(superClass.prototype); subClass.prototype.constructor = subClass; subClass.__proto__ = superClass; }
|
||||
|
||||
var codes = {};
|
||||
|
||||
function createErrorType(code, message, Base) {
|
||||
if (!Base) {
|
||||
Base = Error;
|
||||
}
|
||||
|
||||
function getMessage(arg1, arg2, arg3) {
|
||||
if (typeof message === 'string') {
|
||||
return message;
|
||||
} else {
|
||||
return message(arg1, arg2, arg3);
|
||||
}
|
||||
}
|
||||
|
||||
var NodeError =
|
||||
/*#__PURE__*/
|
||||
function (_Base) {
|
||||
_inheritsLoose(NodeError, _Base);
|
||||
|
||||
function NodeError(arg1, arg2, arg3) {
|
||||
return _Base.call(this, getMessage(arg1, arg2, arg3)) || this;
|
||||
}
|
||||
|
||||
return NodeError;
|
||||
}(Base);
|
||||
|
||||
NodeError.prototype.name = Base.name;
|
||||
NodeError.prototype.code = code;
|
||||
codes[code] = NodeError;
|
||||
} // https://github.com/nodejs/node/blob/v10.8.0/lib/internal/errors.js
|
||||
|
||||
|
||||
function oneOf(expected, thing) {
|
||||
if (Array.isArray(expected)) {
|
||||
var len = expected.length;
|
||||
expected = expected.map(function (i) {
|
||||
return String(i);
|
||||
});
|
||||
|
||||
if (len > 2) {
|
||||
return "one of ".concat(thing, " ").concat(expected.slice(0, len - 1).join(', '), ", or ") + expected[len - 1];
|
||||
} else if (len === 2) {
|
||||
return "one of ".concat(thing, " ").concat(expected[0], " or ").concat(expected[1]);
|
||||
} else {
|
||||
return "of ".concat(thing, " ").concat(expected[0]);
|
||||
}
|
||||
} else {
|
||||
return "of ".concat(thing, " ").concat(String(expected));
|
||||
}
|
||||
} // https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/startsWith
|
||||
|
||||
|
||||
function startsWith(str, search, pos) {
|
||||
return str.substr(!pos || pos < 0 ? 0 : +pos, search.length) === search;
|
||||
} // https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/endsWith
|
||||
|
||||
|
||||
function endsWith(str, search, this_len) {
|
||||
if (this_len === undefined || this_len > str.length) {
|
||||
this_len = str.length;
|
||||
}
|
||||
|
||||
return str.substring(this_len - search.length, this_len) === search;
|
||||
} // https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/includes
|
||||
|
||||
|
||||
function includes(str, search, start) {
|
||||
if (typeof start !== 'number') {
|
||||
start = 0;
|
||||
}
|
||||
|
||||
if (start + search.length > str.length) {
|
||||
return false;
|
||||
} else {
|
||||
return str.indexOf(search, start) !== -1;
|
||||
}
|
||||
}
|
||||
|
||||
createErrorType('ERR_INVALID_OPT_VALUE', function (name, value) {
|
||||
return 'The value "' + value + '" is invalid for option "' + name + '"';
|
||||
}, TypeError);
|
||||
createErrorType('ERR_INVALID_ARG_TYPE', function (name, expected, actual) {
|
||||
// determiner: 'must be' or 'must not be'
|
||||
var determiner;
|
||||
|
||||
if (typeof expected === 'string' && startsWith(expected, 'not ')) {
|
||||
determiner = 'must not be';
|
||||
expected = expected.replace(/^not /, '');
|
||||
} else {
|
||||
determiner = 'must be';
|
||||
}
|
||||
|
||||
var msg;
|
||||
|
||||
if (endsWith(name, ' argument')) {
|
||||
// For cases like 'first argument'
|
||||
msg = "The ".concat(name, " ").concat(determiner, " ").concat(oneOf(expected, 'type'));
|
||||
} else {
|
||||
var type = includes(name, '.') ? 'property' : 'argument';
|
||||
msg = "The \"".concat(name, "\" ").concat(type, " ").concat(determiner, " ").concat(oneOf(expected, 'type'));
|
||||
}
|
||||
|
||||
msg += ". Received type ".concat(typeof actual);
|
||||
return msg;
|
||||
}, TypeError);
|
||||
createErrorType('ERR_STREAM_PUSH_AFTER_EOF', 'stream.push() after EOF');
|
||||
createErrorType('ERR_METHOD_NOT_IMPLEMENTED', function (name) {
|
||||
return 'The ' + name + ' method is not implemented';
|
||||
});
|
||||
createErrorType('ERR_STREAM_PREMATURE_CLOSE', 'Premature close');
|
||||
createErrorType('ERR_STREAM_DESTROYED', function (name) {
|
||||
return 'Cannot call ' + name + ' after a stream was destroyed';
|
||||
});
|
||||
createErrorType('ERR_MULTIPLE_CALLBACK', 'Callback called multiple times');
|
||||
createErrorType('ERR_STREAM_CANNOT_PIPE', 'Cannot pipe, not readable');
|
||||
createErrorType('ERR_STREAM_WRITE_AFTER_END', 'write after end');
|
||||
createErrorType('ERR_STREAM_NULL_VALUES', 'May not write null values to stream', TypeError);
|
||||
createErrorType('ERR_UNKNOWN_ENCODING', function (arg) {
|
||||
return 'Unknown encoding: ' + arg;
|
||||
}, TypeError);
|
||||
createErrorType('ERR_STREAM_UNSHIFT_AFTER_END_EVENT', 'stream.unshift() after end event');
|
||||
module.exports.codes = codes;
|
116
node_modules/bl/node_modules/readable-stream/errors.js
generated
vendored
Normal file
116
node_modules/bl/node_modules/readable-stream/errors.js
generated
vendored
Normal file
|
@ -0,0 +1,116 @@
|
|||
'use strict';
|
||||
|
||||
const codes = {};
|
||||
|
||||
function createErrorType(code, message, Base) {
|
||||
if (!Base) {
|
||||
Base = Error
|
||||
}
|
||||
|
||||
function getMessage (arg1, arg2, arg3) {
|
||||
if (typeof message === 'string') {
|
||||
return message
|
||||
} else {
|
||||
return message(arg1, arg2, arg3)
|
||||
}
|
||||
}
|
||||
|
||||
class NodeError extends Base {
|
||||
constructor (arg1, arg2, arg3) {
|
||||
super(getMessage(arg1, arg2, arg3));
|
||||
}
|
||||
}
|
||||
|
||||
NodeError.prototype.name = Base.name;
|
||||
NodeError.prototype.code = code;
|
||||
|
||||
codes[code] = NodeError;
|
||||
}
|
||||
|
||||
// https://github.com/nodejs/node/blob/v10.8.0/lib/internal/errors.js
|
||||
function oneOf(expected, thing) {
|
||||
if (Array.isArray(expected)) {
|
||||
const len = expected.length;
|
||||
expected = expected.map((i) => String(i));
|
||||
if (len > 2) {
|
||||
return `one of ${thing} ${expected.slice(0, len - 1).join(', ')}, or ` +
|
||||
expected[len - 1];
|
||||
} else if (len === 2) {
|
||||
return `one of ${thing} ${expected[0]} or ${expected[1]}`;
|
||||
} else {
|
||||
return `of ${thing} ${expected[0]}`;
|
||||
}
|
||||
} else {
|
||||
return `of ${thing} ${String(expected)}`;
|
||||
}
|
||||
}
|
||||
|
||||
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/startsWith
|
||||
function startsWith(str, search, pos) {
|
||||
return str.substr(!pos || pos < 0 ? 0 : +pos, search.length) === search;
|
||||
}
|
||||
|
||||
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/endsWith
|
||||
function endsWith(str, search, this_len) {
|
||||
if (this_len === undefined || this_len > str.length) {
|
||||
this_len = str.length;
|
||||
}
|
||||
return str.substring(this_len - search.length, this_len) === search;
|
||||
}
|
||||
|
||||
// https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/includes
|
||||
function includes(str, search, start) {
|
||||
if (typeof start !== 'number') {
|
||||
start = 0;
|
||||
}
|
||||
|
||||
if (start + search.length > str.length) {
|
||||
return false;
|
||||
} else {
|
||||
return str.indexOf(search, start) !== -1;
|
||||
}
|
||||
}
|
||||
|
||||
createErrorType('ERR_INVALID_OPT_VALUE', function (name, value) {
|
||||
return 'The value "' + value + '" is invalid for option "' + name + '"'
|
||||
}, TypeError);
|
||||
createErrorType('ERR_INVALID_ARG_TYPE', function (name, expected, actual) {
|
||||
// determiner: 'must be' or 'must not be'
|
||||
let determiner;
|
||||
if (typeof expected === 'string' && startsWith(expected, 'not ')) {
|
||||
determiner = 'must not be';
|
||||
expected = expected.replace(/^not /, '');
|
||||
} else {
|
||||
determiner = 'must be';
|
||||
}
|
||||
|
||||
let msg;
|
||||
if (endsWith(name, ' argument')) {
|
||||
// For cases like 'first argument'
|
||||
msg = `The ${name} ${determiner} ${oneOf(expected, 'type')}`;
|
||||
} else {
|
||||
const type = includes(name, '.') ? 'property' : 'argument';
|
||||
msg = `The "${name}" ${type} ${determiner} ${oneOf(expected, 'type')}`;
|
||||
}
|
||||
|
||||
msg += `. Received type ${typeof actual}`;
|
||||
return msg;
|
||||
}, TypeError);
|
||||
createErrorType('ERR_STREAM_PUSH_AFTER_EOF', 'stream.push() after EOF');
|
||||
createErrorType('ERR_METHOD_NOT_IMPLEMENTED', function (name) {
|
||||
return 'The ' + name + ' method is not implemented'
|
||||
});
|
||||
createErrorType('ERR_STREAM_PREMATURE_CLOSE', 'Premature close');
|
||||
createErrorType('ERR_STREAM_DESTROYED', function (name) {
|
||||
return 'Cannot call ' + name + ' after a stream was destroyed';
|
||||
});
|
||||
createErrorType('ERR_MULTIPLE_CALLBACK', 'Callback called multiple times');
|
||||
createErrorType('ERR_STREAM_CANNOT_PIPE', 'Cannot pipe, not readable');
|
||||
createErrorType('ERR_STREAM_WRITE_AFTER_END', 'write after end');
|
||||
createErrorType('ERR_STREAM_NULL_VALUES', 'May not write null values to stream', TypeError);
|
||||
createErrorType('ERR_UNKNOWN_ENCODING', function (arg) {
|
||||
return 'Unknown encoding: ' + arg
|
||||
}, TypeError);
|
||||
createErrorType('ERR_STREAM_UNSHIFT_AFTER_END_EVENT', 'stream.unshift() after end event');
|
||||
|
||||
module.exports.codes = codes;
|
17
node_modules/bl/node_modules/readable-stream/experimentalWarning.js
generated
vendored
Normal file
17
node_modules/bl/node_modules/readable-stream/experimentalWarning.js
generated
vendored
Normal file
|
@ -0,0 +1,17 @@
|
|||
'use strict'
|
||||
|
||||
var experimentalWarnings = new Set();
|
||||
|
||||
function emitExperimentalWarning(feature) {
|
||||
if (experimentalWarnings.has(feature)) return;
|
||||
var msg = feature + ' is an experimental feature. This feature could ' +
|
||||
'change at any time';
|
||||
experimentalWarnings.add(feature);
|
||||
process.emitWarning(msg, 'ExperimentalWarning');
|
||||
}
|
||||
|
||||
function noop() {}
|
||||
|
||||
module.exports.emitExperimentalWarning = process.emitWarning
|
||||
? emitExperimentalWarning
|
||||
: noop;
|
139
node_modules/bl/node_modules/readable-stream/lib/_stream_duplex.js
generated
vendored
Normal file
139
node_modules/bl/node_modules/readable-stream/lib/_stream_duplex.js
generated
vendored
Normal file
|
@ -0,0 +1,139 @@
|
|||
// Copyright Joyent, Inc. and other Node contributors.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining a
|
||||
// copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to permit
|
||||
// persons to whom the Software is furnished to do so, subject to the
|
||||
// following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be included
|
||||
// in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
|
||||
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
|
||||
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
|
||||
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
|
||||
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
|
||||
// USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
// a duplex stream is just a stream that is both readable and writable.
|
||||
// Since JS doesn't have multiple prototypal inheritance, this class
|
||||
// prototypally inherits from Readable, and then parasitically from
|
||||
// Writable.
|
||||
'use strict';
|
||||
/*<replacement>*/
|
||||
|
||||
var objectKeys = Object.keys || function (obj) {
|
||||
var keys = [];
|
||||
|
||||
for (var key in obj) {
|
||||
keys.push(key);
|
||||
}
|
||||
|
||||
return keys;
|
||||
};
|
||||
/*</replacement>*/
|
||||
|
||||
|
||||
module.exports = Duplex;
|
||||
|
||||
var Readable = require('./_stream_readable');
|
||||
|
||||
var Writable = require('./_stream_writable');
|
||||
|
||||
require('inherits')(Duplex, Readable);
|
||||
|
||||
{
|
||||
// Allow the keys array to be GC'ed.
|
||||
var keys = objectKeys(Writable.prototype);
|
||||
|
||||
for (var v = 0; v < keys.length; v++) {
|
||||
var method = keys[v];
|
||||
if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method];
|
||||
}
|
||||
}
|
||||
|
||||
function Duplex(options) {
|
||||
if (!(this instanceof Duplex)) return new Duplex(options);
|
||||
Readable.call(this, options);
|
||||
Writable.call(this, options);
|
||||
this.allowHalfOpen = true;
|
||||
|
||||
if (options) {
|
||||
if (options.readable === false) this.readable = false;
|
||||
if (options.writable === false) this.writable = false;
|
||||
|
||||
if (options.allowHalfOpen === false) {
|
||||
this.allowHalfOpen = false;
|
||||
this.once('end', onend);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Object.defineProperty(Duplex.prototype, 'writableHighWaterMark', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
return this._writableState.highWaterMark;
|
||||
}
|
||||
});
|
||||
Object.defineProperty(Duplex.prototype, 'writableBuffer', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
return this._writableState && this._writableState.getBuffer();
|
||||
}
|
||||
});
|
||||
Object.defineProperty(Duplex.prototype, 'writableLength', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
return this._writableState.length;
|
||||
}
|
||||
}); // the no-half-open enforcer
|
||||
|
||||
function onend() {
|
||||
// If the writable side ended, then we're ok.
|
||||
if (this._writableState.ended) return; // no more data can be written.
|
||||
// But allow more writes to happen in this tick.
|
||||
|
||||
process.nextTick(onEndNT, this);
|
||||
}
|
||||
|
||||
function onEndNT(self) {
|
||||
self.end();
|
||||
}
|
||||
|
||||
Object.defineProperty(Duplex.prototype, 'destroyed', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
if (this._readableState === undefined || this._writableState === undefined) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return this._readableState.destroyed && this._writableState.destroyed;
|
||||
},
|
||||
set: function set(value) {
|
||||
// we ignore the value if the stream
|
||||
// has not been initialized yet
|
||||
if (this._readableState === undefined || this._writableState === undefined) {
|
||||
return;
|
||||
} // backward compatibility, the user is explicitly
|
||||
// managing destroyed
|
||||
|
||||
|
||||
this._readableState.destroyed = value;
|
||||
this._writableState.destroyed = value;
|
||||
}
|
||||
});
|
39
node_modules/bl/node_modules/readable-stream/lib/_stream_passthrough.js
generated
vendored
Normal file
39
node_modules/bl/node_modules/readable-stream/lib/_stream_passthrough.js
generated
vendored
Normal file
|
@ -0,0 +1,39 @@
|
|||
// Copyright Joyent, Inc. and other Node contributors.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining a
|
||||
// copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to permit
|
||||
// persons to whom the Software is furnished to do so, subject to the
|
||||
// following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be included
|
||||
// in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
|
||||
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
|
||||
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
|
||||
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
|
||||
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
|
||||
// USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
// a passthrough stream.
|
||||
// basically just the most minimal sort of Transform stream.
|
||||
// Every written chunk gets output as-is.
|
||||
'use strict';
|
||||
|
||||
module.exports = PassThrough;
|
||||
|
||||
var Transform = require('./_stream_transform');
|
||||
|
||||
require('inherits')(PassThrough, Transform);
|
||||
|
||||
function PassThrough(options) {
|
||||
if (!(this instanceof PassThrough)) return new PassThrough(options);
|
||||
Transform.call(this, options);
|
||||
}
|
||||
|
||||
PassThrough.prototype._transform = function (chunk, encoding, cb) {
|
||||
cb(null, chunk);
|
||||
};
|
1124
node_modules/bl/node_modules/readable-stream/lib/_stream_readable.js
generated
vendored
Normal file
1124
node_modules/bl/node_modules/readable-stream/lib/_stream_readable.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
201
node_modules/bl/node_modules/readable-stream/lib/_stream_transform.js
generated
vendored
Normal file
201
node_modules/bl/node_modules/readable-stream/lib/_stream_transform.js
generated
vendored
Normal file
|
@ -0,0 +1,201 @@
|
|||
// Copyright Joyent, Inc. and other Node contributors.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining a
|
||||
// copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to permit
|
||||
// persons to whom the Software is furnished to do so, subject to the
|
||||
// following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be included
|
||||
// in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
|
||||
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
|
||||
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
|
||||
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
|
||||
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
|
||||
// USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
// a transform stream is a readable/writable stream where you do
|
||||
// something with the data. Sometimes it's called a "filter",
|
||||
// but that's not a great name for it, since that implies a thing where
|
||||
// some bits pass through, and others are simply ignored. (That would
|
||||
// be a valid example of a transform, of course.)
|
||||
//
|
||||
// While the output is causally related to the input, it's not a
|
||||
// necessarily symmetric or synchronous transformation. For example,
|
||||
// a zlib stream might take multiple plain-text writes(), and then
|
||||
// emit a single compressed chunk some time in the future.
|
||||
//
|
||||
// Here's how this works:
|
||||
//
|
||||
// The Transform stream has all the aspects of the readable and writable
|
||||
// stream classes. When you write(chunk), that calls _write(chunk,cb)
|
||||
// internally, and returns false if there's a lot of pending writes
|
||||
// buffered up. When you call read(), that calls _read(n) until
|
||||
// there's enough pending readable data buffered up.
|
||||
//
|
||||
// In a transform stream, the written data is placed in a buffer. When
|
||||
// _read(n) is called, it transforms the queued up data, calling the
|
||||
// buffered _write cb's as it consumes chunks. If consuming a single
|
||||
// written chunk would result in multiple output chunks, then the first
|
||||
// outputted bit calls the readcb, and subsequent chunks just go into
|
||||
// the read buffer, and will cause it to emit 'readable' if necessary.
|
||||
//
|
||||
// This way, back-pressure is actually determined by the reading side,
|
||||
// since _read has to be called to start processing a new chunk. However,
|
||||
// a pathological inflate type of transform can cause excessive buffering
|
||||
// here. For example, imagine a stream where every byte of input is
|
||||
// interpreted as an integer from 0-255, and then results in that many
|
||||
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
|
||||
// 1kb of data being output. In this case, you could write a very small
|
||||
// amount of input, and end up with a very large amount of output. In
|
||||
// such a pathological inflating mechanism, there'd be no way to tell
|
||||
// the system to stop doing the transform. A single 4MB write could
|
||||
// cause the system to run out of memory.
|
||||
//
|
||||
// However, even in such a pathological case, only a single written chunk
|
||||
// would be consumed, and then the rest would wait (un-transformed) until
|
||||
// the results of the previous transformed chunk were consumed.
|
||||
'use strict';
|
||||
|
||||
module.exports = Transform;
|
||||
|
||||
var _require$codes = require('../errors').codes,
|
||||
ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
|
||||
ERR_MULTIPLE_CALLBACK = _require$codes.ERR_MULTIPLE_CALLBACK,
|
||||
ERR_TRANSFORM_ALREADY_TRANSFORMING = _require$codes.ERR_TRANSFORM_ALREADY_TRANSFORMING,
|
||||
ERR_TRANSFORM_WITH_LENGTH_0 = _require$codes.ERR_TRANSFORM_WITH_LENGTH_0;
|
||||
|
||||
var Duplex = require('./_stream_duplex');
|
||||
|
||||
require('inherits')(Transform, Duplex);
|
||||
|
||||
function afterTransform(er, data) {
|
||||
var ts = this._transformState;
|
||||
ts.transforming = false;
|
||||
var cb = ts.writecb;
|
||||
|
||||
if (cb === null) {
|
||||
return this.emit('error', new ERR_MULTIPLE_CALLBACK());
|
||||
}
|
||||
|
||||
ts.writechunk = null;
|
||||
ts.writecb = null;
|
||||
if (data != null) // single equals check for both `null` and `undefined`
|
||||
this.push(data);
|
||||
cb(er);
|
||||
var rs = this._readableState;
|
||||
rs.reading = false;
|
||||
|
||||
if (rs.needReadable || rs.length < rs.highWaterMark) {
|
||||
this._read(rs.highWaterMark);
|
||||
}
|
||||
}
|
||||
|
||||
function Transform(options) {
|
||||
if (!(this instanceof Transform)) return new Transform(options);
|
||||
Duplex.call(this, options);
|
||||
this._transformState = {
|
||||
afterTransform: afterTransform.bind(this),
|
||||
needTransform: false,
|
||||
transforming: false,
|
||||
writecb: null,
|
||||
writechunk: null,
|
||||
writeencoding: null
|
||||
}; // start out asking for a readable event once data is transformed.
|
||||
|
||||
this._readableState.needReadable = true; // we have implemented the _read method, and done the other things
|
||||
// that Readable wants before the first _read call, so unset the
|
||||
// sync guard flag.
|
||||
|
||||
this._readableState.sync = false;
|
||||
|
||||
if (options) {
|
||||
if (typeof options.transform === 'function') this._transform = options.transform;
|
||||
if (typeof options.flush === 'function') this._flush = options.flush;
|
||||
} // When the writable side finishes, then flush out anything remaining.
|
||||
|
||||
|
||||
this.on('prefinish', prefinish);
|
||||
}
|
||||
|
||||
function prefinish() {
|
||||
var _this = this;
|
||||
|
||||
if (typeof this._flush === 'function' && !this._readableState.destroyed) {
|
||||
this._flush(function (er, data) {
|
||||
done(_this, er, data);
|
||||
});
|
||||
} else {
|
||||
done(this, null, null);
|
||||
}
|
||||
}
|
||||
|
||||
Transform.prototype.push = function (chunk, encoding) {
|
||||
this._transformState.needTransform = false;
|
||||
return Duplex.prototype.push.call(this, chunk, encoding);
|
||||
}; // This is the part where you do stuff!
|
||||
// override this function in implementation classes.
|
||||
// 'chunk' is an input chunk.
|
||||
//
|
||||
// Call `push(newChunk)` to pass along transformed output
|
||||
// to the readable side. You may call 'push' zero or more times.
|
||||
//
|
||||
// Call `cb(err)` when you are done with this chunk. If you pass
|
||||
// an error, then that'll put the hurt on the whole operation. If you
|
||||
// never call cb(), then you'll never get another chunk.
|
||||
|
||||
|
||||
Transform.prototype._transform = function (chunk, encoding, cb) {
|
||||
cb(new ERR_METHOD_NOT_IMPLEMENTED('_transform()'));
|
||||
};
|
||||
|
||||
Transform.prototype._write = function (chunk, encoding, cb) {
|
||||
var ts = this._transformState;
|
||||
ts.writecb = cb;
|
||||
ts.writechunk = chunk;
|
||||
ts.writeencoding = encoding;
|
||||
|
||||
if (!ts.transforming) {
|
||||
var rs = this._readableState;
|
||||
if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark);
|
||||
}
|
||||
}; // Doesn't matter what the args are here.
|
||||
// _transform does all the work.
|
||||
// That we got here means that the readable side wants more data.
|
||||
|
||||
|
||||
Transform.prototype._read = function (n) {
|
||||
var ts = this._transformState;
|
||||
|
||||
if (ts.writechunk !== null && !ts.transforming) {
|
||||
ts.transforming = true;
|
||||
|
||||
this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform);
|
||||
} else {
|
||||
// mark that we need a transform, so that any data that comes in
|
||||
// will get processed, now that we've asked for it.
|
||||
ts.needTransform = true;
|
||||
}
|
||||
};
|
||||
|
||||
Transform.prototype._destroy = function (err, cb) {
|
||||
Duplex.prototype._destroy.call(this, err, function (err2) {
|
||||
cb(err2);
|
||||
});
|
||||
};
|
||||
|
||||
function done(stream, er, data) {
|
||||
if (er) return stream.emit('error', er);
|
||||
if (data != null) // single equals check for both `null` and `undefined`
|
||||
stream.push(data); // TODO(BridgeAR): Write a test for these two error cases
|
||||
// if there's nothing in the write buffer, then that means
|
||||
// that nothing more will ever be provided
|
||||
|
||||
if (stream._writableState.length) throw new ERR_TRANSFORM_WITH_LENGTH_0();
|
||||
if (stream._transformState.transforming) throw new ERR_TRANSFORM_ALREADY_TRANSFORMING();
|
||||
return stream.push(null);
|
||||
}
|
697
node_modules/bl/node_modules/readable-stream/lib/_stream_writable.js
generated
vendored
Normal file
697
node_modules/bl/node_modules/readable-stream/lib/_stream_writable.js
generated
vendored
Normal file
|
@ -0,0 +1,697 @@
|
|||
// Copyright Joyent, Inc. and other Node contributors.
|
||||
//
|
||||
// Permission is hereby granted, free of charge, to any person obtaining a
|
||||
// copy of this software and associated documentation files (the
|
||||
// "Software"), to deal in the Software without restriction, including
|
||||
// without limitation the rights to use, copy, modify, merge, publish,
|
||||
// distribute, sublicense, and/or sell copies of the Software, and to permit
|
||||
// persons to whom the Software is furnished to do so, subject to the
|
||||
// following conditions:
|
||||
//
|
||||
// The above copyright notice and this permission notice shall be included
|
||||
// in all copies or substantial portions of the Software.
|
||||
//
|
||||
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
|
||||
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
|
||||
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
|
||||
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
|
||||
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
|
||||
// USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
// A bit simpler than readable streams.
|
||||
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
|
||||
// the drain event emission and buffering.
|
||||
'use strict';
|
||||
|
||||
module.exports = Writable;
|
||||
/* <replacement> */
|
||||
|
||||
function WriteReq(chunk, encoding, cb) {
|
||||
this.chunk = chunk;
|
||||
this.encoding = encoding;
|
||||
this.callback = cb;
|
||||
this.next = null;
|
||||
} // It seems a linked list but it is not
|
||||
// there will be only 2 of these for each stream
|
||||
|
||||
|
||||
function CorkedRequest(state) {
|
||||
var _this = this;
|
||||
|
||||
this.next = null;
|
||||
this.entry = null;
|
||||
|
||||
this.finish = function () {
|
||||
onCorkedFinish(_this, state);
|
||||
};
|
||||
}
|
||||
/* </replacement> */
|
||||
|
||||
/*<replacement>*/
|
||||
|
||||
|
||||
var Duplex;
|
||||
/*</replacement>*/
|
||||
|
||||
Writable.WritableState = WritableState;
|
||||
/*<replacement>*/
|
||||
|
||||
var internalUtil = {
|
||||
deprecate: require('util-deprecate')
|
||||
};
|
||||
/*</replacement>*/
|
||||
|
||||
/*<replacement>*/
|
||||
|
||||
var Stream = require('./internal/streams/stream');
|
||||
/*</replacement>*/
|
||||
|
||||
|
||||
var Buffer = require('buffer').Buffer;
|
||||
|
||||
var OurUint8Array = global.Uint8Array || function () {};
|
||||
|
||||
function _uint8ArrayToBuffer(chunk) {
|
||||
return Buffer.from(chunk);
|
||||
}
|
||||
|
||||
function _isUint8Array(obj) {
|
||||
return Buffer.isBuffer(obj) || obj instanceof OurUint8Array;
|
||||
}
|
||||
|
||||
var destroyImpl = require('./internal/streams/destroy');
|
||||
|
||||
var _require = require('./internal/streams/state'),
|
||||
getHighWaterMark = _require.getHighWaterMark;
|
||||
|
||||
var _require$codes = require('../errors').codes,
|
||||
ERR_INVALID_ARG_TYPE = _require$codes.ERR_INVALID_ARG_TYPE,
|
||||
ERR_METHOD_NOT_IMPLEMENTED = _require$codes.ERR_METHOD_NOT_IMPLEMENTED,
|
||||
ERR_MULTIPLE_CALLBACK = _require$codes.ERR_MULTIPLE_CALLBACK,
|
||||
ERR_STREAM_CANNOT_PIPE = _require$codes.ERR_STREAM_CANNOT_PIPE,
|
||||
ERR_STREAM_DESTROYED = _require$codes.ERR_STREAM_DESTROYED,
|
||||
ERR_STREAM_NULL_VALUES = _require$codes.ERR_STREAM_NULL_VALUES,
|
||||
ERR_STREAM_WRITE_AFTER_END = _require$codes.ERR_STREAM_WRITE_AFTER_END,
|
||||
ERR_UNKNOWN_ENCODING = _require$codes.ERR_UNKNOWN_ENCODING;
|
||||
|
||||
var errorOrDestroy = destroyImpl.errorOrDestroy;
|
||||
|
||||
require('inherits')(Writable, Stream);
|
||||
|
||||
function nop() {}
|
||||
|
||||
function WritableState(options, stream, isDuplex) {
|
||||
Duplex = Duplex || require('./_stream_duplex');
|
||||
options = options || {}; // Duplex streams are both readable and writable, but share
|
||||
// the same options object.
|
||||
// However, some cases require setting options to different
|
||||
// values for the readable and the writable sides of the duplex stream,
|
||||
// e.g. options.readableObjectMode vs. options.writableObjectMode, etc.
|
||||
|
||||
if (typeof isDuplex !== 'boolean') isDuplex = stream instanceof Duplex; // object stream flag to indicate whether or not this stream
|
||||
// contains buffers or objects.
|
||||
|
||||
this.objectMode = !!options.objectMode;
|
||||
if (isDuplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // the point at which write() starts returning false
|
||||
// Note: 0 is a valid value, means that we always return false if
|
||||
// the entire buffer is not flushed immediately on write()
|
||||
|
||||
this.highWaterMark = getHighWaterMark(this, options, 'writableHighWaterMark', isDuplex); // if _final has been called
|
||||
|
||||
this.finalCalled = false; // drain event flag.
|
||||
|
||||
this.needDrain = false; // at the start of calling end()
|
||||
|
||||
this.ending = false; // when end() has been called, and returned
|
||||
|
||||
this.ended = false; // when 'finish' is emitted
|
||||
|
||||
this.finished = false; // has it been destroyed
|
||||
|
||||
this.destroyed = false; // should we decode strings into buffers before passing to _write?
|
||||
// this is here so that some node-core streams can optimize string
|
||||
// handling at a lower level.
|
||||
|
||||
var noDecode = options.decodeStrings === false;
|
||||
this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string
|
||||
// encoding is 'binary' so we have to make this configurable.
|
||||
// Everything else in the universe uses 'utf8', though.
|
||||
|
||||
this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement
|
||||
// of how much we're waiting to get pushed to some underlying
|
||||
// socket or file.
|
||||
|
||||
this.length = 0; // a flag to see when we're in the middle of a write.
|
||||
|
||||
this.writing = false; // when true all writes will be buffered until .uncork() call
|
||||
|
||||
this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately,
|
||||
// or on a later tick. We set this to true at first, because any
|
||||
// actions that shouldn't happen until "later" should generally also
|
||||
// not happen before the first write call.
|
||||
|
||||
this.sync = true; // a flag to know if we're processing previously buffered items, which
|
||||
// may call the _write() callback in the same tick, so that we don't
|
||||
// end up in an overlapped onwrite situation.
|
||||
|
||||
this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb)
|
||||
|
||||
this.onwrite = function (er) {
|
||||
onwrite(stream, er);
|
||||
}; // the callback that the user supplies to write(chunk,encoding,cb)
|
||||
|
||||
|
||||
this.writecb = null; // the amount that is being written when _write is called.
|
||||
|
||||
this.writelen = 0;
|
||||
this.bufferedRequest = null;
|
||||
this.lastBufferedRequest = null; // number of pending user-supplied write callbacks
|
||||
// this must be 0 before 'finish' can be emitted
|
||||
|
||||
this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs
|
||||
// This is relevant for synchronous Transform streams
|
||||
|
||||
this.prefinished = false; // True if the error was already emitted and should not be thrown again
|
||||
|
||||
this.errorEmitted = false; // Should close be emitted on destroy. Defaults to true.
|
||||
|
||||
this.emitClose = options.emitClose !== false; // Should .destroy() be called after 'finish' (and potentially 'end')
|
||||
|
||||
this.autoDestroy = !!options.autoDestroy; // count buffered requests
|
||||
|
||||
this.bufferedRequestCount = 0; // allocate the first CorkedRequest, there is always
|
||||
// one allocated and free to use, and we maintain at most two
|
||||
|
||||
this.corkedRequestsFree = new CorkedRequest(this);
|
||||
}
|
||||
|
||||
WritableState.prototype.getBuffer = function getBuffer() {
|
||||
var current = this.bufferedRequest;
|
||||
var out = [];
|
||||
|
||||
while (current) {
|
||||
out.push(current);
|
||||
current = current.next;
|
||||
}
|
||||
|
||||
return out;
|
||||
};
|
||||
|
||||
(function () {
|
||||
try {
|
||||
Object.defineProperty(WritableState.prototype, 'buffer', {
|
||||
get: internalUtil.deprecate(function writableStateBufferGetter() {
|
||||
return this.getBuffer();
|
||||
}, '_writableState.buffer is deprecated. Use _writableState.getBuffer ' + 'instead.', 'DEP0003')
|
||||
});
|
||||
} catch (_) {}
|
||||
})(); // Test _writableState for inheritance to account for Duplex streams,
|
||||
// whose prototype chain only points to Readable.
|
||||
|
||||
|
||||
var realHasInstance;
|
||||
|
||||
if (typeof Symbol === 'function' && Symbol.hasInstance && typeof Function.prototype[Symbol.hasInstance] === 'function') {
|
||||
realHasInstance = Function.prototype[Symbol.hasInstance];
|
||||
Object.defineProperty(Writable, Symbol.hasInstance, {
|
||||
value: function value(object) {
|
||||
if (realHasInstance.call(this, object)) return true;
|
||||
if (this !== Writable) return false;
|
||||
return object && object._writableState instanceof WritableState;
|
||||
}
|
||||
});
|
||||
} else {
|
||||
realHasInstance = function realHasInstance(object) {
|
||||
return object instanceof this;
|
||||
};
|
||||
}
|
||||
|
||||
function Writable(options) {
|
||||
Duplex = Duplex || require('./_stream_duplex'); // Writable ctor is applied to Duplexes, too.
|
||||
// `realHasInstance` is necessary because using plain `instanceof`
|
||||
// would return false, as no `_writableState` property is attached.
|
||||
// Trying to use the custom `instanceof` for Writable here will also break the
|
||||
// Node.js LazyTransform implementation, which has a non-trivial getter for
|
||||
// `_writableState` that would lead to infinite recursion.
|
||||
// Checking for a Stream.Duplex instance is faster here instead of inside
|
||||
// the WritableState constructor, at least with V8 6.5
|
||||
|
||||
var isDuplex = this instanceof Duplex;
|
||||
if (!isDuplex && !realHasInstance.call(Writable, this)) return new Writable(options);
|
||||
this._writableState = new WritableState(options, this, isDuplex); // legacy.
|
||||
|
||||
this.writable = true;
|
||||
|
||||
if (options) {
|
||||
if (typeof options.write === 'function') this._write = options.write;
|
||||
if (typeof options.writev === 'function') this._writev = options.writev;
|
||||
if (typeof options.destroy === 'function') this._destroy = options.destroy;
|
||||
if (typeof options.final === 'function') this._final = options.final;
|
||||
}
|
||||
|
||||
Stream.call(this);
|
||||
} // Otherwise people can pipe Writable streams, which is just wrong.
|
||||
|
||||
|
||||
Writable.prototype.pipe = function () {
|
||||
errorOrDestroy(this, new ERR_STREAM_CANNOT_PIPE());
|
||||
};
|
||||
|
||||
function writeAfterEnd(stream, cb) {
|
||||
var er = new ERR_STREAM_WRITE_AFTER_END(); // TODO: defer error events consistently everywhere, not just the cb
|
||||
|
||||
errorOrDestroy(stream, er);
|
||||
process.nextTick(cb, er);
|
||||
} // Checks that a user-supplied chunk is valid, especially for the particular
|
||||
// mode the stream is in. Currently this means that `null` is never accepted
|
||||
// and undefined/non-string values are only allowed in object mode.
|
||||
|
||||
|
||||
function validChunk(stream, state, chunk, cb) {
|
||||
var er;
|
||||
|
||||
if (chunk === null) {
|
||||
er = new ERR_STREAM_NULL_VALUES();
|
||||
} else if (typeof chunk !== 'string' && !state.objectMode) {
|
||||
er = new ERR_INVALID_ARG_TYPE('chunk', ['string', 'Buffer'], chunk);
|
||||
}
|
||||
|
||||
if (er) {
|
||||
errorOrDestroy(stream, er);
|
||||
process.nextTick(cb, er);
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
Writable.prototype.write = function (chunk, encoding, cb) {
|
||||
var state = this._writableState;
|
||||
var ret = false;
|
||||
|
||||
var isBuf = !state.objectMode && _isUint8Array(chunk);
|
||||
|
||||
if (isBuf && !Buffer.isBuffer(chunk)) {
|
||||
chunk = _uint8ArrayToBuffer(chunk);
|
||||
}
|
||||
|
||||
if (typeof encoding === 'function') {
|
||||
cb = encoding;
|
||||
encoding = null;
|
||||
}
|
||||
|
||||
if (isBuf) encoding = 'buffer';else if (!encoding) encoding = state.defaultEncoding;
|
||||
if (typeof cb !== 'function') cb = nop;
|
||||
if (state.ending) writeAfterEnd(this, cb);else if (isBuf || validChunk(this, state, chunk, cb)) {
|
||||
state.pendingcb++;
|
||||
ret = writeOrBuffer(this, state, isBuf, chunk, encoding, cb);
|
||||
}
|
||||
return ret;
|
||||
};
|
||||
|
||||
Writable.prototype.cork = function () {
|
||||
this._writableState.corked++;
|
||||
};
|
||||
|
||||
Writable.prototype.uncork = function () {
|
||||
var state = this._writableState;
|
||||
|
||||
if (state.corked) {
|
||||
state.corked--;
|
||||
if (!state.writing && !state.corked && !state.bufferProcessing && state.bufferedRequest) clearBuffer(this, state);
|
||||
}
|
||||
};
|
||||
|
||||
Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) {
|
||||
// node::ParseEncoding() requires lower case.
|
||||
if (typeof encoding === 'string') encoding = encoding.toLowerCase();
|
||||
if (!(['hex', 'utf8', 'utf-8', 'ascii', 'binary', 'base64', 'ucs2', 'ucs-2', 'utf16le', 'utf-16le', 'raw'].indexOf((encoding + '').toLowerCase()) > -1)) throw new ERR_UNKNOWN_ENCODING(encoding);
|
||||
this._writableState.defaultEncoding = encoding;
|
||||
return this;
|
||||
};
|
||||
|
||||
Object.defineProperty(Writable.prototype, 'writableBuffer', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
return this._writableState && this._writableState.getBuffer();
|
||||
}
|
||||
});
|
||||
|
||||
function decodeChunk(state, chunk, encoding) {
|
||||
if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') {
|
||||
chunk = Buffer.from(chunk, encoding);
|
||||
}
|
||||
|
||||
return chunk;
|
||||
}
|
||||
|
||||
Object.defineProperty(Writable.prototype, 'writableHighWaterMark', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
return this._writableState.highWaterMark;
|
||||
}
|
||||
}); // if we're already writing something, then just put this
|
||||
// in the queue, and wait our turn. Otherwise, call _write
|
||||
// If we return false, then we need a drain event, so set that flag.
|
||||
|
||||
function writeOrBuffer(stream, state, isBuf, chunk, encoding, cb) {
|
||||
if (!isBuf) {
|
||||
var newChunk = decodeChunk(state, chunk, encoding);
|
||||
|
||||
if (chunk !== newChunk) {
|
||||
isBuf = true;
|
||||
encoding = 'buffer';
|
||||
chunk = newChunk;
|
||||
}
|
||||
}
|
||||
|
||||
var len = state.objectMode ? 1 : chunk.length;
|
||||
state.length += len;
|
||||
var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false.
|
||||
|
||||
if (!ret) state.needDrain = true;
|
||||
|
||||
if (state.writing || state.corked) {
|
||||
var last = state.lastBufferedRequest;
|
||||
state.lastBufferedRequest = {
|
||||
chunk: chunk,
|
||||
encoding: encoding,
|
||||
isBuf: isBuf,
|
||||
callback: cb,
|
||||
next: null
|
||||
};
|
||||
|
||||
if (last) {
|
||||
last.next = state.lastBufferedRequest;
|
||||
} else {
|
||||
state.bufferedRequest = state.lastBufferedRequest;
|
||||
}
|
||||
|
||||
state.bufferedRequestCount += 1;
|
||||
} else {
|
||||
doWrite(stream, state, false, len, chunk, encoding, cb);
|
||||
}
|
||||
|
||||
return ret;
|
||||
}
|
||||
|
||||
function doWrite(stream, state, writev, len, chunk, encoding, cb) {
|
||||
state.writelen = len;
|
||||
state.writecb = cb;
|
||||
state.writing = true;
|
||||
state.sync = true;
|
||||
if (state.destroyed) state.onwrite(new ERR_STREAM_DESTROYED('write'));else if (writev) stream._writev(chunk, state.onwrite);else stream._write(chunk, encoding, state.onwrite);
|
||||
state.sync = false;
|
||||
}
|
||||
|
||||
function onwriteError(stream, state, sync, er, cb) {
|
||||
--state.pendingcb;
|
||||
|
||||
if (sync) {
|
||||
// defer the callback if we are being called synchronously
|
||||
// to avoid piling up things on the stack
|
||||
process.nextTick(cb, er); // this can emit finish, and it will always happen
|
||||
// after error
|
||||
|
||||
process.nextTick(finishMaybe, stream, state);
|
||||
stream._writableState.errorEmitted = true;
|
||||
errorOrDestroy(stream, er);
|
||||
} else {
|
||||
// the caller expect this to happen before if
|
||||
// it is async
|
||||
cb(er);
|
||||
stream._writableState.errorEmitted = true;
|
||||
errorOrDestroy(stream, er); // this can emit finish, but finish must
|
||||
// always follow error
|
||||
|
||||
finishMaybe(stream, state);
|
||||
}
|
||||
}
|
||||
|
||||
function onwriteStateUpdate(state) {
|
||||
state.writing = false;
|
||||
state.writecb = null;
|
||||
state.length -= state.writelen;
|
||||
state.writelen = 0;
|
||||
}
|
||||
|
||||
function onwrite(stream, er) {
|
||||
var state = stream._writableState;
|
||||
var sync = state.sync;
|
||||
var cb = state.writecb;
|
||||
if (typeof cb !== 'function') throw new ERR_MULTIPLE_CALLBACK();
|
||||
onwriteStateUpdate(state);
|
||||
if (er) onwriteError(stream, state, sync, er, cb);else {
|
||||
// Check if we're actually ready to finish, but don't emit yet
|
||||
var finished = needFinish(state) || stream.destroyed;
|
||||
|
||||
if (!finished && !state.corked && !state.bufferProcessing && state.bufferedRequest) {
|
||||
clearBuffer(stream, state);
|
||||
}
|
||||
|
||||
if (sync) {
|
||||
process.nextTick(afterWrite, stream, state, finished, cb);
|
||||
} else {
|
||||
afterWrite(stream, state, finished, cb);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function afterWrite(stream, state, finished, cb) {
|
||||
if (!finished) onwriteDrain(stream, state);
|
||||
state.pendingcb--;
|
||||
cb();
|
||||
finishMaybe(stream, state);
|
||||
} // Must force callback to be called on nextTick, so that we don't
|
||||
// emit 'drain' before the write() consumer gets the 'false' return
|
||||
// value, and has a chance to attach a 'drain' listener.
|
||||
|
||||
|
||||
function onwriteDrain(stream, state) {
|
||||
if (state.length === 0 && state.needDrain) {
|
||||
state.needDrain = false;
|
||||
stream.emit('drain');
|
||||
}
|
||||
} // if there's something in the buffer waiting, then process it
|
||||
|
||||
|
||||
function clearBuffer(stream, state) {
|
||||
state.bufferProcessing = true;
|
||||
var entry = state.bufferedRequest;
|
||||
|
||||
if (stream._writev && entry && entry.next) {
|
||||
// Fast case, write everything using _writev()
|
||||
var l = state.bufferedRequestCount;
|
||||
var buffer = new Array(l);
|
||||
var holder = state.corkedRequestsFree;
|
||||
holder.entry = entry;
|
||||
var count = 0;
|
||||
var allBuffers = true;
|
||||
|
||||
while (entry) {
|
||||
buffer[count] = entry;
|
||||
if (!entry.isBuf) allBuffers = false;
|
||||
entry = entry.next;
|
||||
count += 1;
|
||||
}
|
||||
|
||||
buffer.allBuffers = allBuffers;
|
||||
doWrite(stream, state, true, state.length, buffer, '', holder.finish); // doWrite is almost always async, defer these to save a bit of time
|
||||
// as the hot path ends with doWrite
|
||||
|
||||
state.pendingcb++;
|
||||
state.lastBufferedRequest = null;
|
||||
|
||||
if (holder.next) {
|
||||
state.corkedRequestsFree = holder.next;
|
||||
holder.next = null;
|
||||
} else {
|
||||
state.corkedRequestsFree = new CorkedRequest(state);
|
||||
}
|
||||
|
||||
state.bufferedRequestCount = 0;
|
||||
} else {
|
||||
// Slow case, write chunks one-by-one
|
||||
while (entry) {
|
||||
var chunk = entry.chunk;
|
||||
var encoding = entry.encoding;
|
||||
var cb = entry.callback;
|
||||
var len = state.objectMode ? 1 : chunk.length;
|
||||
doWrite(stream, state, false, len, chunk, encoding, cb);
|
||||
entry = entry.next;
|
||||
state.bufferedRequestCount--; // if we didn't call the onwrite immediately, then
|
||||
// it means that we need to wait until it does.
|
||||
// also, that means that the chunk and cb are currently
|
||||
// being processed, so move the buffer counter past them.
|
||||
|
||||
if (state.writing) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (entry === null) state.lastBufferedRequest = null;
|
||||
}
|
||||
|
||||
state.bufferedRequest = entry;
|
||||
state.bufferProcessing = false;
|
||||
}
|
||||
|
||||
Writable.prototype._write = function (chunk, encoding, cb) {
|
||||
cb(new ERR_METHOD_NOT_IMPLEMENTED('_write()'));
|
||||
};
|
||||
|
||||
Writable.prototype._writev = null;
|
||||
|
||||
Writable.prototype.end = function (chunk, encoding, cb) {
|
||||
var state = this._writableState;
|
||||
|
||||
if (typeof chunk === 'function') {
|
||||
cb = chunk;
|
||||
chunk = null;
|
||||
encoding = null;
|
||||
} else if (typeof encoding === 'function') {
|
||||
cb = encoding;
|
||||
encoding = null;
|
||||
}
|
||||
|
||||
if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); // .end() fully uncorks
|
||||
|
||||
if (state.corked) {
|
||||
state.corked = 1;
|
||||
this.uncork();
|
||||
} // ignore unnecessary end() calls.
|
||||
|
||||
|
||||
if (!state.ending) endWritable(this, state, cb);
|
||||
return this;
|
||||
};
|
||||
|
||||
Object.defineProperty(Writable.prototype, 'writableLength', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
return this._writableState.length;
|
||||
}
|
||||
});
|
||||
|
||||
function needFinish(state) {
|
||||
return state.ending && state.length === 0 && state.bufferedRequest === null && !state.finished && !state.writing;
|
||||
}
|
||||
|
||||
function callFinal(stream, state) {
|
||||
stream._final(function (err) {
|
||||
state.pendingcb--;
|
||||
|
||||
if (err) {
|
||||
errorOrDestroy(stream, err);
|
||||
}
|
||||
|
||||
state.prefinished = true;
|
||||
stream.emit('prefinish');
|
||||
finishMaybe(stream, state);
|
||||
});
|
||||
}
|
||||
|
||||
function prefinish(stream, state) {
|
||||
if (!state.prefinished && !state.finalCalled) {
|
||||
if (typeof stream._final === 'function' && !state.destroyed) {
|
||||
state.pendingcb++;
|
||||
state.finalCalled = true;
|
||||
process.nextTick(callFinal, stream, state);
|
||||
} else {
|
||||
state.prefinished = true;
|
||||
stream.emit('prefinish');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function finishMaybe(stream, state) {
|
||||
var need = needFinish(state);
|
||||
|
||||
if (need) {
|
||||
prefinish(stream, state);
|
||||
|
||||
if (state.pendingcb === 0) {
|
||||
state.finished = true;
|
||||
stream.emit('finish');
|
||||
|
||||
if (state.autoDestroy) {
|
||||
// In case of duplex streams we need a way to detect
|
||||
// if the readable side is ready for autoDestroy as well
|
||||
var rState = stream._readableState;
|
||||
|
||||
if (!rState || rState.autoDestroy && rState.endEmitted) {
|
||||
stream.destroy();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return need;
|
||||
}
|
||||
|
||||
function endWritable(stream, state, cb) {
|
||||
state.ending = true;
|
||||
finishMaybe(stream, state);
|
||||
|
||||
if (cb) {
|
||||
if (state.finished) process.nextTick(cb);else stream.once('finish', cb);
|
||||
}
|
||||
|
||||
state.ended = true;
|
||||
stream.writable = false;
|
||||
}
|
||||
|
||||
function onCorkedFinish(corkReq, state, err) {
|
||||
var entry = corkReq.entry;
|
||||
corkReq.entry = null;
|
||||
|
||||
while (entry) {
|
||||
var cb = entry.callback;
|
||||
state.pendingcb--;
|
||||
cb(err);
|
||||
entry = entry.next;
|
||||
} // reuse the free corkReq.
|
||||
|
||||
|
||||
state.corkedRequestsFree.next = corkReq;
|
||||
}
|
||||
|
||||
Object.defineProperty(Writable.prototype, 'destroyed', {
|
||||
// making it explicit this property is not enumerable
|
||||
// because otherwise some prototype manipulation in
|
||||
// userland will fail
|
||||
enumerable: false,
|
||||
get: function get() {
|
||||
if (this._writableState === undefined) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return this._writableState.destroyed;
|
||||
},
|
||||
set: function set(value) {
|
||||
// we ignore the value if the stream
|
||||
// has not been initialized yet
|
||||
if (!this._writableState) {
|
||||
return;
|
||||
} // backward compatibility, the user is explicitly
|
||||
// managing destroyed
|
||||
|
||||
|
||||
this._writableState.destroyed = value;
|
||||
}
|
||||
});
|
||||
Writable.prototype.destroy = destroyImpl.destroy;
|
||||
Writable.prototype._undestroy = destroyImpl.undestroy;
|
||||
|
||||
Writable.prototype._destroy = function (err, cb) {
|
||||
cb(err);
|
||||
};
|
207
node_modules/bl/node_modules/readable-stream/lib/internal/streams/async_iterator.js
generated
vendored
Normal file
207
node_modules/bl/node_modules/readable-stream/lib/internal/streams/async_iterator.js
generated
vendored
Normal file
|
@ -0,0 +1,207 @@
|
|||
'use strict';
|
||||
|
||||
var _Object$setPrototypeO;
|
||||
|
||||
function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
|
||||
|
||||
var finished = require('./end-of-stream');
|
||||
|
||||
var kLastResolve = Symbol('lastResolve');
|
||||
var kLastReject = Symbol('lastReject');
|
||||
var kError = Symbol('error');
|
||||
var kEnded = Symbol('ended');
|
||||
var kLastPromise = Symbol('lastPromise');
|
||||
var kHandlePromise = Symbol('handlePromise');
|
||||
var kStream = Symbol('stream');
|
||||
|
||||
function createIterResult(value, done) {
|
||||
return {
|
||||
value: value,
|
||||
done: done
|
||||
};
|
||||
}
|
||||
|
||||
function readAndResolve(iter) {
|
||||
var resolve = iter[kLastResolve];
|
||||
|
||||
if (resolve !== null) {
|
||||
var data = iter[kStream].read(); // we defer if data is null
|
||||
// we can be expecting either 'end' or
|
||||
// 'error'
|
||||
|
||||
if (data !== null) {
|
||||
iter[kLastPromise] = null;
|
||||
iter[kLastResolve] = null;
|
||||
iter[kLastReject] = null;
|
||||
resolve(createIterResult(data, false));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function onReadable(iter) {
|
||||
// we wait for the next tick, because it might
|
||||
// emit an error with process.nextTick
|
||||
process.nextTick(readAndResolve, iter);
|
||||
}
|
||||
|
||||
function wrapForNext(lastPromise, iter) {
|
||||
return function (resolve, reject) {
|
||||
lastPromise.then(function () {
|
||||
if (iter[kEnded]) {
|
||||
resolve(createIterResult(undefined, true));
|
||||
return;
|
||||
}
|
||||
|
||||
iter[kHandlePromise](resolve, reject);
|
||||
}, reject);
|
||||
};
|
||||
}
|
||||
|
||||
var AsyncIteratorPrototype = Object.getPrototypeOf(function () {});
|
||||
var ReadableStreamAsyncIteratorPrototype = Object.setPrototypeOf((_Object$setPrototypeO = {
|
||||
get stream() {
|
||||
return this[kStream];
|
||||
},
|
||||
|
||||
next: function next() {
|
||||
var _this = this;
|
||||
|
||||
// if we have detected an error in the meanwhile
|
||||
// reject straight away
|
||||
var error = this[kError];
|
||||
|
||||
if (error !== null) {
|
||||
return Promise.reject(error);
|
||||
}
|
||||
|
||||
if (this[kEnded]) {
|
||||
return Promise.resolve(createIterResult(undefined, true));
|
||||
}
|
||||
|
||||
if (this[kStream].destroyed) {
|
||||
// We need to defer via nextTick because if .destroy(err) is
|
||||
// called, the error will be emitted via nextTick, and
|
||||
// we cannot guarantee that there is no error lingering around
|
||||
// waiting to be emitted.
|
||||
return new Promise(function (resolve, reject) {
|
||||
process.nextTick(function () {
|
||||
if (_this[kError]) {
|
||||
reject(_this[kError]);
|
||||
} else {
|
||||
resolve(createIterResult(undefined, true));
|
||||
}
|
||||
});
|
||||
});
|
||||
} // if we have multiple next() calls
|
||||
// we will wait for the previous Promise to finish
|
||||
// this logic is optimized to support for await loops,
|
||||
// where next() is only called once at a time
|
||||
|
||||
|
||||
var lastPromise = this[kLastPromise];
|
||||
var promise;
|
||||
|
||||
if (lastPromise) {
|
||||
promise = new Promise(wrapForNext(lastPromise, this));
|
||||
} else {
|
||||
// fast path needed to support multiple this.push()
|
||||
// without triggering the next() queue
|
||||
var data = this[kStream].read();
|
||||
|
||||
if (data !== null) {
|
||||
return Promise.resolve(createIterResult(data, false));
|
||||
}
|
||||
|
||||
promise = new Promise(this[kHandlePromise]);
|
||||
}
|
||||
|
||||
this[kLastPromise] = promise;
|
||||
return promise;
|
||||
}
|
||||
}, _defineProperty(_Object$setPrototypeO, Symbol.asyncIterator, function () {
|
||||
return this;
|
||||
}), _defineProperty(_Object$setPrototypeO, "return", function _return() {
|
||||
var _this2 = this;
|
||||
|
||||
// destroy(err, cb) is a private API
|
||||
// we can guarantee we have that here, because we control the
|
||||
// Readable class this is attached to
|
||||
return new Promise(function (resolve, reject) {
|
||||
_this2[kStream].destroy(null, function (err) {
|
||||
if (err) {
|
||||
reject(err);
|
||||
return;
|
||||
}
|
||||
|
||||
resolve(createIterResult(undefined, true));
|
||||
});
|
||||
});
|
||||
}), _Object$setPrototypeO), AsyncIteratorPrototype);
|
||||
|
||||
var createReadableStreamAsyncIterator = function createReadableStreamAsyncIterator(stream) {
|
||||
var _Object$create;
|
||||
|
||||
var iterator = Object.create(ReadableStreamAsyncIteratorPrototype, (_Object$create = {}, _defineProperty(_Object$create, kStream, {
|
||||
value: stream,
|
||||
writable: true
|
||||
}), _defineProperty(_Object$create, kLastResolve, {
|
||||
value: null,
|
||||
writable: true
|
||||
}), _defineProperty(_Object$create, kLastReject, {
|
||||
value: null,
|
||||
writable: true
|
||||
}), _defineProperty(_Object$create, kError, {
|
||||
value: null,
|
||||
writable: true
|
||||
}), _defineProperty(_Object$create, kEnded, {
|
||||
value: stream._readableState.endEmitted,
|
||||
writable: true
|
||||
}), _defineProperty(_Object$create, kHandlePromise, {
|
||||
value: function value(resolve, reject) {
|
||||
var data = iterator[kStream].read();
|
||||
|
||||
if (data) {
|
||||
iterator[kLastPromise] = null;
|
||||
iterator[kLastResolve] = null;
|
||||
iterator[kLastReject] = null;
|
||||
resolve(createIterResult(data, false));
|
||||
} else {
|
||||
iterator[kLastResolve] = resolve;
|
||||
iterator[kLastReject] = reject;
|
||||
}
|
||||
},
|
||||
writable: true
|
||||
}), _Object$create));
|
||||
iterator[kLastPromise] = null;
|
||||
finished(stream, function (err) {
|
||||
if (err && err.code !== 'ERR_STREAM_PREMATURE_CLOSE') {
|
||||
var reject = iterator[kLastReject]; // reject if we are waiting for data in the Promise
|
||||
// returned by next() and store the error
|
||||
|
||||
if (reject !== null) {
|
||||
iterator[kLastPromise] = null;
|
||||
iterator[kLastResolve] = null;
|
||||
iterator[kLastReject] = null;
|
||||
reject(err);
|
||||
}
|
||||
|
||||
iterator[kError] = err;
|
||||
return;
|
||||
}
|
||||
|
||||
var resolve = iterator[kLastResolve];
|
||||
|
||||
if (resolve !== null) {
|
||||
iterator[kLastPromise] = null;
|
||||
iterator[kLastResolve] = null;
|
||||
iterator[kLastReject] = null;
|
||||
resolve(createIterResult(undefined, true));
|
||||
}
|
||||
|
||||
iterator[kEnded] = true;
|
||||
});
|
||||
stream.on('readable', onReadable.bind(null, iterator));
|
||||
return iterator;
|
||||
};
|
||||
|
||||
module.exports = createReadableStreamAsyncIterator;
|
210
node_modules/bl/node_modules/readable-stream/lib/internal/streams/buffer_list.js
generated
vendored
Normal file
210
node_modules/bl/node_modules/readable-stream/lib/internal/streams/buffer_list.js
generated
vendored
Normal file
|
@ -0,0 +1,210 @@
|
|||
'use strict';
|
||||
|
||||
function ownKeys(object, enumerableOnly) { var keys = Object.keys(object); if (Object.getOwnPropertySymbols) { var symbols = Object.getOwnPropertySymbols(object); if (enumerableOnly) symbols = symbols.filter(function (sym) { return Object.getOwnPropertyDescriptor(object, sym).enumerable; }); keys.push.apply(keys, symbols); } return keys; }
|
||||
|
||||
function _objectSpread(target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i] != null ? arguments[i] : {}; if (i % 2) { ownKeys(Object(source), true).forEach(function (key) { _defineProperty(target, key, source[key]); }); } else if (Object.getOwnPropertyDescriptors) { Object.defineProperties(target, Object.getOwnPropertyDescriptors(source)); } else { ownKeys(Object(source)).forEach(function (key) { Object.defineProperty(target, key, Object.getOwnPropertyDescriptor(source, key)); }); } } return target; }
|
||||
|
||||
function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
|
||||
|
||||
function _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError("Cannot call a class as a function"); } }
|
||||
|
||||
function _defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if ("value" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } }
|
||||
|
||||
function _createClass(Constructor, protoProps, staticProps) { if (protoProps) _defineProperties(Constructor.prototype, protoProps); if (staticProps) _defineProperties(Constructor, staticProps); return Constructor; }
|
||||
|
||||
var _require = require('buffer'),
|
||||
Buffer = _require.Buffer;
|
||||
|
||||
var _require2 = require('util'),
|
||||
inspect = _require2.inspect;
|
||||
|
||||
var custom = inspect && inspect.custom || 'inspect';
|
||||
|
||||
function copyBuffer(src, target, offset) {
|
||||
Buffer.prototype.copy.call(src, target, offset);
|
||||
}
|
||||
|
||||
module.exports =
|
||||
/*#__PURE__*/
|
||||
function () {
|
||||
function BufferList() {
|
||||
_classCallCheck(this, BufferList);
|
||||
|
||||
this.head = null;
|
||||
this.tail = null;
|
||||
this.length = 0;
|
||||
}
|
||||
|
||||
_createClass(BufferList, [{
|
||||
key: "push",
|
||||
value: function push(v) {
|
||||
var entry = {
|
||||
data: v,
|
||||
next: null
|
||||
};
|
||||
if (this.length > 0) this.tail.next = entry;else this.head = entry;
|
||||
this.tail = entry;
|
||||
++this.length;
|
||||
}
|
||||
}, {
|
||||
key: "unshift",
|
||||
value: function unshift(v) {
|
||||
var entry = {
|
||||
data: v,
|
||||
next: this.head
|
||||
};
|
||||
if (this.length === 0) this.tail = entry;
|
||||
this.head = entry;
|
||||
++this.length;
|
||||
}
|
||||
}, {
|
||||
key: "shift",
|
||||
value: function shift() {
|
||||
if (this.length === 0) return;
|
||||
var ret = this.head.data;
|
||||
if (this.length === 1) this.head = this.tail = null;else this.head = this.head.next;
|
||||
--this.length;
|
||||
return ret;
|
||||
}
|
||||
}, {
|
||||
key: "clear",
|
||||
value: function clear() {
|
||||
this.head = this.tail = null;
|
||||
this.length = 0;
|
||||
}
|
||||
}, {
|
||||
key: "join",
|
||||
value: function join(s) {
|
||||
if (this.length === 0) return '';
|
||||
var p = this.head;
|
||||
var ret = '' + p.data;
|
||||
|
||||
while (p = p.next) {
|
||||
ret += s + p.data;
|
||||
}
|
||||
|
||||
return ret;
|
||||
}
|
||||
}, {
|
||||
key: "concat",
|
||||
value: function concat(n) {
|
||||
if (this.length === 0) return Buffer.alloc(0);
|
||||
var ret = Buffer.allocUnsafe(n >>> 0);
|
||||
var p = this.head;
|
||||
var i = 0;
|
||||
|
||||
while (p) {
|
||||
copyBuffer(p.data, ret, i);
|
||||
i += p.data.length;
|
||||
p = p.next;
|
||||
}
|
||||
|
||||
return ret;
|
||||
} // Consumes a specified amount of bytes or characters from the buffered data.
|
||||
|
||||
}, {
|
||||
key: "consume",
|
||||
value: function consume(n, hasStrings) {
|
||||
var ret;
|
||||
|
||||
if (n < this.head.data.length) {
|
||||
// `slice` is the same for buffers and strings.
|
||||
ret = this.head.data.slice(0, n);
|
||||
this.head.data = this.head.data.slice(n);
|
||||
} else if (n === this.head.data.length) {
|
||||
// First chunk is a perfect match.
|
||||
ret = this.shift();
|
||||
} else {
|
||||
// Result spans more than one buffer.
|
||||
ret = hasStrings ? this._getString(n) : this._getBuffer(n);
|
||||
}
|
||||
|
||||
return ret;
|
||||
}
|
||||
}, {
|
||||
key: "first",
|
||||
value: function first() {
|
||||
return this.head.data;
|
||||
} // Consumes a specified amount of characters from the buffered data.
|
||||
|
||||
}, {
|
||||
key: "_getString",
|
||||
value: function _getString(n) {
|
||||
var p = this.head;
|
||||
var c = 1;
|
||||
var ret = p.data;
|
||||
n -= ret.length;
|
||||
|
||||
while (p = p.next) {
|
||||
var str = p.data;
|
||||
var nb = n > str.length ? str.length : n;
|
||||
if (nb === str.length) ret += str;else ret += str.slice(0, n);
|
||||
n -= nb;
|
||||
|
||||
if (n === 0) {
|
||||
if (nb === str.length) {
|
||||
++c;
|
||||
if (p.next) this.head = p.next;else this.head = this.tail = null;
|
||||
} else {
|
||||
this.head = p;
|
||||
p.data = str.slice(nb);
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
|
||||
++c;
|
||||
}
|
||||
|
||||
this.length -= c;
|
||||
return ret;
|
||||
} // Consumes a specified amount of bytes from the buffered data.
|
||||
|
||||
}, {
|
||||
key: "_getBuffer",
|
||||
value: function _getBuffer(n) {
|
||||
var ret = Buffer.allocUnsafe(n);
|
||||
var p = this.head;
|
||||
var c = 1;
|
||||
p.data.copy(ret);
|
||||
n -= p.data.length;
|
||||
|
||||
while (p = p.next) {
|
||||
var buf = p.data;
|
||||
var nb = n > buf.length ? buf.length : n;
|
||||
buf.copy(ret, ret.length - n, 0, nb);
|
||||
n -= nb;
|
||||
|
||||
if (n === 0) {
|
||||
if (nb === buf.length) {
|
||||
++c;
|
||||
if (p.next) this.head = p.next;else this.head = this.tail = null;
|
||||
} else {
|
||||
this.head = p;
|
||||
p.data = buf.slice(nb);
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
|
||||
++c;
|
||||
}
|
||||
|
||||
this.length -= c;
|
||||
return ret;
|
||||
} // Make sure the linked list only shows the minimal necessary information.
|
||||
|
||||
}, {
|
||||
key: custom,
|
||||
value: function value(_, options) {
|
||||
return inspect(this, _objectSpread({}, options, {
|
||||
// Only inspect one level.
|
||||
depth: 0,
|
||||
// It should not recurse.
|
||||
customInspect: false
|
||||
}));
|
||||
}
|
||||
}]);
|
||||
|
||||
return BufferList;
|
||||
}();
|
105
node_modules/bl/node_modules/readable-stream/lib/internal/streams/destroy.js
generated
vendored
Normal file
105
node_modules/bl/node_modules/readable-stream/lib/internal/streams/destroy.js
generated
vendored
Normal file
|
@ -0,0 +1,105 @@
|
|||
'use strict'; // undocumented cb() API, needed for core, not for public API
|
||||
|
||||
function destroy(err, cb) {
|
||||
var _this = this;
|
||||
|
||||
var readableDestroyed = this._readableState && this._readableState.destroyed;
|
||||
var writableDestroyed = this._writableState && this._writableState.destroyed;
|
||||
|
||||
if (readableDestroyed || writableDestroyed) {
|
||||
if (cb) {
|
||||
cb(err);
|
||||
} else if (err) {
|
||||
if (!this._writableState) {
|
||||
process.nextTick(emitErrorNT, this, err);
|
||||
} else if (!this._writableState.errorEmitted) {
|
||||
this._writableState.errorEmitted = true;
|
||||
process.nextTick(emitErrorNT, this, err);
|
||||
}
|
||||
}
|
||||
|
||||
return this;
|
||||
} // we set destroyed to true before firing error callbacks in order
|
||||
// to make it re-entrance safe in case destroy() is called within callbacks
|
||||
|
||||
|
||||
if (this._readableState) {
|
||||
this._readableState.destroyed = true;
|
||||
} // if this is a duplex stream mark the writable part as destroyed as well
|
||||
|
||||
|
||||
if (this._writableState) {
|
||||
this._writableState.destroyed = true;
|
||||
}
|
||||
|
||||
this._destroy(err || null, function (err) {
|
||||
if (!cb && err) {
|
||||
if (!_this._writableState) {
|
||||
process.nextTick(emitErrorAndCloseNT, _this, err);
|
||||
} else if (!_this._writableState.errorEmitted) {
|
||||
_this._writableState.errorEmitted = true;
|
||||
process.nextTick(emitErrorAndCloseNT, _this, err);
|
||||
} else {
|
||||
process.nextTick(emitCloseNT, _this);
|
||||
}
|
||||
} else if (cb) {
|
||||
process.nextTick(emitCloseNT, _this);
|
||||
cb(err);
|
||||
} else {
|
||||
process.nextTick(emitCloseNT, _this);
|
||||
}
|
||||
});
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
function emitErrorAndCloseNT(self, err) {
|
||||
emitErrorNT(self, err);
|
||||
emitCloseNT(self);
|
||||
}
|
||||
|
||||
function emitCloseNT(self) {
|
||||
if (self._writableState && !self._writableState.emitClose) return;
|
||||
if (self._readableState && !self._readableState.emitClose) return;
|
||||
self.emit('close');
|
||||
}
|
||||
|
||||
function undestroy() {
|
||||
if (this._readableState) {
|
||||
this._readableState.destroyed = false;
|
||||
this._readableState.reading = false;
|
||||
this._readableState.ended = false;
|
||||
this._readableState.endEmitted = false;
|
||||
}
|
||||
|
||||
if (this._writableState) {
|
||||
this._writableState.destroyed = false;
|
||||
this._writableState.ended = false;
|
||||
this._writableState.ending = false;
|
||||
this._writableState.finalCalled = false;
|
||||
this._writableState.prefinished = false;
|
||||
this._writableState.finished = false;
|
||||
this._writableState.errorEmitted = false;
|
||||
}
|
||||
}
|
||||
|
||||
function emitErrorNT(self, err) {
|
||||
self.emit('error', err);
|
||||
}
|
||||
|
||||
function errorOrDestroy(stream, err) {
|
||||
// We have tests that rely on errors being emitted
|
||||
// in the same tick, so changing this is semver major.
|
||||
// For now when you opt-in to autoDestroy we allow
|
||||
// the error to be emitted nextTick. In a future
|
||||
// semver major update we should change the default to this.
|
||||
var rState = stream._readableState;
|
||||
var wState = stream._writableState;
|
||||
if (rState && rState.autoDestroy || wState && wState.autoDestroy) stream.destroy(err);else stream.emit('error', err);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
destroy: destroy,
|
||||
undestroy: undestroy,
|
||||
errorOrDestroy: errorOrDestroy
|
||||
};
|
104
node_modules/bl/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
generated
vendored
Normal file
104
node_modules/bl/node_modules/readable-stream/lib/internal/streams/end-of-stream.js
generated
vendored
Normal file
|
@ -0,0 +1,104 @@
|
|||
// Ported from https://github.com/mafintosh/end-of-stream with
|
||||
// permission from the author, Mathias Buus (@mafintosh).
|
||||
'use strict';
|
||||
|
||||
var ERR_STREAM_PREMATURE_CLOSE = require('../../../errors').codes.ERR_STREAM_PREMATURE_CLOSE;
|
||||
|
||||
function once(callback) {
|
||||
var called = false;
|
||||
return function () {
|
||||
if (called) return;
|
||||
called = true;
|
||||
|
||||
for (var _len = arguments.length, args = new Array(_len), _key = 0; _key < _len; _key++) {
|
||||
args[_key] = arguments[_key];
|
||||
}
|
||||
|
||||
callback.apply(this, args);
|
||||
};
|
||||
}
|
||||
|
||||
function noop() {}
|
||||
|
||||
function isRequest(stream) {
|
||||
return stream.setHeader && typeof stream.abort === 'function';
|
||||
}
|
||||
|
||||
function eos(stream, opts, callback) {
|
||||
if (typeof opts === 'function') return eos(stream, null, opts);
|
||||
if (!opts) opts = {};
|
||||
callback = once(callback || noop);
|
||||
var readable = opts.readable || opts.readable !== false && stream.readable;
|
||||
var writable = opts.writable || opts.writable !== false && stream.writable;
|
||||
|
||||
var onlegacyfinish = function onlegacyfinish() {
|
||||
if (!stream.writable) onfinish();
|
||||
};
|
||||
|
||||
var writableEnded = stream._writableState && stream._writableState.finished;
|
||||
|
||||
var onfinish = function onfinish() {
|
||||
writable = false;
|
||||
writableEnded = true;
|
||||
if (!readable) callback.call(stream);
|
||||
};
|
||||
|
||||
var readableEnded = stream._readableState && stream._readableState.endEmitted;
|
||||
|
||||
var onend = function onend() {
|
||||
readable = false;
|
||||
readableEnded = true;
|
||||
if (!writable) callback.call(stream);
|
||||
};
|
||||
|
||||
var onerror = function onerror(err) {
|
||||
callback.call(stream, err);
|
||||
};
|
||||
|
||||
var onclose = function onclose() {
|
||||
var err;
|
||||
|
||||
if (readable && !readableEnded) {
|
||||
if (!stream._readableState || !stream._readableState.ended) err = new ERR_STREAM_PREMATURE_CLOSE();
|
||||
return callback.call(stream, err);
|
||||
}
|
||||
|
||||
if (writable && !writableEnded) {
|
||||
if (!stream._writableState || !stream._writableState.ended) err = new ERR_STREAM_PREMATURE_CLOSE();
|
||||
return callback.call(stream, err);
|
||||
}
|
||||
};
|
||||
|
||||
var onrequest = function onrequest() {
|
||||
stream.req.on('finish', onfinish);
|
||||
};
|
||||
|
||||
if (isRequest(stream)) {
|
||||
stream.on('complete', onfinish);
|
||||
stream.on('abort', onclose);
|
||||
if (stream.req) onrequest();else stream.on('request', onrequest);
|
||||
} else if (writable && !stream._writableState) {
|
||||
// legacy streams
|
||||
stream.on('end', onlegacyfinish);
|
||||
stream.on('close', onlegacyfinish);
|
||||
}
|
||||
|
||||
stream.on('end', onend);
|
||||
stream.on('finish', onfinish);
|
||||
if (opts.error !== false) stream.on('error', onerror);
|
||||
stream.on('close', onclose);
|
||||
return function () {
|
||||
stream.removeListener('complete', onfinish);
|
||||
stream.removeListener('abort', onclose);
|
||||
stream.removeListener('request', onrequest);
|
||||
if (stream.req) stream.req.removeListener('finish', onfinish);
|
||||
stream.removeListener('end', onlegacyfinish);
|
||||
stream.removeListener('close', onlegacyfinish);
|
||||
stream.removeListener('finish', onfinish);
|
||||
stream.removeListener('end', onend);
|
||||
stream.removeListener('error', onerror);
|
||||
stream.removeListener('close', onclose);
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = eos;
|
3
node_modules/bl/node_modules/readable-stream/lib/internal/streams/from-browser.js
generated
vendored
Normal file
3
node_modules/bl/node_modules/readable-stream/lib/internal/streams/from-browser.js
generated
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
module.exports = function () {
|
||||
throw new Error('Readable.from is not available in the browser')
|
||||
};
|
64
node_modules/bl/node_modules/readable-stream/lib/internal/streams/from.js
generated
vendored
Normal file
64
node_modules/bl/node_modules/readable-stream/lib/internal/streams/from.js
generated
vendored
Normal file
|
@ -0,0 +1,64 @@
|
|||
'use strict';
|
||||
|
||||
function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) { try { var info = gen[key](arg); var value = info.value; } catch (error) { reject(error); return; } if (info.done) { resolve(value); } else { Promise.resolve(value).then(_next, _throw); } }
|
||||
|
||||
function _asyncToGenerator(fn) { return function () { var self = this, args = arguments; return new Promise(function (resolve, reject) { var gen = fn.apply(self, args); function _next(value) { asyncGeneratorStep(gen, resolve, reject, _next, _throw, "next", value); } function _throw(err) { asyncGeneratorStep(gen, resolve, reject, _next, _throw, "throw", err); } _next(undefined); }); }; }
|
||||
|
||||
function ownKeys(object, enumerableOnly) { var keys = Object.keys(object); if (Object.getOwnPropertySymbols) { var symbols = Object.getOwnPropertySymbols(object); if (enumerableOnly) symbols = symbols.filter(function (sym) { return Object.getOwnPropertyDescriptor(object, sym).enumerable; }); keys.push.apply(keys, symbols); } return keys; }
|
||||
|
||||
function _objectSpread(target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i] != null ? arguments[i] : {}; if (i % 2) { ownKeys(Object(source), true).forEach(function (key) { _defineProperty(target, key, source[key]); }); } else if (Object.getOwnPropertyDescriptors) { Object.defineProperties(target, Object.getOwnPropertyDescriptors(source)); } else { ownKeys(Object(source)).forEach(function (key) { Object.defineProperty(target, key, Object.getOwnPropertyDescriptor(source, key)); }); } } return target; }
|
||||
|
||||
function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
|
||||
|
||||
var ERR_INVALID_ARG_TYPE = require('../../../errors').codes.ERR_INVALID_ARG_TYPE;
|
||||
|
||||
function from(Readable, iterable, opts) {
|
||||
var iterator;
|
||||
|
||||
if (iterable && typeof iterable.next === 'function') {
|
||||
iterator = iterable;
|
||||
} else if (iterable && iterable[Symbol.asyncIterator]) iterator = iterable[Symbol.asyncIterator]();else if (iterable && iterable[Symbol.iterator]) iterator = iterable[Symbol.iterator]();else throw new ERR_INVALID_ARG_TYPE('iterable', ['Iterable'], iterable);
|
||||
|
||||
var readable = new Readable(_objectSpread({
|
||||
objectMode: true
|
||||
}, opts)); // Reading boolean to protect against _read
|
||||
// being called before last iteration completion.
|
||||
|
||||
var reading = false;
|
||||
|
||||
readable._read = function () {
|
||||
if (!reading) {
|
||||
reading = true;
|
||||
next();
|
||||
}
|
||||
};
|
||||
|
||||
function next() {
|
||||
return _next2.apply(this, arguments);
|
||||
}
|
||||
|
||||
function _next2() {
|
||||
_next2 = _asyncToGenerator(function* () {
|
||||
try {
|
||||
var _ref = yield iterator.next(),
|
||||
value = _ref.value,
|
||||
done = _ref.done;
|
||||
|
||||
if (done) {
|
||||
readable.push(null);
|
||||
} else if (readable.push((yield value))) {
|
||||
next();
|
||||
} else {
|
||||
reading = false;
|
||||
}
|
||||
} catch (err) {
|
||||
readable.destroy(err);
|
||||
}
|
||||
});
|
||||
return _next2.apply(this, arguments);
|
||||
}
|
||||
|
||||
return readable;
|
||||
}
|
||||
|
||||
module.exports = from;
|
97
node_modules/bl/node_modules/readable-stream/lib/internal/streams/pipeline.js
generated
vendored
Normal file
97
node_modules/bl/node_modules/readable-stream/lib/internal/streams/pipeline.js
generated
vendored
Normal file
|
@ -0,0 +1,97 @@
|
|||
// Ported from https://github.com/mafintosh/pump with
|
||||
// permission from the author, Mathias Buus (@mafintosh).
|
||||
'use strict';
|
||||
|
||||
var eos;
|
||||
|
||||
function once(callback) {
|
||||
var called = false;
|
||||
return function () {
|
||||
if (called) return;
|
||||
called = true;
|
||||
callback.apply(void 0, arguments);
|
||||
};
|
||||
}
|
||||
|
||||
var _require$codes = require('../../../errors').codes,
|
||||
ERR_MISSING_ARGS = _require$codes.ERR_MISSING_ARGS,
|
||||
ERR_STREAM_DESTROYED = _require$codes.ERR_STREAM_DESTROYED;
|
||||
|
||||
function noop(err) {
|
||||
// Rethrow the error if it exists to avoid swallowing it
|
||||
if (err) throw err;
|
||||
}
|
||||
|
||||
function isRequest(stream) {
|
||||
return stream.setHeader && typeof stream.abort === 'function';
|
||||
}
|
||||
|
||||
function destroyer(stream, reading, writing, callback) {
|
||||
callback = once(callback);
|
||||
var closed = false;
|
||||
stream.on('close', function () {
|
||||
closed = true;
|
||||
});
|
||||
if (eos === undefined) eos = require('./end-of-stream');
|
||||
eos(stream, {
|
||||
readable: reading,
|
||||
writable: writing
|
||||
}, function (err) {
|
||||
if (err) return callback(err);
|
||||
closed = true;
|
||||
callback();
|
||||
});
|
||||
var destroyed = false;
|
||||
return function (err) {
|
||||
if (closed) return;
|
||||
if (destroyed) return;
|
||||
destroyed = true; // request.destroy just do .end - .abort is what we want
|
||||
|
||||
if (isRequest(stream)) return stream.abort();
|
||||
if (typeof stream.destroy === 'function') return stream.destroy();
|
||||
callback(err || new ERR_STREAM_DESTROYED('pipe'));
|
||||
};
|
||||
}
|
||||
|
||||
function call(fn) {
|
||||
fn();
|
||||
}
|
||||
|
||||
function pipe(from, to) {
|
||||
return from.pipe(to);
|
||||
}
|
||||
|
||||
function popCallback(streams) {
|
||||
if (!streams.length) return noop;
|
||||
if (typeof streams[streams.length - 1] !== 'function') return noop;
|
||||
return streams.pop();
|
||||
}
|
||||
|
||||
function pipeline() {
|
||||
for (var _len = arguments.length, streams = new Array(_len), _key = 0; _key < _len; _key++) {
|
||||
streams[_key] = arguments[_key];
|
||||
}
|
||||
|
||||
var callback = popCallback(streams);
|
||||
if (Array.isArray(streams[0])) streams = streams[0];
|
||||
|
||||
if (streams.length < 2) {
|
||||
throw new ERR_MISSING_ARGS('streams');
|
||||
}
|
||||
|
||||
var error;
|
||||
var destroys = streams.map(function (stream, i) {
|
||||
var reading = i < streams.length - 1;
|
||||
var writing = i > 0;
|
||||
return destroyer(stream, reading, writing, function (err) {
|
||||
if (!error) error = err;
|
||||
if (err) destroys.forEach(call);
|
||||
if (reading) return;
|
||||
destroys.forEach(call);
|
||||
callback(error);
|
||||
});
|
||||
});
|
||||
return streams.reduce(pipe);
|
||||
}
|
||||
|
||||
module.exports = pipeline;
|
27
node_modules/bl/node_modules/readable-stream/lib/internal/streams/state.js
generated
vendored
Normal file
27
node_modules/bl/node_modules/readable-stream/lib/internal/streams/state.js
generated
vendored
Normal file
|
@ -0,0 +1,27 @@
|
|||
'use strict';
|
||||
|
||||
var ERR_INVALID_OPT_VALUE = require('../../../errors').codes.ERR_INVALID_OPT_VALUE;
|
||||
|
||||
function highWaterMarkFrom(options, isDuplex, duplexKey) {
|
||||
return options.highWaterMark != null ? options.highWaterMark : isDuplex ? options[duplexKey] : null;
|
||||
}
|
||||
|
||||
function getHighWaterMark(state, options, duplexKey, isDuplex) {
|
||||
var hwm = highWaterMarkFrom(options, isDuplex, duplexKey);
|
||||
|
||||
if (hwm != null) {
|
||||
if (!(isFinite(hwm) && Math.floor(hwm) === hwm) || hwm < 0) {
|
||||
var name = isDuplex ? duplexKey : 'highWaterMark';
|
||||
throw new ERR_INVALID_OPT_VALUE(name, hwm);
|
||||
}
|
||||
|
||||
return Math.floor(hwm);
|
||||
} // Default value
|
||||
|
||||
|
||||
return state.objectMode ? 16 : 16 * 1024;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
getHighWaterMark: getHighWaterMark
|
||||
};
|
1
node_modules/bl/node_modules/readable-stream/lib/internal/streams/stream-browser.js
generated
vendored
Normal file
1
node_modules/bl/node_modules/readable-stream/lib/internal/streams/stream-browser.js
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
module.exports = require('events').EventEmitter;
|
1
node_modules/bl/node_modules/readable-stream/lib/internal/streams/stream.js
generated
vendored
Normal file
1
node_modules/bl/node_modules/readable-stream/lib/internal/streams/stream.js
generated
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
module.exports = require('stream');
|
68
node_modules/bl/node_modules/readable-stream/package.json
generated
vendored
Normal file
68
node_modules/bl/node_modules/readable-stream/package.json
generated
vendored
Normal file
|
@ -0,0 +1,68 @@
|
|||
{
|
||||
"name": "readable-stream",
|
||||
"version": "3.6.0",
|
||||
"description": "Streams3, a user-land copy of the stream library from Node.js",
|
||||
"main": "readable.js",
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
},
|
||||
"dependencies": {
|
||||
"inherits": "^2.0.3",
|
||||
"string_decoder": "^1.1.1",
|
||||
"util-deprecate": "^1.0.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@babel/cli": "^7.2.0",
|
||||
"@babel/core": "^7.2.0",
|
||||
"@babel/polyfill": "^7.0.0",
|
||||
"@babel/preset-env": "^7.2.0",
|
||||
"airtap": "0.0.9",
|
||||
"assert": "^1.4.0",
|
||||
"bl": "^2.0.0",
|
||||
"deep-strict-equal": "^0.2.0",
|
||||
"events.once": "^2.0.2",
|
||||
"glob": "^7.1.2",
|
||||
"gunzip-maybe": "^1.4.1",
|
||||
"hyperquest": "^2.1.3",
|
||||
"lolex": "^2.6.0",
|
||||
"nyc": "^11.0.0",
|
||||
"pump": "^3.0.0",
|
||||
"rimraf": "^2.6.2",
|
||||
"tap": "^12.0.0",
|
||||
"tape": "^4.9.0",
|
||||
"tar-fs": "^1.16.2",
|
||||
"util-promisify": "^2.1.0"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "tap -J --no-esm test/parallel/*.js test/ours/*.js",
|
||||
"ci": "TAP=1 tap --no-esm test/parallel/*.js test/ours/*.js | tee test.tap",
|
||||
"test-browsers": "airtap --sauce-connect --loopback airtap.local -- test/browser.js",
|
||||
"test-browser-local": "airtap --open --local -- test/browser.js",
|
||||
"cover": "nyc npm test",
|
||||
"report": "nyc report --reporter=lcov",
|
||||
"update-browser-errors": "babel -o errors-browser.js errors.js"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/nodejs/readable-stream"
|
||||
},
|
||||
"keywords": [
|
||||
"readable",
|
||||
"stream",
|
||||
"pipe"
|
||||
],
|
||||
"browser": {
|
||||
"util": false,
|
||||
"worker_threads": false,
|
||||
"./errors": "./errors-browser.js",
|
||||
"./readable.js": "./readable-browser.js",
|
||||
"./lib/internal/streams/from.js": "./lib/internal/streams/from-browser.js",
|
||||
"./lib/internal/streams/stream.js": "./lib/internal/streams/stream-browser.js"
|
||||
},
|
||||
"nyc": {
|
||||
"include": [
|
||||
"lib/**.js"
|
||||
]
|
||||
},
|
||||
"license": "MIT"
|
||||
}
|
9
node_modules/bl/node_modules/readable-stream/readable-browser.js
generated
vendored
Normal file
9
node_modules/bl/node_modules/readable-stream/readable-browser.js
generated
vendored
Normal file
|
@ -0,0 +1,9 @@
|
|||
exports = module.exports = require('./lib/_stream_readable.js');
|
||||
exports.Stream = exports;
|
||||
exports.Readable = exports;
|
||||
exports.Writable = require('./lib/_stream_writable.js');
|
||||
exports.Duplex = require('./lib/_stream_duplex.js');
|
||||
exports.Transform = require('./lib/_stream_transform.js');
|
||||
exports.PassThrough = require('./lib/_stream_passthrough.js');
|
||||
exports.finished = require('./lib/internal/streams/end-of-stream.js');
|
||||
exports.pipeline = require('./lib/internal/streams/pipeline.js');
|
16
node_modules/bl/node_modules/readable-stream/readable.js
generated
vendored
Normal file
16
node_modules/bl/node_modules/readable-stream/readable.js
generated
vendored
Normal file
|
@ -0,0 +1,16 @@
|
|||
var Stream = require('stream');
|
||||
if (process.env.READABLE_STREAM === 'disable' && Stream) {
|
||||
module.exports = Stream.Readable;
|
||||
Object.assign(module.exports, Stream);
|
||||
module.exports.Stream = Stream;
|
||||
} else {
|
||||
exports = module.exports = require('./lib/_stream_readable.js');
|
||||
exports.Stream = Stream || exports;
|
||||
exports.Readable = exports;
|
||||
exports.Writable = require('./lib/_stream_writable.js');
|
||||
exports.Duplex = require('./lib/_stream_duplex.js');
|
||||
exports.Transform = require('./lib/_stream_transform.js');
|
||||
exports.PassThrough = require('./lib/_stream_passthrough.js');
|
||||
exports.finished = require('./lib/internal/streams/end-of-stream.js');
|
||||
exports.pipeline = require('./lib/internal/streams/pipeline.js');
|
||||
}
|
37
node_modules/bl/package.json
generated
vendored
Normal file
37
node_modules/bl/package.json
generated
vendored
Normal file
|
@ -0,0 +1,37 @@
|
|||
{
|
||||
"name": "bl",
|
||||
"version": "4.1.0",
|
||||
"description": "Buffer List: collect buffers and access with a standard readable Buffer interface, streamable too!",
|
||||
"license": "MIT",
|
||||
"main": "bl.js",
|
||||
"scripts": {
|
||||
"lint": "standard *.js test/*.js",
|
||||
"test": "npm run lint && node test/test.js | faucet"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/rvagg/bl.git"
|
||||
},
|
||||
"homepage": "https://github.com/rvagg/bl",
|
||||
"authors": [
|
||||
"Rod Vagg <rod@vagg.org> (https://github.com/rvagg)",
|
||||
"Matteo Collina <matteo.collina@gmail.com> (https://github.com/mcollina)",
|
||||
"Jarett Cruger <jcrugzz@gmail.com> (https://github.com/jcrugzz)"
|
||||
],
|
||||
"keywords": [
|
||||
"buffer",
|
||||
"buffers",
|
||||
"stream",
|
||||
"awesomesauce"
|
||||
],
|
||||
"dependencies": {
|
||||
"buffer": "^5.5.0",
|
||||
"inherits": "^2.0.4",
|
||||
"readable-stream": "^3.4.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"faucet": "~0.0.1",
|
||||
"standard": "^14.3.0",
|
||||
"tape": "^4.11.0"
|
||||
}
|
||||
}
|
21
node_modules/bl/test/convert.js
generated
vendored
Normal file
21
node_modules/bl/test/convert.js
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
|||
'use strict'
|
||||
|
||||
const tape = require('tape')
|
||||
const { BufferList, BufferListStream } = require('../')
|
||||
const { Buffer } = require('buffer')
|
||||
|
||||
tape('convert from BufferList to BufferListStream', (t) => {
|
||||
const data = Buffer.from(`TEST-${Date.now()}`)
|
||||
const bl = new BufferList(data)
|
||||
const bls = new BufferListStream(bl)
|
||||
t.ok(bl.slice().equals(bls.slice()))
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('convert from BufferListStream to BufferList', (t) => {
|
||||
const data = Buffer.from(`TEST-${Date.now()}`)
|
||||
const bls = new BufferListStream(data)
|
||||
const bl = new BufferList(bls)
|
||||
t.ok(bl.slice().equals(bls.slice()))
|
||||
t.end()
|
||||
})
|
492
node_modules/bl/test/indexOf.js
generated
vendored
Normal file
492
node_modules/bl/test/indexOf.js
generated
vendored
Normal file
|
@ -0,0 +1,492 @@
|
|||
'use strict'
|
||||
|
||||
const tape = require('tape')
|
||||
const BufferList = require('../')
|
||||
const { Buffer } = require('buffer')
|
||||
|
||||
tape('indexOf single byte needle', (t) => {
|
||||
const bl = new BufferList(['abcdefg', 'abcdefg', '12345'])
|
||||
|
||||
t.equal(bl.indexOf('e'), 4)
|
||||
t.equal(bl.indexOf('e', 5), 11)
|
||||
t.equal(bl.indexOf('e', 12), -1)
|
||||
t.equal(bl.indexOf('5'), 18)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('indexOf multiple byte needle', (t) => {
|
||||
const bl = new BufferList(['abcdefg', 'abcdefg'])
|
||||
|
||||
t.equal(bl.indexOf('ef'), 4)
|
||||
t.equal(bl.indexOf('ef', 5), 11)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('indexOf multiple byte needles across buffer boundaries', (t) => {
|
||||
const bl = new BufferList(['abcdefg', 'abcdefg'])
|
||||
|
||||
t.equal(bl.indexOf('fgabc'), 5)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('indexOf takes a Uint8Array search', (t) => {
|
||||
const bl = new BufferList(['abcdefg', 'abcdefg'])
|
||||
const search = new Uint8Array([102, 103, 97, 98, 99]) // fgabc
|
||||
|
||||
t.equal(bl.indexOf(search), 5)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('indexOf takes a buffer list search', (t) => {
|
||||
const bl = new BufferList(['abcdefg', 'abcdefg'])
|
||||
const search = new BufferList('fgabc')
|
||||
|
||||
t.equal(bl.indexOf(search), 5)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('indexOf a zero byte needle', (t) => {
|
||||
const b = new BufferList('abcdef')
|
||||
const bufEmpty = Buffer.from('')
|
||||
|
||||
t.equal(b.indexOf(''), 0)
|
||||
t.equal(b.indexOf('', 1), 1)
|
||||
t.equal(b.indexOf('', b.length + 1), b.length)
|
||||
t.equal(b.indexOf('', Infinity), b.length)
|
||||
t.equal(b.indexOf(bufEmpty), 0)
|
||||
t.equal(b.indexOf(bufEmpty, 1), 1)
|
||||
t.equal(b.indexOf(bufEmpty, b.length + 1), b.length)
|
||||
t.equal(b.indexOf(bufEmpty, Infinity), b.length)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('indexOf buffers smaller and larger than the needle', (t) => {
|
||||
const bl = new BufferList(['abcdefg', 'a', 'bcdefg', 'a', 'bcfgab'])
|
||||
|
||||
t.equal(bl.indexOf('fgabc'), 5)
|
||||
t.equal(bl.indexOf('fgabc', 6), 12)
|
||||
t.equal(bl.indexOf('fgabc', 13), -1)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
// only present in node 6+
|
||||
;(process.version.substr(1).split('.')[0] >= 6) && tape('indexOf latin1 and binary encoding', (t) => {
|
||||
const b = new BufferList('abcdef')
|
||||
|
||||
// test latin1 encoding
|
||||
t.equal(
|
||||
new BufferList(Buffer.from(b.toString('latin1'), 'latin1'))
|
||||
.indexOf('d', 0, 'latin1'),
|
||||
3
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from(b.toString('latin1'), 'latin1'))
|
||||
.indexOf(Buffer.from('d', 'latin1'), 0, 'latin1'),
|
||||
3
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from('aa\u00e8aa', 'latin1'))
|
||||
.indexOf('\u00e8', 'latin1'),
|
||||
2
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from('\u00e8', 'latin1'))
|
||||
.indexOf('\u00e8', 'latin1'),
|
||||
0
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from('\u00e8', 'latin1'))
|
||||
.indexOf(Buffer.from('\u00e8', 'latin1'), 'latin1'),
|
||||
0
|
||||
)
|
||||
|
||||
// test binary encoding
|
||||
t.equal(
|
||||
new BufferList(Buffer.from(b.toString('binary'), 'binary'))
|
||||
.indexOf('d', 0, 'binary'),
|
||||
3
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from(b.toString('binary'), 'binary'))
|
||||
.indexOf(Buffer.from('d', 'binary'), 0, 'binary'),
|
||||
3
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from('aa\u00e8aa', 'binary'))
|
||||
.indexOf('\u00e8', 'binary'),
|
||||
2
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from('\u00e8', 'binary'))
|
||||
.indexOf('\u00e8', 'binary'),
|
||||
0
|
||||
)
|
||||
t.equal(
|
||||
new BufferList(Buffer.from('\u00e8', 'binary'))
|
||||
.indexOf(Buffer.from('\u00e8', 'binary'), 'binary'),
|
||||
0
|
||||
)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('indexOf the entire nodejs10 buffer test suite', (t) => {
|
||||
const b = new BufferList('abcdef')
|
||||
const bufA = Buffer.from('a')
|
||||
const bufBc = Buffer.from('bc')
|
||||
const bufF = Buffer.from('f')
|
||||
const bufZ = Buffer.from('z')
|
||||
|
||||
const stringComparison = 'abcdef'
|
||||
|
||||
t.equal(b.indexOf('a'), 0)
|
||||
t.equal(b.indexOf('a', 1), -1)
|
||||
t.equal(b.indexOf('a', -1), -1)
|
||||
t.equal(b.indexOf('a', -4), -1)
|
||||
t.equal(b.indexOf('a', -b.length), 0)
|
||||
t.equal(b.indexOf('a', NaN), 0)
|
||||
t.equal(b.indexOf('a', -Infinity), 0)
|
||||
t.equal(b.indexOf('a', Infinity), -1)
|
||||
t.equal(b.indexOf('bc'), 1)
|
||||
t.equal(b.indexOf('bc', 2), -1)
|
||||
t.equal(b.indexOf('bc', -1), -1)
|
||||
t.equal(b.indexOf('bc', -3), -1)
|
||||
t.equal(b.indexOf('bc', -5), 1)
|
||||
t.equal(b.indexOf('bc', NaN), 1)
|
||||
t.equal(b.indexOf('bc', -Infinity), 1)
|
||||
t.equal(b.indexOf('bc', Infinity), -1)
|
||||
t.equal(b.indexOf('f'), b.length - 1)
|
||||
t.equal(b.indexOf('z'), -1)
|
||||
|
||||
// empty search tests
|
||||
t.equal(b.indexOf(bufA), 0)
|
||||
t.equal(b.indexOf(bufA, 1), -1)
|
||||
t.equal(b.indexOf(bufA, -1), -1)
|
||||
t.equal(b.indexOf(bufA, -4), -1)
|
||||
t.equal(b.indexOf(bufA, -b.length), 0)
|
||||
t.equal(b.indexOf(bufA, NaN), 0)
|
||||
t.equal(b.indexOf(bufA, -Infinity), 0)
|
||||
t.equal(b.indexOf(bufA, Infinity), -1)
|
||||
t.equal(b.indexOf(bufBc), 1)
|
||||
t.equal(b.indexOf(bufBc, 2), -1)
|
||||
t.equal(b.indexOf(bufBc, -1), -1)
|
||||
t.equal(b.indexOf(bufBc, -3), -1)
|
||||
t.equal(b.indexOf(bufBc, -5), 1)
|
||||
t.equal(b.indexOf(bufBc, NaN), 1)
|
||||
t.equal(b.indexOf(bufBc, -Infinity), 1)
|
||||
t.equal(b.indexOf(bufBc, Infinity), -1)
|
||||
t.equal(b.indexOf(bufF), b.length - 1)
|
||||
t.equal(b.indexOf(bufZ), -1)
|
||||
t.equal(b.indexOf(0x61), 0)
|
||||
t.equal(b.indexOf(0x61, 1), -1)
|
||||
t.equal(b.indexOf(0x61, -1), -1)
|
||||
t.equal(b.indexOf(0x61, -4), -1)
|
||||
t.equal(b.indexOf(0x61, -b.length), 0)
|
||||
t.equal(b.indexOf(0x61, NaN), 0)
|
||||
t.equal(b.indexOf(0x61, -Infinity), 0)
|
||||
t.equal(b.indexOf(0x61, Infinity), -1)
|
||||
t.equal(b.indexOf(0x0), -1)
|
||||
|
||||
// test offsets
|
||||
t.equal(b.indexOf('d', 2), 3)
|
||||
t.equal(b.indexOf('f', 5), 5)
|
||||
t.equal(b.indexOf('f', -1), 5)
|
||||
t.equal(b.indexOf('f', 6), -1)
|
||||
|
||||
t.equal(b.indexOf(Buffer.from('d'), 2), 3)
|
||||
t.equal(b.indexOf(Buffer.from('f'), 5), 5)
|
||||
t.equal(b.indexOf(Buffer.from('f'), -1), 5)
|
||||
t.equal(b.indexOf(Buffer.from('f'), 6), -1)
|
||||
|
||||
t.equal(Buffer.from('ff').indexOf(Buffer.from('f'), 1, 'ucs2'), -1)
|
||||
|
||||
// test invalid and uppercase encoding
|
||||
t.equal(b.indexOf('b', 'utf8'), 1)
|
||||
t.equal(b.indexOf('b', 'UTF8'), 1)
|
||||
t.equal(b.indexOf('62', 'HEX'), 1)
|
||||
t.throws(() => b.indexOf('bad', 'enc'), TypeError)
|
||||
|
||||
// test hex encoding
|
||||
t.equal(
|
||||
Buffer.from(b.toString('hex'), 'hex')
|
||||
.indexOf('64', 0, 'hex'),
|
||||
3
|
||||
)
|
||||
t.equal(
|
||||
Buffer.from(b.toString('hex'), 'hex')
|
||||
.indexOf(Buffer.from('64', 'hex'), 0, 'hex'),
|
||||
3
|
||||
)
|
||||
|
||||
// test base64 encoding
|
||||
t.equal(
|
||||
Buffer.from(b.toString('base64'), 'base64')
|
||||
.indexOf('ZA==', 0, 'base64'),
|
||||
3
|
||||
)
|
||||
t.equal(
|
||||
Buffer.from(b.toString('base64'), 'base64')
|
||||
.indexOf(Buffer.from('ZA==', 'base64'), 0, 'base64'),
|
||||
3
|
||||
)
|
||||
|
||||
// test ascii encoding
|
||||
t.equal(
|
||||
Buffer.from(b.toString('ascii'), 'ascii')
|
||||
.indexOf('d', 0, 'ascii'),
|
||||
3
|
||||
)
|
||||
t.equal(
|
||||
Buffer.from(b.toString('ascii'), 'ascii')
|
||||
.indexOf(Buffer.from('d', 'ascii'), 0, 'ascii'),
|
||||
3
|
||||
)
|
||||
|
||||
// test optional offset with passed encoding
|
||||
t.equal(Buffer.from('aaaa0').indexOf('30', 'hex'), 4)
|
||||
t.equal(Buffer.from('aaaa00a').indexOf('3030', 'hex'), 4)
|
||||
|
||||
{
|
||||
// test usc2 encoding
|
||||
const twoByteString = Buffer.from('\u039a\u0391\u03a3\u03a3\u0395', 'ucs2')
|
||||
|
||||
t.equal(8, twoByteString.indexOf('\u0395', 4, 'ucs2'))
|
||||
t.equal(6, twoByteString.indexOf('\u03a3', -4, 'ucs2'))
|
||||
t.equal(4, twoByteString.indexOf('\u03a3', -6, 'ucs2'))
|
||||
t.equal(4, twoByteString.indexOf(
|
||||
Buffer.from('\u03a3', 'ucs2'), -6, 'ucs2'))
|
||||
t.equal(-1, twoByteString.indexOf('\u03a3', -2, 'ucs2'))
|
||||
}
|
||||
|
||||
const mixedByteStringUcs2 =
|
||||
Buffer.from('\u039a\u0391abc\u03a3\u03a3\u0395', 'ucs2')
|
||||
|
||||
t.equal(6, mixedByteStringUcs2.indexOf('bc', 0, 'ucs2'))
|
||||
t.equal(10, mixedByteStringUcs2.indexOf('\u03a3', 0, 'ucs2'))
|
||||
t.equal(-1, mixedByteStringUcs2.indexOf('\u0396', 0, 'ucs2'))
|
||||
|
||||
t.equal(
|
||||
6, mixedByteStringUcs2.indexOf(Buffer.from('bc', 'ucs2'), 0, 'ucs2'))
|
||||
t.equal(
|
||||
10, mixedByteStringUcs2.indexOf(Buffer.from('\u03a3', 'ucs2'), 0, 'ucs2'))
|
||||
t.equal(
|
||||
-1, mixedByteStringUcs2.indexOf(Buffer.from('\u0396', 'ucs2'), 0, 'ucs2'))
|
||||
|
||||
{
|
||||
const twoByteString = Buffer.from('\u039a\u0391\u03a3\u03a3\u0395', 'ucs2')
|
||||
|
||||
// Test single char pattern
|
||||
t.equal(0, twoByteString.indexOf('\u039a', 0, 'ucs2'))
|
||||
let index = twoByteString.indexOf('\u0391', 0, 'ucs2')
|
||||
t.equal(2, index, `Alpha - at index ${index}`)
|
||||
index = twoByteString.indexOf('\u03a3', 0, 'ucs2')
|
||||
t.equal(4, index, `First Sigma - at index ${index}`)
|
||||
index = twoByteString.indexOf('\u03a3', 6, 'ucs2')
|
||||
t.equal(6, index, `Second Sigma - at index ${index}`)
|
||||
index = twoByteString.indexOf('\u0395', 0, 'ucs2')
|
||||
t.equal(8, index, `Epsilon - at index ${index}`)
|
||||
index = twoByteString.indexOf('\u0392', 0, 'ucs2')
|
||||
t.equal(-1, index, `Not beta - at index ${index}`)
|
||||
|
||||
// Test multi-char pattern
|
||||
index = twoByteString.indexOf('\u039a\u0391', 0, 'ucs2')
|
||||
t.equal(0, index, `Lambda Alpha - at index ${index}`)
|
||||
index = twoByteString.indexOf('\u0391\u03a3', 0, 'ucs2')
|
||||
t.equal(2, index, `Alpha Sigma - at index ${index}`)
|
||||
index = twoByteString.indexOf('\u03a3\u03a3', 0, 'ucs2')
|
||||
t.equal(4, index, `Sigma Sigma - at index ${index}`)
|
||||
index = twoByteString.indexOf('\u03a3\u0395', 0, 'ucs2')
|
||||
t.equal(6, index, `Sigma Epsilon - at index ${index}`)
|
||||
}
|
||||
|
||||
const mixedByteStringUtf8 = Buffer.from('\u039a\u0391abc\u03a3\u03a3\u0395')
|
||||
|
||||
t.equal(5, mixedByteStringUtf8.indexOf('bc'))
|
||||
t.equal(5, mixedByteStringUtf8.indexOf('bc', 5))
|
||||
t.equal(5, mixedByteStringUtf8.indexOf('bc', -8))
|
||||
t.equal(7, mixedByteStringUtf8.indexOf('\u03a3'))
|
||||
t.equal(-1, mixedByteStringUtf8.indexOf('\u0396'))
|
||||
|
||||
// Test complex string indexOf algorithms. Only trigger for long strings.
|
||||
// Long string that isn't a simple repeat of a shorter string.
|
||||
let longString = 'A'
|
||||
for (let i = 66; i < 76; i++) { // from 'B' to 'K'
|
||||
longString = longString + String.fromCharCode(i) + longString
|
||||
}
|
||||
|
||||
const longBufferString = Buffer.from(longString)
|
||||
|
||||
// pattern of 15 chars, repeated every 16 chars in long
|
||||
let pattern = 'ABACABADABACABA'
|
||||
for (let i = 0; i < longBufferString.length - pattern.length; i += 7) {
|
||||
const index = longBufferString.indexOf(pattern, i)
|
||||
t.equal((i + 15) & ~0xf, index,
|
||||
`Long ABACABA...-string at index ${i}`)
|
||||
}
|
||||
|
||||
let index = longBufferString.indexOf('AJABACA')
|
||||
t.equal(510, index, `Long AJABACA, First J - at index ${index}`)
|
||||
index = longBufferString.indexOf('AJABACA', 511)
|
||||
t.equal(1534, index, `Long AJABACA, Second J - at index ${index}`)
|
||||
|
||||
pattern = 'JABACABADABACABA'
|
||||
index = longBufferString.indexOf(pattern)
|
||||
t.equal(511, index, `Long JABACABA..., First J - at index ${index}`)
|
||||
index = longBufferString.indexOf(pattern, 512)
|
||||
t.equal(
|
||||
1535, index, `Long JABACABA..., Second J - at index ${index}`)
|
||||
|
||||
// Search for a non-ASCII string in a pure ASCII string.
|
||||
const asciiString = Buffer.from(
|
||||
'somethingnotatallsinisterwhichalsoworks')
|
||||
t.equal(-1, asciiString.indexOf('\x2061'))
|
||||
t.equal(3, asciiString.indexOf('eth', 0))
|
||||
|
||||
// Search in string containing many non-ASCII chars.
|
||||
const allCodePoints = []
|
||||
for (let i = 0; i < 65536; i++) {
|
||||
allCodePoints[i] = i
|
||||
}
|
||||
|
||||
const allCharsString = String.fromCharCode.apply(String, allCodePoints)
|
||||
const allCharsBufferUtf8 = Buffer.from(allCharsString)
|
||||
const allCharsBufferUcs2 = Buffer.from(allCharsString, 'ucs2')
|
||||
|
||||
// Search for string long enough to trigger complex search with ASCII pattern
|
||||
// and UC16 subject.
|
||||
t.equal(-1, allCharsBufferUtf8.indexOf('notfound'))
|
||||
t.equal(-1, allCharsBufferUcs2.indexOf('notfound'))
|
||||
|
||||
// Needle is longer than haystack, but only because it's encoded as UTF-16
|
||||
t.equal(Buffer.from('aaaa').indexOf('a'.repeat(4), 'ucs2'), -1)
|
||||
|
||||
t.equal(Buffer.from('aaaa').indexOf('a'.repeat(4), 'utf8'), 0)
|
||||
t.equal(Buffer.from('aaaa').indexOf('你好', 'ucs2'), -1)
|
||||
|
||||
// Haystack has odd length, but the needle is UCS2.
|
||||
t.equal(Buffer.from('aaaaa').indexOf('b', 'ucs2'), -1)
|
||||
|
||||
{
|
||||
// Find substrings in Utf8.
|
||||
const lengths = [1, 3, 15] // Single char, simple and complex.
|
||||
const indices = [0x5, 0x60, 0x400, 0x680, 0x7ee, 0xFF02, 0x16610, 0x2f77b]
|
||||
for (let lengthIndex = 0; lengthIndex < lengths.length; lengthIndex++) {
|
||||
for (let i = 0; i < indices.length; i++) {
|
||||
const index = indices[i]
|
||||
let length = lengths[lengthIndex]
|
||||
|
||||
if (index + length > 0x7F) {
|
||||
length = 2 * length
|
||||
}
|
||||
|
||||
if (index + length > 0x7FF) {
|
||||
length = 3 * length
|
||||
}
|
||||
|
||||
if (index + length > 0xFFFF) {
|
||||
length = 4 * length
|
||||
}
|
||||
|
||||
const patternBufferUtf8 = allCharsBufferUtf8.slice(index, index + length)
|
||||
t.equal(index, allCharsBufferUtf8.indexOf(patternBufferUtf8))
|
||||
|
||||
const patternStringUtf8 = patternBufferUtf8.toString()
|
||||
t.equal(index, allCharsBufferUtf8.indexOf(patternStringUtf8))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
{
|
||||
// Find substrings in Usc2.
|
||||
const lengths = [2, 4, 16] // Single char, simple and complex.
|
||||
const indices = [0x5, 0x65, 0x105, 0x205, 0x285, 0x2005, 0x2085, 0xfff0]
|
||||
|
||||
for (let lengthIndex = 0; lengthIndex < lengths.length; lengthIndex++) {
|
||||
for (let i = 0; i < indices.length; i++) {
|
||||
const index = indices[i] * 2
|
||||
const length = lengths[lengthIndex]
|
||||
|
||||
const patternBufferUcs2 =
|
||||
allCharsBufferUcs2.slice(index, index + length)
|
||||
t.equal(
|
||||
index, allCharsBufferUcs2.indexOf(patternBufferUcs2, 0, 'ucs2'))
|
||||
|
||||
const patternStringUcs2 = patternBufferUcs2.toString('ucs2')
|
||||
t.equal(
|
||||
index, allCharsBufferUcs2.indexOf(patternStringUcs2, 0, 'ucs2'))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
[
|
||||
() => {},
|
||||
{},
|
||||
[]
|
||||
].forEach((val) => {
|
||||
t.throws(() => b.indexOf(val), TypeError, `"${JSON.stringify(val)}" should throw`)
|
||||
})
|
||||
|
||||
// Test weird offset arguments.
|
||||
// The following offsets coerce to NaN or 0, searching the whole Buffer
|
||||
t.equal(b.indexOf('b', undefined), 1)
|
||||
t.equal(b.indexOf('b', {}), 1)
|
||||
t.equal(b.indexOf('b', 0), 1)
|
||||
t.equal(b.indexOf('b', null), 1)
|
||||
t.equal(b.indexOf('b', []), 1)
|
||||
|
||||
// The following offset coerces to 2, in other words +[2] === 2
|
||||
t.equal(b.indexOf('b', [2]), -1)
|
||||
|
||||
// Behavior should match String.indexOf()
|
||||
t.equal(
|
||||
b.indexOf('b', undefined),
|
||||
stringComparison.indexOf('b', undefined))
|
||||
t.equal(
|
||||
b.indexOf('b', {}),
|
||||
stringComparison.indexOf('b', {}))
|
||||
t.equal(
|
||||
b.indexOf('b', 0),
|
||||
stringComparison.indexOf('b', 0))
|
||||
t.equal(
|
||||
b.indexOf('b', null),
|
||||
stringComparison.indexOf('b', null))
|
||||
t.equal(
|
||||
b.indexOf('b', []),
|
||||
stringComparison.indexOf('b', []))
|
||||
t.equal(
|
||||
b.indexOf('b', [2]),
|
||||
stringComparison.indexOf('b', [2]))
|
||||
|
||||
// test truncation of Number arguments to uint8
|
||||
{
|
||||
const buf = Buffer.from('this is a test')
|
||||
|
||||
t.equal(buf.indexOf(0x6973), 3)
|
||||
t.equal(buf.indexOf(0x697320), 4)
|
||||
t.equal(buf.indexOf(0x69732069), 2)
|
||||
t.equal(buf.indexOf(0x697374657374), 0)
|
||||
t.equal(buf.indexOf(0x69737374), 0)
|
||||
t.equal(buf.indexOf(0x69737465), 11)
|
||||
t.equal(buf.indexOf(0x69737465), 11)
|
||||
t.equal(buf.indexOf(-140), 0)
|
||||
t.equal(buf.indexOf(-152), 1)
|
||||
t.equal(buf.indexOf(0xff), -1)
|
||||
t.equal(buf.indexOf(0xffff), -1)
|
||||
}
|
||||
|
||||
// Test that Uint8Array arguments are okay.
|
||||
{
|
||||
const needle = new Uint8Array([0x66, 0x6f, 0x6f])
|
||||
const haystack = new BufferList(Buffer.from('a foo b foo'))
|
||||
t.equal(haystack.indexOf(needle), 2)
|
||||
}
|
||||
|
||||
t.end()
|
||||
})
|
32
node_modules/bl/test/isBufferList.js
generated
vendored
Normal file
32
node_modules/bl/test/isBufferList.js
generated
vendored
Normal file
|
@ -0,0 +1,32 @@
|
|||
'use strict'
|
||||
|
||||
const tape = require('tape')
|
||||
const { BufferList, BufferListStream } = require('../')
|
||||
const { Buffer } = require('buffer')
|
||||
|
||||
tape('isBufferList positives', (t) => {
|
||||
t.ok(BufferList.isBufferList(new BufferList()))
|
||||
t.ok(BufferList.isBufferList(new BufferListStream()))
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('isBufferList negatives', (t) => {
|
||||
const types = [
|
||||
null,
|
||||
undefined,
|
||||
NaN,
|
||||
true,
|
||||
false,
|
||||
{},
|
||||
[],
|
||||
Buffer.alloc(0),
|
||||
[Buffer.alloc(0)]
|
||||
]
|
||||
|
||||
for (const obj of types) {
|
||||
t.notOk(BufferList.isBufferList(obj))
|
||||
}
|
||||
|
||||
t.end()
|
||||
})
|
869
node_modules/bl/test/test.js
generated
vendored
Normal file
869
node_modules/bl/test/test.js
generated
vendored
Normal file
|
@ -0,0 +1,869 @@
|
|||
'use strict'
|
||||
|
||||
const tape = require('tape')
|
||||
const crypto = require('crypto')
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const BufferList = require('../')
|
||||
const { Buffer } = require('buffer')
|
||||
|
||||
const encodings =
|
||||
('hex utf8 utf-8 ascii binary base64' +
|
||||
(process.browser ? '' : ' ucs2 ucs-2 utf16le utf-16le')).split(' ')
|
||||
|
||||
require('./indexOf')
|
||||
require('./isBufferList')
|
||||
require('./convert')
|
||||
|
||||
tape('single bytes from single buffer', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abcd'))
|
||||
|
||||
t.equal(bl.length, 4)
|
||||
t.equal(bl.get(-1), undefined)
|
||||
t.equal(bl.get(0), 97)
|
||||
t.equal(bl.get(1), 98)
|
||||
t.equal(bl.get(2), 99)
|
||||
t.equal(bl.get(3), 100)
|
||||
t.equal(bl.get(4), undefined)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('single bytes from multiple buffers', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abcd'))
|
||||
bl.append(Buffer.from('efg'))
|
||||
bl.append(Buffer.from('hi'))
|
||||
bl.append(Buffer.from('j'))
|
||||
|
||||
t.equal(bl.length, 10)
|
||||
|
||||
t.equal(bl.get(0), 97)
|
||||
t.equal(bl.get(1), 98)
|
||||
t.equal(bl.get(2), 99)
|
||||
t.equal(bl.get(3), 100)
|
||||
t.equal(bl.get(4), 101)
|
||||
t.equal(bl.get(5), 102)
|
||||
t.equal(bl.get(6), 103)
|
||||
t.equal(bl.get(7), 104)
|
||||
t.equal(bl.get(8), 105)
|
||||
t.equal(bl.get(9), 106)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('multi bytes from single buffer', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abcd'))
|
||||
|
||||
t.equal(bl.length, 4)
|
||||
|
||||
t.equal(bl.slice(0, 4).toString('ascii'), 'abcd')
|
||||
t.equal(bl.slice(0, 3).toString('ascii'), 'abc')
|
||||
t.equal(bl.slice(1, 4).toString('ascii'), 'bcd')
|
||||
t.equal(bl.slice(-4, -1).toString('ascii'), 'abc')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('multi bytes from single buffer (negative indexes)', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('buffer'))
|
||||
|
||||
t.equal(bl.length, 6)
|
||||
|
||||
t.equal(bl.slice(-6, -1).toString('ascii'), 'buffe')
|
||||
t.equal(bl.slice(-6, -2).toString('ascii'), 'buff')
|
||||
t.equal(bl.slice(-5, -2).toString('ascii'), 'uff')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('multiple bytes from multiple buffers', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abcd'))
|
||||
bl.append(Buffer.from('efg'))
|
||||
bl.append(Buffer.from('hi'))
|
||||
bl.append(Buffer.from('j'))
|
||||
|
||||
t.equal(bl.length, 10)
|
||||
|
||||
t.equal(bl.slice(0, 10).toString('ascii'), 'abcdefghij')
|
||||
t.equal(bl.slice(3, 10).toString('ascii'), 'defghij')
|
||||
t.equal(bl.slice(3, 6).toString('ascii'), 'def')
|
||||
t.equal(bl.slice(3, 8).toString('ascii'), 'defgh')
|
||||
t.equal(bl.slice(5, 10).toString('ascii'), 'fghij')
|
||||
t.equal(bl.slice(-7, -4).toString('ascii'), 'def')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('multiple bytes from multiple buffer lists', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(new BufferList([Buffer.from('abcd'), Buffer.from('efg')]))
|
||||
bl.append(new BufferList([Buffer.from('hi'), Buffer.from('j')]))
|
||||
|
||||
t.equal(bl.length, 10)
|
||||
|
||||
t.equal(bl.slice(0, 10).toString('ascii'), 'abcdefghij')
|
||||
|
||||
t.equal(bl.slice(3, 10).toString('ascii'), 'defghij')
|
||||
t.equal(bl.slice(3, 6).toString('ascii'), 'def')
|
||||
t.equal(bl.slice(3, 8).toString('ascii'), 'defgh')
|
||||
t.equal(bl.slice(5, 10).toString('ascii'), 'fghij')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
// same data as previous test, just using nested constructors
|
||||
tape('multiple bytes from crazy nested buffer lists', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(new BufferList([
|
||||
new BufferList([
|
||||
new BufferList(Buffer.from('abc')),
|
||||
Buffer.from('d'),
|
||||
new BufferList(Buffer.from('efg'))
|
||||
]),
|
||||
new BufferList([Buffer.from('hi')]),
|
||||
new BufferList(Buffer.from('j'))
|
||||
]))
|
||||
|
||||
t.equal(bl.length, 10)
|
||||
|
||||
t.equal(bl.slice(0, 10).toString('ascii'), 'abcdefghij')
|
||||
|
||||
t.equal(bl.slice(3, 10).toString('ascii'), 'defghij')
|
||||
t.equal(bl.slice(3, 6).toString('ascii'), 'def')
|
||||
t.equal(bl.slice(3, 8).toString('ascii'), 'defgh')
|
||||
t.equal(bl.slice(5, 10).toString('ascii'), 'fghij')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('append accepts arrays of Buffers', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abc'))
|
||||
bl.append([Buffer.from('def')])
|
||||
bl.append([Buffer.from('ghi'), Buffer.from('jkl')])
|
||||
bl.append([Buffer.from('mnop'), Buffer.from('qrstu'), Buffer.from('vwxyz')])
|
||||
t.equal(bl.length, 26)
|
||||
t.equal(bl.slice().toString('ascii'), 'abcdefghijklmnopqrstuvwxyz')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('append accepts arrays of Uint8Arrays', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(new Uint8Array([97, 98, 99]))
|
||||
bl.append([Uint8Array.from([100, 101, 102])])
|
||||
bl.append([new Uint8Array([103, 104, 105]), new Uint8Array([106, 107, 108])])
|
||||
bl.append([new Uint8Array([109, 110, 111, 112]), new Uint8Array([113, 114, 115, 116, 117]), new Uint8Array([118, 119, 120, 121, 122])])
|
||||
t.equal(bl.length, 26)
|
||||
t.equal(bl.slice().toString('ascii'), 'abcdefghijklmnopqrstuvwxyz')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('append accepts arrays of BufferLists', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abc'))
|
||||
bl.append([new BufferList('def')])
|
||||
bl.append(new BufferList([Buffer.from('ghi'), new BufferList('jkl')]))
|
||||
bl.append([Buffer.from('mnop'), new BufferList([Buffer.from('qrstu'), Buffer.from('vwxyz')])])
|
||||
t.equal(bl.length, 26)
|
||||
t.equal(bl.slice().toString('ascii'), 'abcdefghijklmnopqrstuvwxyz')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('append chainable', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
t.ok(bl.append(Buffer.from('abcd')) === bl)
|
||||
t.ok(bl.append([Buffer.from('abcd')]) === bl)
|
||||
t.ok(bl.append(new BufferList(Buffer.from('abcd'))) === bl)
|
||||
t.ok(bl.append([new BufferList(Buffer.from('abcd'))]) === bl)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('append chainable (test results)', function (t) {
|
||||
const bl = new BufferList('abc')
|
||||
.append([new BufferList('def')])
|
||||
.append(new BufferList([Buffer.from('ghi'), new BufferList('jkl')]))
|
||||
.append([Buffer.from('mnop'), new BufferList([Buffer.from('qrstu'), Buffer.from('vwxyz')])])
|
||||
|
||||
t.equal(bl.length, 26)
|
||||
t.equal(bl.slice().toString('ascii'), 'abcdefghijklmnopqrstuvwxyz')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('consuming from multiple buffers', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abcd'))
|
||||
bl.append(Buffer.from('efg'))
|
||||
bl.append(Buffer.from('hi'))
|
||||
bl.append(Buffer.from('j'))
|
||||
|
||||
t.equal(bl.length, 10)
|
||||
|
||||
t.equal(bl.slice(0, 10).toString('ascii'), 'abcdefghij')
|
||||
|
||||
bl.consume(3)
|
||||
t.equal(bl.length, 7)
|
||||
t.equal(bl.slice(0, 7).toString('ascii'), 'defghij')
|
||||
|
||||
bl.consume(2)
|
||||
t.equal(bl.length, 5)
|
||||
t.equal(bl.slice(0, 5).toString('ascii'), 'fghij')
|
||||
|
||||
bl.consume(1)
|
||||
t.equal(bl.length, 4)
|
||||
t.equal(bl.slice(0, 4).toString('ascii'), 'ghij')
|
||||
|
||||
bl.consume(1)
|
||||
t.equal(bl.length, 3)
|
||||
t.equal(bl.slice(0, 3).toString('ascii'), 'hij')
|
||||
|
||||
bl.consume(2)
|
||||
t.equal(bl.length, 1)
|
||||
t.equal(bl.slice(0, 1).toString('ascii'), 'j')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('complete consumption', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('a'))
|
||||
bl.append(Buffer.from('b'))
|
||||
|
||||
bl.consume(2)
|
||||
|
||||
t.equal(bl.length, 0)
|
||||
t.equal(bl._bufs.length, 0)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test readUInt8 / readInt8', function (t) {
|
||||
const buf1 = Buffer.alloc(1)
|
||||
const buf2 = Buffer.alloc(3)
|
||||
const buf3 = Buffer.alloc(3)
|
||||
const bl = new BufferList()
|
||||
|
||||
buf1[0] = 0x1
|
||||
buf2[1] = 0x3
|
||||
buf2[2] = 0x4
|
||||
buf3[0] = 0x23
|
||||
buf3[1] = 0x42
|
||||
|
||||
bl.append(buf1)
|
||||
bl.append(buf2)
|
||||
bl.append(buf3)
|
||||
|
||||
t.equal(bl.readUInt8(), 0x1)
|
||||
t.equal(bl.readUInt8(2), 0x3)
|
||||
t.equal(bl.readInt8(2), 0x3)
|
||||
t.equal(bl.readUInt8(3), 0x4)
|
||||
t.equal(bl.readInt8(3), 0x4)
|
||||
t.equal(bl.readUInt8(4), 0x23)
|
||||
t.equal(bl.readInt8(4), 0x23)
|
||||
t.equal(bl.readUInt8(5), 0x42)
|
||||
t.equal(bl.readInt8(5), 0x42)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test readUInt16LE / readUInt16BE / readInt16LE / readInt16BE', function (t) {
|
||||
const buf1 = Buffer.alloc(1)
|
||||
const buf2 = Buffer.alloc(3)
|
||||
const buf3 = Buffer.alloc(3)
|
||||
const bl = new BufferList()
|
||||
|
||||
buf1[0] = 0x1
|
||||
buf2[1] = 0x3
|
||||
buf2[2] = 0x4
|
||||
buf3[0] = 0x23
|
||||
buf3[1] = 0x42
|
||||
|
||||
bl.append(buf1)
|
||||
bl.append(buf2)
|
||||
bl.append(buf3)
|
||||
|
||||
t.equal(bl.readUInt16BE(), 0x0100)
|
||||
t.equal(bl.readUInt16LE(), 0x0001)
|
||||
t.equal(bl.readUInt16BE(2), 0x0304)
|
||||
t.equal(bl.readUInt16LE(2), 0x0403)
|
||||
t.equal(bl.readInt16BE(2), 0x0304)
|
||||
t.equal(bl.readInt16LE(2), 0x0403)
|
||||
t.equal(bl.readUInt16BE(3), 0x0423)
|
||||
t.equal(bl.readUInt16LE(3), 0x2304)
|
||||
t.equal(bl.readInt16BE(3), 0x0423)
|
||||
t.equal(bl.readInt16LE(3), 0x2304)
|
||||
t.equal(bl.readUInt16BE(4), 0x2342)
|
||||
t.equal(bl.readUInt16LE(4), 0x4223)
|
||||
t.equal(bl.readInt16BE(4), 0x2342)
|
||||
t.equal(bl.readInt16LE(4), 0x4223)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test readUInt32LE / readUInt32BE / readInt32LE / readInt32BE', function (t) {
|
||||
const buf1 = Buffer.alloc(1)
|
||||
const buf2 = Buffer.alloc(3)
|
||||
const buf3 = Buffer.alloc(3)
|
||||
const bl = new BufferList()
|
||||
|
||||
buf1[0] = 0x1
|
||||
buf2[1] = 0x3
|
||||
buf2[2] = 0x4
|
||||
buf3[0] = 0x23
|
||||
buf3[1] = 0x42
|
||||
|
||||
bl.append(buf1)
|
||||
bl.append(buf2)
|
||||
bl.append(buf3)
|
||||
|
||||
t.equal(bl.readUInt32BE(), 0x01000304)
|
||||
t.equal(bl.readUInt32LE(), 0x04030001)
|
||||
t.equal(bl.readUInt32BE(2), 0x03042342)
|
||||
t.equal(bl.readUInt32LE(2), 0x42230403)
|
||||
t.equal(bl.readInt32BE(2), 0x03042342)
|
||||
t.equal(bl.readInt32LE(2), 0x42230403)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test readUIntLE / readUIntBE / readIntLE / readIntBE', function (t) {
|
||||
const buf1 = Buffer.alloc(1)
|
||||
const buf2 = Buffer.alloc(3)
|
||||
const buf3 = Buffer.alloc(3)
|
||||
const bl = new BufferList()
|
||||
|
||||
buf2[0] = 0x2
|
||||
buf2[1] = 0x3
|
||||
buf2[2] = 0x4
|
||||
buf3[0] = 0x23
|
||||
buf3[1] = 0x42
|
||||
buf3[2] = 0x61
|
||||
|
||||
bl.append(buf1)
|
||||
bl.append(buf2)
|
||||
bl.append(buf3)
|
||||
|
||||
t.equal(bl.readUIntBE(1, 1), 0x02)
|
||||
t.equal(bl.readUIntBE(1, 2), 0x0203)
|
||||
t.equal(bl.readUIntBE(1, 3), 0x020304)
|
||||
t.equal(bl.readUIntBE(1, 4), 0x02030423)
|
||||
t.equal(bl.readUIntBE(1, 5), 0x0203042342)
|
||||
t.equal(bl.readUIntBE(1, 6), 0x020304234261)
|
||||
t.equal(bl.readUIntLE(1, 1), 0x02)
|
||||
t.equal(bl.readUIntLE(1, 2), 0x0302)
|
||||
t.equal(bl.readUIntLE(1, 3), 0x040302)
|
||||
t.equal(bl.readUIntLE(1, 4), 0x23040302)
|
||||
t.equal(bl.readUIntLE(1, 5), 0x4223040302)
|
||||
t.equal(bl.readUIntLE(1, 6), 0x614223040302)
|
||||
t.equal(bl.readIntBE(1, 1), 0x02)
|
||||
t.equal(bl.readIntBE(1, 2), 0x0203)
|
||||
t.equal(bl.readIntBE(1, 3), 0x020304)
|
||||
t.equal(bl.readIntBE(1, 4), 0x02030423)
|
||||
t.equal(bl.readIntBE(1, 5), 0x0203042342)
|
||||
t.equal(bl.readIntBE(1, 6), 0x020304234261)
|
||||
t.equal(bl.readIntLE(1, 1), 0x02)
|
||||
t.equal(bl.readIntLE(1, 2), 0x0302)
|
||||
t.equal(bl.readIntLE(1, 3), 0x040302)
|
||||
t.equal(bl.readIntLE(1, 4), 0x23040302)
|
||||
t.equal(bl.readIntLE(1, 5), 0x4223040302)
|
||||
t.equal(bl.readIntLE(1, 6), 0x614223040302)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test readFloatLE / readFloatBE', function (t) {
|
||||
const buf1 = Buffer.alloc(1)
|
||||
const buf2 = Buffer.alloc(3)
|
||||
const buf3 = Buffer.alloc(3)
|
||||
const bl = new BufferList()
|
||||
|
||||
buf1[0] = 0x01
|
||||
buf2[1] = 0x00
|
||||
buf2[2] = 0x00
|
||||
buf3[0] = 0x80
|
||||
buf3[1] = 0x3f
|
||||
|
||||
bl.append(buf1)
|
||||
bl.append(buf2)
|
||||
bl.append(buf3)
|
||||
|
||||
const canonical = Buffer.concat([buf1, buf2, buf3])
|
||||
t.equal(bl.readFloatLE(), canonical.readFloatLE())
|
||||
t.equal(bl.readFloatBE(), canonical.readFloatBE())
|
||||
t.equal(bl.readFloatLE(2), canonical.readFloatLE(2))
|
||||
t.equal(bl.readFloatBE(2), canonical.readFloatBE(2))
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test readDoubleLE / readDoubleBE', function (t) {
|
||||
const buf1 = Buffer.alloc(1)
|
||||
const buf2 = Buffer.alloc(3)
|
||||
const buf3 = Buffer.alloc(10)
|
||||
const bl = new BufferList()
|
||||
|
||||
buf1[0] = 0x01
|
||||
buf2[1] = 0x55
|
||||
buf2[2] = 0x55
|
||||
buf3[0] = 0x55
|
||||
buf3[1] = 0x55
|
||||
buf3[2] = 0x55
|
||||
buf3[3] = 0x55
|
||||
buf3[4] = 0xd5
|
||||
buf3[5] = 0x3f
|
||||
|
||||
bl.append(buf1)
|
||||
bl.append(buf2)
|
||||
bl.append(buf3)
|
||||
|
||||
const canonical = Buffer.concat([buf1, buf2, buf3])
|
||||
t.equal(bl.readDoubleBE(), canonical.readDoubleBE())
|
||||
t.equal(bl.readDoubleLE(), canonical.readDoubleLE())
|
||||
t.equal(bl.readDoubleBE(2), canonical.readDoubleBE(2))
|
||||
t.equal(bl.readDoubleLE(2), canonical.readDoubleLE(2))
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test toString', function (t) {
|
||||
const bl = new BufferList()
|
||||
|
||||
bl.append(Buffer.from('abcd'))
|
||||
bl.append(Buffer.from('efg'))
|
||||
bl.append(Buffer.from('hi'))
|
||||
bl.append(Buffer.from('j'))
|
||||
|
||||
t.equal(bl.toString('ascii', 0, 10), 'abcdefghij')
|
||||
t.equal(bl.toString('ascii', 3, 10), 'defghij')
|
||||
t.equal(bl.toString('ascii', 3, 6), 'def')
|
||||
t.equal(bl.toString('ascii', 3, 8), 'defgh')
|
||||
t.equal(bl.toString('ascii', 5, 10), 'fghij')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test toString encoding', function (t) {
|
||||
const bl = new BufferList()
|
||||
const b = Buffer.from('abcdefghij\xff\x00')
|
||||
|
||||
bl.append(Buffer.from('abcd'))
|
||||
bl.append(Buffer.from('efg'))
|
||||
bl.append(Buffer.from('hi'))
|
||||
bl.append(Buffer.from('j'))
|
||||
bl.append(Buffer.from('\xff\x00'))
|
||||
|
||||
encodings.forEach(function (enc) {
|
||||
t.equal(bl.toString(enc), b.toString(enc), enc)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('uninitialized memory', function (t) {
|
||||
const secret = crypto.randomBytes(256)
|
||||
for (let i = 0; i < 1e6; i++) {
|
||||
const clone = Buffer.from(secret)
|
||||
const bl = new BufferList()
|
||||
bl.append(Buffer.from('a'))
|
||||
bl.consume(-1024)
|
||||
const buf = bl.slice(1)
|
||||
if (buf.indexOf(clone) !== -1) {
|
||||
t.fail(`Match (at ${i})`)
|
||||
break
|
||||
}
|
||||
}
|
||||
t.end()
|
||||
})
|
||||
|
||||
!process.browser && tape('test stream', function (t) {
|
||||
const random = crypto.randomBytes(65534)
|
||||
|
||||
const bl = new BufferList((err, buf) => {
|
||||
t.ok(Buffer.isBuffer(buf))
|
||||
t.ok(err === null)
|
||||
t.ok(random.equals(bl.slice()))
|
||||
t.ok(random.equals(buf.slice()))
|
||||
|
||||
bl.pipe(fs.createWriteStream('/tmp/bl_test_rnd_out.dat'))
|
||||
.on('close', function () {
|
||||
const rndhash = crypto.createHash('md5').update(random).digest('hex')
|
||||
const md5sum = crypto.createHash('md5')
|
||||
const s = fs.createReadStream('/tmp/bl_test_rnd_out.dat')
|
||||
|
||||
s.on('data', md5sum.update.bind(md5sum))
|
||||
s.on('end', function () {
|
||||
t.equal(rndhash, md5sum.digest('hex'), 'woohoo! correct hash!')
|
||||
t.end()
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
fs.writeFileSync('/tmp/bl_test_rnd.dat', random)
|
||||
fs.createReadStream('/tmp/bl_test_rnd.dat').pipe(bl)
|
||||
})
|
||||
|
||||
tape('instantiation with Buffer', function (t) {
|
||||
const buf = crypto.randomBytes(1024)
|
||||
const buf2 = crypto.randomBytes(1024)
|
||||
let b = BufferList(buf)
|
||||
|
||||
t.equal(buf.toString('hex'), b.slice().toString('hex'), 'same buffer')
|
||||
b = BufferList([buf, buf2])
|
||||
t.equal(b.slice().toString('hex'), Buffer.concat([buf, buf2]).toString('hex'), 'same buffer')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test String appendage', function (t) {
|
||||
const bl = new BufferList()
|
||||
const b = Buffer.from('abcdefghij\xff\x00')
|
||||
|
||||
bl.append('abcd')
|
||||
bl.append('efg')
|
||||
bl.append('hi')
|
||||
bl.append('j')
|
||||
bl.append('\xff\x00')
|
||||
|
||||
encodings.forEach(function (enc) {
|
||||
t.equal(bl.toString(enc), b.toString(enc))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('test Number appendage', function (t) {
|
||||
const bl = new BufferList()
|
||||
const b = Buffer.from('1234567890')
|
||||
|
||||
bl.append(1234)
|
||||
bl.append(567)
|
||||
bl.append(89)
|
||||
bl.append(0)
|
||||
|
||||
encodings.forEach(function (enc) {
|
||||
t.equal(bl.toString(enc), b.toString(enc))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('write nothing, should get empty buffer', function (t) {
|
||||
t.plan(3)
|
||||
BufferList(function (err, data) {
|
||||
t.notOk(err, 'no error')
|
||||
t.ok(Buffer.isBuffer(data), 'got a buffer')
|
||||
t.equal(0, data.length, 'got a zero-length buffer')
|
||||
t.end()
|
||||
}).end()
|
||||
})
|
||||
|
||||
tape('unicode string', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
const inp1 = '\u2600'
|
||||
const inp2 = '\u2603'
|
||||
const exp = inp1 + ' and ' + inp2
|
||||
const bl = BufferList()
|
||||
|
||||
bl.write(inp1)
|
||||
bl.write(' and ')
|
||||
bl.write(inp2)
|
||||
t.equal(exp, bl.toString())
|
||||
t.equal(Buffer.from(exp).toString('hex'), bl.toString('hex'))
|
||||
})
|
||||
|
||||
tape('should emit finish', function (t) {
|
||||
const source = BufferList()
|
||||
const dest = BufferList()
|
||||
|
||||
source.write('hello')
|
||||
source.pipe(dest)
|
||||
|
||||
dest.on('finish', function () {
|
||||
t.equal(dest.toString('utf8'), 'hello')
|
||||
t.end()
|
||||
})
|
||||
})
|
||||
|
||||
tape('basic copy', function (t) {
|
||||
const buf = crypto.randomBytes(1024)
|
||||
const buf2 = Buffer.alloc(1024)
|
||||
const b = BufferList(buf)
|
||||
|
||||
b.copy(buf2)
|
||||
t.equal(b.slice().toString('hex'), buf2.toString('hex'), 'same buffer')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('copy after many appends', function (t) {
|
||||
const buf = crypto.randomBytes(512)
|
||||
const buf2 = Buffer.alloc(1024)
|
||||
const b = BufferList(buf)
|
||||
|
||||
b.append(buf)
|
||||
b.copy(buf2)
|
||||
t.equal(b.slice().toString('hex'), buf2.toString('hex'), 'same buffer')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('copy at a precise position', function (t) {
|
||||
const buf = crypto.randomBytes(1004)
|
||||
const buf2 = Buffer.alloc(1024)
|
||||
const b = BufferList(buf)
|
||||
|
||||
b.copy(buf2, 20)
|
||||
t.equal(b.slice().toString('hex'), buf2.slice(20).toString('hex'), 'same buffer')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('copy starting from a precise location', function (t) {
|
||||
const buf = crypto.randomBytes(10)
|
||||
const buf2 = Buffer.alloc(5)
|
||||
const b = BufferList(buf)
|
||||
|
||||
b.copy(buf2, 0, 5)
|
||||
t.equal(b.slice(5).toString('hex'), buf2.toString('hex'), 'same buffer')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('copy in an interval', function (t) {
|
||||
const rnd = crypto.randomBytes(10)
|
||||
const b = BufferList(rnd) // put the random bytes there
|
||||
const actual = Buffer.alloc(3)
|
||||
const expected = Buffer.alloc(3)
|
||||
|
||||
rnd.copy(expected, 0, 5, 8)
|
||||
b.copy(actual, 0, 5, 8)
|
||||
|
||||
t.equal(actual.toString('hex'), expected.toString('hex'), 'same buffer')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('copy an interval between two buffers', function (t) {
|
||||
const buf = crypto.randomBytes(10)
|
||||
const buf2 = Buffer.alloc(10)
|
||||
const b = BufferList(buf)
|
||||
|
||||
b.append(buf)
|
||||
b.copy(buf2, 0, 5, 15)
|
||||
|
||||
t.equal(b.slice(5, 15).toString('hex'), buf2.toString('hex'), 'same buffer')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('shallow slice across buffer boundaries', function (t) {
|
||||
const bl = new BufferList(['First', 'Second', 'Third'])
|
||||
|
||||
t.equal(bl.shallowSlice(3, 13).toString(), 'stSecondTh')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('shallow slice within single buffer', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
const bl = new BufferList(['First', 'Second', 'Third'])
|
||||
|
||||
t.equal(bl.shallowSlice(5, 10).toString(), 'Secon')
|
||||
t.equal(bl.shallowSlice(7, 10).toString(), 'con')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tape('shallow slice single buffer', function (t) {
|
||||
t.plan(3)
|
||||
|
||||
const bl = new BufferList(['First', 'Second', 'Third'])
|
||||
|
||||
t.equal(bl.shallowSlice(0, 5).toString(), 'First')
|
||||
t.equal(bl.shallowSlice(5, 11).toString(), 'Second')
|
||||
t.equal(bl.shallowSlice(11, 16).toString(), 'Third')
|
||||
})
|
||||
|
||||
tape('shallow slice with negative or omitted indices', function (t) {
|
||||
t.plan(4)
|
||||
|
||||
const bl = new BufferList(['First', 'Second', 'Third'])
|
||||
|
||||
t.equal(bl.shallowSlice().toString(), 'FirstSecondThird')
|
||||
t.equal(bl.shallowSlice(5).toString(), 'SecondThird')
|
||||
t.equal(bl.shallowSlice(5, -3).toString(), 'SecondTh')
|
||||
t.equal(bl.shallowSlice(-8).toString(), 'ondThird')
|
||||
})
|
||||
|
||||
tape('shallow slice does not make a copy', function (t) {
|
||||
t.plan(1)
|
||||
|
||||
const buffers = [Buffer.from('First'), Buffer.from('Second'), Buffer.from('Third')]
|
||||
const bl = (new BufferList(buffers)).shallowSlice(5, -3)
|
||||
|
||||
buffers[1].fill('h')
|
||||
buffers[2].fill('h')
|
||||
|
||||
t.equal(bl.toString(), 'hhhhhhhh')
|
||||
})
|
||||
|
||||
tape('shallow slice with 0 length', function (t) {
|
||||
t.plan(1)
|
||||
|
||||
const buffers = [Buffer.from('First'), Buffer.from('Second'), Buffer.from('Third')]
|
||||
const bl = (new BufferList(buffers)).shallowSlice(0, 0)
|
||||
|
||||
t.equal(bl.length, 0)
|
||||
})
|
||||
|
||||
tape('shallow slice with 0 length from middle', function (t) {
|
||||
t.plan(1)
|
||||
|
||||
const buffers = [Buffer.from('First'), Buffer.from('Second'), Buffer.from('Third')]
|
||||
const bl = (new BufferList(buffers)).shallowSlice(10, 10)
|
||||
|
||||
t.equal(bl.length, 0)
|
||||
})
|
||||
|
||||
tape('duplicate', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
const bl = new BufferList('abcdefghij\xff\x00')
|
||||
const dup = bl.duplicate()
|
||||
|
||||
t.equal(bl.prototype, dup.prototype)
|
||||
t.equal(bl.toString('hex'), dup.toString('hex'))
|
||||
})
|
||||
|
||||
tape('destroy no pipe', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
const bl = new BufferList('alsdkfja;lsdkfja;lsdk')
|
||||
|
||||
bl.destroy()
|
||||
|
||||
t.equal(bl._bufs.length, 0)
|
||||
t.equal(bl.length, 0)
|
||||
})
|
||||
|
||||
tape('destroy with error', function (t) {
|
||||
t.plan(3)
|
||||
|
||||
const bl = new BufferList('alsdkfja;lsdkfja;lsdk')
|
||||
const err = new Error('kaboom')
|
||||
|
||||
bl.destroy(err)
|
||||
bl.on('error', function (_err) {
|
||||
t.equal(_err, err)
|
||||
})
|
||||
|
||||
t.equal(bl._bufs.length, 0)
|
||||
t.equal(bl.length, 0)
|
||||
})
|
||||
|
||||
!process.browser && tape('destroy with pipe before read end', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
const bl = new BufferList()
|
||||
fs.createReadStream(path.join(__dirname, '/test.js'))
|
||||
.pipe(bl)
|
||||
|
||||
bl.destroy()
|
||||
|
||||
t.equal(bl._bufs.length, 0)
|
||||
t.equal(bl.length, 0)
|
||||
})
|
||||
|
||||
!process.browser && tape('destroy with pipe before read end with race', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
const bl = new BufferList()
|
||||
|
||||
fs.createReadStream(path.join(__dirname, '/test.js'))
|
||||
.pipe(bl)
|
||||
|
||||
setTimeout(function () {
|
||||
bl.destroy()
|
||||
setTimeout(function () {
|
||||
t.equal(bl._bufs.length, 0)
|
||||
t.equal(bl.length, 0)
|
||||
}, 500)
|
||||
}, 500)
|
||||
})
|
||||
|
||||
!process.browser && tape('destroy with pipe after read end', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
const bl = new BufferList()
|
||||
|
||||
fs.createReadStream(path.join(__dirname, '/test.js'))
|
||||
.on('end', onEnd)
|
||||
.pipe(bl)
|
||||
|
||||
function onEnd () {
|
||||
bl.destroy()
|
||||
|
||||
t.equal(bl._bufs.length, 0)
|
||||
t.equal(bl.length, 0)
|
||||
}
|
||||
})
|
||||
|
||||
!process.browser && tape('destroy with pipe while writing to a destination', function (t) {
|
||||
t.plan(4)
|
||||
|
||||
const bl = new BufferList()
|
||||
const ds = new BufferList()
|
||||
|
||||
fs.createReadStream(path.join(__dirname, '/test.js'))
|
||||
.on('end', onEnd)
|
||||
.pipe(bl)
|
||||
|
||||
function onEnd () {
|
||||
bl.pipe(ds)
|
||||
|
||||
setTimeout(function () {
|
||||
bl.destroy()
|
||||
|
||||
t.equals(bl._bufs.length, 0)
|
||||
t.equals(bl.length, 0)
|
||||
|
||||
ds.destroy()
|
||||
|
||||
t.equals(bl._bufs.length, 0)
|
||||
t.equals(bl.length, 0)
|
||||
}, 100)
|
||||
}
|
||||
})
|
||||
|
||||
!process.browser && tape('handle error', function (t) {
|
||||
t.plan(2)
|
||||
|
||||
fs.createReadStream('/does/not/exist').pipe(BufferList(function (err, data) {
|
||||
t.ok(err instanceof Error, 'has error')
|
||||
t.notOk(data, 'no data')
|
||||
}))
|
||||
})
|
70
node_modules/buffer/AUTHORS.md
generated
vendored
Normal file
70
node_modules/buffer/AUTHORS.md
generated
vendored
Normal file
|
@ -0,0 +1,70 @@
|
|||
# Authors
|
||||
|
||||
#### Ordered by first contribution.
|
||||
|
||||
- Romain Beauxis (toots@rastageeks.org)
|
||||
- Tobias Koppers (tobias.koppers@googlemail.com)
|
||||
- Janus (ysangkok@gmail.com)
|
||||
- Rainer Dreyer (rdrey1@gmail.com)
|
||||
- Tõnis Tiigi (tonistiigi@gmail.com)
|
||||
- James Halliday (mail@substack.net)
|
||||
- Michael Williamson (mike@zwobble.org)
|
||||
- elliottcable (github@elliottcable.name)
|
||||
- rafael (rvalle@livelens.net)
|
||||
- Andrew Kelley (superjoe30@gmail.com)
|
||||
- Andreas Madsen (amwebdk@gmail.com)
|
||||
- Mike Brevoort (mike.brevoort@pearson.com)
|
||||
- Brian White (mscdex@mscdex.net)
|
||||
- Feross Aboukhadijeh (feross@feross.org)
|
||||
- Ruben Verborgh (ruben@verborgh.org)
|
||||
- eliang (eliang.cs@gmail.com)
|
||||
- Jesse Tane (jesse.tane@gmail.com)
|
||||
- Alfonso Boza (alfonso@cloud.com)
|
||||
- Mathias Buus (mathiasbuus@gmail.com)
|
||||
- Devon Govett (devongovett@gmail.com)
|
||||
- Daniel Cousens (github@dcousens.com)
|
||||
- Joseph Dykstra (josephdykstra@gmail.com)
|
||||
- Parsha Pourkhomami (parshap+git@gmail.com)
|
||||
- Damjan Košir (damjan.kosir@gmail.com)
|
||||
- daverayment (dave.rayment@gmail.com)
|
||||
- kawanet (u-suke@kawa.net)
|
||||
- Linus Unnebäck (linus@folkdatorn.se)
|
||||
- Nolan Lawson (nolan.lawson@gmail.com)
|
||||
- Calvin Metcalf (calvin.metcalf@gmail.com)
|
||||
- Koki Takahashi (hakatasiloving@gmail.com)
|
||||
- Guy Bedford (guybedford@gmail.com)
|
||||
- Jan Schär (jscissr@gmail.com)
|
||||
- RaulTsc (tomescu.raul@gmail.com)
|
||||
- Matthieu Monsch (monsch@alum.mit.edu)
|
||||
- Dan Ehrenberg (littledan@chromium.org)
|
||||
- Kirill Fomichev (fanatid@ya.ru)
|
||||
- Yusuke Kawasaki (u-suke@kawa.net)
|
||||
- DC (dcposch@dcpos.ch)
|
||||
- John-David Dalton (john.david.dalton@gmail.com)
|
||||
- adventure-yunfei (adventure030@gmail.com)
|
||||
- Emil Bay (github@tixz.dk)
|
||||
- Sam Sudar (sudar.sam@gmail.com)
|
||||
- Volker Mische (volker.mische@gmail.com)
|
||||
- David Walton (support@geekstocks.com)
|
||||
- Сковорода Никита Андреевич (chalkerx@gmail.com)
|
||||
- greenkeeper[bot] (greenkeeper[bot]@users.noreply.github.com)
|
||||
- ukstv (sergey.ukustov@machinomy.com)
|
||||
- Renée Kooi (renee@kooi.me)
|
||||
- ranbochen (ranbochen@qq.com)
|
||||
- Vladimir Borovik (bobahbdb@gmail.com)
|
||||
- greenkeeper[bot] (23040076+greenkeeper[bot]@users.noreply.github.com)
|
||||
- kumavis (aaron@kumavis.me)
|
||||
- Sergey Ukustov (sergey.ukustov@machinomy.com)
|
||||
- Fei Liu (liu.feiwood@gmail.com)
|
||||
- Blaine Bublitz (blaine.bublitz@gmail.com)
|
||||
- clement (clement@seald.io)
|
||||
- Koushik Dutta (koushd@gmail.com)
|
||||
- Jordan Harband (ljharb@gmail.com)
|
||||
- Niklas Mischkulnig (mischnic@users.noreply.github.com)
|
||||
- Nikolai Vavilov (vvnicholas@gmail.com)
|
||||
- Fedor Nezhivoi (gyzerok@users.noreply.github.com)
|
||||
- Peter Newman (peternewman@users.noreply.github.com)
|
||||
- mathmakgakpak (44949126+mathmakgakpak@users.noreply.github.com)
|
||||
- jkkang (jkkang@smartauth.kr)
|
||||
|
||||
#### Generated by bin/update-authors.sh.
|
21
node_modules/buffer/LICENSE
generated
vendored
Normal file
21
node_modules/buffer/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
|||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) Feross Aboukhadijeh, and other contributors.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
410
node_modules/buffer/README.md
generated
vendored
Normal file
410
node_modules/buffer/README.md
generated
vendored
Normal file
|
@ -0,0 +1,410 @@
|
|||
# buffer [![travis][travis-image]][travis-url] [![npm][npm-image]][npm-url] [![downloads][downloads-image]][downloads-url] [![javascript style guide][standard-image]][standard-url]
|
||||
|
||||
[travis-image]: https://img.shields.io/travis/feross/buffer/master.svg
|
||||
[travis-url]: https://travis-ci.org/feross/buffer
|
||||
[npm-image]: https://img.shields.io/npm/v/buffer.svg
|
||||
[npm-url]: https://npmjs.org/package/buffer
|
||||
[downloads-image]: https://img.shields.io/npm/dm/buffer.svg
|
||||
[downloads-url]: https://npmjs.org/package/buffer
|
||||
[standard-image]: https://img.shields.io/badge/code_style-standard-brightgreen.svg
|
||||
[standard-url]: https://standardjs.com
|
||||
|
||||
#### The buffer module from [node.js](https://nodejs.org/), for the browser.
|
||||
|
||||
[![saucelabs][saucelabs-image]][saucelabs-url]
|
||||
|
||||
[saucelabs-image]: https://saucelabs.com/browser-matrix/buffer.svg
|
||||
[saucelabs-url]: https://saucelabs.com/u/buffer
|
||||
|
||||
With [browserify](http://browserify.org), simply `require('buffer')` or use the `Buffer` global and you will get this module.
|
||||
|
||||
The goal is to provide an API that is 100% identical to
|
||||
[node's Buffer API](https://nodejs.org/api/buffer.html). Read the
|
||||
[official docs](https://nodejs.org/api/buffer.html) for the full list of properties,
|
||||
instance methods, and class methods that are supported.
|
||||
|
||||
## features
|
||||
|
||||
- Manipulate binary data like a boss, in all browsers!
|
||||
- Super fast. Backed by Typed Arrays (`Uint8Array`/`ArrayBuffer`, not `Object`)
|
||||
- Extremely small bundle size (**6.75KB minified + gzipped**, 51.9KB with comments)
|
||||
- Excellent browser support (Chrome, Firefox, Edge, Safari 9+, IE 11, iOS 9+, Android, etc.)
|
||||
- Preserves Node API exactly, with one minor difference (see below)
|
||||
- Square-bracket `buf[4]` notation works!
|
||||
- Does not modify any browser prototypes or put anything on `window`
|
||||
- Comprehensive test suite (including all buffer tests from node.js core)
|
||||
|
||||
## install
|
||||
|
||||
To use this module directly (without browserify), install it:
|
||||
|
||||
```bash
|
||||
npm install buffer
|
||||
```
|
||||
|
||||
This module was previously called **native-buffer-browserify**, but please use **buffer**
|
||||
from now on.
|
||||
|
||||
If you do not use a bundler, you can use the [standalone script](https://bundle.run/buffer).
|
||||
|
||||
## usage
|
||||
|
||||
The module's API is identical to node's `Buffer` API. Read the
|
||||
[official docs](https://nodejs.org/api/buffer.html) for the full list of properties,
|
||||
instance methods, and class methods that are supported.
|
||||
|
||||
As mentioned above, `require('buffer')` or use the `Buffer` global with
|
||||
[browserify](http://browserify.org) and this module will automatically be included
|
||||
in your bundle. Almost any npm module will work in the browser, even if it assumes that
|
||||
the node `Buffer` API will be available.
|
||||
|
||||
To depend on this module explicitly (without browserify), require it like this:
|
||||
|
||||
```js
|
||||
var Buffer = require('buffer/').Buffer // note: the trailing slash is important!
|
||||
```
|
||||
|
||||
To require this module explicitly, use `require('buffer/')` which tells the node.js module
|
||||
lookup algorithm (also used by browserify) to use the **npm module** named `buffer`
|
||||
instead of the **node.js core** module named `buffer`!
|
||||
|
||||
|
||||
## how does it work?
|
||||
|
||||
The Buffer constructor returns instances of `Uint8Array` that have their prototype
|
||||
changed to `Buffer.prototype`. Furthermore, `Buffer` is a subclass of `Uint8Array`,
|
||||
so the returned instances will have all the node `Buffer` methods and the
|
||||
`Uint8Array` methods. Square bracket notation works as expected -- it returns a
|
||||
single octet.
|
||||
|
||||
The `Uint8Array` prototype remains unmodified.
|
||||
|
||||
|
||||
## tracking the latest node api
|
||||
|
||||
This module tracks the Buffer API in the latest (unstable) version of node.js. The Buffer
|
||||
API is considered **stable** in the
|
||||
[node stability index](https://nodejs.org/docs/latest/api/documentation.html#documentation_stability_index),
|
||||
so it is unlikely that there will ever be breaking changes.
|
||||
Nonetheless, when/if the Buffer API changes in node, this module's API will change
|
||||
accordingly.
|
||||
|
||||
## related packages
|
||||
|
||||
- [`buffer-reverse`](https://www.npmjs.com/package/buffer-reverse) - Reverse a buffer
|
||||
- [`buffer-xor`](https://www.npmjs.com/package/buffer-xor) - Bitwise xor a buffer
|
||||
- [`is-buffer`](https://www.npmjs.com/package/is-buffer) - Determine if an object is a Buffer without including the whole `Buffer` package
|
||||
|
||||
## conversion packages
|
||||
|
||||
### convert typed array to buffer
|
||||
|
||||
Use [`typedarray-to-buffer`](https://www.npmjs.com/package/typedarray-to-buffer) to convert any kind of typed array to a `Buffer`. Does not perform a copy, so it's super fast.
|
||||
|
||||
### convert buffer to typed array
|
||||
|
||||
`Buffer` is a subclass of `Uint8Array` (which is a typed array). So there is no need to explicitly convert to typed array. Just use the buffer as a `Uint8Array`.
|
||||
|
||||
### convert blob to buffer
|
||||
|
||||
Use [`blob-to-buffer`](https://www.npmjs.com/package/blob-to-buffer) to convert a `Blob` to a `Buffer`.
|
||||
|
||||
### convert buffer to blob
|
||||
|
||||
To convert a `Buffer` to a `Blob`, use the `Blob` constructor:
|
||||
|
||||
```js
|
||||
var blob = new Blob([ buffer ])
|
||||
```
|
||||
|
||||
Optionally, specify a mimetype:
|
||||
|
||||
```js
|
||||
var blob = new Blob([ buffer ], { type: 'text/html' })
|
||||
```
|
||||
|
||||
### convert arraybuffer to buffer
|
||||
|
||||
To convert an `ArrayBuffer` to a `Buffer`, use the `Buffer.from` function. Does not perform a copy, so it's super fast.
|
||||
|
||||
```js
|
||||
var buffer = Buffer.from(arrayBuffer)
|
||||
```
|
||||
|
||||
### convert buffer to arraybuffer
|
||||
|
||||
To convert a `Buffer` to an `ArrayBuffer`, use the `.buffer` property (which is present on all `Uint8Array` objects):
|
||||
|
||||
```js
|
||||
var arrayBuffer = buffer.buffer.slice(
|
||||
buffer.byteOffset, buffer.byteOffset + buffer.byteLength
|
||||
)
|
||||
```
|
||||
|
||||
Alternatively, use the [`to-arraybuffer`](https://www.npmjs.com/package/to-arraybuffer) module.
|
||||
|
||||
## performance
|
||||
|
||||
See perf tests in `/perf`.
|
||||
|
||||
`BrowserBuffer` is the browser `buffer` module (this repo). `Uint8Array` is included as a
|
||||
sanity check (since `BrowserBuffer` uses `Uint8Array` under the hood, `Uint8Array` will
|
||||
always be at least a bit faster). Finally, `NodeBuffer` is the node.js buffer module,
|
||||
which is included to compare against.
|
||||
|
||||
NOTE: Performance has improved since these benchmarks were taken. PR welcome to update the README.
|
||||
|
||||
### Chrome 38
|
||||
|
||||
| Method | Operations | Accuracy | Sampled | Fastest |
|
||||
|:-------|:-----------|:---------|:--------|:-------:|
|
||||
| BrowserBuffer#bracket-notation | 11,457,464 ops/sec | ±0.86% | 66 | ✓ |
|
||||
| Uint8Array#bracket-notation | 10,824,332 ops/sec | ±0.74% | 65 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#concat | 450,532 ops/sec | ±0.76% | 68 | |
|
||||
| Uint8Array#concat | 1,368,911 ops/sec | ±1.50% | 62 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16000) | 903,001 ops/sec | ±0.96% | 67 | |
|
||||
| Uint8Array#copy(16000) | 1,422,441 ops/sec | ±1.04% | 66 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16) | 11,431,358 ops/sec | ±0.46% | 69 | |
|
||||
| Uint8Array#copy(16) | 13,944,163 ops/sec | ±1.12% | 68 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16000) | 106,329 ops/sec | ±6.70% | 44 | |
|
||||
| Uint8Array#new(16000) | 131,001 ops/sec | ±2.85% | 31 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16) | 1,554,491 ops/sec | ±1.60% | 65 | |
|
||||
| Uint8Array#new(16) | 6,623,930 ops/sec | ±1.66% | 65 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#readDoubleBE | 112,830 ops/sec | ±0.51% | 69 | ✓ |
|
||||
| DataView#getFloat64 | 93,500 ops/sec | ±0.57% | 68 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readFloatBE | 146,678 ops/sec | ±0.95% | 68 | ✓ |
|
||||
| DataView#getFloat32 | 99,311 ops/sec | ±0.41% | 67 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readUInt32LE | 843,214 ops/sec | ±0.70% | 69 | ✓ |
|
||||
| DataView#getUint32 | 103,024 ops/sec | ±0.64% | 67 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#slice | 1,013,941 ops/sec | ±0.75% | 67 | |
|
||||
| Uint8Array#subarray | 1,903,928 ops/sec | ±0.53% | 67 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#writeFloatBE | 61,387 ops/sec | ±0.90% | 67 | |
|
||||
| DataView#setFloat32 | 141,249 ops/sec | ±0.40% | 66 | ✓ |
|
||||
|
||||
|
||||
### Firefox 33
|
||||
|
||||
| Method | Operations | Accuracy | Sampled | Fastest |
|
||||
|:-------|:-----------|:---------|:--------|:-------:|
|
||||
| BrowserBuffer#bracket-notation | 20,800,421 ops/sec | ±1.84% | 60 | |
|
||||
| Uint8Array#bracket-notation | 20,826,235 ops/sec | ±2.02% | 61 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#concat | 153,076 ops/sec | ±2.32% | 61 | |
|
||||
| Uint8Array#concat | 1,255,674 ops/sec | ±8.65% | 52 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16000) | 1,105,312 ops/sec | ±1.16% | 63 | |
|
||||
| Uint8Array#copy(16000) | 1,615,911 ops/sec | ±0.55% | 66 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16) | 16,357,599 ops/sec | ±0.73% | 68 | |
|
||||
| Uint8Array#copy(16) | 31,436,281 ops/sec | ±1.05% | 68 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16000) | 52,995 ops/sec | ±6.01% | 35 | |
|
||||
| Uint8Array#new(16000) | 87,686 ops/sec | ±5.68% | 45 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16) | 252,031 ops/sec | ±1.61% | 66 | |
|
||||
| Uint8Array#new(16) | 8,477,026 ops/sec | ±0.49% | 68 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#readDoubleBE | 99,871 ops/sec | ±0.41% | 69 | |
|
||||
| DataView#getFloat64 | 285,663 ops/sec | ±0.70% | 68 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#readFloatBE | 115,540 ops/sec | ±0.42% | 69 | |
|
||||
| DataView#getFloat32 | 288,722 ops/sec | ±0.82% | 68 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#readUInt32LE | 633,926 ops/sec | ±1.08% | 67 | ✓ |
|
||||
| DataView#getUint32 | 294,808 ops/sec | ±0.79% | 64 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#slice | 349,425 ops/sec | ±0.46% | 69 | |
|
||||
| Uint8Array#subarray | 5,965,819 ops/sec | ±0.60% | 65 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#writeFloatBE | 59,980 ops/sec | ±0.41% | 67 | |
|
||||
| DataView#setFloat32 | 317,634 ops/sec | ±0.63% | 68 | ✓ |
|
||||
|
||||
### Safari 8
|
||||
|
||||
| Method | Operations | Accuracy | Sampled | Fastest |
|
||||
|:-------|:-----------|:---------|:--------|:-------:|
|
||||
| BrowserBuffer#bracket-notation | 10,279,729 ops/sec | ±2.25% | 56 | ✓ |
|
||||
| Uint8Array#bracket-notation | 10,030,767 ops/sec | ±2.23% | 59 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#concat | 144,138 ops/sec | ±1.38% | 65 | |
|
||||
| Uint8Array#concat | 4,950,764 ops/sec | ±1.70% | 63 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16000) | 1,058,548 ops/sec | ±1.51% | 64 | |
|
||||
| Uint8Array#copy(16000) | 1,409,666 ops/sec | ±1.17% | 65 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16) | 6,282,529 ops/sec | ±1.88% | 58 | |
|
||||
| Uint8Array#copy(16) | 11,907,128 ops/sec | ±2.87% | 58 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16000) | 101,663 ops/sec | ±3.89% | 57 | |
|
||||
| Uint8Array#new(16000) | 22,050,818 ops/sec | ±6.51% | 46 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16) | 176,072 ops/sec | ±2.13% | 64 | |
|
||||
| Uint8Array#new(16) | 24,385,731 ops/sec | ±5.01% | 51 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#readDoubleBE | 41,341 ops/sec | ±1.06% | 67 | |
|
||||
| DataView#getFloat64 | 322,280 ops/sec | ±0.84% | 68 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#readFloatBE | 46,141 ops/sec | ±1.06% | 65 | |
|
||||
| DataView#getFloat32 | 337,025 ops/sec | ±0.43% | 69 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#readUInt32LE | 151,551 ops/sec | ±1.02% | 66 | |
|
||||
| DataView#getUint32 | 308,278 ops/sec | ±0.94% | 67 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#slice | 197,365 ops/sec | ±0.95% | 66 | |
|
||||
| Uint8Array#subarray | 9,558,024 ops/sec | ±3.08% | 58 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#writeFloatBE | 17,518 ops/sec | ±1.03% | 63 | |
|
||||
| DataView#setFloat32 | 319,751 ops/sec | ±0.48% | 68 | ✓ |
|
||||
|
||||
|
||||
### Node 0.11.14
|
||||
|
||||
| Method | Operations | Accuracy | Sampled | Fastest |
|
||||
|:-------|:-----------|:---------|:--------|:-------:|
|
||||
| BrowserBuffer#bracket-notation | 10,489,828 ops/sec | ±3.25% | 90 | |
|
||||
| Uint8Array#bracket-notation | 10,534,884 ops/sec | ±0.81% | 92 | ✓ |
|
||||
| NodeBuffer#bracket-notation | 10,389,910 ops/sec | ±0.97% | 87 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#concat | 487,830 ops/sec | ±2.58% | 88 | |
|
||||
| Uint8Array#concat | 1,814,327 ops/sec | ±1.28% | 88 | ✓ |
|
||||
| NodeBuffer#concat | 1,636,523 ops/sec | ±1.88% | 73 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16000) | 1,073,665 ops/sec | ±0.77% | 90 | |
|
||||
| Uint8Array#copy(16000) | 1,348,517 ops/sec | ±0.84% | 89 | ✓ |
|
||||
| NodeBuffer#copy(16000) | 1,289,533 ops/sec | ±0.82% | 93 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16) | 12,782,706 ops/sec | ±0.74% | 85 | |
|
||||
| Uint8Array#copy(16) | 14,180,427 ops/sec | ±0.93% | 92 | ✓ |
|
||||
| NodeBuffer#copy(16) | 11,083,134 ops/sec | ±1.06% | 89 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16000) | 141,678 ops/sec | ±3.30% | 67 | |
|
||||
| Uint8Array#new(16000) | 161,491 ops/sec | ±2.96% | 60 | |
|
||||
| NodeBuffer#new(16000) | 292,699 ops/sec | ±3.20% | 55 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16) | 1,655,466 ops/sec | ±2.41% | 82 | |
|
||||
| Uint8Array#new(16) | 14,399,926 ops/sec | ±0.91% | 94 | ✓ |
|
||||
| NodeBuffer#new(16) | 3,894,696 ops/sec | ±0.88% | 92 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readDoubleBE | 109,582 ops/sec | ±0.75% | 93 | ✓ |
|
||||
| DataView#getFloat64 | 91,235 ops/sec | ±0.81% | 90 | |
|
||||
| NodeBuffer#readDoubleBE | 88,593 ops/sec | ±0.96% | 81 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readFloatBE | 139,854 ops/sec | ±1.03% | 85 | ✓ |
|
||||
| DataView#getFloat32 | 98,744 ops/sec | ±0.80% | 89 | |
|
||||
| NodeBuffer#readFloatBE | 92,769 ops/sec | ±0.94% | 93 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readUInt32LE | 710,861 ops/sec | ±0.82% | 92 | |
|
||||
| DataView#getUint32 | 117,893 ops/sec | ±0.84% | 91 | |
|
||||
| NodeBuffer#readUInt32LE | 851,412 ops/sec | ±0.72% | 93 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#slice | 1,673,877 ops/sec | ±0.73% | 94 | |
|
||||
| Uint8Array#subarray | 6,919,243 ops/sec | ±0.67% | 90 | ✓ |
|
||||
| NodeBuffer#slice | 4,617,604 ops/sec | ±0.79% | 93 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#writeFloatBE | 66,011 ops/sec | ±0.75% | 93 | |
|
||||
| DataView#setFloat32 | 127,760 ops/sec | ±0.72% | 93 | ✓ |
|
||||
| NodeBuffer#writeFloatBE | 103,352 ops/sec | ±0.83% | 93 | |
|
||||
|
||||
### iojs 1.8.1
|
||||
|
||||
| Method | Operations | Accuracy | Sampled | Fastest |
|
||||
|:-------|:-----------|:---------|:--------|:-------:|
|
||||
| BrowserBuffer#bracket-notation | 10,990,488 ops/sec | ±1.11% | 91 | |
|
||||
| Uint8Array#bracket-notation | 11,268,757 ops/sec | ±0.65% | 97 | |
|
||||
| NodeBuffer#bracket-notation | 11,353,260 ops/sec | ±0.83% | 94 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#concat | 378,954 ops/sec | ±0.74% | 94 | |
|
||||
| Uint8Array#concat | 1,358,288 ops/sec | ±0.97% | 87 | |
|
||||
| NodeBuffer#concat | 1,934,050 ops/sec | ±1.11% | 78 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16000) | 894,538 ops/sec | ±0.56% | 84 | |
|
||||
| Uint8Array#copy(16000) | 1,442,656 ops/sec | ±0.71% | 96 | |
|
||||
| NodeBuffer#copy(16000) | 1,457,898 ops/sec | ±0.53% | 92 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#copy(16) | 12,870,457 ops/sec | ±0.67% | 95 | |
|
||||
| Uint8Array#copy(16) | 16,643,989 ops/sec | ±0.61% | 93 | ✓ |
|
||||
| NodeBuffer#copy(16) | 14,885,848 ops/sec | ±0.74% | 94 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16000) | 109,264 ops/sec | ±4.21% | 63 | |
|
||||
| Uint8Array#new(16000) | 138,916 ops/sec | ±1.87% | 61 | |
|
||||
| NodeBuffer#new(16000) | 281,449 ops/sec | ±3.58% | 51 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#new(16) | 1,362,935 ops/sec | ±0.56% | 99 | |
|
||||
| Uint8Array#new(16) | 6,193,090 ops/sec | ±0.64% | 95 | ✓ |
|
||||
| NodeBuffer#new(16) | 4,745,425 ops/sec | ±1.56% | 90 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readDoubleBE | 118,127 ops/sec | ±0.59% | 93 | ✓ |
|
||||
| DataView#getFloat64 | 107,332 ops/sec | ±0.65% | 91 | |
|
||||
| NodeBuffer#readDoubleBE | 116,274 ops/sec | ±0.94% | 95 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readFloatBE | 150,326 ops/sec | ±0.58% | 95 | ✓ |
|
||||
| DataView#getFloat32 | 110,541 ops/sec | ±0.57% | 98 | |
|
||||
| NodeBuffer#readFloatBE | 121,599 ops/sec | ±0.60% | 87 | |
|
||||
| | | | |
|
||||
| BrowserBuffer#readUInt32LE | 814,147 ops/sec | ±0.62% | 93 | |
|
||||
| DataView#getUint32 | 137,592 ops/sec | ±0.64% | 90 | |
|
||||
| NodeBuffer#readUInt32LE | 931,650 ops/sec | ±0.71% | 96 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#slice | 878,590 ops/sec | ±0.68% | 93 | |
|
||||
| Uint8Array#subarray | 2,843,308 ops/sec | ±1.02% | 90 | |
|
||||
| NodeBuffer#slice | 4,998,316 ops/sec | ±0.68% | 90 | ✓ |
|
||||
| | | | |
|
||||
| BrowserBuffer#writeFloatBE | 65,927 ops/sec | ±0.74% | 93 | |
|
||||
| DataView#setFloat32 | 139,823 ops/sec | ±0.97% | 89 | ✓ |
|
||||
| NodeBuffer#writeFloatBE | 135,763 ops/sec | ±0.65% | 96 | |
|
||||
| | | | |
|
||||
|
||||
## Testing the project
|
||||
|
||||
First, install the project:
|
||||
|
||||
npm install
|
||||
|
||||
Then, to run tests in Node.js, run:
|
||||
|
||||
npm run test-node
|
||||
|
||||
To test locally in a browser, you can run:
|
||||
|
||||
npm run test-browser-es5-local # For ES5 browsers that don't support ES6
|
||||
npm run test-browser-es6-local # For ES6 compliant browsers
|
||||
|
||||
This will print out a URL that you can then open in a browser to run the tests, using [airtap](https://www.npmjs.com/package/airtap).
|
||||
|
||||
To run automated browser tests using Saucelabs, ensure that your `SAUCE_USERNAME` and `SAUCE_ACCESS_KEY` environment variables are set, then run:
|
||||
|
||||
npm test
|
||||
|
||||
This is what's run in Travis, to check against various browsers. The list of browsers is kept in the `bin/airtap-es5.yml` and `bin/airtap-es6.yml` files.
|
||||
|
||||
## JavaScript Standard Style
|
||||
|
||||
This module uses [JavaScript Standard Style](https://github.com/feross/standard).
|
||||
|
||||
[![JavaScript Style Guide](https://cdn.rawgit.com/feross/standard/master/badge.svg)](https://github.com/feross/standard)
|
||||
|
||||
To test that the code conforms to the style, `npm install` and run:
|
||||
|
||||
./node_modules/.bin/standard
|
||||
|
||||
## credit
|
||||
|
||||
This was originally forked from [buffer-browserify](https://github.com/toots/buffer-browserify).
|
||||
|
||||
## Security Policies and Procedures
|
||||
|
||||
The `buffer` team and community take all security bugs in `buffer` seriously. Please see our [security policies and procedures](https://github.com/feross/security) document to learn how to report issues.
|
||||
|
||||
## license
|
||||
|
||||
MIT. Copyright (C) [Feross Aboukhadijeh](http://feross.org), and other contributors. Originally forked from an MIT-licensed module by Romain Beauxis.
|
186
node_modules/buffer/index.d.ts
generated
vendored
Normal file
186
node_modules/buffer/index.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,186 @@
|
|||
export class Buffer extends Uint8Array {
|
||||
length: number
|
||||
write(string: string, offset?: number, length?: number, encoding?: string): number;
|
||||
toString(encoding?: string, start?: number, end?: number): string;
|
||||
toJSON(): { type: 'Buffer', data: any[] };
|
||||
equals(otherBuffer: Buffer): boolean;
|
||||
compare(otherBuffer: Buffer, targetStart?: number, targetEnd?: number, sourceStart?: number, sourceEnd?: number): number;
|
||||
copy(targetBuffer: Buffer, targetStart?: number, sourceStart?: number, sourceEnd?: number): number;
|
||||
slice(start?: number, end?: number): Buffer;
|
||||
writeUIntLE(value: number, offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
writeUIntBE(value: number, offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
writeIntLE(value: number, offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
writeIntBE(value: number, offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
readUIntLE(offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
readUIntBE(offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
readIntLE(offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
readIntBE(offset: number, byteLength: number, noAssert?: boolean): number;
|
||||
readUInt8(offset: number, noAssert?: boolean): number;
|
||||
readUInt16LE(offset: number, noAssert?: boolean): number;
|
||||
readUInt16BE(offset: number, noAssert?: boolean): number;
|
||||
readUInt32LE(offset: number, noAssert?: boolean): number;
|
||||
readUInt32BE(offset: number, noAssert?: boolean): number;
|
||||
readInt8(offset: number, noAssert?: boolean): number;
|
||||
readInt16LE(offset: number, noAssert?: boolean): number;
|
||||
readInt16BE(offset: number, noAssert?: boolean): number;
|
||||
readInt32LE(offset: number, noAssert?: boolean): number;
|
||||
readInt32BE(offset: number, noAssert?: boolean): number;
|
||||
readFloatLE(offset: number, noAssert?: boolean): number;
|
||||
readFloatBE(offset: number, noAssert?: boolean): number;
|
||||
readDoubleLE(offset: number, noAssert?: boolean): number;
|
||||
readDoubleBE(offset: number, noAssert?: boolean): number;
|
||||
reverse(): this;
|
||||
swap16(): Buffer;
|
||||
swap32(): Buffer;
|
||||
swap64(): Buffer;
|
||||
writeUInt8(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeUInt16LE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeUInt16BE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeUInt32LE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeUInt32BE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeInt8(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeInt16LE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeInt16BE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeInt32LE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeInt32BE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeFloatLE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeFloatBE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeDoubleLE(value: number, offset: number, noAssert?: boolean): number;
|
||||
writeDoubleBE(value: number, offset: number, noAssert?: boolean): number;
|
||||
fill(value: any, offset?: number, end?: number): this;
|
||||
indexOf(value: string | number | Buffer, byteOffset?: number, encoding?: string): number;
|
||||
lastIndexOf(value: string | number | Buffer, byteOffset?: number, encoding?: string): number;
|
||||
includes(value: string | number | Buffer, byteOffset?: number, encoding?: string): boolean;
|
||||
|
||||
/**
|
||||
* Allocates a new buffer containing the given {str}.
|
||||
*
|
||||
* @param str String to store in buffer.
|
||||
* @param encoding encoding to use, optional. Default is 'utf8'
|
||||
*/
|
||||
constructor (str: string, encoding?: string);
|
||||
/**
|
||||
* Allocates a new buffer of {size} octets.
|
||||
*
|
||||
* @param size count of octets to allocate.
|
||||
*/
|
||||
constructor (size: number);
|
||||
/**
|
||||
* Allocates a new buffer containing the given {array} of octets.
|
||||
*
|
||||
* @param array The octets to store.
|
||||
*/
|
||||
constructor (array: Uint8Array);
|
||||
/**
|
||||
* Produces a Buffer backed by the same allocated memory as
|
||||
* the given {ArrayBuffer}.
|
||||
*
|
||||
*
|
||||
* @param arrayBuffer The ArrayBuffer with which to share memory.
|
||||
*/
|
||||
constructor (arrayBuffer: ArrayBuffer);
|
||||
/**
|
||||
* Allocates a new buffer containing the given {array} of octets.
|
||||
*
|
||||
* @param array The octets to store.
|
||||
*/
|
||||
constructor (array: any[]);
|
||||
/**
|
||||
* Copies the passed {buffer} data onto a new {Buffer} instance.
|
||||
*
|
||||
* @param buffer The buffer to copy.
|
||||
*/
|
||||
constructor (buffer: Buffer);
|
||||
prototype: Buffer;
|
||||
/**
|
||||
* Allocates a new Buffer using an {array} of octets.
|
||||
*
|
||||
* @param array
|
||||
*/
|
||||
static from(array: any[]): Buffer;
|
||||
/**
|
||||
* When passed a reference to the .buffer property of a TypedArray instance,
|
||||
* the newly created Buffer will share the same allocated memory as the TypedArray.
|
||||
* The optional {byteOffset} and {length} arguments specify a memory range
|
||||
* within the {arrayBuffer} that will be shared by the Buffer.
|
||||
*
|
||||
* @param arrayBuffer The .buffer property of a TypedArray or a new ArrayBuffer()
|
||||
* @param byteOffset
|
||||
* @param length
|
||||
*/
|
||||
static from(arrayBuffer: ArrayBuffer, byteOffset?: number, length?: number): Buffer;
|
||||
/**
|
||||
* Copies the passed {buffer} data onto a new Buffer instance.
|
||||
*
|
||||
* @param buffer
|
||||
*/
|
||||
static from(buffer: Buffer | Uint8Array): Buffer;
|
||||
/**
|
||||
* Creates a new Buffer containing the given JavaScript string {str}.
|
||||
* If provided, the {encoding} parameter identifies the character encoding.
|
||||
* If not provided, {encoding} defaults to 'utf8'.
|
||||
*
|
||||
* @param str
|
||||
*/
|
||||
static from(str: string, encoding?: string): Buffer;
|
||||
/**
|
||||
* Returns true if {obj} is a Buffer
|
||||
*
|
||||
* @param obj object to test.
|
||||
*/
|
||||
static isBuffer(obj: any): obj is Buffer;
|
||||
/**
|
||||
* Returns true if {encoding} is a valid encoding argument.
|
||||
* Valid string encodings in Node 0.12: 'ascii'|'utf8'|'utf16le'|'ucs2'(alias of 'utf16le')|'base64'|'binary'(deprecated)|'hex'
|
||||
*
|
||||
* @param encoding string to test.
|
||||
*/
|
||||
static isEncoding(encoding: string): boolean;
|
||||
/**
|
||||
* Gives the actual byte length of a string. encoding defaults to 'utf8'.
|
||||
* This is not the same as String.prototype.length since that returns the number of characters in a string.
|
||||
*
|
||||
* @param string string to test.
|
||||
* @param encoding encoding used to evaluate (defaults to 'utf8')
|
||||
*/
|
||||
static byteLength(string: string, encoding?: string): number;
|
||||
/**
|
||||
* Returns a buffer which is the result of concatenating all the buffers in the list together.
|
||||
*
|
||||
* If the list has no items, or if the totalLength is 0, then it returns a zero-length buffer.
|
||||
* If the list has exactly one item, then the first item of the list is returned.
|
||||
* If the list has more than one item, then a new Buffer is created.
|
||||
*
|
||||
* @param list An array of Buffer objects to concatenate
|
||||
* @param totalLength Total length of the buffers when concatenated.
|
||||
* If totalLength is not provided, it is read from the buffers in the list. However, this adds an additional loop to the function, so it is faster to provide the length explicitly.
|
||||
*/
|
||||
static concat(list: Buffer[], totalLength?: number): Buffer;
|
||||
/**
|
||||
* The same as buf1.compare(buf2).
|
||||
*/
|
||||
static compare(buf1: Buffer, buf2: Buffer): number;
|
||||
/**
|
||||
* Allocates a new buffer of {size} octets.
|
||||
*
|
||||
* @param size count of octets to allocate.
|
||||
* @param fill if specified, buffer will be initialized by calling buf.fill(fill).
|
||||
* If parameter is omitted, buffer will be filled with zeros.
|
||||
* @param encoding encoding used for call to buf.fill while initializing
|
||||
*/
|
||||
static alloc(size: number, fill?: string | Buffer | number, encoding?: string): Buffer;
|
||||
/**
|
||||
* Allocates a new buffer of {size} octets, leaving memory not initialized, so the contents
|
||||
* of the newly created Buffer are unknown and may contain sensitive data.
|
||||
*
|
||||
* @param size count of octets to allocate
|
||||
*/
|
||||
static allocUnsafe(size: number): Buffer;
|
||||
/**
|
||||
* Allocates a new non-pooled buffer of {size} octets, leaving memory not initialized, so the contents
|
||||
* of the newly created Buffer are unknown and may contain sensitive data.
|
||||
*
|
||||
* @param size count of octets to allocate
|
||||
*/
|
||||
static allocUnsafeSlow(size: number): Buffer;
|
||||
}
|
1817
node_modules/buffer/index.js
generated
vendored
Normal file
1817
node_modules/buffer/index.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
96
node_modules/buffer/package.json
generated
vendored
Normal file
96
node_modules/buffer/package.json
generated
vendored
Normal file
|
@ -0,0 +1,96 @@
|
|||
{
|
||||
"name": "buffer",
|
||||
"description": "Node.js Buffer API, for the browser",
|
||||
"version": "5.7.1",
|
||||
"author": {
|
||||
"name": "Feross Aboukhadijeh",
|
||||
"email": "feross@feross.org",
|
||||
"url": "https://feross.org"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/feross/buffer/issues"
|
||||
},
|
||||
"contributors": [
|
||||
"Romain Beauxis <toots@rastageeks.org>",
|
||||
"James Halliday <mail@substack.net>"
|
||||
],
|
||||
"dependencies": {
|
||||
"base64-js": "^1.3.1",
|
||||
"ieee754": "^1.1.13"
|
||||
},
|
||||
"devDependencies": {
|
||||
"airtap": "^3.0.0",
|
||||
"benchmark": "^2.1.4",
|
||||
"browserify": "^17.0.0",
|
||||
"concat-stream": "^2.0.0",
|
||||
"hyperquest": "^2.1.3",
|
||||
"is-buffer": "^2.0.4",
|
||||
"is-nan": "^1.3.0",
|
||||
"split": "^1.0.1",
|
||||
"standard": "*",
|
||||
"tape": "^5.0.1",
|
||||
"through2": "^4.0.2",
|
||||
"uglify-js": "^3.11.3"
|
||||
},
|
||||
"homepage": "https://github.com/feross/buffer",
|
||||
"jspm": {
|
||||
"map": {
|
||||
"./index.js": {
|
||||
"node": "@node/buffer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"keywords": [
|
||||
"arraybuffer",
|
||||
"browser",
|
||||
"browserify",
|
||||
"buffer",
|
||||
"compatible",
|
||||
"dataview",
|
||||
"uint8array"
|
||||
],
|
||||
"license": "MIT",
|
||||
"main": "index.js",
|
||||
"types": "index.d.ts",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/feross/buffer.git"
|
||||
},
|
||||
"scripts": {
|
||||
"perf": "browserify --debug perf/bracket-notation.js > perf/bundle.js && open perf/index.html",
|
||||
"perf-node": "node perf/bracket-notation.js && node perf/concat.js && node perf/copy-big.js && node perf/copy.js && node perf/new-big.js && node perf/new.js && node perf/readDoubleBE.js && node perf/readFloatBE.js && node perf/readUInt32LE.js && node perf/slice.js && node perf/writeFloatBE.js",
|
||||
"size": "browserify -r ./ | uglifyjs -c -m | gzip | wc -c",
|
||||
"test": "standard && node ./bin/test.js",
|
||||
"test-browser-es5": "airtap -- test/*.js",
|
||||
"test-browser-es5-local": "airtap --local -- test/*.js",
|
||||
"test-browser-es6": "airtap -- test/*.js test/node/*.js",
|
||||
"test-browser-es6-local": "airtap --local -- test/*.js test/node/*.js",
|
||||
"test-node": "tape test/*.js test/node/*.js",
|
||||
"update-authors": "./bin/update-authors.sh"
|
||||
},
|
||||
"standard": {
|
||||
"ignore": [
|
||||
"test/node/**/*.js",
|
||||
"test/common.js",
|
||||
"test/_polyfill.js",
|
||||
"perf/**/*.js"
|
||||
],
|
||||
"globals": [
|
||||
"SharedArrayBuffer"
|
||||
]
|
||||
},
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/feross"
|
||||
},
|
||||
{
|
||||
"type": "patreon",
|
||||
"url": "https://www.patreon.com/feross"
|
||||
},
|
||||
{
|
||||
"type": "consulting",
|
||||
"url": "https://feross.org/support"
|
||||
}
|
||||
]
|
||||
}
|
45
node_modules/cli-cursor/index.d.ts
generated
vendored
Normal file
45
node_modules/cli-cursor/index.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,45 @@
|
|||
/// <reference types="node"/>
|
||||
|
||||
/**
|
||||
Show cursor.
|
||||
|
||||
@param stream - Default: `process.stderr`.
|
||||
|
||||
@example
|
||||
```
|
||||
import * as cliCursor from 'cli-cursor';
|
||||
|
||||
cliCursor.show();
|
||||
```
|
||||
*/
|
||||
export function show(stream?: NodeJS.WritableStream): void;
|
||||
|
||||
/**
|
||||
Hide cursor.
|
||||
|
||||
@param stream - Default: `process.stderr`.
|
||||
|
||||
@example
|
||||
```
|
||||
import * as cliCursor from 'cli-cursor';
|
||||
|
||||
cliCursor.hide();
|
||||
```
|
||||
*/
|
||||
export function hide(stream?: NodeJS.WritableStream): void;
|
||||
|
||||
/**
|
||||
Toggle cursor visibility.
|
||||
|
||||
@param force - Is useful to show or hide the cursor based on a boolean.
|
||||
@param stream - Default: `process.stderr`.
|
||||
|
||||
@example
|
||||
```
|
||||
import * as cliCursor from 'cli-cursor';
|
||||
|
||||
const unicornsAreAwesome = true;
|
||||
cliCursor.toggle(unicornsAreAwesome);
|
||||
```
|
||||
*/
|
||||
export function toggle(force?: boolean, stream?: NodeJS.WritableStream): void;
|
32
node_modules/cli-cursor/index.js
generated
vendored
32
node_modules/cli-cursor/index.js
generated
vendored
|
@ -1,39 +1,35 @@
|
|||
'use strict';
|
||||
const restoreCursor = require('restore-cursor');
|
||||
|
||||
let hidden = false;
|
||||
let isHidden = false;
|
||||
|
||||
exports.show = stream => {
|
||||
const s = stream || process.stderr;
|
||||
|
||||
if (!s.isTTY) {
|
||||
exports.show = (writableStream = process.stderr) => {
|
||||
if (!writableStream.isTTY) {
|
||||
return;
|
||||
}
|
||||
|
||||
hidden = false;
|
||||
s.write('\u001b[?25h');
|
||||
isHidden = false;
|
||||
writableStream.write('\u001B[?25h');
|
||||
};
|
||||
|
||||
exports.hide = stream => {
|
||||
const s = stream || process.stderr;
|
||||
|
||||
if (!s.isTTY) {
|
||||
exports.hide = (writableStream = process.stderr) => {
|
||||
if (!writableStream.isTTY) {
|
||||
return;
|
||||
}
|
||||
|
||||
restoreCursor();
|
||||
hidden = true;
|
||||
s.write('\u001b[?25l');
|
||||
isHidden = true;
|
||||
writableStream.write('\u001B[?25l');
|
||||
};
|
||||
|
||||
exports.toggle = (force, stream) => {
|
||||
exports.toggle = (force, writableStream) => {
|
||||
if (force !== undefined) {
|
||||
hidden = force;
|
||||
isHidden = force;
|
||||
}
|
||||
|
||||
if (hidden) {
|
||||
exports.show(stream);
|
||||
if (isHidden) {
|
||||
exports.show(writableStream);
|
||||
} else {
|
||||
exports.hide(stream);
|
||||
exports.hide(writableStream);
|
||||
}
|
||||
};
|
||||
|
|
20
node_modules/cli-cursor/license
generated
vendored
20
node_modules/cli-cursor/license
generated
vendored
|
@ -1,21 +1,9 @@
|
|||
The MIT License (MIT)
|
||||
MIT License
|
||||
|
||||
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue