Initial Commit
This commit is contained in:
20
node_modules/@electron/asar/LICENSE.md
generated
vendored
Normal file
20
node_modules/@electron/asar/LICENSE.md
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
Copyright (c) 2014 GitHub Inc.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining
|
||||
a copy of this software and associated documentation files (the
|
||||
"Software"), to deal in the Software without restriction, including
|
||||
without limitation the rights to use, copy, modify, merge, publish,
|
||||
distribute, sublicense, and/or sell copies of the Software, and to
|
||||
permit persons to whom the Software is furnished to do so, subject to
|
||||
the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
208
node_modules/@electron/asar/README.md
generated
vendored
Normal file
208
node_modules/@electron/asar/README.md
generated
vendored
Normal file
@@ -0,0 +1,208 @@
|
||||
# @electron/asar - Electron Archive
|
||||
|
||||
[](https://github.com/electron/asar/actions/workflows/test.yml)
|
||||
[](https://npmjs.org/package/@electron/asar)
|
||||
|
||||
Asar is a simple extensive archive format, it works like `tar` that concatenates
|
||||
all files together without compression, while having random access support.
|
||||
|
||||
## Features
|
||||
|
||||
* Support random access
|
||||
* Use JSON to store files' information
|
||||
* Very easy to write a parser
|
||||
|
||||
## Command line utility
|
||||
|
||||
### Install
|
||||
|
||||
This module requires Node 10 or later.
|
||||
|
||||
```bash
|
||||
$ npm install --engine-strict @electron/asar
|
||||
```
|
||||
|
||||
### Usage
|
||||
|
||||
```bash
|
||||
$ asar --help
|
||||
|
||||
Usage: asar [options] [command]
|
||||
|
||||
Commands:
|
||||
|
||||
pack|p <dir> <output>
|
||||
create asar archive
|
||||
|
||||
list|l <archive>
|
||||
list files of asar archive
|
||||
|
||||
extract-file|ef <archive> <filename>
|
||||
extract one file from archive
|
||||
|
||||
extract|e <archive> <dest>
|
||||
extract archive
|
||||
|
||||
|
||||
Options:
|
||||
|
||||
-h, --help output usage information
|
||||
-V, --version output the version number
|
||||
|
||||
```
|
||||
|
||||
#### Excluding multiple resources from being packed
|
||||
|
||||
Given:
|
||||
```
|
||||
app
|
||||
(a) ├── x1
|
||||
(b) ├── x2
|
||||
(c) ├── y3
|
||||
(d) │ ├── x1
|
||||
(e) │ └── z1
|
||||
(f) │ └── x2
|
||||
(g) └── z4
|
||||
(h) └── w1
|
||||
```
|
||||
|
||||
Exclude: a, b
|
||||
```bash
|
||||
$ asar pack app app.asar --unpack-dir "{x1,x2}"
|
||||
```
|
||||
|
||||
Exclude: a, b, d, f
|
||||
```bash
|
||||
$ asar pack app app.asar --unpack-dir "**/{x1,x2}"
|
||||
```
|
||||
|
||||
Exclude: a, b, d, f, h
|
||||
```bash
|
||||
$ asar pack app app.asar --unpack-dir "{**/x1,**/x2,z4/w1}"
|
||||
```
|
||||
|
||||
## Using programmatically
|
||||
|
||||
### Example
|
||||
|
||||
```javascript
|
||||
const asar = require('@electron/asar');
|
||||
|
||||
const src = 'some/path/';
|
||||
const dest = 'name.asar';
|
||||
|
||||
await asar.createPackage(src, dest);
|
||||
console.log('done.');
|
||||
```
|
||||
|
||||
Please note that there is currently **no** error handling provided!
|
||||
|
||||
### Transform
|
||||
You can pass in a `transform` option, that is a function, which either returns
|
||||
nothing, or a `stream.Transform`. The latter will be used on files that will be
|
||||
in the `.asar` file to transform them (e.g. compress).
|
||||
|
||||
```javascript
|
||||
const asar = require('@electron/asar');
|
||||
|
||||
const src = 'some/path/';
|
||||
const dest = 'name.asar';
|
||||
|
||||
function transform (filename) {
|
||||
return new CustomTransformStream()
|
||||
}
|
||||
|
||||
await asar.createPackageWithOptions(src, dest, { transform: transform });
|
||||
console.log('done.');
|
||||
```
|
||||
|
||||
## Format
|
||||
|
||||
Asar uses [Pickle][pickle] to safely serialize binary value to file.
|
||||
|
||||
The format of asar is very flat:
|
||||
|
||||
```
|
||||
| UInt32: header_size | String: header | Bytes: file1 | ... | Bytes: file42 |
|
||||
```
|
||||
|
||||
The `header_size` and `header` are serialized with [Pickle][pickle] class, and
|
||||
`header_size`'s [Pickle][pickle] object is 8 bytes.
|
||||
|
||||
The `header` is a JSON string, and the `header_size` is the size of `header`'s
|
||||
`Pickle` object.
|
||||
|
||||
Structure of `header` is something like this:
|
||||
|
||||
```json
|
||||
{
|
||||
"files": {
|
||||
"tmp": {
|
||||
"files": {}
|
||||
},
|
||||
"usr" : {
|
||||
"files": {
|
||||
"bin": {
|
||||
"files": {
|
||||
"ls": {
|
||||
"offset": "0",
|
||||
"size": 100,
|
||||
"executable": true,
|
||||
"integrity": {
|
||||
"algorithm": "SHA256",
|
||||
"hash": "...",
|
||||
"blockSize": 1024,
|
||||
"blocks": ["...", "..."]
|
||||
}
|
||||
},
|
||||
"cd": {
|
||||
"offset": "100",
|
||||
"size": 100,
|
||||
"executable": true,
|
||||
"integrity": {
|
||||
"algorithm": "SHA256",
|
||||
"hash": "...",
|
||||
"blockSize": 1024,
|
||||
"blocks": ["...", "..."]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"etc": {
|
||||
"files": {
|
||||
"hosts": {
|
||||
"offset": "200",
|
||||
"size": 32,
|
||||
"integrity": {
|
||||
"algorithm": "SHA256",
|
||||
"hash": "...",
|
||||
"blockSize": 1024,
|
||||
"blocks": ["...", "..."]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
`offset` and `size` records the information to read the file from archive, the
|
||||
`offset` starts from 0 so you have to manually add the size of `header_size` and
|
||||
`header` to the `offset` to get the real offset of the file.
|
||||
|
||||
`offset` is a UINT64 number represented in string, because there is no way to
|
||||
precisely represent UINT64 in JavaScript `Number`. `size` is a JavaScript
|
||||
`Number` that is no larger than `Number.MAX_SAFE_INTEGER`, which has a value of
|
||||
`9007199254740991` and is about 8PB in size. We didn't store `size` in UINT64
|
||||
because file size in Node.js is represented as `Number` and it is not safe to
|
||||
convert `Number` to UINT64.
|
||||
|
||||
`integrity` is an object consisting of a few keys:
|
||||
* A hashing `algorithm`, currently only `SHA256` is supported.
|
||||
* A hex encoded `hash` value representing the hash of the entire file.
|
||||
* An array of hex encoded hashes for the `blocks` of the file. i.e. for a blockSize of 4KB this array contains the hash of every block if you split the file into N 4KB blocks.
|
||||
* A integer value `blockSize` representing the size in bytes of each block in the `blocks` hashes above
|
||||
|
||||
[pickle]: https://chromium.googlesource.com/chromium/src/+/main/base/pickle.h
|
||||
82
node_modules/@electron/asar/bin/asar.js
generated
vendored
Executable file
82
node_modules/@electron/asar/bin/asar.js
generated
vendored
Executable file
@@ -0,0 +1,82 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
var packageJSON = require('../package.json')
|
||||
var splitVersion = function (version) { return version.split('.').map(function (part) { return Number(part) }) }
|
||||
var requiredNodeVersion = splitVersion(packageJSON.engines.node.slice(2))
|
||||
var actualNodeVersion = splitVersion(process.versions.node)
|
||||
|
||||
if (actualNodeVersion[0] < requiredNodeVersion[0] || (actualNodeVersion[0] === requiredNodeVersion[0] && actualNodeVersion[1] < requiredNodeVersion[1])) {
|
||||
console.error('CANNOT RUN WITH NODE ' + process.versions.node)
|
||||
console.error('asar requires Node ' + packageJSON.engines.node + '.')
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
// Not consts so that this file can load in Node < 4.0
|
||||
var asar = require('../lib/asar')
|
||||
var program = require('commander')
|
||||
|
||||
program.version('v' + packageJSON.version)
|
||||
.description('Manipulate asar archive files')
|
||||
|
||||
program.command('pack <dir> <output>')
|
||||
.alias('p')
|
||||
.description('create asar archive')
|
||||
.option('--ordering <file path>', 'path to a text file for ordering contents')
|
||||
.option('--unpack <expression>', 'do not pack files matching glob <expression>')
|
||||
.option('--unpack-dir <expression>', 'do not pack dirs matching glob <expression> or starting with literal <expression>')
|
||||
.option('--exclude-hidden', 'exclude hidden files')
|
||||
.action(function (dir, output, options) {
|
||||
options = {
|
||||
unpack: options.unpack,
|
||||
unpackDir: options.unpackDir,
|
||||
ordering: options.ordering,
|
||||
version: options.sv,
|
||||
arch: options.sa,
|
||||
builddir: options.sb,
|
||||
dot: !options.excludeHidden
|
||||
}
|
||||
asar.createPackageWithOptions(dir, output, options).catch(error => {
|
||||
console.error(error)
|
||||
process.exit(1)
|
||||
})
|
||||
})
|
||||
|
||||
program.command('list <archive>')
|
||||
.alias('l')
|
||||
.description('list files of asar archive')
|
||||
.option('-i, --is-pack', 'each file in the asar is pack or unpack')
|
||||
.action(function (archive, options) {
|
||||
options = {
|
||||
isPack: options.isPack
|
||||
}
|
||||
var files = asar.listPackage(archive, options)
|
||||
for (var i in files) {
|
||||
console.log(files[i])
|
||||
}
|
||||
})
|
||||
|
||||
program.command('extract-file <archive> <filename>')
|
||||
.alias('ef')
|
||||
.description('extract one file from archive')
|
||||
.action(function (archive, filename) {
|
||||
require('fs').writeFileSync(require('path').basename(filename),
|
||||
asar.extractFile(archive, filename))
|
||||
})
|
||||
|
||||
program.command('extract <archive> <dest>')
|
||||
.alias('e')
|
||||
.description('extract archive')
|
||||
.action(function (archive, dest) {
|
||||
asar.extractAll(archive, dest)
|
||||
})
|
||||
|
||||
program.command('*')
|
||||
.action(function (_cmd, args) {
|
||||
console.log('asar: \'%s\' is not an asar command. See \'asar --help\'.', args[0])
|
||||
})
|
||||
|
||||
program.parse(process.argv)
|
||||
|
||||
if (program.args.length === 0) {
|
||||
program.help()
|
||||
}
|
||||
96
node_modules/@electron/asar/lib/asar.d.ts
generated
vendored
Normal file
96
node_modules/@electron/asar/lib/asar.d.ts
generated
vendored
Normal file
@@ -0,0 +1,96 @@
|
||||
import { FilesystemDirectoryEntry, FilesystemEntry, FilesystemLinkEntry } from './filesystem';
|
||||
import * as disk from './disk';
|
||||
import { CrawledFileType } from './crawlfs';
|
||||
import { IOptions } from './types/glob';
|
||||
export declare function createPackage(src: string, dest: string): Promise<NodeJS.WritableStream>;
|
||||
export type CreateOptions = {
|
||||
dot?: boolean;
|
||||
globOptions?: IOptions;
|
||||
/**
|
||||
* Path to a file containing the list of relative filepaths relative to `src` and the specific order they should be inserted into the asar.
|
||||
* Formats allowed below:
|
||||
* filepath
|
||||
* : filepath
|
||||
* <anything>:filepath
|
||||
*/
|
||||
ordering?: string;
|
||||
pattern?: string;
|
||||
transform?: (filePath: string) => NodeJS.ReadWriteStream | void;
|
||||
unpack?: string;
|
||||
unpackDir?: string;
|
||||
};
|
||||
export declare function createPackageWithOptions(src: string, dest: string, options: CreateOptions): Promise<NodeJS.WritableStream>;
|
||||
/**
|
||||
* Create an ASAR archive from a list of filenames.
|
||||
*
|
||||
* @param src - Base path. All files are relative to this.
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param filenames - List of filenames relative to src.
|
||||
* @param [metadata] - Object with filenames as keys and {type='directory|file|link', stat: fs.stat} as values. (Optional)
|
||||
* @param [options] - Options passed to `createPackageWithOptions`.
|
||||
*/
|
||||
export declare function createPackageFromFiles(src: string, dest: string, filenames: string[], metadata?: disk.InputMetadata, options?: CreateOptions): Promise<NodeJS.WritableStream>;
|
||||
export type AsarStream = {
|
||||
/**
|
||||
Relative path to the file or directory from within the archive
|
||||
*/
|
||||
path: string;
|
||||
/**
|
||||
Function that returns a read stream for a file.
|
||||
Note: this is called multiple times per "file", so a new NodeJS.ReadableStream needs to be created each time
|
||||
*/
|
||||
streamGenerator: () => NodeJS.ReadableStream;
|
||||
/**
|
||||
Whether the file/link should be unpacked
|
||||
*/
|
||||
unpacked: boolean;
|
||||
stat: CrawledFileType['stat'];
|
||||
};
|
||||
export type AsarDirectory = Pick<AsarStream, 'path' | 'unpacked'> & {
|
||||
type: 'directory';
|
||||
};
|
||||
export type AsarSymlinkStream = AsarStream & {
|
||||
type: 'link';
|
||||
symlink: string;
|
||||
};
|
||||
export type AsarFileStream = AsarStream & {
|
||||
type: 'file';
|
||||
};
|
||||
export type AsarStreamType = AsarDirectory | AsarFileStream | AsarSymlinkStream;
|
||||
/**
|
||||
* Create an ASAR archive from a list of streams.
|
||||
*
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param streams - List of streams to be piped in-memory into asar filesystem. Insertion order is preserved.
|
||||
*/
|
||||
export declare function createPackageFromStreams(dest: string, streams: AsarStreamType[]): Promise<import("fs").WriteStream>;
|
||||
export declare function statFile(archivePath: string, filename: string, followLinks?: boolean): FilesystemEntry;
|
||||
export declare function getRawHeader(archivePath: string): disk.ArchiveHeader;
|
||||
export interface ListOptions {
|
||||
isPack: boolean;
|
||||
}
|
||||
export declare function listPackage(archivePath: string, options: ListOptions): string[];
|
||||
export declare function extractFile(archivePath: string, filename: string, followLinks?: boolean): Buffer;
|
||||
export declare function extractAll(archivePath: string, dest: string): void;
|
||||
export declare function uncache(archivePath: string): boolean;
|
||||
export declare function uncacheAll(): void;
|
||||
export { EntryMetadata } from './filesystem';
|
||||
export { InputMetadata, DirectoryRecord, FileRecord, ArchiveHeader } from './disk';
|
||||
export type InputMetadataType = 'directory' | 'file' | 'link';
|
||||
export type DirectoryMetadata = FilesystemDirectoryEntry;
|
||||
export type FileMetadata = FilesystemEntry;
|
||||
export type LinkMetadata = FilesystemLinkEntry;
|
||||
declare const _default: {
|
||||
createPackage: typeof createPackage;
|
||||
createPackageWithOptions: typeof createPackageWithOptions;
|
||||
createPackageFromFiles: typeof createPackageFromFiles;
|
||||
createPackageFromStreams: typeof createPackageFromStreams;
|
||||
statFile: typeof statFile;
|
||||
getRawHeader: typeof getRawHeader;
|
||||
listPackage: typeof listPackage;
|
||||
extractFile: typeof extractFile;
|
||||
extractAll: typeof extractAll;
|
||||
uncache: typeof uncache;
|
||||
uncacheAll: typeof uncacheAll;
|
||||
};
|
||||
export default _default;
|
||||
337
node_modules/@electron/asar/lib/asar.js
generated
vendored
Normal file
337
node_modules/@electron/asar/lib/asar.js
generated
vendored
Normal file
@@ -0,0 +1,337 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.createPackage = createPackage;
|
||||
exports.createPackageWithOptions = createPackageWithOptions;
|
||||
exports.createPackageFromFiles = createPackageFromFiles;
|
||||
exports.createPackageFromStreams = createPackageFromStreams;
|
||||
exports.statFile = statFile;
|
||||
exports.getRawHeader = getRawHeader;
|
||||
exports.listPackage = listPackage;
|
||||
exports.extractFile = extractFile;
|
||||
exports.extractAll = extractAll;
|
||||
exports.uncache = uncache;
|
||||
exports.uncacheAll = uncacheAll;
|
||||
const path = __importStar(require("path"));
|
||||
const minimatch_1 = __importDefault(require("minimatch"));
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const filesystem_1 = require("./filesystem");
|
||||
const disk = __importStar(require("./disk"));
|
||||
const crawlfs_1 = require("./crawlfs");
|
||||
/**
|
||||
* Whether a directory should be excluded from packing due to the `--unpack-dir" option.
|
||||
*
|
||||
* @param dirPath - directory path to check
|
||||
* @param pattern - literal prefix [for backward compatibility] or glob pattern
|
||||
* @param unpackDirs - Array of directory paths previously marked as unpacked
|
||||
*/
|
||||
function isUnpackedDir(dirPath, pattern, unpackDirs) {
|
||||
if (dirPath.startsWith(pattern) || (0, minimatch_1.default)(dirPath, pattern)) {
|
||||
if (!unpackDirs.includes(dirPath)) {
|
||||
unpackDirs.push(dirPath);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
else {
|
||||
return unpackDirs.some((unpackDir) => dirPath.startsWith(unpackDir) && !path.relative(unpackDir, dirPath).startsWith('..'));
|
||||
}
|
||||
}
|
||||
async function createPackage(src, dest) {
|
||||
return createPackageWithOptions(src, dest, {});
|
||||
}
|
||||
async function createPackageWithOptions(src, dest, options) {
|
||||
const globOptions = options.globOptions ? options.globOptions : {};
|
||||
globOptions.dot = options.dot === undefined ? true : options.dot;
|
||||
const pattern = src + (options.pattern ? options.pattern : '/**/*');
|
||||
const [filenames, metadata] = await (0, crawlfs_1.crawl)(pattern, globOptions);
|
||||
return createPackageFromFiles(src, dest, filenames, metadata, options);
|
||||
}
|
||||
/**
|
||||
* Create an ASAR archive from a list of filenames.
|
||||
*
|
||||
* @param src - Base path. All files are relative to this.
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param filenames - List of filenames relative to src.
|
||||
* @param [metadata] - Object with filenames as keys and {type='directory|file|link', stat: fs.stat} as values. (Optional)
|
||||
* @param [options] - Options passed to `createPackageWithOptions`.
|
||||
*/
|
||||
async function createPackageFromFiles(src, dest, filenames, metadata = {}, options = {}) {
|
||||
src = path.normalize(src);
|
||||
dest = path.normalize(dest);
|
||||
filenames = filenames.map(function (filename) {
|
||||
return path.normalize(filename);
|
||||
});
|
||||
const filesystem = new filesystem_1.Filesystem(src);
|
||||
const files = [];
|
||||
const links = [];
|
||||
const unpackDirs = [];
|
||||
let filenamesSorted = [];
|
||||
if (options.ordering) {
|
||||
const orderingFiles = (await wrapped_fs_1.default.readFile(options.ordering))
|
||||
.toString()
|
||||
.split('\n')
|
||||
.map((line) => {
|
||||
if (line.includes(':')) {
|
||||
line = line.split(':').pop();
|
||||
}
|
||||
line = line.trim();
|
||||
if (line.startsWith('/')) {
|
||||
line = line.slice(1);
|
||||
}
|
||||
return line;
|
||||
});
|
||||
const ordering = [];
|
||||
for (const file of orderingFiles) {
|
||||
const pathComponents = file.split(path.sep);
|
||||
let str = src;
|
||||
for (const pathComponent of pathComponents) {
|
||||
str = path.join(str, pathComponent);
|
||||
ordering.push(str);
|
||||
}
|
||||
}
|
||||
let missing = 0;
|
||||
const total = filenames.length;
|
||||
for (const file of ordering) {
|
||||
if (!filenamesSorted.includes(file) && filenames.includes(file)) {
|
||||
filenamesSorted.push(file);
|
||||
}
|
||||
}
|
||||
for (const file of filenames) {
|
||||
if (!filenamesSorted.includes(file)) {
|
||||
filenamesSorted.push(file);
|
||||
missing += 1;
|
||||
}
|
||||
}
|
||||
console.log(`Ordering file has ${((total - missing) / total) * 100}% coverage.`);
|
||||
}
|
||||
else {
|
||||
filenamesSorted = filenames;
|
||||
}
|
||||
const handleFile = async function (filename) {
|
||||
if (!metadata[filename]) {
|
||||
const fileType = await (0, crawlfs_1.determineFileType)(filename);
|
||||
if (!fileType) {
|
||||
throw new Error('Unknown file type for file: ' + filename);
|
||||
}
|
||||
metadata[filename] = fileType;
|
||||
}
|
||||
const file = metadata[filename];
|
||||
const shouldUnpackPath = function (relativePath, unpack, unpackDir) {
|
||||
let shouldUnpack = false;
|
||||
if (unpack) {
|
||||
shouldUnpack = (0, minimatch_1.default)(filename, unpack, { matchBase: true });
|
||||
}
|
||||
if (!shouldUnpack && unpackDir) {
|
||||
shouldUnpack = isUnpackedDir(relativePath, unpackDir, unpackDirs);
|
||||
}
|
||||
return shouldUnpack;
|
||||
};
|
||||
let shouldUnpack;
|
||||
switch (file.type) {
|
||||
case 'directory':
|
||||
shouldUnpack = shouldUnpackPath(path.relative(src, filename), undefined, options.unpackDir);
|
||||
filesystem.insertDirectory(filename, shouldUnpack);
|
||||
break;
|
||||
case 'file':
|
||||
shouldUnpack = shouldUnpackPath(path.relative(src, path.dirname(filename)), options.unpack, options.unpackDir);
|
||||
files.push({ filename, unpack: shouldUnpack });
|
||||
return filesystem.insertFile(filename, () => wrapped_fs_1.default.createReadStream(filename), shouldUnpack, file, options);
|
||||
case 'link':
|
||||
shouldUnpack = shouldUnpackPath(path.relative(src, filename), options.unpack, options.unpackDir);
|
||||
links.push({ filename, unpack: shouldUnpack });
|
||||
filesystem.insertLink(filename, shouldUnpack);
|
||||
break;
|
||||
}
|
||||
return Promise.resolve();
|
||||
};
|
||||
const insertsDone = async function () {
|
||||
await wrapped_fs_1.default.mkdirp(path.dirname(dest));
|
||||
return disk.writeFilesystem(dest, filesystem, { files, links }, metadata);
|
||||
};
|
||||
const names = filenamesSorted.slice();
|
||||
const next = async function (name) {
|
||||
if (!name) {
|
||||
return insertsDone();
|
||||
}
|
||||
await handleFile(name);
|
||||
return next(names.shift());
|
||||
};
|
||||
return next(names.shift());
|
||||
}
|
||||
/**
|
||||
* Create an ASAR archive from a list of streams.
|
||||
*
|
||||
* @param dest - Archive filename (& path).
|
||||
* @param streams - List of streams to be piped in-memory into asar filesystem. Insertion order is preserved.
|
||||
*/
|
||||
async function createPackageFromStreams(dest, streams) {
|
||||
// We use an ambiguous root `src` since we're piping directly from a stream and the `filePath` for the stream is already relative to the src/root
|
||||
const src = '.';
|
||||
const filesystem = new filesystem_1.Filesystem(src);
|
||||
const files = [];
|
||||
const links = [];
|
||||
const handleFile = async function (stream) {
|
||||
const { path: destinationPath, type } = stream;
|
||||
const filename = path.normalize(destinationPath);
|
||||
switch (type) {
|
||||
case 'directory':
|
||||
filesystem.insertDirectory(filename, stream.unpacked);
|
||||
break;
|
||||
case 'file':
|
||||
files.push({
|
||||
filename,
|
||||
streamGenerator: stream.streamGenerator,
|
||||
link: undefined,
|
||||
mode: stream.stat.mode,
|
||||
unpack: stream.unpacked,
|
||||
});
|
||||
return filesystem.insertFile(filename, stream.streamGenerator, stream.unpacked, {
|
||||
type: 'file',
|
||||
stat: stream.stat,
|
||||
});
|
||||
case 'link':
|
||||
links.push({
|
||||
filename,
|
||||
streamGenerator: stream.streamGenerator,
|
||||
link: stream.symlink,
|
||||
mode: stream.stat.mode,
|
||||
unpack: stream.unpacked,
|
||||
});
|
||||
filesystem.insertLink(filename, stream.unpacked, path.dirname(filename), stream.symlink, src);
|
||||
break;
|
||||
}
|
||||
return Promise.resolve();
|
||||
};
|
||||
const insertsDone = async function () {
|
||||
await wrapped_fs_1.default.mkdirp(path.dirname(dest));
|
||||
return disk.streamFilesystem(dest, filesystem, { files, links });
|
||||
};
|
||||
const streamQueue = streams.slice();
|
||||
const next = async function (stream) {
|
||||
if (!stream) {
|
||||
return insertsDone();
|
||||
}
|
||||
await handleFile(stream);
|
||||
return next(streamQueue.shift());
|
||||
};
|
||||
return next(streamQueue.shift());
|
||||
}
|
||||
function statFile(archivePath, filename, followLinks = true) {
|
||||
const filesystem = disk.readFilesystemSync(archivePath);
|
||||
return filesystem.getFile(filename, followLinks);
|
||||
}
|
||||
function getRawHeader(archivePath) {
|
||||
return disk.readArchiveHeaderSync(archivePath);
|
||||
}
|
||||
function listPackage(archivePath, options) {
|
||||
return disk.readFilesystemSync(archivePath).listFiles(options);
|
||||
}
|
||||
function extractFile(archivePath, filename, followLinks = true) {
|
||||
const filesystem = disk.readFilesystemSync(archivePath);
|
||||
const fileInfo = filesystem.getFile(filename, followLinks);
|
||||
if ('link' in fileInfo || 'files' in fileInfo) {
|
||||
throw new Error('Expected to find file at: ' + filename + ' but found a directory or link');
|
||||
}
|
||||
return disk.readFileSync(filesystem, filename, fileInfo);
|
||||
}
|
||||
function extractAll(archivePath, dest) {
|
||||
const filesystem = disk.readFilesystemSync(archivePath);
|
||||
const filenames = filesystem.listFiles();
|
||||
// under windows just extract links as regular files
|
||||
const followLinks = process.platform === 'win32';
|
||||
// create destination directory
|
||||
wrapped_fs_1.default.mkdirpSync(dest);
|
||||
const extractionErrors = [];
|
||||
for (const fullPath of filenames) {
|
||||
// Remove leading slash
|
||||
const filename = fullPath.substr(1);
|
||||
const destFilename = path.join(dest, filename);
|
||||
const file = filesystem.getFile(filename, followLinks);
|
||||
if (path.relative(dest, destFilename).startsWith('..')) {
|
||||
throw new Error(`${fullPath}: file "${destFilename}" writes out of the package`);
|
||||
}
|
||||
if ('files' in file) {
|
||||
// it's a directory, create it and continue with the next entry
|
||||
wrapped_fs_1.default.mkdirpSync(destFilename);
|
||||
}
|
||||
else if ('link' in file) {
|
||||
// it's a symlink, create a symlink
|
||||
const linkSrcPath = path.dirname(path.join(dest, file.link));
|
||||
const linkDestPath = path.dirname(destFilename);
|
||||
const relativePath = path.relative(linkDestPath, linkSrcPath);
|
||||
// try to delete output file, because we can't overwrite a link
|
||||
try {
|
||||
wrapped_fs_1.default.unlinkSync(destFilename);
|
||||
}
|
||||
catch (_a) { }
|
||||
const linkTo = path.join(relativePath, path.basename(file.link));
|
||||
if (path.relative(dest, linkSrcPath).startsWith('..')) {
|
||||
throw new Error(`${fullPath}: file "${file.link}" links out of the package to "${linkSrcPath}"`);
|
||||
}
|
||||
wrapped_fs_1.default.symlinkSync(linkTo, destFilename);
|
||||
}
|
||||
else {
|
||||
// it's a file, try to extract it
|
||||
try {
|
||||
const content = disk.readFileSync(filesystem, filename, file);
|
||||
wrapped_fs_1.default.writeFileSync(destFilename, content);
|
||||
if (file.executable) {
|
||||
wrapped_fs_1.default.chmodSync(destFilename, '755');
|
||||
}
|
||||
}
|
||||
catch (e) {
|
||||
extractionErrors.push(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (extractionErrors.length) {
|
||||
throw new Error('Unable to extract some files:\n\n' +
|
||||
extractionErrors.map((error) => error.stack).join('\n\n'));
|
||||
}
|
||||
}
|
||||
function uncache(archivePath) {
|
||||
return disk.uncacheFilesystem(archivePath);
|
||||
}
|
||||
function uncacheAll() {
|
||||
disk.uncacheAll();
|
||||
}
|
||||
// Export everything in default, too
|
||||
exports.default = {
|
||||
createPackage,
|
||||
createPackageWithOptions,
|
||||
createPackageFromFiles,
|
||||
createPackageFromStreams,
|
||||
statFile,
|
||||
getRawHeader,
|
||||
listPackage,
|
||||
extractFile,
|
||||
extractAll,
|
||||
uncache,
|
||||
uncacheAll,
|
||||
};
|
||||
//# sourceMappingURL=asar.js.map
|
||||
1
node_modules/@electron/asar/lib/asar.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/asar.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
12
node_modules/@electron/asar/lib/crawlfs.d.ts
generated
vendored
Normal file
12
node_modules/@electron/asar/lib/crawlfs.d.ts
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
import { Stats } from 'fs';
|
||||
import { IOptions } from './types/glob';
|
||||
export type CrawledFileType = {
|
||||
type: 'file' | 'directory' | 'link';
|
||||
stat: Pick<Stats, 'mode' | 'size'>;
|
||||
transformed?: {
|
||||
path: string;
|
||||
stat: Stats;
|
||||
};
|
||||
};
|
||||
export declare function determineFileType(filename: string): Promise<CrawledFileType | null>;
|
||||
export declare function crawl(dir: string, options: IOptions): Promise<readonly [string[], Record<string, CrawledFileType>]>;
|
||||
79
node_modules/@electron/asar/lib/crawlfs.js
generated
vendored
Normal file
79
node_modules/@electron/asar/lib/crawlfs.js
generated
vendored
Normal file
@@ -0,0 +1,79 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.determineFileType = determineFileType;
|
||||
exports.crawl = crawl;
|
||||
const util_1 = require("util");
|
||||
const glob_1 = require("glob");
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const path = __importStar(require("path"));
|
||||
const glob = (0, util_1.promisify)(glob_1.glob);
|
||||
async function determineFileType(filename) {
|
||||
const stat = await wrapped_fs_1.default.lstat(filename);
|
||||
if (stat.isFile()) {
|
||||
return { type: 'file', stat };
|
||||
}
|
||||
else if (stat.isDirectory()) {
|
||||
return { type: 'directory', stat };
|
||||
}
|
||||
else if (stat.isSymbolicLink()) {
|
||||
return { type: 'link', stat };
|
||||
}
|
||||
return null;
|
||||
}
|
||||
async function crawl(dir, options) {
|
||||
const metadata = {};
|
||||
const crawled = await glob(dir, options);
|
||||
const results = await Promise.all(crawled.map(async (filename) => [filename, await determineFileType(filename)]));
|
||||
const links = [];
|
||||
const filenames = results
|
||||
.map(([filename, type]) => {
|
||||
if (type) {
|
||||
metadata[filename] = type;
|
||||
if (type.type === 'link')
|
||||
links.push(filename);
|
||||
}
|
||||
return filename;
|
||||
})
|
||||
.filter((filename) => {
|
||||
// Newer glob can return files inside symlinked directories, to avoid
|
||||
// those appearing in archives we need to manually exclude theme here
|
||||
const exactLinkIndex = links.findIndex((link) => filename === link);
|
||||
return links.every((link, index) => {
|
||||
if (index === exactLinkIndex) {
|
||||
return true;
|
||||
}
|
||||
const isFileWithinSymlinkDir = filename.startsWith(link);
|
||||
// symlink may point outside the directory: https://github.com/electron/asar/issues/303
|
||||
const relativePath = path.relative(link, path.dirname(filename));
|
||||
return !isFileWithinSymlinkDir || relativePath.startsWith('..');
|
||||
});
|
||||
});
|
||||
return [filenames, metadata];
|
||||
}
|
||||
//# sourceMappingURL=crawlfs.js.map
|
||||
1
node_modules/@electron/asar/lib/crawlfs.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/crawlfs.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"crawlfs.js","sourceRoot":"","sources":["../src/crawlfs.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;AAmBA,8CAUC;AAED,sBA8BC;AA7DD,+BAAiC;AACjC,+BAAqC;AAErC,8DAA8B;AAE9B,2CAA6B;AAG7B,MAAM,IAAI,GAAG,IAAA,gBAAS,EAAC,WAAK,CAAC,CAAC;AAWvB,KAAK,UAAU,iBAAiB,CAAC,QAAgB;IACtD,MAAM,IAAI,GAAG,MAAM,oBAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,CAAC;IACtC,IAAI,IAAI,CAAC,MAAM,EAAE,EAAE,CAAC;QAClB,OAAO,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,CAAC;IAChC,CAAC;SAAM,IAAI,IAAI,CAAC,WAAW,EAAE,EAAE,CAAC;QAC9B,OAAO,EAAE,IAAI,EAAE,WAAW,EAAE,IAAI,EAAE,CAAC;IACrC,CAAC;SAAM,IAAI,IAAI,CAAC,cAAc,EAAE,EAAE,CAAC;QACjC,OAAO,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,CAAC;IAChC,CAAC;IACD,OAAO,IAAI,CAAC;AACd,CAAC;AAEM,KAAK,UAAU,KAAK,CAAC,GAAW,EAAE,OAAiB;IACxD,MAAM,QAAQ,GAAoC,EAAE,CAAC;IACrD,MAAM,OAAO,GAAG,MAAM,IAAI,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;IACzC,MAAM,OAAO,GAAG,MAAM,OAAO,CAAC,GAAG,CAC/B,OAAO,CAAC,GAAG,CAAC,KAAK,EAAE,QAAQ,EAAE,EAAE,CAAC,CAAC,QAAQ,EAAE,MAAM,iBAAiB,CAAC,QAAQ,CAAC,CAAU,CAAC,CACxF,CAAC;IACF,MAAM,KAAK,GAAa,EAAE,CAAC;IAC3B,MAAM,SAAS,GAAG,OAAO;SACtB,GAAG,CAAC,CAAC,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE,EAAE;QACxB,IAAI,IAAI,EAAE,CAAC;YACT,QAAQ,CAAC,QAAQ,CAAC,GAAG,IAAI,CAAC;YAC1B,IAAI,IAAI,CAAC,IAAI,KAAK,MAAM;gBAAE,KAAK,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QACjD,CAAC;QACD,OAAO,QAAQ,CAAC;IAClB,CAAC,CAAC;SACD,MAAM,CAAC,CAAC,QAAQ,EAAE,EAAE;QACnB,qEAAqE;QACrE,qEAAqE;QACrE,MAAM,cAAc,GAAG,KAAK,CAAC,SAAS,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,QAAQ,KAAK,IAAI,CAAC,CAAC;QACpE,OAAO,KAAK,CAAC,KAAK,CAAC,CAAC,IAAI,EAAE,KAAK,EAAE,EAAE;YACjC,IAAI,KAAK,KAAK,cAAc,EAAE,CAAC;gBAC7B,OAAO,IAAI,CAAC;YACd,CAAC;YACD,MAAM,sBAAsB,GAAG,QAAQ,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;YACzD,uFAAuF;YACvF,MAAM,YAAY,GAAG,IAAI,CAAC,QAAQ,CAAC,IAAI,EAAE,IAAI,CAAC,OAAO,CAAC,QAAQ,CAAC,CAAC,CAAC;YACjE,OAAO,CAAC,sBAAsB,IAAI,YAAY,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;QAClE,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;IACL,OAAO,CAAC,SAAS,EAAE,QAAQ,CAAU,CAAC;AACxC,CAAC"}
|
||||
44
node_modules/@electron/asar/lib/disk.d.ts
generated
vendored
Normal file
44
node_modules/@electron/asar/lib/disk.d.ts
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import { Filesystem, FilesystemFileEntry } from './filesystem';
|
||||
import { CrawledFileType } from './crawlfs';
|
||||
import { Stats } from 'fs';
|
||||
export type InputMetadata = {
|
||||
[property: string]: CrawledFileType;
|
||||
};
|
||||
export type BasicFilesArray = {
|
||||
filename: string;
|
||||
unpack: boolean;
|
||||
}[];
|
||||
export type BasicStreamArray = {
|
||||
filename: string;
|
||||
streamGenerator: () => NodeJS.ReadableStream;
|
||||
mode: Stats['mode'];
|
||||
unpack: boolean;
|
||||
link: string | undefined;
|
||||
}[];
|
||||
export type FilesystemFilesAndLinks<T extends BasicFilesArray | BasicStreamArray> = {
|
||||
files: T;
|
||||
links: T;
|
||||
};
|
||||
export declare function writeFilesystem(dest: string, filesystem: Filesystem, lists: FilesystemFilesAndLinks<BasicFilesArray>, metadata: InputMetadata): Promise<NodeJS.WritableStream>;
|
||||
export declare function streamFilesystem(dest: string, filesystem: Filesystem, lists: FilesystemFilesAndLinks<BasicStreamArray>): Promise<import("fs").WriteStream>;
|
||||
export interface FileRecord extends FilesystemFileEntry {
|
||||
integrity: {
|
||||
hash: string;
|
||||
algorithm: 'SHA256';
|
||||
blocks: string[];
|
||||
blockSize: number;
|
||||
};
|
||||
}
|
||||
export type DirectoryRecord = {
|
||||
files: Record<string, DirectoryRecord | FileRecord>;
|
||||
};
|
||||
export type ArchiveHeader = {
|
||||
header: DirectoryRecord;
|
||||
headerString: string;
|
||||
headerSize: number;
|
||||
};
|
||||
export declare function readArchiveHeaderSync(archivePath: string): ArchiveHeader;
|
||||
export declare function readFilesystemSync(archivePath: string): Filesystem;
|
||||
export declare function uncacheFilesystem(archivePath: string): boolean;
|
||||
export declare function uncacheAll(): void;
|
||||
export declare function readFileSync(filesystem: Filesystem, filename: string, info: FilesystemFileEntry): Buffer;
|
||||
219
node_modules/@electron/asar/lib/disk.js
generated
vendored
Normal file
219
node_modules/@electron/asar/lib/disk.js
generated
vendored
Normal file
@@ -0,0 +1,219 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __asyncValues = (this && this.__asyncValues) || function (o) {
|
||||
if (!Symbol.asyncIterator) throw new TypeError("Symbol.asyncIterator is not defined.");
|
||||
var m = o[Symbol.asyncIterator], i;
|
||||
return m ? m.call(o) : (o = typeof __values === "function" ? __values(o) : o[Symbol.iterator](), i = {}, verb("next"), verb("throw"), verb("return"), i[Symbol.asyncIterator] = function () { return this; }, i);
|
||||
function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }
|
||||
function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.writeFilesystem = writeFilesystem;
|
||||
exports.streamFilesystem = streamFilesystem;
|
||||
exports.readArchiveHeaderSync = readArchiveHeaderSync;
|
||||
exports.readFilesystemSync = readFilesystemSync;
|
||||
exports.uncacheFilesystem = uncacheFilesystem;
|
||||
exports.uncacheAll = uncacheAll;
|
||||
exports.readFileSync = readFileSync;
|
||||
const path = __importStar(require("path"));
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const pickle_1 = require("./pickle");
|
||||
const filesystem_1 = require("./filesystem");
|
||||
const util_1 = require("util");
|
||||
const stream = __importStar(require("stream"));
|
||||
const pipeline = (0, util_1.promisify)(stream.pipeline);
|
||||
let filesystemCache = Object.create(null);
|
||||
async function copyFile(dest, src, filename) {
|
||||
const srcFile = path.join(src, filename);
|
||||
const targetFile = path.join(dest, filename);
|
||||
const [content, stats] = await Promise.all([
|
||||
wrapped_fs_1.default.readFile(srcFile),
|
||||
wrapped_fs_1.default.stat(srcFile),
|
||||
wrapped_fs_1.default.mkdirp(path.dirname(targetFile)),
|
||||
]);
|
||||
return wrapped_fs_1.default.writeFile(targetFile, content, { mode: stats.mode });
|
||||
}
|
||||
async function streamTransformedFile(stream, outStream) {
|
||||
return new Promise((resolve, reject) => {
|
||||
stream.pipe(outStream, { end: false });
|
||||
stream.on('error', reject);
|
||||
stream.on('end', () => resolve());
|
||||
});
|
||||
}
|
||||
const writeFileListToStream = async function (dest, filesystem, out, lists, metadata) {
|
||||
const { files, links } = lists;
|
||||
for (const file of files) {
|
||||
if (file.unpack) {
|
||||
// the file should not be packed into archive
|
||||
const filename = path.relative(filesystem.getRootPath(), file.filename);
|
||||
await copyFile(`${dest}.unpacked`, filesystem.getRootPath(), filename);
|
||||
}
|
||||
else {
|
||||
const transformed = metadata[file.filename].transformed;
|
||||
const stream = wrapped_fs_1.default.createReadStream(transformed ? transformed.path : file.filename);
|
||||
await streamTransformedFile(stream, out);
|
||||
}
|
||||
}
|
||||
for (const file of links.filter((f) => f.unpack)) {
|
||||
// the symlink needs to be recreated outside in .unpacked
|
||||
const filename = path.relative(filesystem.getRootPath(), file.filename);
|
||||
const link = await wrapped_fs_1.default.readlink(file.filename);
|
||||
await createSymlink(dest, filename, link);
|
||||
}
|
||||
return out.end();
|
||||
};
|
||||
async function writeFilesystem(dest, filesystem, lists, metadata) {
|
||||
const out = await createFilesystemWriteStream(filesystem, dest);
|
||||
return writeFileListToStream(dest, filesystem, out, lists, metadata);
|
||||
}
|
||||
async function streamFilesystem(dest, filesystem, lists) {
|
||||
var _a, e_1, _b, _c;
|
||||
const out = await createFilesystemWriteStream(filesystem, dest);
|
||||
const { files, links } = lists;
|
||||
try {
|
||||
for (var _d = true, files_1 = __asyncValues(files), files_1_1; files_1_1 = await files_1.next(), _a = files_1_1.done, !_a; _d = true) {
|
||||
_c = files_1_1.value;
|
||||
_d = false;
|
||||
const file = _c;
|
||||
// the file should not be packed into archive
|
||||
if (file.unpack) {
|
||||
const targetFile = path.join(`${dest}.unpacked`, file.filename);
|
||||
await wrapped_fs_1.default.mkdirp(path.dirname(targetFile));
|
||||
const writeStream = wrapped_fs_1.default.createWriteStream(targetFile, { mode: file.mode });
|
||||
await pipeline(file.streamGenerator(), writeStream);
|
||||
}
|
||||
else {
|
||||
await streamTransformedFile(file.streamGenerator(), out);
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (e_1_1) { e_1 = { error: e_1_1 }; }
|
||||
finally {
|
||||
try {
|
||||
if (!_d && !_a && (_b = files_1.return)) await _b.call(files_1);
|
||||
}
|
||||
finally { if (e_1) throw e_1.error; }
|
||||
}
|
||||
for (const file of links.filter((f) => f.unpack && f.link)) {
|
||||
// the symlink needs to be recreated outside in .unpacked
|
||||
await createSymlink(dest, file.filename, file.link);
|
||||
}
|
||||
return out.end();
|
||||
}
|
||||
function readArchiveHeaderSync(archivePath) {
|
||||
const fd = wrapped_fs_1.default.openSync(archivePath, 'r');
|
||||
let size;
|
||||
let headerBuf;
|
||||
try {
|
||||
const sizeBuf = Buffer.alloc(8);
|
||||
if (wrapped_fs_1.default.readSync(fd, sizeBuf, 0, 8, null) !== 8) {
|
||||
throw new Error('Unable to read header size');
|
||||
}
|
||||
const sizePickle = pickle_1.Pickle.createFromBuffer(sizeBuf);
|
||||
size = sizePickle.createIterator().readUInt32();
|
||||
headerBuf = Buffer.alloc(size);
|
||||
if (wrapped_fs_1.default.readSync(fd, headerBuf, 0, size, null) !== size) {
|
||||
throw new Error('Unable to read header');
|
||||
}
|
||||
}
|
||||
finally {
|
||||
wrapped_fs_1.default.closeSync(fd);
|
||||
}
|
||||
const headerPickle = pickle_1.Pickle.createFromBuffer(headerBuf);
|
||||
const header = headerPickle.createIterator().readString();
|
||||
return { headerString: header, header: JSON.parse(header), headerSize: size };
|
||||
}
|
||||
function readFilesystemSync(archivePath) {
|
||||
if (!filesystemCache[archivePath]) {
|
||||
const header = readArchiveHeaderSync(archivePath);
|
||||
const filesystem = new filesystem_1.Filesystem(archivePath);
|
||||
filesystem.setHeader(header.header, header.headerSize);
|
||||
filesystemCache[archivePath] = filesystem;
|
||||
}
|
||||
return filesystemCache[archivePath];
|
||||
}
|
||||
function uncacheFilesystem(archivePath) {
|
||||
if (filesystemCache[archivePath]) {
|
||||
filesystemCache[archivePath] = undefined;
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
function uncacheAll() {
|
||||
filesystemCache = {};
|
||||
}
|
||||
function readFileSync(filesystem, filename, info) {
|
||||
let buffer = Buffer.alloc(info.size);
|
||||
if (info.size <= 0) {
|
||||
return buffer;
|
||||
}
|
||||
if (info.unpacked) {
|
||||
// it's an unpacked file, copy it.
|
||||
buffer = wrapped_fs_1.default.readFileSync(path.join(`${filesystem.getRootPath()}.unpacked`, filename));
|
||||
}
|
||||
else {
|
||||
// Node throws an exception when reading 0 bytes into a 0-size buffer,
|
||||
// so we short-circuit the read in this case.
|
||||
const fd = wrapped_fs_1.default.openSync(filesystem.getRootPath(), 'r');
|
||||
try {
|
||||
const offset = 8 + filesystem.getHeaderSize() + parseInt(info.offset);
|
||||
wrapped_fs_1.default.readSync(fd, buffer, 0, info.size, offset);
|
||||
}
|
||||
finally {
|
||||
wrapped_fs_1.default.closeSync(fd);
|
||||
}
|
||||
}
|
||||
return buffer;
|
||||
}
|
||||
async function createFilesystemWriteStream(filesystem, dest) {
|
||||
const headerPickle = pickle_1.Pickle.createEmpty();
|
||||
headerPickle.writeString(JSON.stringify(filesystem.getHeader()));
|
||||
const headerBuf = headerPickle.toBuffer();
|
||||
const sizePickle = pickle_1.Pickle.createEmpty();
|
||||
sizePickle.writeUInt32(headerBuf.length);
|
||||
const sizeBuf = sizePickle.toBuffer();
|
||||
const out = wrapped_fs_1.default.createWriteStream(dest);
|
||||
await new Promise((resolve, reject) => {
|
||||
out.on('error', reject);
|
||||
out.write(sizeBuf);
|
||||
return out.write(headerBuf, () => resolve());
|
||||
});
|
||||
return out;
|
||||
}
|
||||
async function createSymlink(dest, filepath, link) {
|
||||
// if symlink is within subdirectories, then we need to recreate dir structure
|
||||
await wrapped_fs_1.default.mkdirp(path.join(`${dest}.unpacked`, path.dirname(filepath)));
|
||||
// create symlink within unpacked dir
|
||||
await wrapped_fs_1.default.symlink(link, path.join(`${dest}.unpacked`, filepath)).catch(async (error) => {
|
||||
if (error.code === 'EPERM' && error.syscall === 'symlink') {
|
||||
throw new Error('Could not create symlinks for unpacked assets. On Windows, consider activating Developer Mode to allow non-admin users to create symlinks by following the instructions at https://docs.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development.');
|
||||
}
|
||||
throw error;
|
||||
});
|
||||
}
|
||||
//# sourceMappingURL=disk.js.map
|
||||
1
node_modules/@electron/asar/lib/disk.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/disk.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
44
node_modules/@electron/asar/lib/filesystem.d.ts
generated
vendored
Normal file
44
node_modules/@electron/asar/lib/filesystem.d.ts
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import { FileIntegrity } from './integrity';
|
||||
import { CrawledFileType } from './crawlfs';
|
||||
export type EntryMetadata = {
|
||||
unpacked?: boolean;
|
||||
};
|
||||
export type FilesystemDirectoryEntry = {
|
||||
files: Record<string, FilesystemEntry>;
|
||||
} & EntryMetadata;
|
||||
export type FilesystemFileEntry = {
|
||||
unpacked: boolean;
|
||||
executable: boolean;
|
||||
offset: string;
|
||||
size: number;
|
||||
integrity: FileIntegrity;
|
||||
} & EntryMetadata;
|
||||
export type FilesystemLinkEntry = {
|
||||
link: string;
|
||||
} & EntryMetadata;
|
||||
export type FilesystemEntry = FilesystemDirectoryEntry | FilesystemFileEntry | FilesystemLinkEntry;
|
||||
export declare class Filesystem {
|
||||
private src;
|
||||
private header;
|
||||
private headerSize;
|
||||
private offset;
|
||||
constructor(src: string);
|
||||
getRootPath(): string;
|
||||
getHeader(): FilesystemEntry;
|
||||
getHeaderSize(): number;
|
||||
setHeader(header: FilesystemEntry, headerSize: number): void;
|
||||
searchNodeFromDirectory(p: string): FilesystemEntry;
|
||||
searchNodeFromPath(p: string): FilesystemEntry;
|
||||
insertDirectory(p: string, shouldUnpack: boolean): Record<string, FilesystemEntry>;
|
||||
insertFile(p: string, streamGenerator: () => NodeJS.ReadableStream, shouldUnpack: boolean, file: CrawledFileType, options?: {
|
||||
transform?: (filePath: string) => NodeJS.ReadWriteStream | void;
|
||||
}): Promise<void>;
|
||||
insertLink(p: string, shouldUnpack: boolean, parentPath?: string, symlink?: string, // /var/tmp => /private/var
|
||||
src?: string): string;
|
||||
private resolveLink;
|
||||
listFiles(options?: {
|
||||
isPack: boolean;
|
||||
}): string[];
|
||||
getNode(p: string, followLinks?: boolean): FilesystemEntry;
|
||||
getFile(p: string, followLinks?: boolean): FilesystemEntry;
|
||||
}
|
||||
199
node_modules/@electron/asar/lib/filesystem.js
generated
vendored
Normal file
199
node_modules/@electron/asar/lib/filesystem.js
generated
vendored
Normal file
@@ -0,0 +1,199 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Filesystem = void 0;
|
||||
const os = __importStar(require("os"));
|
||||
const path = __importStar(require("path"));
|
||||
const util_1 = require("util");
|
||||
const stream = __importStar(require("stream"));
|
||||
const integrity_1 = require("./integrity");
|
||||
const wrapped_fs_1 = __importDefault(require("./wrapped-fs"));
|
||||
const UINT32_MAX = 2 ** 32 - 1;
|
||||
const pipeline = (0, util_1.promisify)(stream.pipeline);
|
||||
class Filesystem {
|
||||
constructor(src) {
|
||||
this.src = path.resolve(src);
|
||||
this.header = { files: Object.create(null) };
|
||||
this.headerSize = 0;
|
||||
this.offset = BigInt(0);
|
||||
}
|
||||
getRootPath() {
|
||||
return this.src;
|
||||
}
|
||||
getHeader() {
|
||||
return this.header;
|
||||
}
|
||||
getHeaderSize() {
|
||||
return this.headerSize;
|
||||
}
|
||||
setHeader(header, headerSize) {
|
||||
this.header = header;
|
||||
this.headerSize = headerSize;
|
||||
}
|
||||
searchNodeFromDirectory(p) {
|
||||
let json = this.header;
|
||||
const dirs = p.split(path.sep);
|
||||
for (const dir of dirs) {
|
||||
if (dir !== '.') {
|
||||
if ('files' in json) {
|
||||
if (!json.files[dir]) {
|
||||
json.files[dir] = { files: Object.create(null) };
|
||||
}
|
||||
json = json.files[dir];
|
||||
}
|
||||
else {
|
||||
throw new Error('Unexpected directory state while traversing: ' + p);
|
||||
}
|
||||
}
|
||||
}
|
||||
return json;
|
||||
}
|
||||
searchNodeFromPath(p) {
|
||||
p = path.relative(this.src, p);
|
||||
if (!p) {
|
||||
return this.header;
|
||||
}
|
||||
const name = path.basename(p);
|
||||
const node = this.searchNodeFromDirectory(path.dirname(p));
|
||||
if (!node.files) {
|
||||
node.files = Object.create(null);
|
||||
}
|
||||
if (!node.files[name]) {
|
||||
node.files[name] = Object.create(null);
|
||||
}
|
||||
return node.files[name];
|
||||
}
|
||||
insertDirectory(p, shouldUnpack) {
|
||||
const node = this.searchNodeFromPath(p);
|
||||
if (shouldUnpack) {
|
||||
node.unpacked = shouldUnpack;
|
||||
}
|
||||
node.files = node.files || Object.create(null);
|
||||
return node.files;
|
||||
}
|
||||
async insertFile(p, streamGenerator, shouldUnpack, file, options = {}) {
|
||||
const dirNode = this.searchNodeFromPath(path.dirname(p));
|
||||
const node = this.searchNodeFromPath(p);
|
||||
if (shouldUnpack || dirNode.unpacked) {
|
||||
node.size = file.stat.size;
|
||||
node.unpacked = true;
|
||||
node.integrity = await (0, integrity_1.getFileIntegrity)(streamGenerator());
|
||||
return Promise.resolve();
|
||||
}
|
||||
let size;
|
||||
const transformed = options.transform && options.transform(p);
|
||||
if (transformed) {
|
||||
const tmpdir = await wrapped_fs_1.default.mkdtemp(path.join(os.tmpdir(), 'asar-'));
|
||||
const tmpfile = path.join(tmpdir, path.basename(p));
|
||||
const out = wrapped_fs_1.default.createWriteStream(tmpfile);
|
||||
await pipeline(streamGenerator(), transformed, out);
|
||||
file.transformed = {
|
||||
path: tmpfile,
|
||||
stat: await wrapped_fs_1.default.lstat(tmpfile),
|
||||
};
|
||||
size = file.transformed.stat.size;
|
||||
}
|
||||
else {
|
||||
size = file.stat.size;
|
||||
}
|
||||
// JavaScript cannot precisely present integers >= UINT32_MAX.
|
||||
if (size > UINT32_MAX) {
|
||||
throw new Error(`${p}: file size can not be larger than 4.2GB`);
|
||||
}
|
||||
node.size = size;
|
||||
node.offset = this.offset.toString();
|
||||
node.integrity = await (0, integrity_1.getFileIntegrity)(streamGenerator());
|
||||
if (process.platform !== 'win32' && file.stat.mode & 0o100) {
|
||||
node.executable = true;
|
||||
}
|
||||
this.offset += BigInt(size);
|
||||
}
|
||||
insertLink(p, shouldUnpack, parentPath = wrapped_fs_1.default.realpathSync(path.dirname(p)), symlink = wrapped_fs_1.default.readlinkSync(p), // /var/tmp => /private/var
|
||||
src = wrapped_fs_1.default.realpathSync(this.src)) {
|
||||
const link = this.resolveLink(src, parentPath, symlink);
|
||||
if (link.startsWith('..')) {
|
||||
throw new Error(`${p}: file "${link}" links out of the package`);
|
||||
}
|
||||
const node = this.searchNodeFromPath(p);
|
||||
const dirNode = this.searchNodeFromPath(path.dirname(p));
|
||||
if (shouldUnpack || dirNode.unpacked) {
|
||||
node.unpacked = true;
|
||||
}
|
||||
node.link = link;
|
||||
return link;
|
||||
}
|
||||
resolveLink(src, parentPath, symlink) {
|
||||
const target = path.join(parentPath, symlink);
|
||||
const link = path.relative(src, target);
|
||||
return link;
|
||||
}
|
||||
listFiles(options) {
|
||||
const files = [];
|
||||
const fillFilesFromMetadata = function (basePath, metadata) {
|
||||
if (!('files' in metadata)) {
|
||||
return;
|
||||
}
|
||||
for (const [childPath, childMetadata] of Object.entries(metadata.files)) {
|
||||
const fullPath = path.join(basePath, childPath);
|
||||
const packState = 'unpacked' in childMetadata && childMetadata.unpacked ? 'unpack' : 'pack ';
|
||||
files.push(options && options.isPack ? `${packState} : ${fullPath}` : fullPath);
|
||||
fillFilesFromMetadata(fullPath, childMetadata);
|
||||
}
|
||||
};
|
||||
fillFilesFromMetadata('/', this.header);
|
||||
return files;
|
||||
}
|
||||
getNode(p, followLinks = true) {
|
||||
const node = this.searchNodeFromDirectory(path.dirname(p));
|
||||
const name = path.basename(p);
|
||||
if ('link' in node && followLinks) {
|
||||
return this.getNode(path.join(node.link, name));
|
||||
}
|
||||
if (name) {
|
||||
return node.files[name];
|
||||
}
|
||||
else {
|
||||
return node;
|
||||
}
|
||||
}
|
||||
getFile(p, followLinks = true) {
|
||||
const info = this.getNode(p, followLinks);
|
||||
if (!info) {
|
||||
throw new Error(`"${p}" was not found in this archive`);
|
||||
}
|
||||
// if followLinks is false we don't resolve symlinks
|
||||
if ('link' in info && followLinks) {
|
||||
return this.getFile(info.link, followLinks);
|
||||
}
|
||||
else {
|
||||
return info;
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.Filesystem = Filesystem;
|
||||
//# sourceMappingURL=filesystem.js.map
|
||||
1
node_modules/@electron/asar/lib/filesystem.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/filesystem.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
7
node_modules/@electron/asar/lib/integrity.d.ts
generated
vendored
Normal file
7
node_modules/@electron/asar/lib/integrity.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
export type FileIntegrity = {
|
||||
algorithm: 'SHA256';
|
||||
hash: string;
|
||||
blockSize: number;
|
||||
blocks: string[];
|
||||
};
|
||||
export declare function getFileIntegrity(inputFileStream: NodeJS.ReadableStream): Promise<FileIntegrity>;
|
||||
75
node_modules/@electron/asar/lib/integrity.js
generated
vendored
Normal file
75
node_modules/@electron/asar/lib/integrity.js
generated
vendored
Normal file
@@ -0,0 +1,75 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.getFileIntegrity = getFileIntegrity;
|
||||
const crypto = __importStar(require("crypto"));
|
||||
const stream = __importStar(require("stream"));
|
||||
const util_1 = require("util");
|
||||
const ALGORITHM = 'SHA256';
|
||||
// 4MB default block size
|
||||
const BLOCK_SIZE = 4 * 1024 * 1024;
|
||||
const pipeline = (0, util_1.promisify)(stream.pipeline);
|
||||
function hashBlock(block) {
|
||||
return crypto.createHash(ALGORITHM).update(block).digest('hex');
|
||||
}
|
||||
async function getFileIntegrity(inputFileStream) {
|
||||
const fileHash = crypto.createHash(ALGORITHM);
|
||||
const blockHashes = [];
|
||||
let currentBlockSize = 0;
|
||||
let currentBlock = [];
|
||||
await pipeline(inputFileStream, new stream.PassThrough({
|
||||
decodeStrings: false,
|
||||
transform(_chunk, encoding, callback) {
|
||||
fileHash.update(_chunk);
|
||||
function handleChunk(chunk) {
|
||||
const diffToSlice = Math.min(BLOCK_SIZE - currentBlockSize, chunk.byteLength);
|
||||
currentBlockSize += diffToSlice;
|
||||
currentBlock.push(chunk.slice(0, diffToSlice));
|
||||
if (currentBlockSize === BLOCK_SIZE) {
|
||||
blockHashes.push(hashBlock(Buffer.concat(currentBlock)));
|
||||
currentBlock = [];
|
||||
currentBlockSize = 0;
|
||||
}
|
||||
if (diffToSlice < chunk.byteLength) {
|
||||
handleChunk(chunk.slice(diffToSlice));
|
||||
}
|
||||
}
|
||||
handleChunk(_chunk);
|
||||
callback();
|
||||
},
|
||||
flush(callback) {
|
||||
blockHashes.push(hashBlock(Buffer.concat(currentBlock)));
|
||||
currentBlock = [];
|
||||
callback();
|
||||
},
|
||||
}));
|
||||
return {
|
||||
algorithm: ALGORITHM,
|
||||
hash: fileHash.digest('hex'),
|
||||
blockSize: BLOCK_SIZE,
|
||||
blocks: blockHashes,
|
||||
};
|
||||
}
|
||||
//# sourceMappingURL=integrity.js.map
|
||||
1
node_modules/@electron/asar/lib/integrity.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/integrity.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"integrity.js","sourceRoot":"","sources":["../src/integrity.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;AAsBA,4CA8CC;AApED,+CAAiC;AAEjC,+CAAiC;AACjC,+BAAiC;AAEjC,MAAM,SAAS,GAAG,QAAQ,CAAC;AAC3B,yBAAyB;AACzB,MAAM,UAAU,GAAG,CAAC,GAAG,IAAI,GAAG,IAAI,CAAC;AAEnC,MAAM,QAAQ,GAAG,IAAA,gBAAS,EAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;AAE5C,SAAS,SAAS,CAAC,KAAa;IAC9B,OAAO,MAAM,CAAC,UAAU,CAAC,SAAS,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;AAClE,CAAC;AASM,KAAK,UAAU,gBAAgB,CACpC,eAAsC;IAEtC,MAAM,QAAQ,GAAG,MAAM,CAAC,UAAU,CAAC,SAAS,CAAC,CAAC;IAE9C,MAAM,WAAW,GAAa,EAAE,CAAC;IACjC,IAAI,gBAAgB,GAAG,CAAC,CAAC;IACzB,IAAI,YAAY,GAAa,EAAE,CAAC;IAEhC,MAAM,QAAQ,CACZ,eAAe,EACf,IAAI,MAAM,CAAC,WAAW,CAAC;QACrB,aAAa,EAAE,KAAK;QACpB,SAAS,CAAC,MAAc,EAAE,QAAQ,EAAE,QAAQ;YAC1C,QAAQ,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC;YAExB,SAAS,WAAW,CAAC,KAAa;gBAChC,MAAM,WAAW,GAAG,IAAI,CAAC,GAAG,CAAC,UAAU,GAAG,gBAAgB,EAAE,KAAK,CAAC,UAAU,CAAC,CAAC;gBAC9E,gBAAgB,IAAI,WAAW,CAAC;gBAChC,YAAY,CAAC,IAAI,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,EAAE,WAAW,CAAC,CAAC,CAAC;gBAC/C,IAAI,gBAAgB,KAAK,UAAU,EAAE,CAAC;oBACpC,WAAW,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC;oBACzD,YAAY,GAAG,EAAE,CAAC;oBAClB,gBAAgB,GAAG,CAAC,CAAC;gBACvB,CAAC;gBACD,IAAI,WAAW,GAAG,KAAK,CAAC,UAAU,EAAE,CAAC;oBACnC,WAAW,CAAC,KAAK,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC,CAAC;gBACxC,CAAC;YACH,CAAC;YACD,WAAW,CAAC,MAAM,CAAC,CAAC;YACpB,QAAQ,EAAE,CAAC;QACb,CAAC;QACD,KAAK,CAAC,QAAQ;YACZ,WAAW,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC;YACzD,YAAY,GAAG,EAAE,CAAC;YAClB,QAAQ,EAAE,CAAC;QACb,CAAC;KACF,CAAC,CACH,CAAC;IAEF,OAAO;QACL,SAAS,EAAE,SAAS;QACpB,IAAI,EAAE,QAAQ,CAAC,MAAM,CAAC,KAAK,CAAC;QAC5B,SAAS,EAAE,UAAU;QACrB,MAAM,EAAE,WAAW;KACpB,CAAC;AACJ,CAAC"}
|
||||
46
node_modules/@electron/asar/lib/pickle.d.ts
generated
vendored
Normal file
46
node_modules/@electron/asar/lib/pickle.d.ts
generated
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
declare class PickleIterator {
|
||||
private payload;
|
||||
private payloadOffset;
|
||||
private readIndex;
|
||||
private endIndex;
|
||||
constructor(pickle: Pickle);
|
||||
readBool(): boolean;
|
||||
readInt(): number;
|
||||
readUInt32(): number;
|
||||
readInt64(): bigint;
|
||||
readUInt64(): bigint;
|
||||
readFloat(): number;
|
||||
readDouble(): number;
|
||||
readString(): string;
|
||||
readBytes(length: number): Buffer;
|
||||
readBytes<R, F extends (...args: any[]) => R>(length: number, method: F): R;
|
||||
getReadPayloadOffsetAndAdvance(length: number): number;
|
||||
advance(size: number): void;
|
||||
}
|
||||
export declare class Pickle {
|
||||
private header;
|
||||
private headerSize;
|
||||
private capacityAfterHeader;
|
||||
private writeOffset;
|
||||
private constructor();
|
||||
static createEmpty(): Pickle;
|
||||
static createFromBuffer(buffer: Buffer): Pickle;
|
||||
getHeader(): Buffer;
|
||||
getHeaderSize(): number;
|
||||
createIterator(): PickleIterator;
|
||||
toBuffer(): Buffer;
|
||||
writeBool(value: boolean): boolean;
|
||||
writeInt(value: number): boolean;
|
||||
writeUInt32(value: number): boolean;
|
||||
writeInt64(value: number): boolean;
|
||||
writeUInt64(value: number): boolean;
|
||||
writeFloat(value: number): boolean;
|
||||
writeDouble(value: number): boolean;
|
||||
writeString(value: string): boolean;
|
||||
setPayloadSize(payloadSize: number): number;
|
||||
getPayloadSize(): number;
|
||||
writeBytes(data: string, length: number, method?: undefined): boolean;
|
||||
writeBytes(data: number | BigInt, length: number, method: Function): boolean;
|
||||
resize(newCapacity: number): void;
|
||||
}
|
||||
export {};
|
||||
199
node_modules/@electron/asar/lib/pickle.js
generated
vendored
Normal file
199
node_modules/@electron/asar/lib/pickle.js
generated
vendored
Normal file
@@ -0,0 +1,199 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Pickle = void 0;
|
||||
// sizeof(T).
|
||||
const SIZE_INT32 = 4;
|
||||
const SIZE_UINT32 = 4;
|
||||
const SIZE_INT64 = 8;
|
||||
const SIZE_UINT64 = 8;
|
||||
const SIZE_FLOAT = 4;
|
||||
const SIZE_DOUBLE = 8;
|
||||
// The allocation granularity of the payload.
|
||||
const PAYLOAD_UNIT = 64;
|
||||
// Largest JS number.
|
||||
const CAPACITY_READ_ONLY = 9007199254740992;
|
||||
// Aligns 'i' by rounding it up to the next multiple of 'alignment'.
|
||||
const alignInt = function (i, alignment) {
|
||||
return i + ((alignment - (i % alignment)) % alignment);
|
||||
};
|
||||
// PickleIterator reads data from a Pickle. The Pickle object must remain valid
|
||||
// while the PickleIterator object is in use.
|
||||
class PickleIterator {
|
||||
constructor(pickle) {
|
||||
this.payload = pickle.getHeader();
|
||||
this.payloadOffset = pickle.getHeaderSize();
|
||||
this.readIndex = 0;
|
||||
this.endIndex = pickle.getPayloadSize();
|
||||
}
|
||||
readBool() {
|
||||
return this.readInt() !== 0;
|
||||
}
|
||||
readInt() {
|
||||
return this.readBytes(SIZE_INT32, Buffer.prototype.readInt32LE);
|
||||
}
|
||||
readUInt32() {
|
||||
return this.readBytes(SIZE_UINT32, Buffer.prototype.readUInt32LE);
|
||||
}
|
||||
readInt64() {
|
||||
return this.readBytes(SIZE_INT64, Buffer.prototype.readBigInt64LE);
|
||||
}
|
||||
readUInt64() {
|
||||
return this.readBytes(SIZE_UINT64, Buffer.prototype.readBigUInt64LE);
|
||||
}
|
||||
readFloat() {
|
||||
return this.readBytes(SIZE_FLOAT, Buffer.prototype.readFloatLE);
|
||||
}
|
||||
readDouble() {
|
||||
return this.readBytes(SIZE_DOUBLE, Buffer.prototype.readDoubleLE);
|
||||
}
|
||||
readString() {
|
||||
return this.readBytes(this.readInt()).toString();
|
||||
}
|
||||
readBytes(length, method) {
|
||||
const readPayloadOffset = this.getReadPayloadOffsetAndAdvance(length);
|
||||
if (method != null) {
|
||||
return method.call(this.payload, readPayloadOffset, length);
|
||||
}
|
||||
else {
|
||||
return this.payload.slice(readPayloadOffset, readPayloadOffset + length);
|
||||
}
|
||||
}
|
||||
getReadPayloadOffsetAndAdvance(length) {
|
||||
if (length > this.endIndex - this.readIndex) {
|
||||
this.readIndex = this.endIndex;
|
||||
throw new Error('Failed to read data with length of ' + length);
|
||||
}
|
||||
const readPayloadOffset = this.payloadOffset + this.readIndex;
|
||||
this.advance(length);
|
||||
return readPayloadOffset;
|
||||
}
|
||||
advance(size) {
|
||||
const alignedSize = alignInt(size, SIZE_UINT32);
|
||||
if (this.endIndex - this.readIndex < alignedSize) {
|
||||
this.readIndex = this.endIndex;
|
||||
}
|
||||
else {
|
||||
this.readIndex += alignedSize;
|
||||
}
|
||||
}
|
||||
}
|
||||
// This class provides facilities for basic binary value packing and unpacking.
|
||||
//
|
||||
// The Pickle class supports appending primitive values (ints, strings, etc.)
|
||||
// to a pickle instance. The Pickle instance grows its internal memory buffer
|
||||
// dynamically to hold the sequence of primitive values. The internal memory
|
||||
// buffer is exposed as the "data" of the Pickle. This "data" can be passed
|
||||
// to a Pickle object to initialize it for reading.
|
||||
//
|
||||
// When reading from a Pickle object, it is important for the consumer to know
|
||||
// what value types to read and in what order to read them as the Pickle does
|
||||
// not keep track of the type of data written to it.
|
||||
//
|
||||
// The Pickle's data has a header which contains the size of the Pickle's
|
||||
// payload. It can optionally support additional space in the header. That
|
||||
// space is controlled by the header_size parameter passed to the Pickle
|
||||
// constructor.
|
||||
class Pickle {
|
||||
constructor(buffer) {
|
||||
if (buffer) {
|
||||
this.header = buffer;
|
||||
this.headerSize = buffer.length - this.getPayloadSize();
|
||||
this.capacityAfterHeader = CAPACITY_READ_ONLY;
|
||||
this.writeOffset = 0;
|
||||
if (this.headerSize > buffer.length) {
|
||||
this.headerSize = 0;
|
||||
}
|
||||
if (this.headerSize !== alignInt(this.headerSize, SIZE_UINT32)) {
|
||||
this.headerSize = 0;
|
||||
}
|
||||
if (this.headerSize === 0) {
|
||||
this.header = Buffer.alloc(0);
|
||||
}
|
||||
}
|
||||
else {
|
||||
this.header = Buffer.alloc(0);
|
||||
this.headerSize = SIZE_UINT32;
|
||||
this.capacityAfterHeader = 0;
|
||||
this.writeOffset = 0;
|
||||
this.resize(PAYLOAD_UNIT);
|
||||
this.setPayloadSize(0);
|
||||
}
|
||||
}
|
||||
static createEmpty() {
|
||||
return new Pickle();
|
||||
}
|
||||
static createFromBuffer(buffer) {
|
||||
return new Pickle(buffer);
|
||||
}
|
||||
getHeader() {
|
||||
return this.header;
|
||||
}
|
||||
getHeaderSize() {
|
||||
return this.headerSize;
|
||||
}
|
||||
createIterator() {
|
||||
return new PickleIterator(this);
|
||||
}
|
||||
toBuffer() {
|
||||
return this.header.slice(0, this.headerSize + this.getPayloadSize());
|
||||
}
|
||||
writeBool(value) {
|
||||
return this.writeInt(value ? 1 : 0);
|
||||
}
|
||||
writeInt(value) {
|
||||
return this.writeBytes(value, SIZE_INT32, Buffer.prototype.writeInt32LE);
|
||||
}
|
||||
writeUInt32(value) {
|
||||
return this.writeBytes(value, SIZE_UINT32, Buffer.prototype.writeUInt32LE);
|
||||
}
|
||||
writeInt64(value) {
|
||||
return this.writeBytes(BigInt(value), SIZE_INT64, Buffer.prototype.writeBigInt64LE);
|
||||
}
|
||||
writeUInt64(value) {
|
||||
return this.writeBytes(BigInt(value), SIZE_UINT64, Buffer.prototype.writeBigUInt64LE);
|
||||
}
|
||||
writeFloat(value) {
|
||||
return this.writeBytes(value, SIZE_FLOAT, Buffer.prototype.writeFloatLE);
|
||||
}
|
||||
writeDouble(value) {
|
||||
return this.writeBytes(value, SIZE_DOUBLE, Buffer.prototype.writeDoubleLE);
|
||||
}
|
||||
writeString(value) {
|
||||
const length = Buffer.byteLength(value, 'utf8');
|
||||
if (!this.writeInt(length)) {
|
||||
return false;
|
||||
}
|
||||
return this.writeBytes(value, length);
|
||||
}
|
||||
setPayloadSize(payloadSize) {
|
||||
return this.header.writeUInt32LE(payloadSize, 0);
|
||||
}
|
||||
getPayloadSize() {
|
||||
return this.header.readUInt32LE(0);
|
||||
}
|
||||
writeBytes(data, length, method) {
|
||||
const dataLength = alignInt(length, SIZE_UINT32);
|
||||
const newSize = this.writeOffset + dataLength;
|
||||
if (newSize > this.capacityAfterHeader) {
|
||||
this.resize(Math.max(this.capacityAfterHeader * 2, newSize));
|
||||
}
|
||||
if (method) {
|
||||
method.call(this.header, data, this.headerSize + this.writeOffset);
|
||||
}
|
||||
else {
|
||||
this.header.write(data, this.headerSize + this.writeOffset, length);
|
||||
}
|
||||
const endOffset = this.headerSize + this.writeOffset + length;
|
||||
this.header.fill(0, endOffset, endOffset + dataLength - length);
|
||||
this.setPayloadSize(newSize);
|
||||
this.writeOffset = newSize;
|
||||
return true;
|
||||
}
|
||||
resize(newCapacity) {
|
||||
newCapacity = alignInt(newCapacity, PAYLOAD_UNIT);
|
||||
this.header = Buffer.concat([this.header, Buffer.alloc(newCapacity)]);
|
||||
this.capacityAfterHeader = newCapacity;
|
||||
}
|
||||
}
|
||||
exports.Pickle = Pickle;
|
||||
//# sourceMappingURL=pickle.js.map
|
||||
1
node_modules/@electron/asar/lib/pickle.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/pickle.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
157
node_modules/@electron/asar/lib/types/glob.d.ts
generated
vendored
Normal file
157
node_modules/@electron/asar/lib/types/glob.d.ts
generated
vendored
Normal file
@@ -0,0 +1,157 @@
|
||||
/**
|
||||
* TODO(erikian): remove this file once we upgrade to the latest `glob` version.
|
||||
* https://github.com/electron/asar/pull/332#issuecomment-2435407933
|
||||
*/
|
||||
interface IMinimatchOptions {
|
||||
/**
|
||||
* Dump a ton of stuff to stderr.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
debug?: boolean | undefined;
|
||||
/**
|
||||
* Do not expand `{a,b}` and `{1..3}` brace sets.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nobrace?: boolean | undefined;
|
||||
/**
|
||||
* Disable `**` matching against multiple folder names.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
noglobstar?: boolean | undefined;
|
||||
/**
|
||||
* Allow patterns to match filenames starting with a period,
|
||||
* even if the pattern does not explicitly have a period in that spot.
|
||||
*
|
||||
* Note that by default, `'a/**' + '/b'` will **not** match `a/.d/b`, unless `dot` is set.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
dot?: boolean | undefined;
|
||||
/**
|
||||
* Disable "extglob" style patterns like `+(a|b)`.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
noext?: boolean | undefined;
|
||||
/**
|
||||
* Perform a case-insensitive match.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nocase?: boolean | undefined;
|
||||
/**
|
||||
* When a match is not found by `minimatch.match`,
|
||||
* return a list containing the pattern itself if this option is set.
|
||||
* Otherwise, an empty list is returned if there are no matches.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nonull?: boolean | undefined;
|
||||
/**
|
||||
* If set, then patterns without slashes will be matched
|
||||
* against the basename of the path if it contains slashes. For example,
|
||||
* `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
matchBase?: boolean | undefined;
|
||||
/**
|
||||
* Suppress the behavior of treating `#` at the start of a pattern as a comment.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nocomment?: boolean | undefined;
|
||||
/**
|
||||
* Suppress the behavior of treating a leading `!` character as negation.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
nonegate?: boolean | undefined;
|
||||
/**
|
||||
* Returns from negate expressions the same as if they were not negated.
|
||||
* (Ie, true on a hit, false on a miss.)
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
flipNegate?: boolean | undefined;
|
||||
/**
|
||||
* Compare a partial path to a pattern. As long as the parts of the path that
|
||||
* are present are not contradicted by the pattern, it will be treated as a
|
||||
* match. This is useful in applications where you're walking through a
|
||||
* folder structure, and don't yet have the full path, but want to ensure that
|
||||
* you do not walk down paths that can never be a match.
|
||||
*
|
||||
* @default false
|
||||
*
|
||||
* @example
|
||||
* import minimatch = require("minimatch");
|
||||
*
|
||||
* minimatch('/a/b', '/a/*' + '/c/d', { partial: true }) // true, might be /a/b/c/d
|
||||
* minimatch('/a/b', '/**' + '/d', { partial: true }) // true, might be /a/b/.../d
|
||||
* minimatch('/x/y/z', '/a/**' + '/z', { partial: true }) // false, because x !== a
|
||||
*/
|
||||
partial?: boolean;
|
||||
/**
|
||||
* Use `\\` as a path separator _only_, and _never_ as an escape
|
||||
* character. If set, all `\\` characters are replaced with `/` in
|
||||
* the pattern. Note that this makes it **impossible** to match
|
||||
* against paths containing literal glob pattern characters, but
|
||||
* allows matching with patterns constructed using `path.join()` and
|
||||
* `path.resolve()` on Windows platforms, mimicking the (buggy!)
|
||||
* behavior of earlier versions on Windows. Please use with
|
||||
* caution, and be mindful of the caveat about Windows paths
|
||||
*
|
||||
* For legacy reasons, this is also set if
|
||||
* `options.allowWindowsEscape` is set to the exact value `false`.
|
||||
*
|
||||
* @default false
|
||||
*/
|
||||
windowsPathsNoEscape?: boolean;
|
||||
}
|
||||
export interface IOptions extends IMinimatchOptions {
|
||||
cwd?: string | undefined;
|
||||
root?: string | undefined;
|
||||
dot?: boolean | undefined;
|
||||
nomount?: boolean | undefined;
|
||||
mark?: boolean | undefined;
|
||||
nosort?: boolean | undefined;
|
||||
stat?: boolean | undefined;
|
||||
silent?: boolean | undefined;
|
||||
strict?: boolean | undefined;
|
||||
cache?: {
|
||||
[path: string]: boolean | 'DIR' | 'FILE' | ReadonlyArray<string>;
|
||||
} | undefined;
|
||||
statCache?: {
|
||||
[path: string]: false | {
|
||||
isDirectory(): boolean;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
symlinks?: {
|
||||
[path: string]: boolean | undefined;
|
||||
} | undefined;
|
||||
realpathCache?: {
|
||||
[path: string]: string;
|
||||
} | undefined;
|
||||
sync?: boolean | undefined;
|
||||
nounique?: boolean | undefined;
|
||||
nonull?: boolean | undefined;
|
||||
debug?: boolean | undefined;
|
||||
nobrace?: boolean | undefined;
|
||||
noglobstar?: boolean | undefined;
|
||||
noext?: boolean | undefined;
|
||||
nocase?: boolean | undefined;
|
||||
matchBase?: any;
|
||||
nodir?: boolean | undefined;
|
||||
ignore?: string | ReadonlyArray<string> | undefined;
|
||||
follow?: boolean | undefined;
|
||||
realpath?: boolean | undefined;
|
||||
nonegate?: boolean | undefined;
|
||||
nocomment?: boolean | undefined;
|
||||
absolute?: boolean | undefined;
|
||||
allowWindowsEscape?: boolean | undefined;
|
||||
fs?: typeof import('fs');
|
||||
}
|
||||
export {};
|
||||
3
node_modules/@electron/asar/lib/types/glob.js
generated
vendored
Normal file
3
node_modules/@electron/asar/lib/types/glob.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=glob.js.map
|
||||
1
node_modules/@electron/asar/lib/types/glob.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/types/glob.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"glob.js","sourceRoot":"","sources":["../../src/types/glob.ts"],"names":[],"mappings":""}
|
||||
13
node_modules/@electron/asar/lib/wrapped-fs.d.ts
generated
vendored
Normal file
13
node_modules/@electron/asar/lib/wrapped-fs.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
type AsarFS = typeof import('fs') & {
|
||||
mkdirp(dir: string): Promise<void>;
|
||||
mkdirpSync(dir: string): void;
|
||||
lstat: (typeof import('fs'))['promises']['lstat'];
|
||||
mkdtemp: (typeof import('fs'))['promises']['mkdtemp'];
|
||||
readFile: (typeof import('fs'))['promises']['readFile'];
|
||||
stat: (typeof import('fs'))['promises']['stat'];
|
||||
writeFile: (typeof import('fs'))['promises']['writeFile'];
|
||||
symlink: (typeof import('fs'))['promises']['symlink'];
|
||||
readlink: (typeof import('fs'))['promises']['readlink'];
|
||||
};
|
||||
declare const promisified: AsarFS;
|
||||
export default promisified;
|
||||
26
node_modules/@electron/asar/lib/wrapped-fs.js
generated
vendored
Normal file
26
node_modules/@electron/asar/lib/wrapped-fs.js
generated
vendored
Normal file
@@ -0,0 +1,26 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = 'electron' in process.versions ? require('original-fs') : require('fs');
|
||||
const promisifiedMethods = [
|
||||
'lstat',
|
||||
'mkdtemp',
|
||||
'readFile',
|
||||
'stat',
|
||||
'writeFile',
|
||||
'symlink',
|
||||
'readlink',
|
||||
];
|
||||
const promisified = {};
|
||||
for (const method of Object.keys(fs)) {
|
||||
if (promisifiedMethods.includes(method)) {
|
||||
promisified[method] = fs.promises[method];
|
||||
}
|
||||
else {
|
||||
promisified[method] = fs[method];
|
||||
}
|
||||
}
|
||||
// To make it more like fs-extra
|
||||
promisified.mkdirp = (dir) => fs.promises.mkdir(dir, { recursive: true });
|
||||
promisified.mkdirpSync = (dir) => fs.mkdirSync(dir, { recursive: true });
|
||||
exports.default = promisified;
|
||||
//# sourceMappingURL=wrapped-fs.js.map
|
||||
1
node_modules/@electron/asar/lib/wrapped-fs.js.map
generated
vendored
Normal file
1
node_modules/@electron/asar/lib/wrapped-fs.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"wrapped-fs.js","sourceRoot":"","sources":["../src/wrapped-fs.ts"],"names":[],"mappings":";;AAAA,MAAM,EAAE,GAAG,UAAU,IAAI,OAAO,CAAC,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;AAEnF,MAAM,kBAAkB,GAAG;IACzB,OAAO;IACP,SAAS;IACT,UAAU;IACV,MAAM;IACN,WAAW;IACX,SAAS;IACT,UAAU;CACX,CAAC;AAcF,MAAM,WAAW,GAAG,EAAY,CAAC;AAEjC,KAAK,MAAM,MAAM,IAAI,MAAM,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,CAAC;IACrC,IAAI,kBAAkB,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;QACvC,WAAmB,CAAC,MAAM,CAAC,GAAG,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC;IACrD,CAAC;SAAM,CAAC;QACL,WAAmB,CAAC,MAAM,CAAC,GAAG,EAAE,CAAC,MAAM,CAAC,CAAC;IAC5C,CAAC;AACH,CAAC;AACD,gCAAgC;AAChC,WAAW,CAAC,MAAM,GAAG,CAAC,GAAG,EAAE,EAAE,CAAC,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,GAAG,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;AAC1E,WAAW,CAAC,UAAU,GAAG,CAAC,GAAG,EAAE,EAAE,CAAC,EAAE,CAAC,SAAS,CAAC,GAAG,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC,CAAC;AAEzE,kBAAe,WAAW,CAAC"}
|
||||
60
node_modules/@electron/asar/package.json
generated
vendored
Normal file
60
node_modules/@electron/asar/package.json
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
{
|
||||
"name": "@electron/asar",
|
||||
"description": "Creating Electron app packages",
|
||||
"version": "3.4.1",
|
||||
"main": "./lib/asar.js",
|
||||
"types": "./lib/asar.d.ts",
|
||||
"bin": {
|
||||
"asar": "./bin/asar.js"
|
||||
},
|
||||
"files": [
|
||||
"bin",
|
||||
"lib"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">=10.12.0"
|
||||
},
|
||||
"license": "MIT",
|
||||
"homepage": "https://github.com/electron/asar",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/electron/asar.git"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/electron/asar/issues"
|
||||
},
|
||||
"publishConfig": {
|
||||
"provenance": true
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"mocha": "xvfb-maybe electron-mocha && mocha",
|
||||
"mocha:update": "mocha --update",
|
||||
"mocha:watch": "mocha --watch",
|
||||
"test": "yarn lint && yarn mocha",
|
||||
"lint": "yarn prettier:check",
|
||||
"prettier": "prettier \"src/**/*.ts\" \"test/**/*.js\" \"*.js\"",
|
||||
"prettier:check": "yarn prettier --check",
|
||||
"prettier:write": "yarn prettier --write",
|
||||
"prepare": "tsc"
|
||||
},
|
||||
"dependencies": {
|
||||
"commander": "^5.0.0",
|
||||
"glob": "^7.1.6",
|
||||
"minimatch": "^3.0.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/minimatch": "^3.0.5",
|
||||
"@types/node": "^12.0.0",
|
||||
"chai": "^4.5.0",
|
||||
"electron": "^22.0.0",
|
||||
"electron-mocha": "^13.0.1",
|
||||
"lodash": "^4.17.15",
|
||||
"mocha": "^10.1.0",
|
||||
"mocha-chai-jest-snapshot": "^1.1.6",
|
||||
"prettier": "^3.3.3",
|
||||
"rimraf": "^3.0.2",
|
||||
"typescript": "^5.5.4",
|
||||
"xvfb-maybe": "^0.2.1"
|
||||
}
|
||||
}
|
||||
21
node_modules/@electron/get/LICENSE
generated
vendored
Normal file
21
node_modules/@electron/get/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) Contributors to the Electron project
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
138
node_modules/@electron/get/README.md
generated
vendored
Normal file
138
node_modules/@electron/get/README.md
generated
vendored
Normal file
@@ -0,0 +1,138 @@
|
||||
# @electron/get
|
||||
|
||||
> Download Electron release artifacts
|
||||
|
||||
[](https://circleci.com/gh/electron/get)
|
||||
[](https://npm.im/@electron/get)
|
||||
|
||||
## Usage
|
||||
|
||||
### Simple: Downloading an Electron Binary ZIP
|
||||
|
||||
```typescript
|
||||
import { download } from '@electron/get';
|
||||
|
||||
// NB: Use this syntax within an async function, Node does not have support for
|
||||
// top-level await as of Node 12.
|
||||
const zipFilePath = await download('4.0.4');
|
||||
```
|
||||
|
||||
### Advanced: Downloading a macOS Electron Symbol File
|
||||
|
||||
|
||||
```typescript
|
||||
import { downloadArtifact } from '@electron/get';
|
||||
|
||||
// NB: Use this syntax within an async function, Node does not have support for
|
||||
// top-level await as of Node 12.
|
||||
const zipFilePath = await downloadArtifact({
|
||||
version: '4.0.4',
|
||||
platform: 'darwin',
|
||||
artifactName: 'electron',
|
||||
artifactSuffix: 'symbols',
|
||||
arch: 'x64',
|
||||
});
|
||||
```
|
||||
|
||||
### Specifying a mirror
|
||||
|
||||
To specify another location to download Electron assets from, the following options are
|
||||
available:
|
||||
|
||||
* `mirrorOptions` Object
|
||||
* `mirror` String (optional) - The base URL of the mirror to download from.
|
||||
* `nightlyMirror` String (optional) - The Electron nightly-specific mirror URL.
|
||||
* `customDir` String (optional) - The name of the directory to download from, often scoped by version number.
|
||||
* `customFilename` String (optional) - The name of the asset to download.
|
||||
* `resolveAssetURL` Function (optional) - A function allowing customization of the url used to download the asset.
|
||||
|
||||
Anatomy of a download URL, in terms of `mirrorOptions`:
|
||||
|
||||
```
|
||||
https://github.com/electron/electron/releases/download/v4.0.4/electron-v4.0.4-linux-x64.zip
|
||||
| | | |
|
||||
------------------------------------------------------- -----------------------------
|
||||
| |
|
||||
mirror / nightlyMirror | | customFilename
|
||||
------
|
||||
||
|
||||
customDir
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
```typescript
|
||||
import { download } from '@electron/get';
|
||||
|
||||
const zipFilePath = await download('4.0.4', {
|
||||
mirrorOptions: {
|
||||
mirror: 'https://mirror.example.com/electron/',
|
||||
customDir: 'custom',
|
||||
customFilename: 'unofficial-electron-linux.zip'
|
||||
}
|
||||
});
|
||||
// Will download from https://mirror.example.com/electron/custom/unofficial-electron-linux.zip
|
||||
|
||||
const nightlyZipFilePath = await download('8.0.0-nightly.20190901', {
|
||||
mirrorOptions: {
|
||||
nightlyMirror: 'https://nightly.example.com/',
|
||||
customDir: 'nightlies',
|
||||
customFilename: 'nightly-linux.zip'
|
||||
}
|
||||
});
|
||||
// Will download from https://nightly.example.com/nightlies/nightly-linux.zip
|
||||
```
|
||||
|
||||
`customDir` can have the placeholder `{{ version }}`, which will be replaced by the version
|
||||
specified (without the leading `v`). For example:
|
||||
|
||||
```javascript
|
||||
const zipFilePath = await download('4.0.4', {
|
||||
mirrorOptions: {
|
||||
mirror: 'https://mirror.example.com/electron/',
|
||||
customDir: 'version-{{ version }}',
|
||||
platform: 'linux',
|
||||
arch: 'x64'
|
||||
}
|
||||
});
|
||||
// Will download from https://mirror.example.com/electron/version-4.0.4/electron-v4.0.4-linux-x64.zip
|
||||
```
|
||||
|
||||
#### Using environment variables for mirror options
|
||||
Mirror options can also be specified via the following environment variables:
|
||||
* `ELECTRON_CUSTOM_DIR` - Specifies the custom directory to download from.
|
||||
* `ELECTRON_CUSTOM_FILENAME` - Specifies the custom file name to download.
|
||||
* `ELECTRON_MIRROR` - Specifies the URL of the server to download from if the version is not a nightly version.
|
||||
* `ELECTRON_NIGHTLY_MIRROR` - Specifies the URL of the server to download from if the version is a nightly version.
|
||||
|
||||
### Overriding the version downloaded
|
||||
|
||||
The version downloaded can be overriden by setting the `ELECTRON_CUSTOM_VERSION` environment variable.
|
||||
Setting this environment variable will override the version passed in to `download` or `downloadArtifact`.
|
||||
|
||||
## How It Works
|
||||
|
||||
This module downloads Electron to a known place on your system and caches it
|
||||
so that future requests for that asset can be returned instantly. The cache
|
||||
locations are:
|
||||
|
||||
* Linux: `$XDG_CACHE_HOME` or `~/.cache/electron/`
|
||||
* MacOS: `~/Library/Caches/electron/`
|
||||
* Windows: `%LOCALAPPDATA%/electron/Cache` or `~/AppData/Local/electron/Cache/`
|
||||
|
||||
By default, the module uses [`got`](https://github.com/sindresorhus/got) as the
|
||||
downloader. As a result, you can use the same [options](https://github.com/sindresorhus/got#options)
|
||||
via `downloadOptions`.
|
||||
|
||||
### Progress Bar
|
||||
|
||||
By default, a progress bar is shown when downloading an artifact for more than 30 seconds. To
|
||||
disable, set the `ELECTRON_GET_NO_PROGRESS` environment variable to any non-empty value, or set
|
||||
`quiet` to `true` in `downloadOptions`. If you need to monitor progress yourself via the API, set
|
||||
`getProgressCallback` in `downloadOptions`, which has the same function signature as `got`'s
|
||||
[`downloadProgress` event callback](https://github.com/sindresorhus/got#ondownloadprogress-progress).
|
||||
|
||||
### Proxies
|
||||
|
||||
Downstream packages should utilize the `initializeProxy` function to add HTTP(S) proxy support. If
|
||||
the environment variable `ELECTRON_GET_USE_PROXY` is set, it is called automatically.
|
||||
8
node_modules/@electron/get/dist/cjs/Cache.d.ts
generated
vendored
Normal file
8
node_modules/@electron/get/dist/cjs/Cache.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
export declare class Cache {
|
||||
private cacheRoot;
|
||||
constructor(cacheRoot?: string);
|
||||
static getCacheDirectory(downloadUrl: string): string;
|
||||
getCachePath(downloadUrl: string, fileName: string): string;
|
||||
getPathForFileInCache(url: string, fileName: string): Promise<string | null>;
|
||||
putFileInCache(url: string, currentPath: string, fileName: string): Promise<string>;
|
||||
}
|
||||
60
node_modules/@electron/get/dist/cjs/Cache.js
generated
vendored
Normal file
60
node_modules/@electron/get/dist/cjs/Cache.js
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
"use strict";
|
||||
var __rest = (this && this.__rest) || function (s, e) {
|
||||
var t = {};
|
||||
for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)
|
||||
t[p] = s[p];
|
||||
if (s != null && typeof Object.getOwnPropertySymbols === "function")
|
||||
for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {
|
||||
if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))
|
||||
t[p[i]] = s[p[i]];
|
||||
}
|
||||
return t;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const debug_1 = require("debug");
|
||||
const env_paths_1 = require("env-paths");
|
||||
const fs = require("fs-extra");
|
||||
const path = require("path");
|
||||
const url = require("url");
|
||||
const crypto = require("crypto");
|
||||
const d = debug_1.default('@electron/get:cache');
|
||||
const defaultCacheRoot = env_paths_1.default('electron', {
|
||||
suffix: '',
|
||||
}).cache;
|
||||
class Cache {
|
||||
constructor(cacheRoot = defaultCacheRoot) {
|
||||
this.cacheRoot = cacheRoot;
|
||||
}
|
||||
static getCacheDirectory(downloadUrl) {
|
||||
const parsedDownloadUrl = url.parse(downloadUrl);
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
const { search, hash, pathname } = parsedDownloadUrl, rest = __rest(parsedDownloadUrl, ["search", "hash", "pathname"]);
|
||||
const strippedUrl = url.format(Object.assign(Object.assign({}, rest), { pathname: path.dirname(pathname || 'electron') }));
|
||||
return crypto
|
||||
.createHash('sha256')
|
||||
.update(strippedUrl)
|
||||
.digest('hex');
|
||||
}
|
||||
getCachePath(downloadUrl, fileName) {
|
||||
return path.resolve(this.cacheRoot, Cache.getCacheDirectory(downloadUrl), fileName);
|
||||
}
|
||||
async getPathForFileInCache(url, fileName) {
|
||||
const cachePath = this.getCachePath(url, fileName);
|
||||
if (await fs.pathExists(cachePath)) {
|
||||
return cachePath;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
async putFileInCache(url, currentPath, fileName) {
|
||||
const cachePath = this.getCachePath(url, fileName);
|
||||
d(`Moving ${currentPath} to ${cachePath}`);
|
||||
if (await fs.pathExists(cachePath)) {
|
||||
d('* Replacing existing file');
|
||||
await fs.remove(cachePath);
|
||||
}
|
||||
await fs.move(currentPath, cachePath);
|
||||
return cachePath;
|
||||
}
|
||||
}
|
||||
exports.Cache = Cache;
|
||||
//# sourceMappingURL=Cache.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/Cache.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/Cache.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Cache.js","sourceRoot":"","sources":["../../src/Cache.ts"],"names":[],"mappings":";;;;;;;;;;;;;AAAA,iCAA0B;AAC1B,yCAAiC;AACjC,+BAA+B;AAC/B,6BAA6B;AAC7B,2BAA2B;AAC3B,iCAAiC;AAEjC,MAAM,CAAC,GAAG,eAAK,CAAC,qBAAqB,CAAC,CAAC;AAEvC,MAAM,gBAAgB,GAAG,mBAAQ,CAAC,UAAU,EAAE;IAC5C,MAAM,EAAE,EAAE;CACX,CAAC,CAAC,KAAK,CAAC;AAET,MAAa,KAAK;IAChB,YAAoB,YAAY,gBAAgB;QAA5B,cAAS,GAAT,SAAS,CAAmB;IAAG,CAAC;IAE7C,MAAM,CAAC,iBAAiB,CAAC,WAAmB;QACjD,MAAM,iBAAiB,GAAG,GAAG,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC;QACjD,6DAA6D;QAC7D,MAAM,EAAE,MAAM,EAAE,IAAI,EAAE,QAAQ,KAAc,iBAAiB,EAA7B,gEAA6B,CAAC;QAC9D,MAAM,WAAW,GAAG,GAAG,CAAC,MAAM,iCAAM,IAAI,KAAE,QAAQ,EAAE,IAAI,CAAC,OAAO,CAAC,QAAQ,IAAI,UAAU,CAAC,IAAG,CAAC;QAE5F,OAAO,MAAM;aACV,UAAU,CAAC,QAAQ,CAAC;aACpB,MAAM,CAAC,WAAW,CAAC;aACnB,MAAM,CAAC,KAAK,CAAC,CAAC;IACnB,CAAC;IAEM,YAAY,CAAC,WAAmB,EAAE,QAAgB;QACvD,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,SAAS,EAAE,KAAK,CAAC,iBAAiB,CAAC,WAAW,CAAC,EAAE,QAAQ,CAAC,CAAC;IACtF,CAAC;IAEM,KAAK,CAAC,qBAAqB,CAAC,GAAW,EAAE,QAAgB;QAC9D,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QACnD,IAAI,MAAM,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAClC,OAAO,SAAS,CAAC;SAClB;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAEM,KAAK,CAAC,cAAc,CAAC,GAAW,EAAE,WAAmB,EAAE,QAAgB;QAC5E,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QACnD,CAAC,CAAC,UAAU,WAAW,OAAO,SAAS,EAAE,CAAC,CAAC;QAC3C,IAAI,MAAM,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAClC,CAAC,CAAC,2BAA2B,CAAC,CAAC;YAC/B,MAAM,EAAE,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;SAC5B;QAED,MAAM,EAAE,CAAC,IAAI,CAAC,WAAW,EAAE,SAAS,CAAC,CAAC;QAEtC,OAAO,SAAS,CAAC;IACnB,CAAC;CACF;AAxCD,sBAwCC"}
|
||||
3
node_modules/@electron/get/dist/cjs/Downloader.d.ts
generated
vendored
Normal file
3
node_modules/@electron/get/dist/cjs/Downloader.d.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
export interface Downloader<T> {
|
||||
download(url: string, targetFilePath: string, options: T): Promise<void>;
|
||||
}
|
||||
3
node_modules/@electron/get/dist/cjs/Downloader.js
generated
vendored
Normal file
3
node_modules/@electron/get/dist/cjs/Downloader.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=Downloader.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/Downloader.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/Downloader.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Downloader.js","sourceRoot":"","sources":["../../src/Downloader.ts"],"names":[],"mappings":""}
|
||||
21
node_modules/@electron/get/dist/cjs/GotDownloader.d.ts
generated
vendored
Normal file
21
node_modules/@electron/get/dist/cjs/GotDownloader.d.ts
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
import { Progress as GotProgress, Options as GotOptions } from 'got';
|
||||
import { Downloader } from './Downloader';
|
||||
/**
|
||||
* See [`got#options`](https://github.com/sindresorhus/got#options) for possible keys/values.
|
||||
*/
|
||||
export declare type GotDownloaderOptions = (GotOptions & {
|
||||
isStream?: true;
|
||||
}) & {
|
||||
/**
|
||||
* if defined, triggers every time `got`'s `downloadProgress` event callback is triggered.
|
||||
*/
|
||||
getProgressCallback?: (progress: GotProgress) => Promise<void>;
|
||||
/**
|
||||
* if `true`, disables the console progress bar (setting the `ELECTRON_GET_NO_PROGRESS`
|
||||
* environment variable to a non-empty value also does this).
|
||||
*/
|
||||
quiet?: boolean;
|
||||
};
|
||||
export declare class GotDownloader implements Downloader<GotDownloaderOptions> {
|
||||
download(url: string, targetFilePath: string, options?: GotDownloaderOptions): Promise<void>;
|
||||
}
|
||||
76
node_modules/@electron/get/dist/cjs/GotDownloader.js
generated
vendored
Normal file
76
node_modules/@electron/get/dist/cjs/GotDownloader.js
generated
vendored
Normal file
@@ -0,0 +1,76 @@
|
||||
"use strict";
|
||||
var __rest = (this && this.__rest) || function (s, e) {
|
||||
var t = {};
|
||||
for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)
|
||||
t[p] = s[p];
|
||||
if (s != null && typeof Object.getOwnPropertySymbols === "function")
|
||||
for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {
|
||||
if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))
|
||||
t[p[i]] = s[p[i]];
|
||||
}
|
||||
return t;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = require("fs-extra");
|
||||
const got_1 = require("got");
|
||||
const path = require("path");
|
||||
const ProgressBar = require("progress");
|
||||
const PROGRESS_BAR_DELAY_IN_SECONDS = 30;
|
||||
class GotDownloader {
|
||||
async download(url, targetFilePath, options) {
|
||||
if (!options) {
|
||||
options = {};
|
||||
}
|
||||
const { quiet, getProgressCallback } = options, gotOptions = __rest(options, ["quiet", "getProgressCallback"]);
|
||||
let downloadCompleted = false;
|
||||
let bar;
|
||||
let progressPercent;
|
||||
let timeout = undefined;
|
||||
await fs.mkdirp(path.dirname(targetFilePath));
|
||||
const writeStream = fs.createWriteStream(targetFilePath);
|
||||
if (!quiet || !process.env.ELECTRON_GET_NO_PROGRESS) {
|
||||
const start = new Date();
|
||||
timeout = setTimeout(() => {
|
||||
if (!downloadCompleted) {
|
||||
bar = new ProgressBar(`Downloading ${path.basename(url)}: [:bar] :percent ETA: :eta seconds `, {
|
||||
curr: progressPercent,
|
||||
total: 100,
|
||||
});
|
||||
// https://github.com/visionmedia/node-progress/issues/159
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
bar.start = start;
|
||||
}
|
||||
}, PROGRESS_BAR_DELAY_IN_SECONDS * 1000);
|
||||
}
|
||||
await new Promise((resolve, reject) => {
|
||||
const downloadStream = got_1.default.stream(url, gotOptions);
|
||||
downloadStream.on('downloadProgress', async (progress) => {
|
||||
progressPercent = progress.percent;
|
||||
if (bar) {
|
||||
bar.update(progress.percent);
|
||||
}
|
||||
if (getProgressCallback) {
|
||||
await getProgressCallback(progress);
|
||||
}
|
||||
});
|
||||
downloadStream.on('error', error => {
|
||||
if (error instanceof got_1.HTTPError && error.response.statusCode === 404) {
|
||||
error.message += ` for ${error.response.url}`;
|
||||
}
|
||||
if (writeStream.destroy) {
|
||||
writeStream.destroy(error);
|
||||
}
|
||||
reject(error);
|
||||
});
|
||||
writeStream.on('error', error => reject(error));
|
||||
writeStream.on('close', () => resolve());
|
||||
downloadStream.pipe(writeStream);
|
||||
});
|
||||
downloadCompleted = true;
|
||||
if (timeout) {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.GotDownloader = GotDownloader;
|
||||
//# sourceMappingURL=GotDownloader.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/GotDownloader.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/GotDownloader.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"GotDownloader.js","sourceRoot":"","sources":["../../src/GotDownloader.ts"],"names":[],"mappings":";;;;;;;;;;;;;AAAA,+BAA+B;AAC/B,6BAAqF;AACrF,6BAA6B;AAC7B,wCAAwC;AAIxC,MAAM,6BAA6B,GAAG,EAAE,CAAC;AAiBzC,MAAa,aAAa;IACxB,KAAK,CAAC,QAAQ,CACZ,GAAW,EACX,cAAsB,EACtB,OAA8B;QAE9B,IAAI,CAAC,OAAO,EAAE;YACZ,OAAO,GAAG,EAAE,CAAC;SACd;QACD,MAAM,EAAE,KAAK,EAAE,mBAAmB,KAAoB,OAAO,EAAzB,8DAAyB,CAAC;QAC9D,IAAI,iBAAiB,GAAG,KAAK,CAAC;QAC9B,IAAI,GAA4B,CAAC;QACjC,IAAI,eAAuB,CAAC;QAC5B,IAAI,OAAO,GAA+B,SAAS,CAAC;QACpD,MAAM,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC;QAC9C,MAAM,WAAW,GAAG,EAAE,CAAC,iBAAiB,CAAC,cAAc,CAAC,CAAC;QAEzD,IAAI,CAAC,KAAK,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,wBAAwB,EAAE;YACnD,MAAM,KAAK,GAAG,IAAI,IAAI,EAAE,CAAC;YACzB,OAAO,GAAG,UAAU,CAAC,GAAG,EAAE;gBACxB,IAAI,CAAC,iBAAiB,EAAE;oBACtB,GAAG,GAAG,IAAI,WAAW,CACnB,eAAe,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,sCAAsC,EACvE;wBACE,IAAI,EAAE,eAAe;wBACrB,KAAK,EAAE,GAAG;qBACX,CACF,CAAC;oBACF,0DAA0D;oBAC1D,8DAA8D;oBAC7D,GAAW,CAAC,KAAK,GAAG,KAAK,CAAC;iBAC5B;YACH,CAAC,EAAE,6BAA6B,GAAG,IAAI,CAAC,CAAC;SAC1C;QACD,MAAM,IAAI,OAAO,CAAO,CAAC,OAAO,EAAE,MAAM,EAAE,EAAE;YAC1C,MAAM,cAAc,GAAG,aAAG,CAAC,MAAM,CAAC,GAAG,EAAE,UAAU,CAAC,CAAC;YACnD,cAAc,CAAC,EAAE,CAAC,kBAAkB,EAAE,KAAK,EAAC,QAAQ,EAAC,EAAE;gBACrD,eAAe,GAAG,QAAQ,CAAC,OAAO,CAAC;gBACnC,IAAI,GAAG,EAAE;oBACP,GAAG,CAAC,MAAM,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;iBAC9B;gBACD,IAAI,mBAAmB,EAAE;oBACvB,MAAM,mBAAmB,CAAC,QAAQ,CAAC,CAAC;iBACrC;YACH,CAAC,CAAC,CAAC;YACH,cAAc,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE;gBACjC,IAAI,KAAK,YAAY,eAAS,IAAI,KAAK,CAAC,QAAQ,CAAC,UAAU,KAAK,GAAG,EAAE;oBACnE,KAAK,CAAC,OAAO,IAAI,QAAQ,KAAK,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC;iBAC/C;gBACD,IAAI,WAAW,CAAC,OAAO,EAAE;oBACvB,WAAW,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;iBAC5B;gBAED,MAAM,CAAC,KAAK,CAAC,CAAC;YAChB,CAAC,CAAC,CAAC;YACH,WAAW,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC;YAChD,WAAW,CAAC,EAAE,CAAC,OAAO,EAAE,GAAG,EAAE,CAAC,OAAO,EAAE,CAAC,CAAC;YAEzC,cAAc,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;QACnC,CAAC,CAAC,CAAC;QAEH,iBAAiB,GAAG,IAAI,CAAC;QACzB,IAAI,OAAO,EAAE;YACX,YAAY,CAAC,OAAO,CAAC,CAAC;SACvB;IACH,CAAC;CACF;AAlED,sCAkEC"}
|
||||
4
node_modules/@electron/get/dist/cjs/artifact-utils.d.ts
generated
vendored
Normal file
4
node_modules/@electron/get/dist/cjs/artifact-utils.d.ts
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
import { ElectronArtifactDetails } from './types';
|
||||
export declare function getArtifactFileName(details: ElectronArtifactDetails): string;
|
||||
export declare function getArtifactRemoteURL(details: ElectronArtifactDetails): Promise<string>;
|
||||
export declare function getArtifactVersion(details: ElectronArtifactDetails): string;
|
||||
66
node_modules/@electron/get/dist/cjs/artifact-utils.js
generated
vendored
Normal file
66
node_modules/@electron/get/dist/cjs/artifact-utils.js
generated
vendored
Normal file
@@ -0,0 +1,66 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const utils_1 = require("./utils");
|
||||
const BASE_URL = 'https://github.com/electron/electron/releases/download/';
|
||||
const NIGHTLY_BASE_URL = 'https://github.com/electron/nightlies/releases/download/';
|
||||
function getArtifactFileName(details) {
|
||||
utils_1.ensureIsTruthyString(details, 'artifactName');
|
||||
if (details.isGeneric) {
|
||||
return details.artifactName;
|
||||
}
|
||||
utils_1.ensureIsTruthyString(details, 'arch');
|
||||
utils_1.ensureIsTruthyString(details, 'platform');
|
||||
utils_1.ensureIsTruthyString(details, 'version');
|
||||
return `${[
|
||||
details.artifactName,
|
||||
details.version,
|
||||
details.platform,
|
||||
details.arch,
|
||||
...(details.artifactSuffix ? [details.artifactSuffix] : []),
|
||||
].join('-')}.zip`;
|
||||
}
|
||||
exports.getArtifactFileName = getArtifactFileName;
|
||||
function mirrorVar(name, options, defaultValue) {
|
||||
// Convert camelCase to camel_case for env var reading
|
||||
const snakeName = name.replace(/([a-z])([A-Z])/g, (_, a, b) => `${a}_${b}`).toLowerCase();
|
||||
return (
|
||||
// .npmrc
|
||||
process.env[`npm_config_electron_${name.toLowerCase()}`] ||
|
||||
process.env[`NPM_CONFIG_ELECTRON_${snakeName.toUpperCase()}`] ||
|
||||
process.env[`npm_config_electron_${snakeName}`] ||
|
||||
// package.json
|
||||
process.env[`npm_package_config_electron_${name}`] ||
|
||||
process.env[`npm_package_config_electron_${snakeName.toLowerCase()}`] ||
|
||||
// env
|
||||
process.env[`ELECTRON_${snakeName.toUpperCase()}`] ||
|
||||
options[name] ||
|
||||
defaultValue);
|
||||
}
|
||||
async function getArtifactRemoteURL(details) {
|
||||
const opts = details.mirrorOptions || {};
|
||||
let base = mirrorVar('mirror', opts, BASE_URL);
|
||||
if (details.version.includes('nightly')) {
|
||||
const nightlyDeprecated = mirrorVar('nightly_mirror', opts, '');
|
||||
if (nightlyDeprecated) {
|
||||
base = nightlyDeprecated;
|
||||
console.warn(`nightly_mirror is deprecated, please use nightlyMirror`);
|
||||
}
|
||||
else {
|
||||
base = mirrorVar('nightlyMirror', opts, NIGHTLY_BASE_URL);
|
||||
}
|
||||
}
|
||||
const path = mirrorVar('customDir', opts, details.version).replace('{{ version }}', details.version.replace(/^v/, ''));
|
||||
const file = mirrorVar('customFilename', opts, getArtifactFileName(details));
|
||||
// Allow customized download URL resolution.
|
||||
if (opts.resolveAssetURL) {
|
||||
const url = await opts.resolveAssetURL(details);
|
||||
return url;
|
||||
}
|
||||
return `${base}${path}/${file}`;
|
||||
}
|
||||
exports.getArtifactRemoteURL = getArtifactRemoteURL;
|
||||
function getArtifactVersion(details) {
|
||||
return utils_1.normalizeVersion(mirrorVar('customVersion', details.mirrorOptions || {}, details.version));
|
||||
}
|
||||
exports.getArtifactVersion = getArtifactVersion;
|
||||
//# sourceMappingURL=artifact-utils.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/artifact-utils.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/artifact-utils.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"artifact-utils.js","sourceRoot":"","sources":["../../src/artifact-utils.ts"],"names":[],"mappings":";;AACA,mCAAiE;AAEjE,MAAM,QAAQ,GAAG,yDAAyD,CAAC;AAC3E,MAAM,gBAAgB,GAAG,0DAA0D,CAAC;AAEpF,SAAgB,mBAAmB,CAAC,OAAgC;IAClE,4BAAoB,CAAC,OAAO,EAAE,cAAc,CAAC,CAAC;IAE9C,IAAI,OAAO,CAAC,SAAS,EAAE;QACrB,OAAO,OAAO,CAAC,YAAY,CAAC;KAC7B;IAED,4BAAoB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;IACtC,4BAAoB,CAAC,OAAO,EAAE,UAAU,CAAC,CAAC;IAC1C,4BAAoB,CAAC,OAAO,EAAE,SAAS,CAAC,CAAC;IAEzC,OAAO,GAAG;QACR,OAAO,CAAC,YAAY;QACpB,OAAO,CAAC,OAAO;QACf,OAAO,CAAC,QAAQ;QAChB,OAAO,CAAC,IAAI;QACZ,GAAG,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC;KAC5D,CAAC,IAAI,CAAC,GAAG,CAAC,MAAM,CAAC;AACpB,CAAC;AAlBD,kDAkBC;AAED,SAAS,SAAS,CAChB,IAAkD,EAClD,OAAsB,EACtB,YAAoB;IAEpB,sDAAsD;IACtD,MAAM,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,WAAW,EAAE,CAAC;IAE1F,OAAO;IACL,SAAS;IACT,OAAO,CAAC,GAAG,CAAC,uBAAuB,IAAI,CAAC,WAAW,EAAE,EAAE,CAAC;QACxD,OAAO,CAAC,GAAG,CAAC,uBAAuB,SAAS,CAAC,WAAW,EAAE,EAAE,CAAC;QAC7D,OAAO,CAAC,GAAG,CAAC,uBAAuB,SAAS,EAAE,CAAC;QAC/C,eAAe;QACf,OAAO,CAAC,GAAG,CAAC,+BAA+B,IAAI,EAAE,CAAC;QAClD,OAAO,CAAC,GAAG,CAAC,+BAA+B,SAAS,CAAC,WAAW,EAAE,EAAE,CAAC;QACrE,MAAM;QACN,OAAO,CAAC,GAAG,CAAC,YAAY,SAAS,CAAC,WAAW,EAAE,EAAE,CAAC;QAClD,OAAO,CAAC,IAAI,CAAC;QACb,YAAY,CACb,CAAC;AACJ,CAAC;AAEM,KAAK,UAAU,oBAAoB,CAAC,OAAgC;IACzE,MAAM,IAAI,GAAkB,OAAO,CAAC,aAAa,IAAI,EAAE,CAAC;IACxD,IAAI,IAAI,GAAG,SAAS,CAAC,QAAQ,EAAE,IAAI,EAAE,QAAQ,CAAC,CAAC;IAC/C,IAAI,OAAO,CAAC,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE;QACvC,MAAM,iBAAiB,GAAG,SAAS,CAAC,gBAAgB,EAAE,IAAI,EAAE,EAAE,CAAC,CAAC;QAChE,IAAI,iBAAiB,EAAE;YACrB,IAAI,GAAG,iBAAiB,CAAC;YACzB,OAAO,CAAC,IAAI,CAAC,wDAAwD,CAAC,CAAC;SACxE;aAAM;YACL,IAAI,GAAG,SAAS,CAAC,eAAe,EAAE,IAAI,EAAE,gBAAgB,CAAC,CAAC;SAC3D;KACF;IACD,MAAM,IAAI,GAAG,SAAS,CAAC,WAAW,EAAE,IAAI,EAAE,OAAO,CAAC,OAAO,CAAC,CAAC,OAAO,CAChE,eAAe,EACf,OAAO,CAAC,OAAO,CAAC,OAAO,CAAC,IAAI,EAAE,EAAE,CAAC,CAClC,CAAC;IACF,MAAM,IAAI,GAAG,SAAS,CAAC,gBAAgB,EAAE,IAAI,EAAE,mBAAmB,CAAC,OAAO,CAAC,CAAC,CAAC;IAE7E,4CAA4C;IAC5C,IAAI,IAAI,CAAC,eAAe,EAAE;QACxB,MAAM,GAAG,GAAG,MAAM,IAAI,CAAC,eAAe,CAAC,OAAO,CAAC,CAAC;QAChD,OAAO,GAAG,CAAC;KACZ;IAED,OAAO,GAAG,IAAI,GAAG,IAAI,IAAI,IAAI,EAAE,CAAC;AAClC,CAAC;AAzBD,oDAyBC;AAED,SAAgB,kBAAkB,CAAC,OAAgC;IACjE,OAAO,wBAAgB,CAAC,SAAS,CAAC,eAAe,EAAE,OAAO,CAAC,aAAa,IAAI,EAAE,EAAE,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC;AACpG,CAAC;AAFD,gDAEC"}
|
||||
3
node_modules/@electron/get/dist/cjs/downloader-resolver.d.ts
generated
vendored
Normal file
3
node_modules/@electron/get/dist/cjs/downloader-resolver.d.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
import { DownloadOptions } from './types';
|
||||
import { Downloader } from './Downloader';
|
||||
export declare function getDownloaderForSystem(): Promise<Downloader<DownloadOptions>>;
|
||||
12
node_modules/@electron/get/dist/cjs/downloader-resolver.js
generated
vendored
Normal file
12
node_modules/@electron/get/dist/cjs/downloader-resolver.js
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
async function getDownloaderForSystem() {
|
||||
// TODO: Resolve the downloader or default to GotDownloader
|
||||
// Current thoughts are a dot-file traversal for something like
|
||||
// ".electron.downloader" which would be a text file with the name of the
|
||||
// npm module to import() and use as the downloader
|
||||
const { GotDownloader } = await Promise.resolve().then(() => require('./GotDownloader'));
|
||||
return new GotDownloader();
|
||||
}
|
||||
exports.getDownloaderForSystem = getDownloaderForSystem;
|
||||
//# sourceMappingURL=downloader-resolver.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/downloader-resolver.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/downloader-resolver.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"downloader-resolver.js","sourceRoot":"","sources":["../../src/downloader-resolver.ts"],"names":[],"mappings":";;AAGO,KAAK,UAAU,sBAAsB;IAC1C,2DAA2D;IAC3D,+DAA+D;IAC/D,yEAAyE;IACzE,mDAAmD;IACnD,MAAM,EAAE,aAAa,EAAE,GAAG,2CAAa,iBAAiB,EAAC,CAAC;IAC1D,OAAO,IAAI,aAAa,EAAE,CAAC;AAC7B,CAAC;AAPD,wDAOC"}
|
||||
18
node_modules/@electron/get/dist/cjs/index.d.ts
generated
vendored
Normal file
18
node_modules/@electron/get/dist/cjs/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
import { ElectronDownloadRequestOptions, ElectronPlatformArtifactDetailsWithDefaults } from './types';
|
||||
export { getHostArch } from './utils';
|
||||
export { initializeProxy } from './proxy';
|
||||
export * from './types';
|
||||
/**
|
||||
* Downloads an artifact from an Electron release and returns an absolute path
|
||||
* to the downloaded file.
|
||||
*
|
||||
* @param artifactDetails - The information required to download the artifact
|
||||
*/
|
||||
export declare function downloadArtifact(_artifactDetails: ElectronPlatformArtifactDetailsWithDefaults): Promise<string>;
|
||||
/**
|
||||
* Downloads a specific version of Electron and returns an absolute path to a
|
||||
* ZIP file.
|
||||
*
|
||||
* @param version - The version of Electron you want to download
|
||||
*/
|
||||
export declare function download(version: string, options?: ElectronDownloadRequestOptions): Promise<string>;
|
||||
140
node_modules/@electron/get/dist/cjs/index.js
generated
vendored
Normal file
140
node_modules/@electron/get/dist/cjs/index.js
generated
vendored
Normal file
@@ -0,0 +1,140 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const debug_1 = require("debug");
|
||||
const fs = require("fs-extra");
|
||||
const path = require("path");
|
||||
const semver = require("semver");
|
||||
const sumchecker = require("sumchecker");
|
||||
const artifact_utils_1 = require("./artifact-utils");
|
||||
const Cache_1 = require("./Cache");
|
||||
const downloader_resolver_1 = require("./downloader-resolver");
|
||||
const proxy_1 = require("./proxy");
|
||||
const utils_1 = require("./utils");
|
||||
var utils_2 = require("./utils");
|
||||
exports.getHostArch = utils_2.getHostArch;
|
||||
var proxy_2 = require("./proxy");
|
||||
exports.initializeProxy = proxy_2.initializeProxy;
|
||||
const d = debug_1.default('@electron/get:index');
|
||||
if (process.env.ELECTRON_GET_USE_PROXY) {
|
||||
proxy_1.initializeProxy();
|
||||
}
|
||||
async function validateArtifact(artifactDetails, downloadedAssetPath, _downloadArtifact) {
|
||||
return await utils_1.withTempDirectoryIn(artifactDetails.tempDirectory, async (tempFolder) => {
|
||||
// Don't try to verify the hash of the hash file itself
|
||||
// and for older versions that don't have a SHASUMS256.txt
|
||||
if (!artifactDetails.artifactName.startsWith('SHASUMS256') &&
|
||||
!artifactDetails.unsafelyDisableChecksums &&
|
||||
semver.gte(artifactDetails.version, '1.3.2')) {
|
||||
let shasumPath;
|
||||
const checksums = artifactDetails.checksums;
|
||||
if (checksums) {
|
||||
shasumPath = path.resolve(tempFolder, 'SHASUMS256.txt');
|
||||
const fileNames = Object.keys(checksums);
|
||||
if (fileNames.length === 0) {
|
||||
throw new Error('Provided "checksums" object is empty, cannot generate a valid SHASUMS256.txt');
|
||||
}
|
||||
const generatedChecksums = fileNames
|
||||
.map(fileName => `${checksums[fileName]} *${fileName}`)
|
||||
.join('\n');
|
||||
await fs.writeFile(shasumPath, generatedChecksums);
|
||||
}
|
||||
else {
|
||||
shasumPath = await _downloadArtifact({
|
||||
isGeneric: true,
|
||||
version: artifactDetails.version,
|
||||
artifactName: 'SHASUMS256.txt',
|
||||
force: artifactDetails.force,
|
||||
downloadOptions: artifactDetails.downloadOptions,
|
||||
cacheRoot: artifactDetails.cacheRoot,
|
||||
downloader: artifactDetails.downloader,
|
||||
mirrorOptions: artifactDetails.mirrorOptions,
|
||||
});
|
||||
}
|
||||
// For versions 1.3.2 - 1.3.4, need to overwrite the `defaultTextEncoding` option:
|
||||
// https://github.com/electron/electron/pull/6676#discussion_r75332120
|
||||
if (semver.satisfies(artifactDetails.version, '1.3.2 - 1.3.4')) {
|
||||
const validatorOptions = {};
|
||||
validatorOptions.defaultTextEncoding = 'binary';
|
||||
const checker = new sumchecker.ChecksumValidator('sha256', shasumPath, validatorOptions);
|
||||
await checker.validate(path.dirname(downloadedAssetPath), path.basename(downloadedAssetPath));
|
||||
}
|
||||
else {
|
||||
await sumchecker('sha256', shasumPath, path.dirname(downloadedAssetPath), [
|
||||
path.basename(downloadedAssetPath),
|
||||
]);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Downloads an artifact from an Electron release and returns an absolute path
|
||||
* to the downloaded file.
|
||||
*
|
||||
* @param artifactDetails - The information required to download the artifact
|
||||
*/
|
||||
async function downloadArtifact(_artifactDetails) {
|
||||
const artifactDetails = Object.assign({}, _artifactDetails);
|
||||
if (!_artifactDetails.isGeneric) {
|
||||
const platformArtifactDetails = artifactDetails;
|
||||
if (!platformArtifactDetails.platform) {
|
||||
d('No platform found, defaulting to the host platform');
|
||||
platformArtifactDetails.platform = process.platform;
|
||||
}
|
||||
if (platformArtifactDetails.arch) {
|
||||
platformArtifactDetails.arch = utils_1.getNodeArch(platformArtifactDetails.arch);
|
||||
}
|
||||
else {
|
||||
d('No arch found, defaulting to the host arch');
|
||||
platformArtifactDetails.arch = utils_1.getHostArch();
|
||||
}
|
||||
}
|
||||
utils_1.ensureIsTruthyString(artifactDetails, 'version');
|
||||
artifactDetails.version = artifact_utils_1.getArtifactVersion(artifactDetails);
|
||||
const fileName = artifact_utils_1.getArtifactFileName(artifactDetails);
|
||||
const url = await artifact_utils_1.getArtifactRemoteURL(artifactDetails);
|
||||
const cache = new Cache_1.Cache(artifactDetails.cacheRoot);
|
||||
// Do not check if the file exists in the cache when force === true
|
||||
if (!artifactDetails.force) {
|
||||
d(`Checking the cache (${artifactDetails.cacheRoot}) for ${fileName} (${url})`);
|
||||
const cachedPath = await cache.getPathForFileInCache(url, fileName);
|
||||
if (cachedPath === null) {
|
||||
d('Cache miss');
|
||||
}
|
||||
else {
|
||||
d('Cache hit');
|
||||
try {
|
||||
await validateArtifact(artifactDetails, cachedPath, downloadArtifact);
|
||||
return cachedPath;
|
||||
}
|
||||
catch (err) {
|
||||
d("Artifact in cache didn't match checksums", err);
|
||||
d('falling back to re-download');
|
||||
}
|
||||
}
|
||||
}
|
||||
if (!artifactDetails.isGeneric &&
|
||||
utils_1.isOfficialLinuxIA32Download(artifactDetails.platform, artifactDetails.arch, artifactDetails.version, artifactDetails.mirrorOptions)) {
|
||||
console.warn('Official Linux/ia32 support is deprecated.');
|
||||
console.warn('For more info: https://electronjs.org/blog/linux-32bit-support');
|
||||
}
|
||||
return await utils_1.withTempDirectoryIn(artifactDetails.tempDirectory, async (tempFolder) => {
|
||||
const tempDownloadPath = path.resolve(tempFolder, artifact_utils_1.getArtifactFileName(artifactDetails));
|
||||
const downloader = artifactDetails.downloader || (await downloader_resolver_1.getDownloaderForSystem());
|
||||
d(`Downloading ${url} to ${tempDownloadPath} with options: ${JSON.stringify(artifactDetails.downloadOptions)}`);
|
||||
await downloader.download(url, tempDownloadPath, artifactDetails.downloadOptions);
|
||||
await validateArtifact(artifactDetails, tempDownloadPath, downloadArtifact);
|
||||
return await cache.putFileInCache(url, tempDownloadPath, fileName);
|
||||
});
|
||||
}
|
||||
exports.downloadArtifact = downloadArtifact;
|
||||
/**
|
||||
* Downloads a specific version of Electron and returns an absolute path to a
|
||||
* ZIP file.
|
||||
*
|
||||
* @param version - The version of Electron you want to download
|
||||
*/
|
||||
function download(version, options) {
|
||||
return downloadArtifact(Object.assign(Object.assign({}, options), { version, platform: process.platform, arch: process.arch, artifactName: 'electron' }));
|
||||
}
|
||||
exports.download = download;
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/index.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;AAAA,iCAA0B;AAC1B,+BAA+B;AAC/B,6BAA6B;AAC7B,iCAAiC;AACjC,yCAAyC;AAEzC,qDAAiG;AAOjG,mCAAgC;AAChC,+DAA+D;AAC/D,mCAA0C;AAC1C,mCAOiB;AAEjB,iCAAsC;AAA7B,8BAAA,WAAW,CAAA;AACpB,iCAA0C;AAAjC,kCAAA,eAAe,CAAA;AAGxB,MAAM,CAAC,GAAG,eAAK,CAAC,qBAAqB,CAAC,CAAC;AAEvC,IAAI,OAAO,CAAC,GAAG,CAAC,sBAAsB,EAAE;IACtC,uBAAe,EAAE,CAAC;CACnB;AAMD,KAAK,UAAU,gBAAgB,CAC7B,eAAwC,EACxC,mBAA2B,EAC3B,iBAAqC;IAErC,OAAO,MAAM,2BAAmB,CAAC,eAAe,CAAC,aAAa,EAAE,KAAK,EAAC,UAAU,EAAC,EAAE;QACjF,uDAAuD;QACvD,0DAA0D;QAC1D,IACE,CAAC,eAAe,CAAC,YAAY,CAAC,UAAU,CAAC,YAAY,CAAC;YACtD,CAAC,eAAe,CAAC,wBAAwB;YACzC,MAAM,CAAC,GAAG,CAAC,eAAe,CAAC,OAAO,EAAE,OAAO,CAAC,EAC5C;YACA,IAAI,UAAkB,CAAC;YACvB,MAAM,SAAS,GAAG,eAAe,CAAC,SAAS,CAAC;YAC5C,IAAI,SAAS,EAAE;gBACb,UAAU,GAAG,IAAI,CAAC,OAAO,CAAC,UAAU,EAAE,gBAAgB,CAAC,CAAC;gBACxD,MAAM,SAAS,GAAa,MAAM,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBACnD,IAAI,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;oBAC1B,MAAM,IAAI,KAAK,CACb,8EAA8E,CAC/E,CAAC;iBACH;gBACD,MAAM,kBAAkB,GAAG,SAAS;qBACjC,GAAG,CAAC,QAAQ,CAAC,EAAE,CAAC,GAAG,SAAS,CAAC,QAAQ,CAAC,KAAK,QAAQ,EAAE,CAAC;qBACtD,IAAI,CAAC,IAAI,CAAC,CAAC;gBACd,MAAM,EAAE,CAAC,SAAS,CAAC,UAAU,EAAE,kBAAkB,CAAC,CAAC;aACpD;iBAAM;gBACL,UAAU,GAAG,MAAM,iBAAiB,CAAC;oBACnC,SAAS,EAAE,IAAI;oBACf,OAAO,EAAE,eAAe,CAAC,OAAO;oBAChC,YAAY,EAAE,gBAAgB;oBAC9B,KAAK,EAAE,eAAe,CAAC,KAAK;oBAC5B,eAAe,EAAE,eAAe,CAAC,eAAe;oBAChD,SAAS,EAAE,eAAe,CAAC,SAAS;oBACpC,UAAU,EAAE,eAAe,CAAC,UAAU;oBACtC,aAAa,EAAE,eAAe,CAAC,aAAa;iBAC7C,CAAC,CAAC;aACJ;YAED,kFAAkF;YAClF,sEAAsE;YACtE,IAAI,MAAM,CAAC,SAAS,CAAC,eAAe,CAAC,OAAO,EAAE,eAAe,CAAC,EAAE;gBAC9D,MAAM,gBAAgB,GAA+B,EAAE,CAAC;gBACxD,gBAAgB,CAAC,mBAAmB,GAAG,QAAQ,CAAC;gBAChD,MAAM,OAAO,GAAG,IAAI,UAAU,CAAC,iBAAiB,CAAC,QAAQ,EAAE,UAAU,EAAE,gBAAgB,CAAC,CAAC;gBACzF,MAAM,OAAO,CAAC,QAAQ,CACpB,IAAI,CAAC,OAAO,CAAC,mBAAmB,CAAC,EACjC,IAAI,CAAC,QAAQ,CAAC,mBAAmB,CAAC,CACnC,CAAC;aACH;iBAAM;gBACL,MAAM,UAAU,CAAC,QAAQ,EAAE,UAAU,EAAE,IAAI,CAAC,OAAO,CAAC,mBAAmB,CAAC,EAAE;oBACxE,IAAI,CAAC,QAAQ,CAAC,mBAAmB,CAAC;iBACnC,CAAC,CAAC;aACJ;SACF;IACH,CAAC,CAAC,CAAC;AACL,CAAC;AAED;;;;;GAKG;AACI,KAAK,UAAU,gBAAgB,CACpC,gBAA6D;IAE7D,MAAM,eAAe,qBACf,gBAA4C,CACjD,CAAC;IACF,IAAI,CAAC,gBAAgB,CAAC,SAAS,EAAE;QAC/B,MAAM,uBAAuB,GAAG,eAAkD,CAAC;QACnF,IAAI,CAAC,uBAAuB,CAAC,QAAQ,EAAE;YACrC,CAAC,CAAC,oDAAoD,CAAC,CAAC;YACxD,uBAAuB,CAAC,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC;SACrD;QACD,IAAI,uBAAuB,CAAC,IAAI,EAAE;YAChC,uBAAuB,CAAC,IAAI,GAAG,mBAAW,CAAC,uBAAuB,CAAC,IAAI,CAAC,CAAC;SAC1E;aAAM;YACL,CAAC,CAAC,4CAA4C,CAAC,CAAC;YAChD,uBAAuB,CAAC,IAAI,GAAG,mBAAW,EAAE,CAAC;SAC9C;KACF;IACD,4BAAoB,CAAC,eAAe,EAAE,SAAS,CAAC,CAAC;IAEjD,eAAe,CAAC,OAAO,GAAG,mCAAkB,CAAC,eAAe,CAAC,CAAC;IAC9D,MAAM,QAAQ,GAAG,oCAAmB,CAAC,eAAe,CAAC,CAAC;IACtD,MAAM,GAAG,GAAG,MAAM,qCAAoB,CAAC,eAAe,CAAC,CAAC;IACxD,MAAM,KAAK,GAAG,IAAI,aAAK,CAAC,eAAe,CAAC,SAAS,CAAC,CAAC;IAEnD,mEAAmE;IACnE,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE;QAC1B,CAAC,CAAC,uBAAuB,eAAe,CAAC,SAAS,SAAS,QAAQ,KAAK,GAAG,GAAG,CAAC,CAAC;QAChF,MAAM,UAAU,GAAG,MAAM,KAAK,CAAC,qBAAqB,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QAEpE,IAAI,UAAU,KAAK,IAAI,EAAE;YACvB,CAAC,CAAC,YAAY,CAAC,CAAC;SACjB;aAAM;YACL,CAAC,CAAC,WAAW,CAAC,CAAC;YACf,IAAI;gBACF,MAAM,gBAAgB,CAAC,eAAe,EAAE,UAAU,EAAE,gBAAgB,CAAC,CAAC;gBAEtE,OAAO,UAAU,CAAC;aACnB;YAAC,OAAO,GAAG,EAAE;gBACZ,CAAC,CAAC,0CAA0C,EAAE,GAAG,CAAC,CAAC;gBACnD,CAAC,CAAC,6BAA6B,CAAC,CAAC;aAClC;SACF;KACF;IAED,IACE,CAAC,eAAe,CAAC,SAAS;QAC1B,mCAA2B,CACzB,eAAe,CAAC,QAAQ,EACxB,eAAe,CAAC,IAAI,EACpB,eAAe,CAAC,OAAO,EACvB,eAAe,CAAC,aAAa,CAC9B,EACD;QACA,OAAO,CAAC,IAAI,CAAC,4CAA4C,CAAC,CAAC;QAC3D,OAAO,CAAC,IAAI,CAAC,gEAAgE,CAAC,CAAC;KAChF;IAED,OAAO,MAAM,2BAAmB,CAAC,eAAe,CAAC,aAAa,EAAE,KAAK,EAAC,UAAU,EAAC,EAAE;QACjF,MAAM,gBAAgB,GAAG,IAAI,CAAC,OAAO,CAAC,UAAU,EAAE,oCAAmB,CAAC,eAAe,CAAC,CAAC,CAAC;QAExF,MAAM,UAAU,GAAG,eAAe,CAAC,UAAU,IAAI,CAAC,MAAM,4CAAsB,EAAE,CAAC,CAAC;QAClF,CAAC,CACC,eAAe,GAAG,OAAO,gBAAgB,kBAAkB,IAAI,CAAC,SAAS,CACvE,eAAe,CAAC,eAAe,CAChC,EAAE,CACJ,CAAC;QACF,MAAM,UAAU,CAAC,QAAQ,CAAC,GAAG,EAAE,gBAAgB,EAAE,eAAe,CAAC,eAAe,CAAC,CAAC;QAElF,MAAM,gBAAgB,CAAC,eAAe,EAAE,gBAAgB,EAAE,gBAAgB,CAAC,CAAC;QAE5E,OAAO,MAAM,KAAK,CAAC,cAAc,CAAC,GAAG,EAAE,gBAAgB,EAAE,QAAQ,CAAC,CAAC;IACrE,CAAC,CAAC,CAAC;AACL,CAAC;AA1ED,4CA0EC;AAED;;;;;GAKG;AACH,SAAgB,QAAQ,CACtB,OAAe,EACf,OAAwC;IAExC,OAAO,gBAAgB,iCAClB,OAAO,KACV,OAAO,EACP,QAAQ,EAAE,OAAO,CAAC,QAAQ,EAC1B,IAAI,EAAE,OAAO,CAAC,IAAI,EAClB,YAAY,EAAE,UAAU,IACxB,CAAC;AACL,CAAC;AAXD,4BAWC"}
|
||||
4
node_modules/@electron/get/dist/cjs/proxy.d.ts
generated
vendored
Normal file
4
node_modules/@electron/get/dist/cjs/proxy.d.ts
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
/**
|
||||
* Initializes a third-party proxy module for HTTP(S) requests.
|
||||
*/
|
||||
export declare function initializeProxy(): void;
|
||||
27
node_modules/@electron/get/dist/cjs/proxy.js
generated
vendored
Normal file
27
node_modules/@electron/get/dist/cjs/proxy.js
generated
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const debug = require("debug");
|
||||
const utils_1 = require("./utils");
|
||||
const d = debug('@electron/get:proxy');
|
||||
/**
|
||||
* Initializes a third-party proxy module for HTTP(S) requests.
|
||||
*/
|
||||
function initializeProxy() {
|
||||
try {
|
||||
// See: https://github.com/electron/get/pull/214#discussion_r798845713
|
||||
const env = utils_1.getEnv('GLOBAL_AGENT_');
|
||||
utils_1.setEnv('GLOBAL_AGENT_HTTP_PROXY', env('HTTP_PROXY'));
|
||||
utils_1.setEnv('GLOBAL_AGENT_HTTPS_PROXY', env('HTTPS_PROXY'));
|
||||
utils_1.setEnv('GLOBAL_AGENT_NO_PROXY', env('NO_PROXY'));
|
||||
/**
|
||||
* TODO: replace global-agent with a hpagent. @BlackHole1
|
||||
* https://github.com/sindresorhus/got/blob/HEAD/documentation/tips.md#proxying
|
||||
*/
|
||||
require('global-agent').bootstrap();
|
||||
}
|
||||
catch (e) {
|
||||
d('Could not load either proxy modules, built-in proxy support not available:', e);
|
||||
}
|
||||
}
|
||||
exports.initializeProxy = initializeProxy;
|
||||
//# sourceMappingURL=proxy.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/proxy.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/proxy.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"proxy.js","sourceRoot":"","sources":["../../src/proxy.ts"],"names":[],"mappings":";;AAAA,+BAA+B;AAC/B,mCAAyC;AAEzC,MAAM,CAAC,GAAG,KAAK,CAAC,qBAAqB,CAAC,CAAC;AAEvC;;GAEG;AACH,SAAgB,eAAe;IAC7B,IAAI;QACF,sEAAsE;QACtE,MAAM,GAAG,GAAG,cAAM,CAAC,eAAe,CAAC,CAAC;QAEpC,cAAM,CAAC,yBAAyB,EAAE,GAAG,CAAC,YAAY,CAAC,CAAC,CAAC;QACrD,cAAM,CAAC,0BAA0B,EAAE,GAAG,CAAC,aAAa,CAAC,CAAC,CAAC;QACvD,cAAM,CAAC,uBAAuB,EAAE,GAAG,CAAC,UAAU,CAAC,CAAC,CAAC;QAEjD;;;WAGG;QACH,OAAO,CAAC,cAAc,CAAC,CAAC,SAAS,EAAE,CAAC;KACrC;IAAC,OAAO,CAAC,EAAE;QACV,CAAC,CAAC,4EAA4E,EAAE,CAAC,CAAC,CAAC;KACpF;AACH,CAAC;AAjBD,0CAiBC"}
|
||||
129
node_modules/@electron/get/dist/cjs/types.d.ts
generated
vendored
Normal file
129
node_modules/@electron/get/dist/cjs/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,129 @@
|
||||
import { Downloader } from './Downloader';
|
||||
export declare type DownloadOptions = any;
|
||||
export interface MirrorOptions {
|
||||
/**
|
||||
* DEPRECATED - see nightlyMirror.
|
||||
*/
|
||||
nightly_mirror?: string;
|
||||
/**
|
||||
* The Electron nightly-specific mirror URL.
|
||||
*/
|
||||
nightlyMirror?: string;
|
||||
/**
|
||||
* The base URL of the mirror to download from,
|
||||
* e.g https://github.com/electron/electron/releases/download
|
||||
*/
|
||||
mirror?: string;
|
||||
/**
|
||||
* The name of the directory to download from,
|
||||
* often scoped by version number e.g 'v4.0.4'
|
||||
*/
|
||||
customDir?: string;
|
||||
/**
|
||||
* The name of the asset to download,
|
||||
* e.g 'electron-v4.0.4-linux-x64.zip'
|
||||
*/
|
||||
customFilename?: string;
|
||||
/**
|
||||
* The version of the asset to download,
|
||||
* e.g '4.0.4'
|
||||
*/
|
||||
customVersion?: string;
|
||||
/**
|
||||
* A function allowing customization of the url returned
|
||||
* from getArtifactRemoteURL().
|
||||
*/
|
||||
resolveAssetURL?: (opts: DownloadOptions) => Promise<string>;
|
||||
}
|
||||
export interface ElectronDownloadRequest {
|
||||
/**
|
||||
* The version of Electron associated with the artifact.
|
||||
*/
|
||||
version: string;
|
||||
/**
|
||||
* The type of artifact. For example:
|
||||
* * `electron`
|
||||
* * `ffmpeg`
|
||||
*/
|
||||
artifactName: string;
|
||||
}
|
||||
export interface ElectronDownloadRequestOptions {
|
||||
/**
|
||||
* Whether to download an artifact regardless of whether it's in the cache directory.
|
||||
*
|
||||
* Defaults to `false`.
|
||||
*/
|
||||
force?: boolean;
|
||||
/**
|
||||
* When set to `true`, disables checking that the artifact download completed successfully
|
||||
* with the correct payload.
|
||||
*
|
||||
* Defaults to `false`.
|
||||
*/
|
||||
unsafelyDisableChecksums?: boolean;
|
||||
/**
|
||||
* Provides checksums for the artifact as strings.
|
||||
* Can be used if you already know the checksums of the Electron artifact
|
||||
* you are downloading and want to skip the checksum file download
|
||||
* without skipping the checksum validation.
|
||||
*
|
||||
* This should be an object whose keys are the file names of the artifacts and
|
||||
* the values are their respective SHA256 checksums.
|
||||
*/
|
||||
checksums?: Record<string, string>;
|
||||
/**
|
||||
* The directory that caches Electron artifact downloads.
|
||||
*
|
||||
* The default value is dependent upon the host platform:
|
||||
*
|
||||
* * Linux: `$XDG_CACHE_HOME` or `~/.cache/electron/`
|
||||
* * MacOS: `~/Library/Caches/electron/`
|
||||
* * Windows: `%LOCALAPPDATA%/electron/Cache` or `~/AppData/Local/electron/Cache/`
|
||||
*/
|
||||
cacheRoot?: string;
|
||||
/**
|
||||
* Options passed to the downloader module.
|
||||
*/
|
||||
downloadOptions?: DownloadOptions;
|
||||
/**
|
||||
* Options related to specifying an artifact mirror.
|
||||
*/
|
||||
mirrorOptions?: MirrorOptions;
|
||||
/**
|
||||
* The custom [[Downloader]] class used to download artifacts. Defaults to the
|
||||
* built-in [[GotDownloader]].
|
||||
*/
|
||||
downloader?: Downloader<DownloadOptions>;
|
||||
/**
|
||||
* A temporary directory for downloads.
|
||||
* It is used before artifacts are put into cache.
|
||||
*/
|
||||
tempDirectory?: string;
|
||||
}
|
||||
export declare type ElectronPlatformArtifactDetails = {
|
||||
/**
|
||||
* The target artifact platform. These are Node-style platform names, for example:
|
||||
* * `win32`
|
||||
* * `darwin`
|
||||
* * `linux`
|
||||
*/
|
||||
platform: string;
|
||||
/**
|
||||
* The target artifact architecture. These are Node-style architecture names, for example:
|
||||
* * `ia32`
|
||||
* * `x64`
|
||||
* * `armv7l`
|
||||
*/
|
||||
arch: string;
|
||||
artifactSuffix?: string;
|
||||
isGeneric?: false;
|
||||
} & ElectronDownloadRequest & ElectronDownloadRequestOptions;
|
||||
export declare type ElectronGenericArtifactDetails = {
|
||||
isGeneric: true;
|
||||
} & ElectronDownloadRequest & ElectronDownloadRequestOptions;
|
||||
export declare type ElectronArtifactDetails = ElectronPlatformArtifactDetails | ElectronGenericArtifactDetails;
|
||||
export declare type Omit<T, K> = Pick<T, Exclude<keyof T, K>>;
|
||||
export declare type ElectronPlatformArtifactDetailsWithDefaults = (Omit<ElectronPlatformArtifactDetails, 'platform' | 'arch'> & {
|
||||
platform?: string;
|
||||
arch?: string;
|
||||
}) | ElectronGenericArtifactDetails;
|
||||
3
node_modules/@electron/get/dist/cjs/types.js
generated
vendored
Normal file
3
node_modules/@electron/get/dist/cjs/types.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
//# sourceMappingURL=types.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/types.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/types.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"types.js","sourceRoot":"","sources":["../../src/types.ts"],"names":[],"mappings":""}
|
||||
25
node_modules/@electron/get/dist/cjs/utils.d.ts
generated
vendored
Normal file
25
node_modules/@electron/get/dist/cjs/utils.d.ts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
export declare function withTempDirectoryIn<T>(parentDirectory: string | undefined, fn: (directory: string) => Promise<T>): Promise<T>;
|
||||
export declare function withTempDirectory<T>(fn: (directory: string) => Promise<T>): Promise<T>;
|
||||
export declare function normalizeVersion(version: string): string;
|
||||
/**
|
||||
* Runs the `uname` command and returns the trimmed output.
|
||||
*/
|
||||
export declare function uname(): string;
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name.
|
||||
*/
|
||||
export declare function getNodeArch(arch: string): string;
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name, from the `process` module information.
|
||||
*/
|
||||
export declare function getHostArch(): string;
|
||||
export declare function ensureIsTruthyString<T, K extends keyof T>(obj: T, key: K): void;
|
||||
export declare function isOfficialLinuxIA32Download(platform: string, arch: string, version: string, mirrorOptions?: object): boolean;
|
||||
/**
|
||||
* Find the value of a environment variable which may or may not have the
|
||||
* prefix, in a case-insensitive manner.
|
||||
*/
|
||||
export declare function getEnv(prefix?: string): (name: string) => string | undefined;
|
||||
export declare function setEnv(key: string, value: string | undefined): void;
|
||||
107
node_modules/@electron/get/dist/cjs/utils.js
generated
vendored
Normal file
107
node_modules/@electron/get/dist/cjs/utils.js
generated
vendored
Normal file
@@ -0,0 +1,107 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const childProcess = require("child_process");
|
||||
const fs = require("fs-extra");
|
||||
const os = require("os");
|
||||
const path = require("path");
|
||||
async function useAndRemoveDirectory(directory, fn) {
|
||||
let result;
|
||||
try {
|
||||
result = await fn(directory);
|
||||
}
|
||||
finally {
|
||||
await fs.remove(directory);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
async function withTempDirectoryIn(parentDirectory = os.tmpdir(), fn) {
|
||||
const tempDirectoryPrefix = 'electron-download-';
|
||||
const tempDirectory = await fs.mkdtemp(path.resolve(parentDirectory, tempDirectoryPrefix));
|
||||
return useAndRemoveDirectory(tempDirectory, fn);
|
||||
}
|
||||
exports.withTempDirectoryIn = withTempDirectoryIn;
|
||||
async function withTempDirectory(fn) {
|
||||
return withTempDirectoryIn(undefined, fn);
|
||||
}
|
||||
exports.withTempDirectory = withTempDirectory;
|
||||
function normalizeVersion(version) {
|
||||
if (!version.startsWith('v')) {
|
||||
return `v${version}`;
|
||||
}
|
||||
return version;
|
||||
}
|
||||
exports.normalizeVersion = normalizeVersion;
|
||||
/**
|
||||
* Runs the `uname` command and returns the trimmed output.
|
||||
*/
|
||||
function uname() {
|
||||
return childProcess
|
||||
.execSync('uname -m')
|
||||
.toString()
|
||||
.trim();
|
||||
}
|
||||
exports.uname = uname;
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name.
|
||||
*/
|
||||
function getNodeArch(arch) {
|
||||
if (arch === 'arm') {
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
switch (process.config.variables.arm_version) {
|
||||
case '6':
|
||||
return uname();
|
||||
case '7':
|
||||
default:
|
||||
return 'armv7l';
|
||||
}
|
||||
}
|
||||
return arch;
|
||||
}
|
||||
exports.getNodeArch = getNodeArch;
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name, from the `process` module information.
|
||||
*/
|
||||
function getHostArch() {
|
||||
return getNodeArch(process.arch);
|
||||
}
|
||||
exports.getHostArch = getHostArch;
|
||||
function ensureIsTruthyString(obj, key) {
|
||||
if (!obj[key] || typeof obj[key] !== 'string') {
|
||||
throw new Error(`Expected property "${key}" to be provided as a string but it was not`);
|
||||
}
|
||||
}
|
||||
exports.ensureIsTruthyString = ensureIsTruthyString;
|
||||
function isOfficialLinuxIA32Download(platform, arch, version, mirrorOptions) {
|
||||
return (platform === 'linux' &&
|
||||
arch === 'ia32' &&
|
||||
Number(version.slice(1).split('.')[0]) >= 4 &&
|
||||
typeof mirrorOptions === 'undefined');
|
||||
}
|
||||
exports.isOfficialLinuxIA32Download = isOfficialLinuxIA32Download;
|
||||
/**
|
||||
* Find the value of a environment variable which may or may not have the
|
||||
* prefix, in a case-insensitive manner.
|
||||
*/
|
||||
function getEnv(prefix = '') {
|
||||
const envsLowerCase = {};
|
||||
for (const envKey in process.env) {
|
||||
envsLowerCase[envKey.toLowerCase()] = process.env[envKey];
|
||||
}
|
||||
return (name) => {
|
||||
return (envsLowerCase[`${prefix}${name}`.toLowerCase()] ||
|
||||
envsLowerCase[name.toLowerCase()] ||
|
||||
undefined);
|
||||
};
|
||||
}
|
||||
exports.getEnv = getEnv;
|
||||
function setEnv(key, value) {
|
||||
// The `void` operator always returns `undefined`.
|
||||
// See: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/void
|
||||
if (value !== void 0) {
|
||||
process.env[key] = value;
|
||||
}
|
||||
}
|
||||
exports.setEnv = setEnv;
|
||||
//# sourceMappingURL=utils.js.map
|
||||
1
node_modules/@electron/get/dist/cjs/utils.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/cjs/utils.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"utils.js","sourceRoot":"","sources":["../../src/utils.ts"],"names":[],"mappings":";;AAAA,8CAA8C;AAC9C,+BAA+B;AAC/B,yBAAyB;AACzB,6BAA6B;AAE7B,KAAK,UAAU,qBAAqB,CAClC,SAAiB,EACjB,EAAqC;IAErC,IAAI,MAAS,CAAC;IACd,IAAI;QACF,MAAM,GAAG,MAAM,EAAE,CAAC,SAAS,CAAC,CAAC;KAC9B;YAAS;QACR,MAAM,EAAE,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;KAC5B;IACD,OAAO,MAAM,CAAC;AAChB,CAAC;AAEM,KAAK,UAAU,mBAAmB,CACvC,kBAA0B,EAAE,CAAC,MAAM,EAAE,EACrC,EAAqC;IAErC,MAAM,mBAAmB,GAAG,oBAAoB,CAAC;IACjD,MAAM,aAAa,GAAG,MAAM,EAAE,CAAC,OAAO,CAAC,IAAI,CAAC,OAAO,CAAC,eAAe,EAAE,mBAAmB,CAAC,CAAC,CAAC;IAC3F,OAAO,qBAAqB,CAAC,aAAa,EAAE,EAAE,CAAC,CAAC;AAClD,CAAC;AAPD,kDAOC;AAEM,KAAK,UAAU,iBAAiB,CAAI,EAAqC;IAC9E,OAAO,mBAAmB,CAAC,SAAS,EAAE,EAAE,CAAC,CAAC;AAC5C,CAAC;AAFD,8CAEC;AAED,SAAgB,gBAAgB,CAAC,OAAe;IAC9C,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,GAAG,CAAC,EAAE;QAC5B,OAAO,IAAI,OAAO,EAAE,CAAC;KACtB;IACD,OAAO,OAAO,CAAC;AACjB,CAAC;AALD,4CAKC;AAED;;GAEG;AACH,SAAgB,KAAK;IACnB,OAAO,YAAY;SAChB,QAAQ,CAAC,UAAU,CAAC;SACpB,QAAQ,EAAE;SACV,IAAI,EAAE,CAAC;AACZ,CAAC;AALD,sBAKC;AAED;;;GAGG;AACH,SAAgB,WAAW,CAAC,IAAY;IACtC,IAAI,IAAI,KAAK,KAAK,EAAE;QAClB,8DAA8D;QAC9D,QAAS,OAAO,CAAC,MAAM,CAAC,SAAiB,CAAC,WAAW,EAAE;YACrD,KAAK,GAAG;gBACN,OAAO,KAAK,EAAE,CAAC;YACjB,KAAK,GAAG,CAAC;YACT;gBACE,OAAO,QAAQ,CAAC;SACnB;KACF;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AAbD,kCAaC;AAED;;;GAGG;AACH,SAAgB,WAAW;IACzB,OAAO,WAAW,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;AACnC,CAAC;AAFD,kCAEC;AAED,SAAgB,oBAAoB,CAAuB,GAAM,EAAE,GAAM;IACvE,IAAI,CAAC,GAAG,CAAC,GAAG,CAAC,IAAI,OAAO,GAAG,CAAC,GAAG,CAAC,KAAK,QAAQ,EAAE;QAC7C,MAAM,IAAI,KAAK,CAAC,sBAAsB,GAAG,6CAA6C,CAAC,CAAC;KACzF;AACH,CAAC;AAJD,oDAIC;AAED,SAAgB,2BAA2B,CACzC,QAAgB,EAChB,IAAY,EACZ,OAAe,EACf,aAAsB;IAEtB,OAAO,CACL,QAAQ,KAAK,OAAO;QACpB,IAAI,KAAK,MAAM;QACf,MAAM,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QAC3C,OAAO,aAAa,KAAK,WAAW,CACrC,CAAC;AACJ,CAAC;AAZD,kEAYC;AAED;;;GAGG;AACH,SAAgB,MAAM,CAAC,MAAM,GAAG,EAAE;IAChC,MAAM,aAAa,GAAsB,EAAE,CAAC;IAE5C,KAAK,MAAM,MAAM,IAAI,OAAO,CAAC,GAAG,EAAE;QAChC,aAAa,CAAC,MAAM,CAAC,WAAW,EAAE,CAAC,GAAG,OAAO,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC;KAC3D;IAED,OAAO,CAAC,IAAY,EAAsB,EAAE;QAC1C,OAAO,CACL,aAAa,CAAC,GAAG,MAAM,GAAG,IAAI,EAAE,CAAC,WAAW,EAAE,CAAC;YAC/C,aAAa,CAAC,IAAI,CAAC,WAAW,EAAE,CAAC;YACjC,SAAS,CACV,CAAC;IACJ,CAAC,CAAC;AACJ,CAAC;AAdD,wBAcC;AAED,SAAgB,MAAM,CAAC,GAAW,EAAE,KAAyB;IAC3D,kDAAkD;IAClD,wFAAwF;IACxF,IAAI,KAAK,KAAK,KAAK,CAAC,EAAE;QACpB,OAAO,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC;KAC1B;AACH,CAAC;AAND,wBAMC"}
|
||||
8
node_modules/@electron/get/dist/esm/Cache.d.ts
generated
vendored
Normal file
8
node_modules/@electron/get/dist/esm/Cache.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
export declare class Cache {
|
||||
private cacheRoot;
|
||||
constructor(cacheRoot?: string);
|
||||
static getCacheDirectory(downloadUrl: string): string;
|
||||
getCachePath(downloadUrl: string, fileName: string): string;
|
||||
getPathForFileInCache(url: string, fileName: string): Promise<string | null>;
|
||||
putFileInCache(url: string, currentPath: string, fileName: string): Promise<string>;
|
||||
}
|
||||
57
node_modules/@electron/get/dist/esm/Cache.js
generated
vendored
Normal file
57
node_modules/@electron/get/dist/esm/Cache.js
generated
vendored
Normal file
@@ -0,0 +1,57 @@
|
||||
var __rest = (this && this.__rest) || function (s, e) {
|
||||
var t = {};
|
||||
for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)
|
||||
t[p] = s[p];
|
||||
if (s != null && typeof Object.getOwnPropertySymbols === "function")
|
||||
for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {
|
||||
if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))
|
||||
t[p[i]] = s[p[i]];
|
||||
}
|
||||
return t;
|
||||
};
|
||||
import debug from 'debug';
|
||||
import envPaths from 'env-paths';
|
||||
import * as fs from 'fs-extra';
|
||||
import * as path from 'path';
|
||||
import * as url from 'url';
|
||||
import * as crypto from 'crypto';
|
||||
const d = debug('@electron/get:cache');
|
||||
const defaultCacheRoot = envPaths('electron', {
|
||||
suffix: '',
|
||||
}).cache;
|
||||
export class Cache {
|
||||
constructor(cacheRoot = defaultCacheRoot) {
|
||||
this.cacheRoot = cacheRoot;
|
||||
}
|
||||
static getCacheDirectory(downloadUrl) {
|
||||
const parsedDownloadUrl = url.parse(downloadUrl);
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
const { search, hash, pathname } = parsedDownloadUrl, rest = __rest(parsedDownloadUrl, ["search", "hash", "pathname"]);
|
||||
const strippedUrl = url.format(Object.assign(Object.assign({}, rest), { pathname: path.dirname(pathname || 'electron') }));
|
||||
return crypto
|
||||
.createHash('sha256')
|
||||
.update(strippedUrl)
|
||||
.digest('hex');
|
||||
}
|
||||
getCachePath(downloadUrl, fileName) {
|
||||
return path.resolve(this.cacheRoot, Cache.getCacheDirectory(downloadUrl), fileName);
|
||||
}
|
||||
async getPathForFileInCache(url, fileName) {
|
||||
const cachePath = this.getCachePath(url, fileName);
|
||||
if (await fs.pathExists(cachePath)) {
|
||||
return cachePath;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
async putFileInCache(url, currentPath, fileName) {
|
||||
const cachePath = this.getCachePath(url, fileName);
|
||||
d(`Moving ${currentPath} to ${cachePath}`);
|
||||
if (await fs.pathExists(cachePath)) {
|
||||
d('* Replacing existing file');
|
||||
await fs.remove(cachePath);
|
||||
}
|
||||
await fs.move(currentPath, cachePath);
|
||||
return cachePath;
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=Cache.js.map
|
||||
1
node_modules/@electron/get/dist/esm/Cache.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/Cache.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Cache.js","sourceRoot":"","sources":["../../src/Cache.ts"],"names":[],"mappings":";;;;;;;;;;;AAAA,OAAO,KAAK,MAAM,OAAO,CAAC;AAC1B,OAAO,QAAQ,MAAM,WAAW,CAAC;AACjC,OAAO,KAAK,EAAE,MAAM,UAAU,CAAC;AAC/B,OAAO,KAAK,IAAI,MAAM,MAAM,CAAC;AAC7B,OAAO,KAAK,GAAG,MAAM,KAAK,CAAC;AAC3B,OAAO,KAAK,MAAM,MAAM,QAAQ,CAAC;AAEjC,MAAM,CAAC,GAAG,KAAK,CAAC,qBAAqB,CAAC,CAAC;AAEvC,MAAM,gBAAgB,GAAG,QAAQ,CAAC,UAAU,EAAE;IAC5C,MAAM,EAAE,EAAE;CACX,CAAC,CAAC,KAAK,CAAC;AAET,MAAM,OAAO,KAAK;IAChB,YAAoB,YAAY,gBAAgB;QAA5B,cAAS,GAAT,SAAS,CAAmB;IAAG,CAAC;IAE7C,MAAM,CAAC,iBAAiB,CAAC,WAAmB;QACjD,MAAM,iBAAiB,GAAG,GAAG,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC;QACjD,6DAA6D;QAC7D,MAAM,EAAE,MAAM,EAAE,IAAI,EAAE,QAAQ,KAAc,iBAAiB,EAA7B,gEAA6B,CAAC;QAC9D,MAAM,WAAW,GAAG,GAAG,CAAC,MAAM,iCAAM,IAAI,KAAE,QAAQ,EAAE,IAAI,CAAC,OAAO,CAAC,QAAQ,IAAI,UAAU,CAAC,IAAG,CAAC;QAE5F,OAAO,MAAM;aACV,UAAU,CAAC,QAAQ,CAAC;aACpB,MAAM,CAAC,WAAW,CAAC;aACnB,MAAM,CAAC,KAAK,CAAC,CAAC;IACnB,CAAC;IAEM,YAAY,CAAC,WAAmB,EAAE,QAAgB;QACvD,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,SAAS,EAAE,KAAK,CAAC,iBAAiB,CAAC,WAAW,CAAC,EAAE,QAAQ,CAAC,CAAC;IACtF,CAAC;IAEM,KAAK,CAAC,qBAAqB,CAAC,GAAW,EAAE,QAAgB;QAC9D,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QACnD,IAAI,MAAM,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAClC,OAAO,SAAS,CAAC;SAClB;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAEM,KAAK,CAAC,cAAc,CAAC,GAAW,EAAE,WAAmB,EAAE,QAAgB;QAC5E,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QACnD,CAAC,CAAC,UAAU,WAAW,OAAO,SAAS,EAAE,CAAC,CAAC;QAC3C,IAAI,MAAM,EAAE,CAAC,UAAU,CAAC,SAAS,CAAC,EAAE;YAClC,CAAC,CAAC,2BAA2B,CAAC,CAAC;YAC/B,MAAM,EAAE,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;SAC5B;QAED,MAAM,EAAE,CAAC,IAAI,CAAC,WAAW,EAAE,SAAS,CAAC,CAAC;QAEtC,OAAO,SAAS,CAAC;IACnB,CAAC;CACF"}
|
||||
3
node_modules/@electron/get/dist/esm/Downloader.d.ts
generated
vendored
Normal file
3
node_modules/@electron/get/dist/esm/Downloader.d.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
export interface Downloader<T> {
|
||||
download(url: string, targetFilePath: string, options: T): Promise<void>;
|
||||
}
|
||||
1
node_modules/@electron/get/dist/esm/Downloader.js
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/Downloader.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
//# sourceMappingURL=Downloader.js.map
|
||||
1
node_modules/@electron/get/dist/esm/Downloader.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/Downloader.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Downloader.js","sourceRoot":"","sources":["../../src/Downloader.ts"],"names":[],"mappings":""}
|
||||
21
node_modules/@electron/get/dist/esm/GotDownloader.d.ts
generated
vendored
Normal file
21
node_modules/@electron/get/dist/esm/GotDownloader.d.ts
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
import { Progress as GotProgress, Options as GotOptions } from 'got';
|
||||
import { Downloader } from './Downloader';
|
||||
/**
|
||||
* See [`got#options`](https://github.com/sindresorhus/got#options) for possible keys/values.
|
||||
*/
|
||||
export declare type GotDownloaderOptions = (GotOptions & {
|
||||
isStream?: true;
|
||||
}) & {
|
||||
/**
|
||||
* if defined, triggers every time `got`'s `downloadProgress` event callback is triggered.
|
||||
*/
|
||||
getProgressCallback?: (progress: GotProgress) => Promise<void>;
|
||||
/**
|
||||
* if `true`, disables the console progress bar (setting the `ELECTRON_GET_NO_PROGRESS`
|
||||
* environment variable to a non-empty value also does this).
|
||||
*/
|
||||
quiet?: boolean;
|
||||
};
|
||||
export declare class GotDownloader implements Downloader<GotDownloaderOptions> {
|
||||
download(url: string, targetFilePath: string, options?: GotDownloaderOptions): Promise<void>;
|
||||
}
|
||||
73
node_modules/@electron/get/dist/esm/GotDownloader.js
generated
vendored
Normal file
73
node_modules/@electron/get/dist/esm/GotDownloader.js
generated
vendored
Normal file
@@ -0,0 +1,73 @@
|
||||
var __rest = (this && this.__rest) || function (s, e) {
|
||||
var t = {};
|
||||
for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)
|
||||
t[p] = s[p];
|
||||
if (s != null && typeof Object.getOwnPropertySymbols === "function")
|
||||
for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {
|
||||
if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))
|
||||
t[p[i]] = s[p[i]];
|
||||
}
|
||||
return t;
|
||||
};
|
||||
import * as fs from 'fs-extra';
|
||||
import got, { HTTPError } from 'got';
|
||||
import * as path from 'path';
|
||||
import * as ProgressBar from 'progress';
|
||||
const PROGRESS_BAR_DELAY_IN_SECONDS = 30;
|
||||
export class GotDownloader {
|
||||
async download(url, targetFilePath, options) {
|
||||
if (!options) {
|
||||
options = {};
|
||||
}
|
||||
const { quiet, getProgressCallback } = options, gotOptions = __rest(options, ["quiet", "getProgressCallback"]);
|
||||
let downloadCompleted = false;
|
||||
let bar;
|
||||
let progressPercent;
|
||||
let timeout = undefined;
|
||||
await fs.mkdirp(path.dirname(targetFilePath));
|
||||
const writeStream = fs.createWriteStream(targetFilePath);
|
||||
if (!quiet || !process.env.ELECTRON_GET_NO_PROGRESS) {
|
||||
const start = new Date();
|
||||
timeout = setTimeout(() => {
|
||||
if (!downloadCompleted) {
|
||||
bar = new ProgressBar(`Downloading ${path.basename(url)}: [:bar] :percent ETA: :eta seconds `, {
|
||||
curr: progressPercent,
|
||||
total: 100,
|
||||
});
|
||||
// https://github.com/visionmedia/node-progress/issues/159
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
bar.start = start;
|
||||
}
|
||||
}, PROGRESS_BAR_DELAY_IN_SECONDS * 1000);
|
||||
}
|
||||
await new Promise((resolve, reject) => {
|
||||
const downloadStream = got.stream(url, gotOptions);
|
||||
downloadStream.on('downloadProgress', async (progress) => {
|
||||
progressPercent = progress.percent;
|
||||
if (bar) {
|
||||
bar.update(progress.percent);
|
||||
}
|
||||
if (getProgressCallback) {
|
||||
await getProgressCallback(progress);
|
||||
}
|
||||
});
|
||||
downloadStream.on('error', error => {
|
||||
if (error instanceof HTTPError && error.response.statusCode === 404) {
|
||||
error.message += ` for ${error.response.url}`;
|
||||
}
|
||||
if (writeStream.destroy) {
|
||||
writeStream.destroy(error);
|
||||
}
|
||||
reject(error);
|
||||
});
|
||||
writeStream.on('error', error => reject(error));
|
||||
writeStream.on('close', () => resolve());
|
||||
downloadStream.pipe(writeStream);
|
||||
});
|
||||
downloadCompleted = true;
|
||||
if (timeout) {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=GotDownloader.js.map
|
||||
1
node_modules/@electron/get/dist/esm/GotDownloader.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/GotDownloader.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"GotDownloader.js","sourceRoot":"","sources":["../../src/GotDownloader.ts"],"names":[],"mappings":";;;;;;;;;;;AAAA,OAAO,KAAK,EAAE,MAAM,UAAU,CAAC;AAC/B,OAAO,GAAG,EAAE,EAAE,SAAS,EAAkD,MAAM,KAAK,CAAC;AACrF,OAAO,KAAK,IAAI,MAAM,MAAM,CAAC;AAC7B,OAAO,KAAK,WAAW,MAAM,UAAU,CAAC;AAIxC,MAAM,6BAA6B,GAAG,EAAE,CAAC;AAiBzC,MAAM,OAAO,aAAa;IACxB,KAAK,CAAC,QAAQ,CACZ,GAAW,EACX,cAAsB,EACtB,OAA8B;QAE9B,IAAI,CAAC,OAAO,EAAE;YACZ,OAAO,GAAG,EAAE,CAAC;SACd;QACD,MAAM,EAAE,KAAK,EAAE,mBAAmB,KAAoB,OAAO,EAAzB,8DAAyB,CAAC;QAC9D,IAAI,iBAAiB,GAAG,KAAK,CAAC;QAC9B,IAAI,GAA4B,CAAC;QACjC,IAAI,eAAuB,CAAC;QAC5B,IAAI,OAAO,GAA+B,SAAS,CAAC;QACpD,MAAM,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC;QAC9C,MAAM,WAAW,GAAG,EAAE,CAAC,iBAAiB,CAAC,cAAc,CAAC,CAAC;QAEzD,IAAI,CAAC,KAAK,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,wBAAwB,EAAE;YACnD,MAAM,KAAK,GAAG,IAAI,IAAI,EAAE,CAAC;YACzB,OAAO,GAAG,UAAU,CAAC,GAAG,EAAE;gBACxB,IAAI,CAAC,iBAAiB,EAAE;oBACtB,GAAG,GAAG,IAAI,WAAW,CACnB,eAAe,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,sCAAsC,EACvE;wBACE,IAAI,EAAE,eAAe;wBACrB,KAAK,EAAE,GAAG;qBACX,CACF,CAAC;oBACF,0DAA0D;oBAC1D,8DAA8D;oBAC7D,GAAW,CAAC,KAAK,GAAG,KAAK,CAAC;iBAC5B;YACH,CAAC,EAAE,6BAA6B,GAAG,IAAI,CAAC,CAAC;SAC1C;QACD,MAAM,IAAI,OAAO,CAAO,CAAC,OAAO,EAAE,MAAM,EAAE,EAAE;YAC1C,MAAM,cAAc,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,EAAE,UAAU,CAAC,CAAC;YACnD,cAAc,CAAC,EAAE,CAAC,kBAAkB,EAAE,KAAK,EAAC,QAAQ,EAAC,EAAE;gBACrD,eAAe,GAAG,QAAQ,CAAC,OAAO,CAAC;gBACnC,IAAI,GAAG,EAAE;oBACP,GAAG,CAAC,MAAM,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC;iBAC9B;gBACD,IAAI,mBAAmB,EAAE;oBACvB,MAAM,mBAAmB,CAAC,QAAQ,CAAC,CAAC;iBACrC;YACH,CAAC,CAAC,CAAC;YACH,cAAc,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE;gBACjC,IAAI,KAAK,YAAY,SAAS,IAAI,KAAK,CAAC,QAAQ,CAAC,UAAU,KAAK,GAAG,EAAE;oBACnE,KAAK,CAAC,OAAO,IAAI,QAAQ,KAAK,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC;iBAC/C;gBACD,IAAI,WAAW,CAAC,OAAO,EAAE;oBACvB,WAAW,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;iBAC5B;gBAED,MAAM,CAAC,KAAK,CAAC,CAAC;YAChB,CAAC,CAAC,CAAC;YACH,WAAW,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC;YAChD,WAAW,CAAC,EAAE,CAAC,OAAO,EAAE,GAAG,EAAE,CAAC,OAAO,EAAE,CAAC,CAAC;YAEzC,cAAc,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;QACnC,CAAC,CAAC,CAAC;QAEH,iBAAiB,GAAG,IAAI,CAAC;QACzB,IAAI,OAAO,EAAE;YACX,YAAY,CAAC,OAAO,CAAC,CAAC;SACvB;IACH,CAAC;CACF"}
|
||||
4
node_modules/@electron/get/dist/esm/artifact-utils.d.ts
generated
vendored
Normal file
4
node_modules/@electron/get/dist/esm/artifact-utils.d.ts
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
import { ElectronArtifactDetails } from './types';
|
||||
export declare function getArtifactFileName(details: ElectronArtifactDetails): string;
|
||||
export declare function getArtifactRemoteURL(details: ElectronArtifactDetails): Promise<string>;
|
||||
export declare function getArtifactVersion(details: ElectronArtifactDetails): string;
|
||||
61
node_modules/@electron/get/dist/esm/artifact-utils.js
generated
vendored
Normal file
61
node_modules/@electron/get/dist/esm/artifact-utils.js
generated
vendored
Normal file
@@ -0,0 +1,61 @@
|
||||
import { ensureIsTruthyString, normalizeVersion } from './utils';
|
||||
const BASE_URL = 'https://github.com/electron/electron/releases/download/';
|
||||
const NIGHTLY_BASE_URL = 'https://github.com/electron/nightlies/releases/download/';
|
||||
export function getArtifactFileName(details) {
|
||||
ensureIsTruthyString(details, 'artifactName');
|
||||
if (details.isGeneric) {
|
||||
return details.artifactName;
|
||||
}
|
||||
ensureIsTruthyString(details, 'arch');
|
||||
ensureIsTruthyString(details, 'platform');
|
||||
ensureIsTruthyString(details, 'version');
|
||||
return `${[
|
||||
details.artifactName,
|
||||
details.version,
|
||||
details.platform,
|
||||
details.arch,
|
||||
...(details.artifactSuffix ? [details.artifactSuffix] : []),
|
||||
].join('-')}.zip`;
|
||||
}
|
||||
function mirrorVar(name, options, defaultValue) {
|
||||
// Convert camelCase to camel_case for env var reading
|
||||
const snakeName = name.replace(/([a-z])([A-Z])/g, (_, a, b) => `${a}_${b}`).toLowerCase();
|
||||
return (
|
||||
// .npmrc
|
||||
process.env[`npm_config_electron_${name.toLowerCase()}`] ||
|
||||
process.env[`NPM_CONFIG_ELECTRON_${snakeName.toUpperCase()}`] ||
|
||||
process.env[`npm_config_electron_${snakeName}`] ||
|
||||
// package.json
|
||||
process.env[`npm_package_config_electron_${name}`] ||
|
||||
process.env[`npm_package_config_electron_${snakeName.toLowerCase()}`] ||
|
||||
// env
|
||||
process.env[`ELECTRON_${snakeName.toUpperCase()}`] ||
|
||||
options[name] ||
|
||||
defaultValue);
|
||||
}
|
||||
export async function getArtifactRemoteURL(details) {
|
||||
const opts = details.mirrorOptions || {};
|
||||
let base = mirrorVar('mirror', opts, BASE_URL);
|
||||
if (details.version.includes('nightly')) {
|
||||
const nightlyDeprecated = mirrorVar('nightly_mirror', opts, '');
|
||||
if (nightlyDeprecated) {
|
||||
base = nightlyDeprecated;
|
||||
console.warn(`nightly_mirror is deprecated, please use nightlyMirror`);
|
||||
}
|
||||
else {
|
||||
base = mirrorVar('nightlyMirror', opts, NIGHTLY_BASE_URL);
|
||||
}
|
||||
}
|
||||
const path = mirrorVar('customDir', opts, details.version).replace('{{ version }}', details.version.replace(/^v/, ''));
|
||||
const file = mirrorVar('customFilename', opts, getArtifactFileName(details));
|
||||
// Allow customized download URL resolution.
|
||||
if (opts.resolveAssetURL) {
|
||||
const url = await opts.resolveAssetURL(details);
|
||||
return url;
|
||||
}
|
||||
return `${base}${path}/${file}`;
|
||||
}
|
||||
export function getArtifactVersion(details) {
|
||||
return normalizeVersion(mirrorVar('customVersion', details.mirrorOptions || {}, details.version));
|
||||
}
|
||||
//# sourceMappingURL=artifact-utils.js.map
|
||||
1
node_modules/@electron/get/dist/esm/artifact-utils.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/artifact-utils.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"artifact-utils.js","sourceRoot":"","sources":["../../src/artifact-utils.ts"],"names":[],"mappings":"AACA,OAAO,EAAE,oBAAoB,EAAE,gBAAgB,EAAE,MAAM,SAAS,CAAC;AAEjE,MAAM,QAAQ,GAAG,yDAAyD,CAAC;AAC3E,MAAM,gBAAgB,GAAG,0DAA0D,CAAC;AAEpF,MAAM,UAAU,mBAAmB,CAAC,OAAgC;IAClE,oBAAoB,CAAC,OAAO,EAAE,cAAc,CAAC,CAAC;IAE9C,IAAI,OAAO,CAAC,SAAS,EAAE;QACrB,OAAO,OAAO,CAAC,YAAY,CAAC;KAC7B;IAED,oBAAoB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;IACtC,oBAAoB,CAAC,OAAO,EAAE,UAAU,CAAC,CAAC;IAC1C,oBAAoB,CAAC,OAAO,EAAE,SAAS,CAAC,CAAC;IAEzC,OAAO,GAAG;QACR,OAAO,CAAC,YAAY;QACpB,OAAO,CAAC,OAAO;QACf,OAAO,CAAC,QAAQ;QAChB,OAAO,CAAC,IAAI;QACZ,GAAG,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC;KAC5D,CAAC,IAAI,CAAC,GAAG,CAAC,MAAM,CAAC;AACpB,CAAC;AAED,SAAS,SAAS,CAChB,IAAkD,EAClD,OAAsB,EACtB,YAAoB;IAEpB,sDAAsD;IACtD,MAAM,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE,CAAC,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,WAAW,EAAE,CAAC;IAE1F,OAAO;IACL,SAAS;IACT,OAAO,CAAC,GAAG,CAAC,uBAAuB,IAAI,CAAC,WAAW,EAAE,EAAE,CAAC;QACxD,OAAO,CAAC,GAAG,CAAC,uBAAuB,SAAS,CAAC,WAAW,EAAE,EAAE,CAAC;QAC7D,OAAO,CAAC,GAAG,CAAC,uBAAuB,SAAS,EAAE,CAAC;QAC/C,eAAe;QACf,OAAO,CAAC,GAAG,CAAC,+BAA+B,IAAI,EAAE,CAAC;QAClD,OAAO,CAAC,GAAG,CAAC,+BAA+B,SAAS,CAAC,WAAW,EAAE,EAAE,CAAC;QACrE,MAAM;QACN,OAAO,CAAC,GAAG,CAAC,YAAY,SAAS,CAAC,WAAW,EAAE,EAAE,CAAC;QAClD,OAAO,CAAC,IAAI,CAAC;QACb,YAAY,CACb,CAAC;AACJ,CAAC;AAED,MAAM,CAAC,KAAK,UAAU,oBAAoB,CAAC,OAAgC;IACzE,MAAM,IAAI,GAAkB,OAAO,CAAC,aAAa,IAAI,EAAE,CAAC;IACxD,IAAI,IAAI,GAAG,SAAS,CAAC,QAAQ,EAAE,IAAI,EAAE,QAAQ,CAAC,CAAC;IAC/C,IAAI,OAAO,CAAC,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE;QACvC,MAAM,iBAAiB,GAAG,SAAS,CAAC,gBAAgB,EAAE,IAAI,EAAE,EAAE,CAAC,CAAC;QAChE,IAAI,iBAAiB,EAAE;YACrB,IAAI,GAAG,iBAAiB,CAAC;YACzB,OAAO,CAAC,IAAI,CAAC,wDAAwD,CAAC,CAAC;SACxE;aAAM;YACL,IAAI,GAAG,SAAS,CAAC,eAAe,EAAE,IAAI,EAAE,gBAAgB,CAAC,CAAC;SAC3D;KACF;IACD,MAAM,IAAI,GAAG,SAAS,CAAC,WAAW,EAAE,IAAI,EAAE,OAAO,CAAC,OAAO,CAAC,CAAC,OAAO,CAChE,eAAe,EACf,OAAO,CAAC,OAAO,CAAC,OAAO,CAAC,IAAI,EAAE,EAAE,CAAC,CAClC,CAAC;IACF,MAAM,IAAI,GAAG,SAAS,CAAC,gBAAgB,EAAE,IAAI,EAAE,mBAAmB,CAAC,OAAO,CAAC,CAAC,CAAC;IAE7E,4CAA4C;IAC5C,IAAI,IAAI,CAAC,eAAe,EAAE;QACxB,MAAM,GAAG,GAAG,MAAM,IAAI,CAAC,eAAe,CAAC,OAAO,CAAC,CAAC;QAChD,OAAO,GAAG,CAAC;KACZ;IAED,OAAO,GAAG,IAAI,GAAG,IAAI,IAAI,IAAI,EAAE,CAAC;AAClC,CAAC;AAED,MAAM,UAAU,kBAAkB,CAAC,OAAgC;IACjE,OAAO,gBAAgB,CAAC,SAAS,CAAC,eAAe,EAAE,OAAO,CAAC,aAAa,IAAI,EAAE,EAAE,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC;AACpG,CAAC"}
|
||||
3
node_modules/@electron/get/dist/esm/downloader-resolver.d.ts
generated
vendored
Normal file
3
node_modules/@electron/get/dist/esm/downloader-resolver.d.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
import { DownloadOptions } from './types';
|
||||
import { Downloader } from './Downloader';
|
||||
export declare function getDownloaderForSystem(): Promise<Downloader<DownloadOptions>>;
|
||||
9
node_modules/@electron/get/dist/esm/downloader-resolver.js
generated
vendored
Normal file
9
node_modules/@electron/get/dist/esm/downloader-resolver.js
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
export async function getDownloaderForSystem() {
|
||||
// TODO: Resolve the downloader or default to GotDownloader
|
||||
// Current thoughts are a dot-file traversal for something like
|
||||
// ".electron.downloader" which would be a text file with the name of the
|
||||
// npm module to import() and use as the downloader
|
||||
const { GotDownloader } = await import('./GotDownloader');
|
||||
return new GotDownloader();
|
||||
}
|
||||
//# sourceMappingURL=downloader-resolver.js.map
|
||||
1
node_modules/@electron/get/dist/esm/downloader-resolver.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/downloader-resolver.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"downloader-resolver.js","sourceRoot":"","sources":["../../src/downloader-resolver.ts"],"names":[],"mappings":"AAGA,MAAM,CAAC,KAAK,UAAU,sBAAsB;IAC1C,2DAA2D;IAC3D,+DAA+D;IAC/D,yEAAyE;IACzE,mDAAmD;IACnD,MAAM,EAAE,aAAa,EAAE,GAAG,MAAM,MAAM,CAAC,iBAAiB,CAAC,CAAC;IAC1D,OAAO,IAAI,aAAa,EAAE,CAAC;AAC7B,CAAC"}
|
||||
18
node_modules/@electron/get/dist/esm/index.d.ts
generated
vendored
Normal file
18
node_modules/@electron/get/dist/esm/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
import { ElectronDownloadRequestOptions, ElectronPlatformArtifactDetailsWithDefaults } from './types';
|
||||
export { getHostArch } from './utils';
|
||||
export { initializeProxy } from './proxy';
|
||||
export * from './types';
|
||||
/**
|
||||
* Downloads an artifact from an Electron release and returns an absolute path
|
||||
* to the downloaded file.
|
||||
*
|
||||
* @param artifactDetails - The information required to download the artifact
|
||||
*/
|
||||
export declare function downloadArtifact(_artifactDetails: ElectronPlatformArtifactDetailsWithDefaults): Promise<string>;
|
||||
/**
|
||||
* Downloads a specific version of Electron and returns an absolute path to a
|
||||
* ZIP file.
|
||||
*
|
||||
* @param version - The version of Electron you want to download
|
||||
*/
|
||||
export declare function download(version: string, options?: ElectronDownloadRequestOptions): Promise<string>;
|
||||
134
node_modules/@electron/get/dist/esm/index.js
generated
vendored
Normal file
134
node_modules/@electron/get/dist/esm/index.js
generated
vendored
Normal file
@@ -0,0 +1,134 @@
|
||||
import debug from 'debug';
|
||||
import * as fs from 'fs-extra';
|
||||
import * as path from 'path';
|
||||
import * as semver from 'semver';
|
||||
import * as sumchecker from 'sumchecker';
|
||||
import { getArtifactFileName, getArtifactRemoteURL, getArtifactVersion } from './artifact-utils';
|
||||
import { Cache } from './Cache';
|
||||
import { getDownloaderForSystem } from './downloader-resolver';
|
||||
import { initializeProxy } from './proxy';
|
||||
import { withTempDirectoryIn, getHostArch, getNodeArch, ensureIsTruthyString, isOfficialLinuxIA32Download, } from './utils';
|
||||
export { getHostArch } from './utils';
|
||||
export { initializeProxy } from './proxy';
|
||||
const d = debug('@electron/get:index');
|
||||
if (process.env.ELECTRON_GET_USE_PROXY) {
|
||||
initializeProxy();
|
||||
}
|
||||
async function validateArtifact(artifactDetails, downloadedAssetPath, _downloadArtifact) {
|
||||
return await withTempDirectoryIn(artifactDetails.tempDirectory, async (tempFolder) => {
|
||||
// Don't try to verify the hash of the hash file itself
|
||||
// and for older versions that don't have a SHASUMS256.txt
|
||||
if (!artifactDetails.artifactName.startsWith('SHASUMS256') &&
|
||||
!artifactDetails.unsafelyDisableChecksums &&
|
||||
semver.gte(artifactDetails.version, '1.3.2')) {
|
||||
let shasumPath;
|
||||
const checksums = artifactDetails.checksums;
|
||||
if (checksums) {
|
||||
shasumPath = path.resolve(tempFolder, 'SHASUMS256.txt');
|
||||
const fileNames = Object.keys(checksums);
|
||||
if (fileNames.length === 0) {
|
||||
throw new Error('Provided "checksums" object is empty, cannot generate a valid SHASUMS256.txt');
|
||||
}
|
||||
const generatedChecksums = fileNames
|
||||
.map(fileName => `${checksums[fileName]} *${fileName}`)
|
||||
.join('\n');
|
||||
await fs.writeFile(shasumPath, generatedChecksums);
|
||||
}
|
||||
else {
|
||||
shasumPath = await _downloadArtifact({
|
||||
isGeneric: true,
|
||||
version: artifactDetails.version,
|
||||
artifactName: 'SHASUMS256.txt',
|
||||
force: artifactDetails.force,
|
||||
downloadOptions: artifactDetails.downloadOptions,
|
||||
cacheRoot: artifactDetails.cacheRoot,
|
||||
downloader: artifactDetails.downloader,
|
||||
mirrorOptions: artifactDetails.mirrorOptions,
|
||||
});
|
||||
}
|
||||
// For versions 1.3.2 - 1.3.4, need to overwrite the `defaultTextEncoding` option:
|
||||
// https://github.com/electron/electron/pull/6676#discussion_r75332120
|
||||
if (semver.satisfies(artifactDetails.version, '1.3.2 - 1.3.4')) {
|
||||
const validatorOptions = {};
|
||||
validatorOptions.defaultTextEncoding = 'binary';
|
||||
const checker = new sumchecker.ChecksumValidator('sha256', shasumPath, validatorOptions);
|
||||
await checker.validate(path.dirname(downloadedAssetPath), path.basename(downloadedAssetPath));
|
||||
}
|
||||
else {
|
||||
await sumchecker('sha256', shasumPath, path.dirname(downloadedAssetPath), [
|
||||
path.basename(downloadedAssetPath),
|
||||
]);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Downloads an artifact from an Electron release and returns an absolute path
|
||||
* to the downloaded file.
|
||||
*
|
||||
* @param artifactDetails - The information required to download the artifact
|
||||
*/
|
||||
export async function downloadArtifact(_artifactDetails) {
|
||||
const artifactDetails = Object.assign({}, _artifactDetails);
|
||||
if (!_artifactDetails.isGeneric) {
|
||||
const platformArtifactDetails = artifactDetails;
|
||||
if (!platformArtifactDetails.platform) {
|
||||
d('No platform found, defaulting to the host platform');
|
||||
platformArtifactDetails.platform = process.platform;
|
||||
}
|
||||
if (platformArtifactDetails.arch) {
|
||||
platformArtifactDetails.arch = getNodeArch(platformArtifactDetails.arch);
|
||||
}
|
||||
else {
|
||||
d('No arch found, defaulting to the host arch');
|
||||
platformArtifactDetails.arch = getHostArch();
|
||||
}
|
||||
}
|
||||
ensureIsTruthyString(artifactDetails, 'version');
|
||||
artifactDetails.version = getArtifactVersion(artifactDetails);
|
||||
const fileName = getArtifactFileName(artifactDetails);
|
||||
const url = await getArtifactRemoteURL(artifactDetails);
|
||||
const cache = new Cache(artifactDetails.cacheRoot);
|
||||
// Do not check if the file exists in the cache when force === true
|
||||
if (!artifactDetails.force) {
|
||||
d(`Checking the cache (${artifactDetails.cacheRoot}) for ${fileName} (${url})`);
|
||||
const cachedPath = await cache.getPathForFileInCache(url, fileName);
|
||||
if (cachedPath === null) {
|
||||
d('Cache miss');
|
||||
}
|
||||
else {
|
||||
d('Cache hit');
|
||||
try {
|
||||
await validateArtifact(artifactDetails, cachedPath, downloadArtifact);
|
||||
return cachedPath;
|
||||
}
|
||||
catch (err) {
|
||||
d("Artifact in cache didn't match checksums", err);
|
||||
d('falling back to re-download');
|
||||
}
|
||||
}
|
||||
}
|
||||
if (!artifactDetails.isGeneric &&
|
||||
isOfficialLinuxIA32Download(artifactDetails.platform, artifactDetails.arch, artifactDetails.version, artifactDetails.mirrorOptions)) {
|
||||
console.warn('Official Linux/ia32 support is deprecated.');
|
||||
console.warn('For more info: https://electronjs.org/blog/linux-32bit-support');
|
||||
}
|
||||
return await withTempDirectoryIn(artifactDetails.tempDirectory, async (tempFolder) => {
|
||||
const tempDownloadPath = path.resolve(tempFolder, getArtifactFileName(artifactDetails));
|
||||
const downloader = artifactDetails.downloader || (await getDownloaderForSystem());
|
||||
d(`Downloading ${url} to ${tempDownloadPath} with options: ${JSON.stringify(artifactDetails.downloadOptions)}`);
|
||||
await downloader.download(url, tempDownloadPath, artifactDetails.downloadOptions);
|
||||
await validateArtifact(artifactDetails, tempDownloadPath, downloadArtifact);
|
||||
return await cache.putFileInCache(url, tempDownloadPath, fileName);
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Downloads a specific version of Electron and returns an absolute path to a
|
||||
* ZIP file.
|
||||
*
|
||||
* @param version - The version of Electron you want to download
|
||||
*/
|
||||
export function download(version, options) {
|
||||
return downloadArtifact(Object.assign(Object.assign({}, options), { version, platform: process.platform, arch: process.arch, artifactName: 'electron' }));
|
||||
}
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
node_modules/@electron/get/dist/esm/index.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,MAAM,OAAO,CAAC;AAC1B,OAAO,KAAK,EAAE,MAAM,UAAU,CAAC;AAC/B,OAAO,KAAK,IAAI,MAAM,MAAM,CAAC;AAC7B,OAAO,KAAK,MAAM,MAAM,QAAQ,CAAC;AACjC,OAAO,KAAK,UAAU,MAAM,YAAY,CAAC;AAEzC,OAAO,EAAE,mBAAmB,EAAE,oBAAoB,EAAE,kBAAkB,EAAE,MAAM,kBAAkB,CAAC;AAOjG,OAAO,EAAE,KAAK,EAAE,MAAM,SAAS,CAAC;AAChC,OAAO,EAAE,sBAAsB,EAAE,MAAM,uBAAuB,CAAC;AAC/D,OAAO,EAAE,eAAe,EAAE,MAAM,SAAS,CAAC;AAC1C,OAAO,EACL,mBAAmB,EACnB,WAAW,EACX,WAAW,EACX,oBAAoB,EACpB,2BAA2B,GAE5B,MAAM,SAAS,CAAC;AAEjB,OAAO,EAAE,WAAW,EAAE,MAAM,SAAS,CAAC;AACtC,OAAO,EAAE,eAAe,EAAE,MAAM,SAAS,CAAC;AAG1C,MAAM,CAAC,GAAG,KAAK,CAAC,qBAAqB,CAAC,CAAC;AAEvC,IAAI,OAAO,CAAC,GAAG,CAAC,sBAAsB,EAAE;IACtC,eAAe,EAAE,CAAC;CACnB;AAMD,KAAK,UAAU,gBAAgB,CAC7B,eAAwC,EACxC,mBAA2B,EAC3B,iBAAqC;IAErC,OAAO,MAAM,mBAAmB,CAAC,eAAe,CAAC,aAAa,EAAE,KAAK,EAAC,UAAU,EAAC,EAAE;QACjF,uDAAuD;QACvD,0DAA0D;QAC1D,IACE,CAAC,eAAe,CAAC,YAAY,CAAC,UAAU,CAAC,YAAY,CAAC;YACtD,CAAC,eAAe,CAAC,wBAAwB;YACzC,MAAM,CAAC,GAAG,CAAC,eAAe,CAAC,OAAO,EAAE,OAAO,CAAC,EAC5C;YACA,IAAI,UAAkB,CAAC;YACvB,MAAM,SAAS,GAAG,eAAe,CAAC,SAAS,CAAC;YAC5C,IAAI,SAAS,EAAE;gBACb,UAAU,GAAG,IAAI,CAAC,OAAO,CAAC,UAAU,EAAE,gBAAgB,CAAC,CAAC;gBACxD,MAAM,SAAS,GAAa,MAAM,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBACnD,IAAI,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;oBAC1B,MAAM,IAAI,KAAK,CACb,8EAA8E,CAC/E,CAAC;iBACH;gBACD,MAAM,kBAAkB,GAAG,SAAS;qBACjC,GAAG,CAAC,QAAQ,CAAC,EAAE,CAAC,GAAG,SAAS,CAAC,QAAQ,CAAC,KAAK,QAAQ,EAAE,CAAC;qBACtD,IAAI,CAAC,IAAI,CAAC,CAAC;gBACd,MAAM,EAAE,CAAC,SAAS,CAAC,UAAU,EAAE,kBAAkB,CAAC,CAAC;aACpD;iBAAM;gBACL,UAAU,GAAG,MAAM,iBAAiB,CAAC;oBACnC,SAAS,EAAE,IAAI;oBACf,OAAO,EAAE,eAAe,CAAC,OAAO;oBAChC,YAAY,EAAE,gBAAgB;oBAC9B,KAAK,EAAE,eAAe,CAAC,KAAK;oBAC5B,eAAe,EAAE,eAAe,CAAC,eAAe;oBAChD,SAAS,EAAE,eAAe,CAAC,SAAS;oBACpC,UAAU,EAAE,eAAe,CAAC,UAAU;oBACtC,aAAa,EAAE,eAAe,CAAC,aAAa;iBAC7C,CAAC,CAAC;aACJ;YAED,kFAAkF;YAClF,sEAAsE;YACtE,IAAI,MAAM,CAAC,SAAS,CAAC,eAAe,CAAC,OAAO,EAAE,eAAe,CAAC,EAAE;gBAC9D,MAAM,gBAAgB,GAA+B,EAAE,CAAC;gBACxD,gBAAgB,CAAC,mBAAmB,GAAG,QAAQ,CAAC;gBAChD,MAAM,OAAO,GAAG,IAAI,UAAU,CAAC,iBAAiB,CAAC,QAAQ,EAAE,UAAU,EAAE,gBAAgB,CAAC,CAAC;gBACzF,MAAM,OAAO,CAAC,QAAQ,CACpB,IAAI,CAAC,OAAO,CAAC,mBAAmB,CAAC,EACjC,IAAI,CAAC,QAAQ,CAAC,mBAAmB,CAAC,CACnC,CAAC;aACH;iBAAM;gBACL,MAAM,UAAU,CAAC,QAAQ,EAAE,UAAU,EAAE,IAAI,CAAC,OAAO,CAAC,mBAAmB,CAAC,EAAE;oBACxE,IAAI,CAAC,QAAQ,CAAC,mBAAmB,CAAC;iBACnC,CAAC,CAAC;aACJ;SACF;IACH,CAAC,CAAC,CAAC;AACL,CAAC;AAED;;;;;GAKG;AACH,MAAM,CAAC,KAAK,UAAU,gBAAgB,CACpC,gBAA6D;IAE7D,MAAM,eAAe,qBACf,gBAA4C,CACjD,CAAC;IACF,IAAI,CAAC,gBAAgB,CAAC,SAAS,EAAE;QAC/B,MAAM,uBAAuB,GAAG,eAAkD,CAAC;QACnF,IAAI,CAAC,uBAAuB,CAAC,QAAQ,EAAE;YACrC,CAAC,CAAC,oDAAoD,CAAC,CAAC;YACxD,uBAAuB,CAAC,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC;SACrD;QACD,IAAI,uBAAuB,CAAC,IAAI,EAAE;YAChC,uBAAuB,CAAC,IAAI,GAAG,WAAW,CAAC,uBAAuB,CAAC,IAAI,CAAC,CAAC;SAC1E;aAAM;YACL,CAAC,CAAC,4CAA4C,CAAC,CAAC;YAChD,uBAAuB,CAAC,IAAI,GAAG,WAAW,EAAE,CAAC;SAC9C;KACF;IACD,oBAAoB,CAAC,eAAe,EAAE,SAAS,CAAC,CAAC;IAEjD,eAAe,CAAC,OAAO,GAAG,kBAAkB,CAAC,eAAe,CAAC,CAAC;IAC9D,MAAM,QAAQ,GAAG,mBAAmB,CAAC,eAAe,CAAC,CAAC;IACtD,MAAM,GAAG,GAAG,MAAM,oBAAoB,CAAC,eAAe,CAAC,CAAC;IACxD,MAAM,KAAK,GAAG,IAAI,KAAK,CAAC,eAAe,CAAC,SAAS,CAAC,CAAC;IAEnD,mEAAmE;IACnE,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE;QAC1B,CAAC,CAAC,uBAAuB,eAAe,CAAC,SAAS,SAAS,QAAQ,KAAK,GAAG,GAAG,CAAC,CAAC;QAChF,MAAM,UAAU,GAAG,MAAM,KAAK,CAAC,qBAAqB,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QAEpE,IAAI,UAAU,KAAK,IAAI,EAAE;YACvB,CAAC,CAAC,YAAY,CAAC,CAAC;SACjB;aAAM;YACL,CAAC,CAAC,WAAW,CAAC,CAAC;YACf,IAAI;gBACF,MAAM,gBAAgB,CAAC,eAAe,EAAE,UAAU,EAAE,gBAAgB,CAAC,CAAC;gBAEtE,OAAO,UAAU,CAAC;aACnB;YAAC,OAAO,GAAG,EAAE;gBACZ,CAAC,CAAC,0CAA0C,EAAE,GAAG,CAAC,CAAC;gBACnD,CAAC,CAAC,6BAA6B,CAAC,CAAC;aAClC;SACF;KACF;IAED,IACE,CAAC,eAAe,CAAC,SAAS;QAC1B,2BAA2B,CACzB,eAAe,CAAC,QAAQ,EACxB,eAAe,CAAC,IAAI,EACpB,eAAe,CAAC,OAAO,EACvB,eAAe,CAAC,aAAa,CAC9B,EACD;QACA,OAAO,CAAC,IAAI,CAAC,4CAA4C,CAAC,CAAC;QAC3D,OAAO,CAAC,IAAI,CAAC,gEAAgE,CAAC,CAAC;KAChF;IAED,OAAO,MAAM,mBAAmB,CAAC,eAAe,CAAC,aAAa,EAAE,KAAK,EAAC,UAAU,EAAC,EAAE;QACjF,MAAM,gBAAgB,GAAG,IAAI,CAAC,OAAO,CAAC,UAAU,EAAE,mBAAmB,CAAC,eAAe,CAAC,CAAC,CAAC;QAExF,MAAM,UAAU,GAAG,eAAe,CAAC,UAAU,IAAI,CAAC,MAAM,sBAAsB,EAAE,CAAC,CAAC;QAClF,CAAC,CACC,eAAe,GAAG,OAAO,gBAAgB,kBAAkB,IAAI,CAAC,SAAS,CACvE,eAAe,CAAC,eAAe,CAChC,EAAE,CACJ,CAAC;QACF,MAAM,UAAU,CAAC,QAAQ,CAAC,GAAG,EAAE,gBAAgB,EAAE,eAAe,CAAC,eAAe,CAAC,CAAC;QAElF,MAAM,gBAAgB,CAAC,eAAe,EAAE,gBAAgB,EAAE,gBAAgB,CAAC,CAAC;QAE5E,OAAO,MAAM,KAAK,CAAC,cAAc,CAAC,GAAG,EAAE,gBAAgB,EAAE,QAAQ,CAAC,CAAC;IACrE,CAAC,CAAC,CAAC;AACL,CAAC;AAED;;;;;GAKG;AACH,MAAM,UAAU,QAAQ,CACtB,OAAe,EACf,OAAwC;IAExC,OAAO,gBAAgB,iCAClB,OAAO,KACV,OAAO,EACP,QAAQ,EAAE,OAAO,CAAC,QAAQ,EAC1B,IAAI,EAAE,OAAO,CAAC,IAAI,EAClB,YAAY,EAAE,UAAU,IACxB,CAAC;AACL,CAAC"}
|
||||
4
node_modules/@electron/get/dist/esm/proxy.d.ts
generated
vendored
Normal file
4
node_modules/@electron/get/dist/esm/proxy.d.ts
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
/**
|
||||
* Initializes a third-party proxy module for HTTP(S) requests.
|
||||
*/
|
||||
export declare function initializeProxy(): void;
|
||||
24
node_modules/@electron/get/dist/esm/proxy.js
generated
vendored
Normal file
24
node_modules/@electron/get/dist/esm/proxy.js
generated
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
import * as debug from 'debug';
|
||||
import { getEnv, setEnv } from './utils';
|
||||
const d = debug('@electron/get:proxy');
|
||||
/**
|
||||
* Initializes a third-party proxy module for HTTP(S) requests.
|
||||
*/
|
||||
export function initializeProxy() {
|
||||
try {
|
||||
// See: https://github.com/electron/get/pull/214#discussion_r798845713
|
||||
const env = getEnv('GLOBAL_AGENT_');
|
||||
setEnv('GLOBAL_AGENT_HTTP_PROXY', env('HTTP_PROXY'));
|
||||
setEnv('GLOBAL_AGENT_HTTPS_PROXY', env('HTTPS_PROXY'));
|
||||
setEnv('GLOBAL_AGENT_NO_PROXY', env('NO_PROXY'));
|
||||
/**
|
||||
* TODO: replace global-agent with a hpagent. @BlackHole1
|
||||
* https://github.com/sindresorhus/got/blob/HEAD/documentation/tips.md#proxying
|
||||
*/
|
||||
require('global-agent').bootstrap();
|
||||
}
|
||||
catch (e) {
|
||||
d('Could not load either proxy modules, built-in proxy support not available:', e);
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=proxy.js.map
|
||||
1
node_modules/@electron/get/dist/esm/proxy.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/proxy.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"proxy.js","sourceRoot":"","sources":["../../src/proxy.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,KAAK,MAAM,OAAO,CAAC;AAC/B,OAAO,EAAE,MAAM,EAAE,MAAM,EAAE,MAAM,SAAS,CAAC;AAEzC,MAAM,CAAC,GAAG,KAAK,CAAC,qBAAqB,CAAC,CAAC;AAEvC;;GAEG;AACH,MAAM,UAAU,eAAe;IAC7B,IAAI;QACF,sEAAsE;QACtE,MAAM,GAAG,GAAG,MAAM,CAAC,eAAe,CAAC,CAAC;QAEpC,MAAM,CAAC,yBAAyB,EAAE,GAAG,CAAC,YAAY,CAAC,CAAC,CAAC;QACrD,MAAM,CAAC,0BAA0B,EAAE,GAAG,CAAC,aAAa,CAAC,CAAC,CAAC;QACvD,MAAM,CAAC,uBAAuB,EAAE,GAAG,CAAC,UAAU,CAAC,CAAC,CAAC;QAEjD;;;WAGG;QACH,OAAO,CAAC,cAAc,CAAC,CAAC,SAAS,EAAE,CAAC;KACrC;IAAC,OAAO,CAAC,EAAE;QACV,CAAC,CAAC,4EAA4E,EAAE,CAAC,CAAC,CAAC;KACpF;AACH,CAAC"}
|
||||
129
node_modules/@electron/get/dist/esm/types.d.ts
generated
vendored
Normal file
129
node_modules/@electron/get/dist/esm/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,129 @@
|
||||
import { Downloader } from './Downloader';
|
||||
export declare type DownloadOptions = any;
|
||||
export interface MirrorOptions {
|
||||
/**
|
||||
* DEPRECATED - see nightlyMirror.
|
||||
*/
|
||||
nightly_mirror?: string;
|
||||
/**
|
||||
* The Electron nightly-specific mirror URL.
|
||||
*/
|
||||
nightlyMirror?: string;
|
||||
/**
|
||||
* The base URL of the mirror to download from,
|
||||
* e.g https://github.com/electron/electron/releases/download
|
||||
*/
|
||||
mirror?: string;
|
||||
/**
|
||||
* The name of the directory to download from,
|
||||
* often scoped by version number e.g 'v4.0.4'
|
||||
*/
|
||||
customDir?: string;
|
||||
/**
|
||||
* The name of the asset to download,
|
||||
* e.g 'electron-v4.0.4-linux-x64.zip'
|
||||
*/
|
||||
customFilename?: string;
|
||||
/**
|
||||
* The version of the asset to download,
|
||||
* e.g '4.0.4'
|
||||
*/
|
||||
customVersion?: string;
|
||||
/**
|
||||
* A function allowing customization of the url returned
|
||||
* from getArtifactRemoteURL().
|
||||
*/
|
||||
resolveAssetURL?: (opts: DownloadOptions) => Promise<string>;
|
||||
}
|
||||
export interface ElectronDownloadRequest {
|
||||
/**
|
||||
* The version of Electron associated with the artifact.
|
||||
*/
|
||||
version: string;
|
||||
/**
|
||||
* The type of artifact. For example:
|
||||
* * `electron`
|
||||
* * `ffmpeg`
|
||||
*/
|
||||
artifactName: string;
|
||||
}
|
||||
export interface ElectronDownloadRequestOptions {
|
||||
/**
|
||||
* Whether to download an artifact regardless of whether it's in the cache directory.
|
||||
*
|
||||
* Defaults to `false`.
|
||||
*/
|
||||
force?: boolean;
|
||||
/**
|
||||
* When set to `true`, disables checking that the artifact download completed successfully
|
||||
* with the correct payload.
|
||||
*
|
||||
* Defaults to `false`.
|
||||
*/
|
||||
unsafelyDisableChecksums?: boolean;
|
||||
/**
|
||||
* Provides checksums for the artifact as strings.
|
||||
* Can be used if you already know the checksums of the Electron artifact
|
||||
* you are downloading and want to skip the checksum file download
|
||||
* without skipping the checksum validation.
|
||||
*
|
||||
* This should be an object whose keys are the file names of the artifacts and
|
||||
* the values are their respective SHA256 checksums.
|
||||
*/
|
||||
checksums?: Record<string, string>;
|
||||
/**
|
||||
* The directory that caches Electron artifact downloads.
|
||||
*
|
||||
* The default value is dependent upon the host platform:
|
||||
*
|
||||
* * Linux: `$XDG_CACHE_HOME` or `~/.cache/electron/`
|
||||
* * MacOS: `~/Library/Caches/electron/`
|
||||
* * Windows: `%LOCALAPPDATA%/electron/Cache` or `~/AppData/Local/electron/Cache/`
|
||||
*/
|
||||
cacheRoot?: string;
|
||||
/**
|
||||
* Options passed to the downloader module.
|
||||
*/
|
||||
downloadOptions?: DownloadOptions;
|
||||
/**
|
||||
* Options related to specifying an artifact mirror.
|
||||
*/
|
||||
mirrorOptions?: MirrorOptions;
|
||||
/**
|
||||
* The custom [[Downloader]] class used to download artifacts. Defaults to the
|
||||
* built-in [[GotDownloader]].
|
||||
*/
|
||||
downloader?: Downloader<DownloadOptions>;
|
||||
/**
|
||||
* A temporary directory for downloads.
|
||||
* It is used before artifacts are put into cache.
|
||||
*/
|
||||
tempDirectory?: string;
|
||||
}
|
||||
export declare type ElectronPlatformArtifactDetails = {
|
||||
/**
|
||||
* The target artifact platform. These are Node-style platform names, for example:
|
||||
* * `win32`
|
||||
* * `darwin`
|
||||
* * `linux`
|
||||
*/
|
||||
platform: string;
|
||||
/**
|
||||
* The target artifact architecture. These are Node-style architecture names, for example:
|
||||
* * `ia32`
|
||||
* * `x64`
|
||||
* * `armv7l`
|
||||
*/
|
||||
arch: string;
|
||||
artifactSuffix?: string;
|
||||
isGeneric?: false;
|
||||
} & ElectronDownloadRequest & ElectronDownloadRequestOptions;
|
||||
export declare type ElectronGenericArtifactDetails = {
|
||||
isGeneric: true;
|
||||
} & ElectronDownloadRequest & ElectronDownloadRequestOptions;
|
||||
export declare type ElectronArtifactDetails = ElectronPlatformArtifactDetails | ElectronGenericArtifactDetails;
|
||||
export declare type Omit<T, K> = Pick<T, Exclude<keyof T, K>>;
|
||||
export declare type ElectronPlatformArtifactDetailsWithDefaults = (Omit<ElectronPlatformArtifactDetails, 'platform' | 'arch'> & {
|
||||
platform?: string;
|
||||
arch?: string;
|
||||
}) | ElectronGenericArtifactDetails;
|
||||
1
node_modules/@electron/get/dist/esm/types.js
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/types.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
//# sourceMappingURL=types.js.map
|
||||
1
node_modules/@electron/get/dist/esm/types.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/types.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"types.js","sourceRoot":"","sources":["../../src/types.ts"],"names":[],"mappings":""}
|
||||
25
node_modules/@electron/get/dist/esm/utils.d.ts
generated
vendored
Normal file
25
node_modules/@electron/get/dist/esm/utils.d.ts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
export declare function withTempDirectoryIn<T>(parentDirectory: string | undefined, fn: (directory: string) => Promise<T>): Promise<T>;
|
||||
export declare function withTempDirectory<T>(fn: (directory: string) => Promise<T>): Promise<T>;
|
||||
export declare function normalizeVersion(version: string): string;
|
||||
/**
|
||||
* Runs the `uname` command and returns the trimmed output.
|
||||
*/
|
||||
export declare function uname(): string;
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name.
|
||||
*/
|
||||
export declare function getNodeArch(arch: string): string;
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name, from the `process` module information.
|
||||
*/
|
||||
export declare function getHostArch(): string;
|
||||
export declare function ensureIsTruthyString<T, K extends keyof T>(obj: T, key: K): void;
|
||||
export declare function isOfficialLinuxIA32Download(platform: string, arch: string, version: string, mirrorOptions?: object): boolean;
|
||||
/**
|
||||
* Find the value of a environment variable which may or may not have the
|
||||
* prefix, in a case-insensitive manner.
|
||||
*/
|
||||
export declare function getEnv(prefix?: string): (name: string) => string | undefined;
|
||||
export declare function setEnv(key: string, value: string | undefined): void;
|
||||
95
node_modules/@electron/get/dist/esm/utils.js
generated
vendored
Normal file
95
node_modules/@electron/get/dist/esm/utils.js
generated
vendored
Normal file
@@ -0,0 +1,95 @@
|
||||
import * as childProcess from 'child_process';
|
||||
import * as fs from 'fs-extra';
|
||||
import * as os from 'os';
|
||||
import * as path from 'path';
|
||||
async function useAndRemoveDirectory(directory, fn) {
|
||||
let result;
|
||||
try {
|
||||
result = await fn(directory);
|
||||
}
|
||||
finally {
|
||||
await fs.remove(directory);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
export async function withTempDirectoryIn(parentDirectory = os.tmpdir(), fn) {
|
||||
const tempDirectoryPrefix = 'electron-download-';
|
||||
const tempDirectory = await fs.mkdtemp(path.resolve(parentDirectory, tempDirectoryPrefix));
|
||||
return useAndRemoveDirectory(tempDirectory, fn);
|
||||
}
|
||||
export async function withTempDirectory(fn) {
|
||||
return withTempDirectoryIn(undefined, fn);
|
||||
}
|
||||
export function normalizeVersion(version) {
|
||||
if (!version.startsWith('v')) {
|
||||
return `v${version}`;
|
||||
}
|
||||
return version;
|
||||
}
|
||||
/**
|
||||
* Runs the `uname` command and returns the trimmed output.
|
||||
*/
|
||||
export function uname() {
|
||||
return childProcess
|
||||
.execSync('uname -m')
|
||||
.toString()
|
||||
.trim();
|
||||
}
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name.
|
||||
*/
|
||||
export function getNodeArch(arch) {
|
||||
if (arch === 'arm') {
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
switch (process.config.variables.arm_version) {
|
||||
case '6':
|
||||
return uname();
|
||||
case '7':
|
||||
default:
|
||||
return 'armv7l';
|
||||
}
|
||||
}
|
||||
return arch;
|
||||
}
|
||||
/**
|
||||
* Generates an architecture name that would be used in an Electron or Node.js
|
||||
* download file name, from the `process` module information.
|
||||
*/
|
||||
export function getHostArch() {
|
||||
return getNodeArch(process.arch);
|
||||
}
|
||||
export function ensureIsTruthyString(obj, key) {
|
||||
if (!obj[key] || typeof obj[key] !== 'string') {
|
||||
throw new Error(`Expected property "${key}" to be provided as a string but it was not`);
|
||||
}
|
||||
}
|
||||
export function isOfficialLinuxIA32Download(platform, arch, version, mirrorOptions) {
|
||||
return (platform === 'linux' &&
|
||||
arch === 'ia32' &&
|
||||
Number(version.slice(1).split('.')[0]) >= 4 &&
|
||||
typeof mirrorOptions === 'undefined');
|
||||
}
|
||||
/**
|
||||
* Find the value of a environment variable which may or may not have the
|
||||
* prefix, in a case-insensitive manner.
|
||||
*/
|
||||
export function getEnv(prefix = '') {
|
||||
const envsLowerCase = {};
|
||||
for (const envKey in process.env) {
|
||||
envsLowerCase[envKey.toLowerCase()] = process.env[envKey];
|
||||
}
|
||||
return (name) => {
|
||||
return (envsLowerCase[`${prefix}${name}`.toLowerCase()] ||
|
||||
envsLowerCase[name.toLowerCase()] ||
|
||||
undefined);
|
||||
};
|
||||
}
|
||||
export function setEnv(key, value) {
|
||||
// The `void` operator always returns `undefined`.
|
||||
// See: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/void
|
||||
if (value !== void 0) {
|
||||
process.env[key] = value;
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=utils.js.map
|
||||
1
node_modules/@electron/get/dist/esm/utils.js.map
generated
vendored
Normal file
1
node_modules/@electron/get/dist/esm/utils.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"utils.js","sourceRoot":"","sources":["../../src/utils.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,YAAY,MAAM,eAAe,CAAC;AAC9C,OAAO,KAAK,EAAE,MAAM,UAAU,CAAC;AAC/B,OAAO,KAAK,EAAE,MAAM,IAAI,CAAC;AACzB,OAAO,KAAK,IAAI,MAAM,MAAM,CAAC;AAE7B,KAAK,UAAU,qBAAqB,CAClC,SAAiB,EACjB,EAAqC;IAErC,IAAI,MAAS,CAAC;IACd,IAAI;QACF,MAAM,GAAG,MAAM,EAAE,CAAC,SAAS,CAAC,CAAC;KAC9B;YAAS;QACR,MAAM,EAAE,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;KAC5B;IACD,OAAO,MAAM,CAAC;AAChB,CAAC;AAED,MAAM,CAAC,KAAK,UAAU,mBAAmB,CACvC,kBAA0B,EAAE,CAAC,MAAM,EAAE,EACrC,EAAqC;IAErC,MAAM,mBAAmB,GAAG,oBAAoB,CAAC;IACjD,MAAM,aAAa,GAAG,MAAM,EAAE,CAAC,OAAO,CAAC,IAAI,CAAC,OAAO,CAAC,eAAe,EAAE,mBAAmB,CAAC,CAAC,CAAC;IAC3F,OAAO,qBAAqB,CAAC,aAAa,EAAE,EAAE,CAAC,CAAC;AAClD,CAAC;AAED,MAAM,CAAC,KAAK,UAAU,iBAAiB,CAAI,EAAqC;IAC9E,OAAO,mBAAmB,CAAC,SAAS,EAAE,EAAE,CAAC,CAAC;AAC5C,CAAC;AAED,MAAM,UAAU,gBAAgB,CAAC,OAAe;IAC9C,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,GAAG,CAAC,EAAE;QAC5B,OAAO,IAAI,OAAO,EAAE,CAAC;KACtB;IACD,OAAO,OAAO,CAAC;AACjB,CAAC;AAED;;GAEG;AACH,MAAM,UAAU,KAAK;IACnB,OAAO,YAAY;SAChB,QAAQ,CAAC,UAAU,CAAC;SACpB,QAAQ,EAAE;SACV,IAAI,EAAE,CAAC;AACZ,CAAC;AAED;;;GAGG;AACH,MAAM,UAAU,WAAW,CAAC,IAAY;IACtC,IAAI,IAAI,KAAK,KAAK,EAAE;QAClB,8DAA8D;QAC9D,QAAS,OAAO,CAAC,MAAM,CAAC,SAAiB,CAAC,WAAW,EAAE;YACrD,KAAK,GAAG;gBACN,OAAO,KAAK,EAAE,CAAC;YACjB,KAAK,GAAG,CAAC;YACT;gBACE,OAAO,QAAQ,CAAC;SACnB;KACF;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AAED;;;GAGG;AACH,MAAM,UAAU,WAAW;IACzB,OAAO,WAAW,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;AACnC,CAAC;AAED,MAAM,UAAU,oBAAoB,CAAuB,GAAM,EAAE,GAAM;IACvE,IAAI,CAAC,GAAG,CAAC,GAAG,CAAC,IAAI,OAAO,GAAG,CAAC,GAAG,CAAC,KAAK,QAAQ,EAAE;QAC7C,MAAM,IAAI,KAAK,CAAC,sBAAsB,GAAG,6CAA6C,CAAC,CAAC;KACzF;AACH,CAAC;AAED,MAAM,UAAU,2BAA2B,CACzC,QAAgB,EAChB,IAAY,EACZ,OAAe,EACf,aAAsB;IAEtB,OAAO,CACL,QAAQ,KAAK,OAAO;QACpB,IAAI,KAAK,MAAM;QACf,MAAM,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QAC3C,OAAO,aAAa,KAAK,WAAW,CACrC,CAAC;AACJ,CAAC;AAED;;;GAGG;AACH,MAAM,UAAU,MAAM,CAAC,MAAM,GAAG,EAAE;IAChC,MAAM,aAAa,GAAsB,EAAE,CAAC;IAE5C,KAAK,MAAM,MAAM,IAAI,OAAO,CAAC,GAAG,EAAE;QAChC,aAAa,CAAC,MAAM,CAAC,WAAW,EAAE,CAAC,GAAG,OAAO,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC;KAC3D;IAED,OAAO,CAAC,IAAY,EAAsB,EAAE;QAC1C,OAAO,CACL,aAAa,CAAC,GAAG,MAAM,GAAG,IAAI,EAAE,CAAC,WAAW,EAAE,CAAC;YAC/C,aAAa,CAAC,IAAI,CAAC,WAAW,EAAE,CAAC;YACjC,SAAS,CACV,CAAC;IACJ,CAAC,CAAC;AACJ,CAAC;AAED,MAAM,UAAU,MAAM,CAAC,GAAW,EAAE,KAAyB;IAC3D,kDAAkD;IAClD,wFAAwF;IACxF,IAAI,KAAK,KAAK,KAAK,CAAC,EAAE;QACpB,OAAO,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC;KAC1B;AACH,CAAC"}
|
||||
1
node_modules/@electron/get/node_modules/.bin/semver
generated
vendored
Symbolic link
1
node_modules/@electron/get/node_modules/.bin/semver
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../semver/bin/semver.js
|
||||
15
node_modules/@electron/get/node_modules/semver/LICENSE
generated
vendored
Normal file
15
node_modules/@electron/get/node_modules/semver/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
The ISC License
|
||||
|
||||
Copyright (c) Isaac Z. Schlueter and Contributors
|
||||
|
||||
Permission to use, copy, modify, and/or distribute this software for any
|
||||
purpose with or without fee is hereby granted, provided that the above
|
||||
copyright notice and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
|
||||
IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
443
node_modules/@electron/get/node_modules/semver/README.md
generated
vendored
Normal file
443
node_modules/@electron/get/node_modules/semver/README.md
generated
vendored
Normal file
@@ -0,0 +1,443 @@
|
||||
semver(1) -- The semantic versioner for npm
|
||||
===========================================
|
||||
|
||||
## Install
|
||||
|
||||
```bash
|
||||
npm install semver
|
||||
````
|
||||
|
||||
## Usage
|
||||
|
||||
As a node module:
|
||||
|
||||
```js
|
||||
const semver = require('semver')
|
||||
|
||||
semver.valid('1.2.3') // '1.2.3'
|
||||
semver.valid('a.b.c') // null
|
||||
semver.clean(' =v1.2.3 ') // '1.2.3'
|
||||
semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true
|
||||
semver.gt('1.2.3', '9.8.7') // false
|
||||
semver.lt('1.2.3', '9.8.7') // true
|
||||
semver.minVersion('>=1.0.0') // '1.0.0'
|
||||
semver.valid(semver.coerce('v2')) // '2.0.0'
|
||||
semver.valid(semver.coerce('42.6.7.9.3-alpha')) // '42.6.7'
|
||||
```
|
||||
|
||||
As a command-line utility:
|
||||
|
||||
```
|
||||
$ semver -h
|
||||
|
||||
A JavaScript implementation of the https://semver.org/ specification
|
||||
Copyright Isaac Z. Schlueter
|
||||
|
||||
Usage: semver [options] <version> [<version> [...]]
|
||||
Prints valid versions sorted by SemVer precedence
|
||||
|
||||
Options:
|
||||
-r --range <range>
|
||||
Print versions that match the specified range.
|
||||
|
||||
-i --increment [<level>]
|
||||
Increment a version by the specified level. Level can
|
||||
be one of: major, minor, patch, premajor, preminor,
|
||||
prepatch, or prerelease. Default level is 'patch'.
|
||||
Only one version may be specified.
|
||||
|
||||
--preid <identifier>
|
||||
Identifier to be used to prefix premajor, preminor,
|
||||
prepatch or prerelease version increments.
|
||||
|
||||
-l --loose
|
||||
Interpret versions and ranges loosely
|
||||
|
||||
-p --include-prerelease
|
||||
Always include prerelease versions in range matching
|
||||
|
||||
-c --coerce
|
||||
Coerce a string into SemVer if possible
|
||||
(does not imply --loose)
|
||||
|
||||
--rtl
|
||||
Coerce version strings right to left
|
||||
|
||||
--ltr
|
||||
Coerce version strings left to right (default)
|
||||
|
||||
Program exits successfully if any valid version satisfies
|
||||
all supplied ranges, and prints all satisfying versions.
|
||||
|
||||
If no satisfying versions are found, then exits failure.
|
||||
|
||||
Versions are printed in ascending order, so supplying
|
||||
multiple versions to the utility will just sort them.
|
||||
```
|
||||
|
||||
## Versions
|
||||
|
||||
A "version" is described by the `v2.0.0` specification found at
|
||||
<https://semver.org/>.
|
||||
|
||||
A leading `"="` or `"v"` character is stripped off and ignored.
|
||||
|
||||
## Ranges
|
||||
|
||||
A `version range` is a set of `comparators` which specify versions
|
||||
that satisfy the range.
|
||||
|
||||
A `comparator` is composed of an `operator` and a `version`. The set
|
||||
of primitive `operators` is:
|
||||
|
||||
* `<` Less than
|
||||
* `<=` Less than or equal to
|
||||
* `>` Greater than
|
||||
* `>=` Greater than or equal to
|
||||
* `=` Equal. If no operator is specified, then equality is assumed,
|
||||
so this operator is optional, but MAY be included.
|
||||
|
||||
For example, the comparator `>=1.2.7` would match the versions
|
||||
`1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6`
|
||||
or `1.1.0`.
|
||||
|
||||
Comparators can be joined by whitespace to form a `comparator set`,
|
||||
which is satisfied by the **intersection** of all of the comparators
|
||||
it includes.
|
||||
|
||||
A range is composed of one or more comparator sets, joined by `||`. A
|
||||
version matches a range if and only if every comparator in at least
|
||||
one of the `||`-separated comparator sets is satisfied by the version.
|
||||
|
||||
For example, the range `>=1.2.7 <1.3.0` would match the versions
|
||||
`1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`,
|
||||
or `1.1.0`.
|
||||
|
||||
The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`,
|
||||
`1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`.
|
||||
|
||||
### Prerelease Tags
|
||||
|
||||
If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then
|
||||
it will only be allowed to satisfy comparator sets if at least one
|
||||
comparator with the same `[major, minor, patch]` tuple also has a
|
||||
prerelease tag.
|
||||
|
||||
For example, the range `>1.2.3-alpha.3` would be allowed to match the
|
||||
version `1.2.3-alpha.7`, but it would *not* be satisfied by
|
||||
`3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater
|
||||
than" `1.2.3-alpha.3` according to the SemVer sort rules. The version
|
||||
range only accepts prerelease tags on the `1.2.3` version. The
|
||||
version `3.4.5` *would* satisfy the range, because it does not have a
|
||||
prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`.
|
||||
|
||||
The purpose for this behavior is twofold. First, prerelease versions
|
||||
frequently are updated very quickly, and contain many breaking changes
|
||||
that are (by the author's design) not yet fit for public consumption.
|
||||
Therefore, by default, they are excluded from range matching
|
||||
semantics.
|
||||
|
||||
Second, a user who has opted into using a prerelease version has
|
||||
clearly indicated the intent to use *that specific* set of
|
||||
alpha/beta/rc versions. By including a prerelease tag in the range,
|
||||
the user is indicating that they are aware of the risk. However, it
|
||||
is still not appropriate to assume that they have opted into taking a
|
||||
similar risk on the *next* set of prerelease versions.
|
||||
|
||||
Note that this behavior can be suppressed (treating all prerelease
|
||||
versions as if they were normal versions, for the purpose of range
|
||||
matching) by setting the `includePrerelease` flag on the options
|
||||
object to any
|
||||
[functions](https://github.com/npm/node-semver#functions) that do
|
||||
range matching.
|
||||
|
||||
#### Prerelease Identifiers
|
||||
|
||||
The method `.inc` takes an additional `identifier` string argument that
|
||||
will append the value of the string as a prerelease identifier:
|
||||
|
||||
```javascript
|
||||
semver.inc('1.2.3', 'prerelease', 'beta')
|
||||
// '1.2.4-beta.0'
|
||||
```
|
||||
|
||||
command-line example:
|
||||
|
||||
```bash
|
||||
$ semver 1.2.3 -i prerelease --preid beta
|
||||
1.2.4-beta.0
|
||||
```
|
||||
|
||||
Which then can be used to increment further:
|
||||
|
||||
```bash
|
||||
$ semver 1.2.4-beta.0 -i prerelease
|
||||
1.2.4-beta.1
|
||||
```
|
||||
|
||||
### Advanced Range Syntax
|
||||
|
||||
Advanced range syntax desugars to primitive comparators in
|
||||
deterministic ways.
|
||||
|
||||
Advanced ranges may be combined in the same way as primitive
|
||||
comparators using white space or `||`.
|
||||
|
||||
#### Hyphen Ranges `X.Y.Z - A.B.C`
|
||||
|
||||
Specifies an inclusive set.
|
||||
|
||||
* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4`
|
||||
|
||||
If a partial version is provided as the first version in the inclusive
|
||||
range, then the missing pieces are replaced with zeroes.
|
||||
|
||||
* `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4`
|
||||
|
||||
If a partial version is provided as the second version in the
|
||||
inclusive range, then all versions that start with the supplied parts
|
||||
of the tuple are accepted, but nothing that would be greater than the
|
||||
provided tuple parts.
|
||||
|
||||
* `1.2.3 - 2.3` := `>=1.2.3 <2.4.0`
|
||||
* `1.2.3 - 2` := `>=1.2.3 <3.0.0`
|
||||
|
||||
#### X-Ranges `1.2.x` `1.X` `1.2.*` `*`
|
||||
|
||||
Any of `X`, `x`, or `*` may be used to "stand in" for one of the
|
||||
numeric values in the `[major, minor, patch]` tuple.
|
||||
|
||||
* `*` := `>=0.0.0` (Any version satisfies)
|
||||
* `1.x` := `>=1.0.0 <2.0.0` (Matching major version)
|
||||
* `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions)
|
||||
|
||||
A partial version range is treated as an X-Range, so the special
|
||||
character is in fact optional.
|
||||
|
||||
* `""` (empty string) := `*` := `>=0.0.0`
|
||||
* `1` := `1.x.x` := `>=1.0.0 <2.0.0`
|
||||
* `1.2` := `1.2.x` := `>=1.2.0 <1.3.0`
|
||||
|
||||
#### Tilde Ranges `~1.2.3` `~1.2` `~1`
|
||||
|
||||
Allows patch-level changes if a minor version is specified on the
|
||||
comparator. Allows minor-level changes if not.
|
||||
|
||||
* `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0`
|
||||
* `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`)
|
||||
* `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`)
|
||||
* `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0`
|
||||
* `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`)
|
||||
* `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`)
|
||||
* `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in
|
||||
the `1.2.3` version will be allowed, if they are greater than or
|
||||
equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but
|
||||
`1.2.4-beta.2` would not, because it is a prerelease of a
|
||||
different `[major, minor, patch]` tuple.
|
||||
|
||||
#### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4`
|
||||
|
||||
Allows changes that do not modify the left-most non-zero element in the
|
||||
`[major, minor, patch]` tuple. In other words, this allows patch and
|
||||
minor updates for versions `1.0.0` and above, patch updates for
|
||||
versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`.
|
||||
|
||||
Many authors treat a `0.x` version as if the `x` were the major
|
||||
"breaking-change" indicator.
|
||||
|
||||
Caret ranges are ideal when an author may make breaking changes
|
||||
between `0.2.4` and `0.3.0` releases, which is a common practice.
|
||||
However, it presumes that there will *not* be breaking changes between
|
||||
`0.2.4` and `0.2.5`. It allows for changes that are presumed to be
|
||||
additive (but non-breaking), according to commonly observed practices.
|
||||
|
||||
* `^1.2.3` := `>=1.2.3 <2.0.0`
|
||||
* `^0.2.3` := `>=0.2.3 <0.3.0`
|
||||
* `^0.0.3` := `>=0.0.3 <0.0.4`
|
||||
* `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in
|
||||
the `1.2.3` version will be allowed, if they are greater than or
|
||||
equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but
|
||||
`1.2.4-beta.2` would not, because it is a prerelease of a
|
||||
different `[major, minor, patch]` tuple.
|
||||
* `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the
|
||||
`0.0.3` version *only* will be allowed, if they are greater than or
|
||||
equal to `beta`. So, `0.0.3-pr.2` would be allowed.
|
||||
|
||||
When parsing caret ranges, a missing `patch` value desugars to the
|
||||
number `0`, but will allow flexibility within that value, even if the
|
||||
major and minor versions are both `0`.
|
||||
|
||||
* `^1.2.x` := `>=1.2.0 <2.0.0`
|
||||
* `^0.0.x` := `>=0.0.0 <0.1.0`
|
||||
* `^0.0` := `>=0.0.0 <0.1.0`
|
||||
|
||||
A missing `minor` and `patch` values will desugar to zero, but also
|
||||
allow flexibility within those values, even if the major version is
|
||||
zero.
|
||||
|
||||
* `^1.x` := `>=1.0.0 <2.0.0`
|
||||
* `^0.x` := `>=0.0.0 <1.0.0`
|
||||
|
||||
### Range Grammar
|
||||
|
||||
Putting all this together, here is a Backus-Naur grammar for ranges,
|
||||
for the benefit of parser authors:
|
||||
|
||||
```bnf
|
||||
range-set ::= range ( logical-or range ) *
|
||||
logical-or ::= ( ' ' ) * '||' ( ' ' ) *
|
||||
range ::= hyphen | simple ( ' ' simple ) * | ''
|
||||
hyphen ::= partial ' - ' partial
|
||||
simple ::= primitive | partial | tilde | caret
|
||||
primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial
|
||||
partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )?
|
||||
xr ::= 'x' | 'X' | '*' | nr
|
||||
nr ::= '0' | ['1'-'9'] ( ['0'-'9'] ) *
|
||||
tilde ::= '~' partial
|
||||
caret ::= '^' partial
|
||||
qualifier ::= ( '-' pre )? ( '+' build )?
|
||||
pre ::= parts
|
||||
build ::= parts
|
||||
parts ::= part ( '.' part ) *
|
||||
part ::= nr | [-0-9A-Za-z]+
|
||||
```
|
||||
|
||||
## Functions
|
||||
|
||||
All methods and classes take a final `options` object argument. All
|
||||
options in this object are `false` by default. The options supported
|
||||
are:
|
||||
|
||||
- `loose` Be more forgiving about not-quite-valid semver strings.
|
||||
(Any resulting output will always be 100% strict compliant, of
|
||||
course.) For backwards compatibility reasons, if the `options`
|
||||
argument is a boolean value instead of an object, it is interpreted
|
||||
to be the `loose` param.
|
||||
- `includePrerelease` Set to suppress the [default
|
||||
behavior](https://github.com/npm/node-semver#prerelease-tags) of
|
||||
excluding prerelease tagged versions from ranges unless they are
|
||||
explicitly opted into.
|
||||
|
||||
Strict-mode Comparators and Ranges will be strict about the SemVer
|
||||
strings that they parse.
|
||||
|
||||
* `valid(v)`: Return the parsed version, or null if it's not valid.
|
||||
* `inc(v, release)`: Return the version incremented by the release
|
||||
type (`major`, `premajor`, `minor`, `preminor`, `patch`,
|
||||
`prepatch`, or `prerelease`), or null if it's not valid
|
||||
* `premajor` in one call will bump the version up to the next major
|
||||
version and down to a prerelease of that major version.
|
||||
`preminor`, and `prepatch` work the same way.
|
||||
* If called from a non-prerelease version, the `prerelease` will work the
|
||||
same as `prepatch`. It increments the patch version, then makes a
|
||||
prerelease. If the input version is already a prerelease it simply
|
||||
increments it.
|
||||
* `prerelease(v)`: Returns an array of prerelease components, or null
|
||||
if none exist. Example: `prerelease('1.2.3-alpha.1') -> ['alpha', 1]`
|
||||
* `major(v)`: Return the major version number.
|
||||
* `minor(v)`: Return the minor version number.
|
||||
* `patch(v)`: Return the patch version number.
|
||||
* `intersects(r1, r2, loose)`: Return true if the two supplied ranges
|
||||
or comparators intersect.
|
||||
* `parse(v)`: Attempt to parse a string as a semantic version, returning either
|
||||
a `SemVer` object or `null`.
|
||||
|
||||
### Comparison
|
||||
|
||||
* `gt(v1, v2)`: `v1 > v2`
|
||||
* `gte(v1, v2)`: `v1 >= v2`
|
||||
* `lt(v1, v2)`: `v1 < v2`
|
||||
* `lte(v1, v2)`: `v1 <= v2`
|
||||
* `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent,
|
||||
even if they're not the exact same string. You already know how to
|
||||
compare strings.
|
||||
* `neq(v1, v2)`: `v1 != v2` The opposite of `eq`.
|
||||
* `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call
|
||||
the corresponding function above. `"==="` and `"!=="` do simple
|
||||
string comparison, but are included for completeness. Throws if an
|
||||
invalid comparison string is provided.
|
||||
* `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if
|
||||
`v2` is greater. Sorts in ascending order if passed to `Array.sort()`.
|
||||
* `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions
|
||||
in descending order when passed to `Array.sort()`.
|
||||
* `compareBuild(v1, v2)`: The same as `compare` but considers `build` when two versions
|
||||
are equal. Sorts in ascending order if passed to `Array.sort()`.
|
||||
`v2` is greater. Sorts in ascending order if passed to `Array.sort()`.
|
||||
* `diff(v1, v2)`: Returns difference between two versions by the release type
|
||||
(`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`),
|
||||
or null if the versions are the same.
|
||||
|
||||
### Comparators
|
||||
|
||||
* `intersects(comparator)`: Return true if the comparators intersect
|
||||
|
||||
### Ranges
|
||||
|
||||
* `validRange(range)`: Return the valid range or null if it's not valid
|
||||
* `satisfies(version, range)`: Return true if the version satisfies the
|
||||
range.
|
||||
* `maxSatisfying(versions, range)`: Return the highest version in the list
|
||||
that satisfies the range, or `null` if none of them do.
|
||||
* `minSatisfying(versions, range)`: Return the lowest version in the list
|
||||
that satisfies the range, or `null` if none of them do.
|
||||
* `minVersion(range)`: Return the lowest version that can possibly match
|
||||
the given range.
|
||||
* `gtr(version, range)`: Return `true` if version is greater than all the
|
||||
versions possible in the range.
|
||||
* `ltr(version, range)`: Return `true` if version is less than all the
|
||||
versions possible in the range.
|
||||
* `outside(version, range, hilo)`: Return true if the version is outside
|
||||
the bounds of the range in either the high or low direction. The
|
||||
`hilo` argument must be either the string `'>'` or `'<'`. (This is
|
||||
the function called by `gtr` and `ltr`.)
|
||||
* `intersects(range)`: Return true if any of the ranges comparators intersect
|
||||
|
||||
Note that, since ranges may be non-contiguous, a version might not be
|
||||
greater than a range, less than a range, *or* satisfy a range! For
|
||||
example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9`
|
||||
until `2.0.0`, so the version `1.2.10` would not be greater than the
|
||||
range (because `2.0.1` satisfies, which is higher), nor less than the
|
||||
range (since `1.2.8` satisfies, which is lower), and it also does not
|
||||
satisfy the range.
|
||||
|
||||
If you want to know if a version satisfies or does not satisfy a
|
||||
range, use the `satisfies(version, range)` function.
|
||||
|
||||
### Coercion
|
||||
|
||||
* `coerce(version, options)`: Coerces a string to semver if possible
|
||||
|
||||
This aims to provide a very forgiving translation of a non-semver string to
|
||||
semver. It looks for the first digit in a string, and consumes all
|
||||
remaining characters which satisfy at least a partial semver (e.g., `1`,
|
||||
`1.2`, `1.2.3`) up to the max permitted length (256 characters). Longer
|
||||
versions are simply truncated (`4.6.3.9.2-alpha2` becomes `4.6.3`). All
|
||||
surrounding text is simply ignored (`v3.4 replaces v3.3.1` becomes
|
||||
`3.4.0`). Only text which lacks digits will fail coercion (`version one`
|
||||
is not valid). The maximum length for any semver component considered for
|
||||
coercion is 16 characters; longer components will be ignored
|
||||
(`10000000000000000.4.7.4` becomes `4.7.4`). The maximum value for any
|
||||
semver component is `Integer.MAX_SAFE_INTEGER || (2**53 - 1)`; higher value
|
||||
components are invalid (`9999999999999999.4.7.4` is likely invalid).
|
||||
|
||||
If the `options.rtl` flag is set, then `coerce` will return the right-most
|
||||
coercible tuple that does not share an ending index with a longer coercible
|
||||
tuple. For example, `1.2.3.4` will return `2.3.4` in rtl mode, not
|
||||
`4.0.0`. `1.2.3/4` will return `4.0.0`, because the `4` is not a part of
|
||||
any other overlapping SemVer tuple.
|
||||
|
||||
### Clean
|
||||
|
||||
* `clean(version)`: Clean a string to be a valid semver if possible
|
||||
|
||||
This will return a cleaned and trimmed semver version. If the provided version is not valid a null will be returned. This does not work for ranges.
|
||||
|
||||
ex.
|
||||
* `s.clean(' = v 2.1.5foo')`: `null`
|
||||
* `s.clean(' = v 2.1.5foo', { loose: true })`: `'2.1.5-foo'`
|
||||
* `s.clean(' = v 2.1.5-foo')`: `null`
|
||||
* `s.clean(' = v 2.1.5-foo', { loose: true })`: `'2.1.5-foo'`
|
||||
* `s.clean('=v2.1.5')`: `'2.1.5'`
|
||||
* `s.clean(' =v2.1.5')`: `2.1.5`
|
||||
* `s.clean(' 2.1.5 ')`: `'2.1.5'`
|
||||
* `s.clean('~1.0.0')`: `null`
|
||||
174
node_modules/@electron/get/node_modules/semver/bin/semver.js
generated
vendored
Executable file
174
node_modules/@electron/get/node_modules/semver/bin/semver.js
generated
vendored
Executable file
@@ -0,0 +1,174 @@
|
||||
#!/usr/bin/env node
|
||||
// Standalone semver comparison program.
|
||||
// Exits successfully and prints matching version(s) if
|
||||
// any supplied version is valid and passes all tests.
|
||||
|
||||
var argv = process.argv.slice(2)
|
||||
|
||||
var versions = []
|
||||
|
||||
var range = []
|
||||
|
||||
var inc = null
|
||||
|
||||
var version = require('../package.json').version
|
||||
|
||||
var loose = false
|
||||
|
||||
var includePrerelease = false
|
||||
|
||||
var coerce = false
|
||||
|
||||
var rtl = false
|
||||
|
||||
var identifier
|
||||
|
||||
var semver = require('../semver')
|
||||
|
||||
var reverse = false
|
||||
|
||||
var options = {}
|
||||
|
||||
main()
|
||||
|
||||
function main () {
|
||||
if (!argv.length) return help()
|
||||
while (argv.length) {
|
||||
var a = argv.shift()
|
||||
var indexOfEqualSign = a.indexOf('=')
|
||||
if (indexOfEqualSign !== -1) {
|
||||
a = a.slice(0, indexOfEqualSign)
|
||||
argv.unshift(a.slice(indexOfEqualSign + 1))
|
||||
}
|
||||
switch (a) {
|
||||
case '-rv': case '-rev': case '--rev': case '--reverse':
|
||||
reverse = true
|
||||
break
|
||||
case '-l': case '--loose':
|
||||
loose = true
|
||||
break
|
||||
case '-p': case '--include-prerelease':
|
||||
includePrerelease = true
|
||||
break
|
||||
case '-v': case '--version':
|
||||
versions.push(argv.shift())
|
||||
break
|
||||
case '-i': case '--inc': case '--increment':
|
||||
switch (argv[0]) {
|
||||
case 'major': case 'minor': case 'patch': case 'prerelease':
|
||||
case 'premajor': case 'preminor': case 'prepatch':
|
||||
inc = argv.shift()
|
||||
break
|
||||
default:
|
||||
inc = 'patch'
|
||||
break
|
||||
}
|
||||
break
|
||||
case '--preid':
|
||||
identifier = argv.shift()
|
||||
break
|
||||
case '-r': case '--range':
|
||||
range.push(argv.shift())
|
||||
break
|
||||
case '-c': case '--coerce':
|
||||
coerce = true
|
||||
break
|
||||
case '--rtl':
|
||||
rtl = true
|
||||
break
|
||||
case '--ltr':
|
||||
rtl = false
|
||||
break
|
||||
case '-h': case '--help': case '-?':
|
||||
return help()
|
||||
default:
|
||||
versions.push(a)
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
var options = { loose: loose, includePrerelease: includePrerelease, rtl: rtl }
|
||||
|
||||
versions = versions.map(function (v) {
|
||||
return coerce ? (semver.coerce(v, options) || { version: v }).version : v
|
||||
}).filter(function (v) {
|
||||
return semver.valid(v)
|
||||
})
|
||||
if (!versions.length) return fail()
|
||||
if (inc && (versions.length !== 1 || range.length)) { return failInc() }
|
||||
|
||||
for (var i = 0, l = range.length; i < l; i++) {
|
||||
versions = versions.filter(function (v) {
|
||||
return semver.satisfies(v, range[i], options)
|
||||
})
|
||||
if (!versions.length) return fail()
|
||||
}
|
||||
return success(versions)
|
||||
}
|
||||
|
||||
function failInc () {
|
||||
console.error('--inc can only be used on a single version with no range')
|
||||
fail()
|
||||
}
|
||||
|
||||
function fail () { process.exit(1) }
|
||||
|
||||
function success () {
|
||||
var compare = reverse ? 'rcompare' : 'compare'
|
||||
versions.sort(function (a, b) {
|
||||
return semver[compare](a, b, options)
|
||||
}).map(function (v) {
|
||||
return semver.clean(v, options)
|
||||
}).map(function (v) {
|
||||
return inc ? semver.inc(v, inc, options, identifier) : v
|
||||
}).forEach(function (v, i, _) { console.log(v) })
|
||||
}
|
||||
|
||||
function help () {
|
||||
console.log(['SemVer ' + version,
|
||||
'',
|
||||
'A JavaScript implementation of the https://semver.org/ specification',
|
||||
'Copyright Isaac Z. Schlueter',
|
||||
'',
|
||||
'Usage: semver [options] <version> [<version> [...]]',
|
||||
'Prints valid versions sorted by SemVer precedence',
|
||||
'',
|
||||
'Options:',
|
||||
'-r --range <range>',
|
||||
' Print versions that match the specified range.',
|
||||
'',
|
||||
'-i --increment [<level>]',
|
||||
' Increment a version by the specified level. Level can',
|
||||
' be one of: major, minor, patch, premajor, preminor,',
|
||||
" prepatch, or prerelease. Default level is 'patch'.",
|
||||
' Only one version may be specified.',
|
||||
'',
|
||||
'--preid <identifier>',
|
||||
' Identifier to be used to prefix premajor, preminor,',
|
||||
' prepatch or prerelease version increments.',
|
||||
'',
|
||||
'-l --loose',
|
||||
' Interpret versions and ranges loosely',
|
||||
'',
|
||||
'-p --include-prerelease',
|
||||
' Always include prerelease versions in range matching',
|
||||
'',
|
||||
'-c --coerce',
|
||||
' Coerce a string into SemVer if possible',
|
||||
' (does not imply --loose)',
|
||||
'',
|
||||
'--rtl',
|
||||
' Coerce version strings right to left',
|
||||
'',
|
||||
'--ltr',
|
||||
' Coerce version strings left to right (default)',
|
||||
'',
|
||||
'Program exits successfully if any valid version satisfies',
|
||||
'all supplied ranges, and prints all satisfying versions.',
|
||||
'',
|
||||
'If no satisfying versions are found, then exits failure.',
|
||||
'',
|
||||
'Versions are printed in ascending order, so supplying',
|
||||
'multiple versions to the utility will just sort them.'
|
||||
].join('\n'))
|
||||
}
|
||||
38
node_modules/@electron/get/node_modules/semver/package.json
generated
vendored
Normal file
38
node_modules/@electron/get/node_modules/semver/package.json
generated
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
{
|
||||
"name": "semver",
|
||||
"version": "6.3.1",
|
||||
"description": "The semantic version parser used by npm.",
|
||||
"main": "semver.js",
|
||||
"scripts": {
|
||||
"test": "tap test/ --100 --timeout=30",
|
||||
"lint": "echo linting disabled",
|
||||
"postlint": "template-oss-check",
|
||||
"template-oss-apply": "template-oss-apply --force",
|
||||
"lintfix": "npm run lint -- --fix",
|
||||
"snap": "tap test/ --100 --timeout=30",
|
||||
"posttest": "npm run lint"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@npmcli/template-oss": "4.17.0",
|
||||
"tap": "^12.7.0"
|
||||
},
|
||||
"license": "ISC",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/npm/node-semver.git"
|
||||
},
|
||||
"bin": {
|
||||
"semver": "./bin/semver.js"
|
||||
},
|
||||
"files": [
|
||||
"bin",
|
||||
"range.bnf",
|
||||
"semver.js"
|
||||
],
|
||||
"author": "GitHub Inc.",
|
||||
"templateOSS": {
|
||||
"//@npmcli/template-oss": "This file is partially managed by @npmcli/template-oss. Edits may be overwritten.",
|
||||
"content": "./scripts/template-oss",
|
||||
"version": "4.17.0"
|
||||
}
|
||||
}
|
||||
16
node_modules/@electron/get/node_modules/semver/range.bnf
generated
vendored
Normal file
16
node_modules/@electron/get/node_modules/semver/range.bnf
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
range-set ::= range ( logical-or range ) *
|
||||
logical-or ::= ( ' ' ) * '||' ( ' ' ) *
|
||||
range ::= hyphen | simple ( ' ' simple ) * | ''
|
||||
hyphen ::= partial ' - ' partial
|
||||
simple ::= primitive | partial | tilde | caret
|
||||
primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial
|
||||
partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )?
|
||||
xr ::= 'x' | 'X' | '*' | nr
|
||||
nr ::= '0' | [1-9] ( [0-9] ) *
|
||||
tilde ::= '~' partial
|
||||
caret ::= '^' partial
|
||||
qualifier ::= ( '-' pre )? ( '+' build )?
|
||||
pre ::= parts
|
||||
build ::= parts
|
||||
parts ::= part ( '.' part ) *
|
||||
part ::= nr | [-0-9A-Za-z]+
|
||||
1643
node_modules/@electron/get/node_modules/semver/semver.js
generated
vendored
Normal file
1643
node_modules/@electron/get/node_modules/semver/semver.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
100
node_modules/@electron/get/package.json
generated
vendored
Normal file
100
node_modules/@electron/get/package.json
generated
vendored
Normal file
@@ -0,0 +1,100 @@
|
||||
{
|
||||
"name": "@electron/get",
|
||||
"version": "2.0.3",
|
||||
"description": "Utility for downloading artifacts from different versions of Electron",
|
||||
"main": "dist/cjs/index.js",
|
||||
"module": "dist/esm/index.js",
|
||||
"repository": "https://github.com/electron/get",
|
||||
"author": "Samuel Attard",
|
||||
"license": "MIT",
|
||||
"scripts": {
|
||||
"build": "tsc && tsc -p tsconfig.esm.json",
|
||||
"build:docs": "typedoc --out docs",
|
||||
"eslint": "eslint --ext .ts src test",
|
||||
"jest": "jest --coverage",
|
||||
"lint": "npm run prettier && npm run eslint",
|
||||
"prettier": "prettier --check \"src/**/*.ts\" \"test/**/*.ts\"",
|
||||
"prepublishOnly": "npm run build",
|
||||
"test": "npm run lint && npm run jest",
|
||||
"test:nonetwork": "npm run lint && npm run jest -- --testPathIgnorePatterns network.spec"
|
||||
},
|
||||
"files": [
|
||||
"dist/*",
|
||||
"README.md"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"dependencies": {
|
||||
"debug": "^4.1.1",
|
||||
"env-paths": "^2.2.0",
|
||||
"fs-extra": "^8.1.0",
|
||||
"got": "^11.8.5",
|
||||
"progress": "^2.0.3",
|
||||
"semver": "^6.2.0",
|
||||
"sumchecker": "^3.0.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@continuous-auth/semantic-release-npm": "^3.0.0",
|
||||
"@types/debug": "^4.1.4",
|
||||
"@types/fs-extra": "^8.0.0",
|
||||
"@types/jest": "^24.0.13",
|
||||
"@types/node": "^12.20.55",
|
||||
"@types/progress": "^2.0.3",
|
||||
"@types/semver": "^6.2.0",
|
||||
"@typescript-eslint/eslint-plugin": "^2.34.0",
|
||||
"@typescript-eslint/parser": "^2.34.0",
|
||||
"eslint": "^6.8.0",
|
||||
"eslint-config-prettier": "^6.15.0",
|
||||
"eslint-plugin-import": "^2.22.1",
|
||||
"eslint-plugin-jest": "< 24.0.0",
|
||||
"husky": "^2.3.0",
|
||||
"jest": "^24.8.0",
|
||||
"lint-staged": "^8.1.7",
|
||||
"prettier": "^1.17.1",
|
||||
"ts-jest": "^24.0.0",
|
||||
"typedoc": "^0.17.2",
|
||||
"typescript": "^3.8.0"
|
||||
},
|
||||
"eslintConfig": {
|
||||
"parser": "@typescript-eslint/parser",
|
||||
"extends": [
|
||||
"eslint:recommended",
|
||||
"plugin:@typescript-eslint/eslint-recommended",
|
||||
"plugin:@typescript-eslint/recommended",
|
||||
"plugin:jest/recommended",
|
||||
"plugin:import/errors",
|
||||
"plugin:import/warnings",
|
||||
"plugin:import/typescript",
|
||||
"prettier",
|
||||
"prettier/@typescript-eslint"
|
||||
]
|
||||
},
|
||||
"husky": {
|
||||
"hooks": {
|
||||
"pre-commit": "lint-staged"
|
||||
}
|
||||
},
|
||||
"lint-staged": {
|
||||
"*.ts": [
|
||||
"eslint --fix",
|
||||
"prettier --write",
|
||||
"git add"
|
||||
]
|
||||
},
|
||||
"keywords": [
|
||||
"electron",
|
||||
"download",
|
||||
"prebuild",
|
||||
"get",
|
||||
"artifact",
|
||||
"release"
|
||||
],
|
||||
"optionalDependencies": {
|
||||
"global-agent": "^3.0.0"
|
||||
},
|
||||
"resolutions": {
|
||||
"eslint/inquirer": "< 7.3.0",
|
||||
"**/@typescript-eslint/typescript-estree/semver": "^6.3.0"
|
||||
}
|
||||
}
|
||||
7
node_modules/@electron/notarize/LICENSE
generated
vendored
Normal file
7
node_modules/@electron/notarize/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
Copyright 2018 Samuel Attard and contributors
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
143
node_modules/@electron/notarize/README.md
generated
vendored
Normal file
143
node_modules/@electron/notarize/README.md
generated
vendored
Normal file
@@ -0,0 +1,143 @@
|
||||
Electron Notarize
|
||||
-----------
|
||||
|
||||
> Notarize your Electron apps seamlessly for macOS
|
||||
|
||||
[](https://circleci.com/gh/electron/notarize)
|
||||
[](https://npm.im/@electron/notarize)
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# npm
|
||||
npm install @electron/notarize --save-dev
|
||||
|
||||
# yarn
|
||||
yarn add @electron/notarize --dev
|
||||
```
|
||||
|
||||
## What is app "notarization"?
|
||||
|
||||
From Apple's docs in XCode:
|
||||
|
||||
> A notarized app is a macOS app that was uploaded to Apple for processing before it was distributed. When you export a notarized app from Xcode, it code signs the app with a Developer ID certificate and staples a ticket from Apple to the app. The ticket confirms that you previously uploaded the app to Apple.
|
||||
|
||||
> On macOS 10.14 and later, the user can launch notarized apps when Gatekeeper is enabled. When the user first launches a notarized app, Gatekeeper looks for the app’s ticket online. If the user is offline, Gatekeeper looks for the ticket that was stapled to the app.
|
||||
|
||||
Apple has made this a hard requirement as of 10.15 (Catalina).
|
||||
|
||||
## Prerequisites
|
||||
|
||||
For notarization, you need the following things:
|
||||
|
||||
1. Xcode 10 or later installed on your Mac.
|
||||
2. An [Apple Developer](https://developer.apple.com/) account.
|
||||
3. [An app-specific password for your ADC account’s Apple ID](https://support.apple.com/HT204397).
|
||||
4. Your app may need to be signed with `hardened-runtime`, including the following entitlement:
|
||||
1. `com.apple.security.cs.allow-jit`
|
||||
|
||||
If you are using Electron 11 or below, you must add the `com.apple.security.cs.allow-unsigned-executable-memory` entitlement too.
|
||||
When using version 12+, this entitlement should not be applied as it increases your app's attack surface.
|
||||
|
||||
## API
|
||||
|
||||
### Method: `notarize(opts): Promise<void>`
|
||||
|
||||
* `options` Object
|
||||
* `tool` String - The notarization tool to use, default is `notarytool`. Can be `legacy` or `notarytool`. `notarytool` is substantially (10x) faster and `legacy` is deprecated and will **stop working** on November 1st 2023.
|
||||
* `appPath` String - The absolute path to your `.app` file
|
||||
* There are different options for each tool: Notarytool
|
||||
* There are three authentication methods available: user name with password:
|
||||
* `appleId` String - The username of your apple developer account
|
||||
* `appleIdPassword` String - The [app-specific password](https://support.apple.com/HT204397) (not your Apple ID password).
|
||||
* `teamId` String - The team ID you want to notarize under.
|
||||
* ... or apiKey with apiIssuer:
|
||||
* `appleApiKey` String - Absolute path to the `.p8` file containing the key. Required for JWT authentication. See Note on JWT authentication below.
|
||||
* `appleApiKeyId` String - App Store Connect API key ID, for example, `T9GPZ92M7K`. Required for JWT authentication. See Note on JWT authentication below.
|
||||
* `appleApiIssuer` String - Your App Store Connect API key issuer, for example, `c055ca8c-e5a8-4836-b61d-aa5794eeb3f4`. Required if `appleApiKey` is specified.
|
||||
* ... or keychain with keychainProfile:
|
||||
* `keychain` String (optional) - The name of the keychain or path to the keychain you stored notarization credentials in. If omitted, iCloud keychain is used by default.
|
||||
* `keychainProfile` String - The name of the profile you provided when storing notarization credentials.
|
||||
* ... or Legacy
|
||||
* `appBundleId` String - The app bundle identifier your Electron app is using. E.g. `com.github.electron`
|
||||
* `ascProvider` String (optional) - Your [Team Short Name](#notes-on-your-team-short-name).
|
||||
* There are two authentication methods available: user name with password:
|
||||
* `appleId` String - The username of your apple developer account
|
||||
* `appleIdPassword` String - The [app-specific password](https://support.apple.com/HT204397) (not your Apple ID password).
|
||||
* ... or apiKey with apiIssuer:
|
||||
* `appleApiKey` String - Required for JWT authentication. See Note on JWT authentication below.
|
||||
* `appleApiIssuer` String - Issuer ID. Required if `appleApiKey` is specified.
|
||||
|
||||
## Safety when using `appleIdPassword`
|
||||
|
||||
1. Never hard code your password into your packaging scripts, use an environment
|
||||
variable at a minimum.
|
||||
2. It is possible to provide a keychain reference instead of your actual password (assuming that you have already logged into
|
||||
the Application Loader from Xcode). For example:
|
||||
|
||||
```javascript
|
||||
const password = `@keychain:"Application Loader: ${appleId}"`;
|
||||
```
|
||||
|
||||
Another option is that you can add a new keychain item using either the Keychain Access app or from the command line using the `security` utility:
|
||||
|
||||
```bash
|
||||
security add-generic-password -a "AC_USERNAME" -w <app_specific_password> -s "AC_PASSWORD"
|
||||
```
|
||||
where `AC_USERNAME` should be replaced with your Apple ID, and then in your code you can use:
|
||||
|
||||
```javascript
|
||||
const password = `@keychain:AC_PASSWORD`;
|
||||
```
|
||||
|
||||
## Notes on JWT authentication
|
||||
|
||||
You can obtain an API key from [Appstore Connect](https://appstoreconnect.apple.com/access/api). Create a key with _App Manager_ access. Note down the Issuer ID and download the `.p8` file. This file is your API key and comes with the name of `AuthKey_<appleApiKeyId>.p8`. This is the string you have to supply when calling `notarize`.
|
||||
|
||||
Based on the `ApiKey`, the legacy `altool` will look in the following places for that file:
|
||||
|
||||
* `./private_keys`
|
||||
* `~/private_keys`
|
||||
* `~/.private_keys`
|
||||
* `~/.appstoreconnect/private_keys`
|
||||
|
||||
`notarytool` will not look for the key, and you must instead provide its path as the `appleApiKey` argument.
|
||||
|
||||
## Notes on your Team Short Name
|
||||
|
||||
If you are a member of multiple teams or organizations, you have to tell Apple on behalf of which organization you're uploading. To find your [team's short name](https://forums.developer.apple.com/thread/113798)), you can ask `iTMSTransporter`, which is part of the now deprecated `Application Loader` as well as the newer [`Transporter`](https://apps.apple.com/us/app/transporter/id1450874784?mt=12).
|
||||
|
||||
With `Transporter` installed, run:
|
||||
```sh
|
||||
/Applications/Transporter.app/Contents/itms/bin/iTMSTransporter -m provider -u APPLE_DEV_ACCOUNT -p APP_PASSWORD
|
||||
```
|
||||
|
||||
Alternatively, with older versions of Xcode, run:
|
||||
```sh
|
||||
/Applications/Xcode.app/Contents/Applications/Application Loader.app/Contents/itms/bin/iTMSTransporter -m provider -u APPLE_DEV_ACCOUNT -p APP_PASSWORD
|
||||
```
|
||||
|
||||
## Notes on your teamId
|
||||
|
||||
If you use the new Notary Tool method with `appleId`/`appleIdPassword` you will need to set the `teamId` option. To get this ID, go to your [Apple Developer Account](https://developer.apple.com/account), then click on "Membership details", and there you will find your Team ID. This link should get you there directly: https://developer.apple.com/account#MembershipDetailsCard
|
||||
|
||||
## Debug
|
||||
|
||||
[`debug`](https://www.npmjs.com/package/debug) is used to display logs and messages. You can use `export DEBUG=electron-notarize*` to log additional debug information from this module.
|
||||
|
||||
## Example Usage
|
||||
|
||||
```javascript
|
||||
import { notarize } from '@electron/notarize';
|
||||
|
||||
async function packageTask () {
|
||||
// Package your app here, and code sign with hardened runtime
|
||||
await notarize({
|
||||
appBundleId,
|
||||
appPath,
|
||||
appleId,
|
||||
appleIdPassword,
|
||||
ascProvider, // This parameter is optional
|
||||
});
|
||||
}
|
||||
```
|
||||
2
node_modules/@electron/notarize/lib/check-signature.d.ts
generated
vendored
Normal file
2
node_modules/@electron/notarize/lib/check-signature.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
import { NotarizeStapleOptions } from './types';
|
||||
export declare function checkSignatures(opts: NotarizeStapleOptions): Promise<void>;
|
||||
75
node_modules/@electron/notarize/lib/check-signature.js
generated
vendored
Normal file
75
node_modules/@electron/notarize/lib/check-signature.js
generated
vendored
Normal file
@@ -0,0 +1,75 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
||||
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
||||
return new (P || (P = Promise))(function (resolve, reject) {
|
||||
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
|
||||
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
|
||||
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
|
||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
||||
});
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.checkSignatures = void 0;
|
||||
const path = __importStar(require("path"));
|
||||
const spawn_1 = require("./spawn");
|
||||
const debug_1 = __importDefault(require("debug"));
|
||||
const d = (0, debug_1.default)('electron-notarize');
|
||||
const codesignDisplay = (opts) => __awaiter(void 0, void 0, void 0, function* () {
|
||||
const result = yield (0, spawn_1.spawn)('codesign', ['-dv', '-vvvv', '--deep', path.basename(opts.appPath)], {
|
||||
cwd: path.dirname(opts.appPath),
|
||||
});
|
||||
return result;
|
||||
});
|
||||
const codesign = (opts) => __awaiter(void 0, void 0, void 0, function* () {
|
||||
d('attempting to check codesign of app:', opts.appPath);
|
||||
const result = yield (0, spawn_1.spawn)('codesign', ['-vvv', '--deep', '--strict', path.basename(opts.appPath)], {
|
||||
cwd: path.dirname(opts.appPath),
|
||||
});
|
||||
return result;
|
||||
});
|
||||
function checkSignatures(opts) {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
const [codesignResult, codesignInfo] = yield Promise.all([codesign(opts), codesignDisplay(opts)]);
|
||||
let error = '';
|
||||
if (codesignInfo.code !== 0) {
|
||||
d('codesignInfo failed');
|
||||
error = `Failed to display codesign info on your application with code: ${codesignInfo.code}\n\n${codesignInfo.output}\n`;
|
||||
}
|
||||
if (codesignResult.code !== 0) {
|
||||
d('codesign check failed');
|
||||
error += `Failed to codesign your application with code: ${codesignResult.code}\n\n${codesignResult.output}\n\n${codesignInfo.output}`;
|
||||
}
|
||||
if (error) {
|
||||
throw new Error(error);
|
||||
}
|
||||
d('codesign assess succeeded');
|
||||
});
|
||||
}
|
||||
exports.checkSignatures = checkSignatures;
|
||||
//# sourceMappingURL=check-signature.js.map
|
||||
1
node_modules/@electron/notarize/lib/check-signature.js.map
generated
vendored
Normal file
1
node_modules/@electron/notarize/lib/check-signature.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"check-signature.js","sourceRoot":"","sources":["../src/check-signature.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,2CAA6B;AAE7B,mCAAgC;AAEhC,kDAA0B;AAC1B,MAAM,CAAC,GAAG,IAAA,eAAK,EAAC,mBAAmB,CAAC,CAAC;AAErC,MAAM,eAAe,GAAG,CAAO,IAA2B,EAAE,EAAE;IAC5D,MAAM,MAAM,GAAG,MAAM,IAAA,aAAK,EAAC,UAAU,EAAE,CAAC,KAAK,EAAE,OAAO,EAAE,QAAQ,EAAE,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,EAAE;QAC9F,GAAG,EAAE,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,OAAO,CAAC;KAChC,CAAC,CAAC;IACH,OAAO,MAAM,CAAC;AAChB,CAAC,CAAA,CAAC;AAEF,MAAM,QAAQ,GAAG,CAAO,IAA2B,EAAE,EAAE;IACrD,CAAC,CAAC,sCAAsC,EAAE,IAAI,CAAC,OAAO,CAAC,CAAC;IACxD,MAAM,MAAM,GAAG,MAAM,IAAA,aAAK,EACxB,UAAU,EACV,CAAC,MAAM,EAAE,QAAQ,EAAE,UAAU,EAAE,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,EAC3D;QACE,GAAG,EAAE,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,OAAO,CAAC;KAChC,CACF,CAAC;IAEF,OAAO,MAAM,CAAC;AAChB,CAAC,CAAA,CAAC;AACF,SAAsB,eAAe,CAAC,IAA2B;;QAC/D,MAAM,CAAC,cAAc,EAAE,YAAY,CAAC,GAAG,MAAM,OAAO,CAAC,GAAG,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,EAAE,eAAe,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;QAClG,IAAI,KAAK,GAAG,EAAE,CAAC;QAEf,IAAI,YAAY,CAAC,IAAI,KAAK,CAAC,EAAE;YAC3B,CAAC,CAAC,qBAAqB,CAAC,CAAC;YACzB,KAAK,GAAG,kEAAkE,YAAY,CAAC,IAAI,OAAO,YAAY,CAAC,MAAM,IAAI,CAAC;SAC3H;QACD,IAAI,cAAc,CAAC,IAAI,KAAK,CAAC,EAAE;YAC7B,CAAC,CAAC,uBAAuB,CAAC,CAAC;YAC3B,KAAK,IAAI,kDAAkD,cAAc,CAAC,IAAI,OAAO,cAAc,CAAC,MAAM,OAAO,YAAY,CAAC,MAAM,EAAE,CAAC;SACxI;QAED,IAAI,KAAK,EAAE;YACT,MAAM,IAAI,KAAK,CAAC,KAAK,CAAC,CAAC;SACxB;QACD,CAAC,CAAC,2BAA2B,CAAC,CAAC;IACjC,CAAC;CAAA;AAjBD,0CAiBC"}
|
||||
13
node_modules/@electron/notarize/lib/helpers.d.ts
generated
vendored
Normal file
13
node_modules/@electron/notarize/lib/helpers.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
export declare function withTempDir<T>(fn: (dir: string) => Promise<T>): Promise<T>;
|
||||
export declare function makeSecret(s: string): string;
|
||||
export declare function isSecret(s: string): boolean;
|
||||
export interface NotarizationInfo {
|
||||
uuid: string;
|
||||
date: Date;
|
||||
status: 'invalid' | 'in progress' | 'success';
|
||||
logFileUrl: string | null;
|
||||
statusCode?: 0 | 2;
|
||||
statusMessage?: string;
|
||||
}
|
||||
export declare function parseNotarizationInfo(info: string): NotarizationInfo;
|
||||
export declare function delay(ms: number): Promise<void>;
|
||||
106
node_modules/@electron/notarize/lib/helpers.js
generated
vendored
Normal file
106
node_modules/@electron/notarize/lib/helpers.js
generated
vendored
Normal file
@@ -0,0 +1,106 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
||||
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
||||
return new (P || (P = Promise))(function (resolve, reject) {
|
||||
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
|
||||
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
|
||||
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
|
||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
||||
});
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.delay = exports.parseNotarizationInfo = exports.isSecret = exports.makeSecret = exports.withTempDir = void 0;
|
||||
const debug_1 = __importDefault(require("debug"));
|
||||
const fs = __importStar(require("fs-extra"));
|
||||
const os = __importStar(require("os"));
|
||||
const path = __importStar(require("path"));
|
||||
const d = (0, debug_1.default)('electron-notarize:helpers');
|
||||
function withTempDir(fn) {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
const dir = yield fs.mkdtemp(path.resolve(os.tmpdir(), 'electron-notarize-'));
|
||||
d('doing work inside temp dir:', dir);
|
||||
let result;
|
||||
try {
|
||||
result = yield fn(dir);
|
||||
}
|
||||
catch (err) {
|
||||
d('work failed');
|
||||
yield fs.remove(dir);
|
||||
throw err;
|
||||
}
|
||||
d('work succeeded');
|
||||
yield fs.remove(dir);
|
||||
return result;
|
||||
});
|
||||
}
|
||||
exports.withTempDir = withTempDir;
|
||||
class Secret {
|
||||
constructor(value) {
|
||||
this.value = value;
|
||||
}
|
||||
toString() {
|
||||
return this.value;
|
||||
}
|
||||
inspect() {
|
||||
return '******';
|
||||
}
|
||||
}
|
||||
function makeSecret(s) {
|
||||
return new Secret(s);
|
||||
}
|
||||
exports.makeSecret = makeSecret;
|
||||
function isSecret(s) {
|
||||
return s instanceof Secret;
|
||||
}
|
||||
exports.isSecret = isSecret;
|
||||
function parseNotarizationInfo(info) {
|
||||
const out = {};
|
||||
const matchToProperty = (key, r, modifier) => {
|
||||
const exec = r.exec(info);
|
||||
if (exec) {
|
||||
out[key] = modifier ? modifier(exec[1]) : exec[1];
|
||||
}
|
||||
};
|
||||
matchToProperty('uuid', /\n *RequestUUID: (.+?)\n/);
|
||||
matchToProperty('date', /\n *Date: (.+?)\n/, d => new Date(d));
|
||||
matchToProperty('status', /\n *Status: (.+?)\n/);
|
||||
matchToProperty('logFileUrl', /\n *LogFileURL: (.+?)\n/);
|
||||
matchToProperty('statusCode', /\n *Status Code: (.+?)\n/, n => parseInt(n, 10));
|
||||
matchToProperty('statusMessage', /\n *Status Message: (.+?)\n/);
|
||||
if (out.logFileUrl === '(null)') {
|
||||
out.logFileUrl = null;
|
||||
}
|
||||
return out;
|
||||
}
|
||||
exports.parseNotarizationInfo = parseNotarizationInfo;
|
||||
function delay(ms) {
|
||||
return new Promise(resolve => setTimeout(resolve, ms));
|
||||
}
|
||||
exports.delay = delay;
|
||||
//# sourceMappingURL=helpers.js.map
|
||||
1
node_modules/@electron/notarize/lib/helpers.js.map
generated
vendored
Normal file
1
node_modules/@electron/notarize/lib/helpers.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"helpers.js","sourceRoot":"","sources":["../src/helpers.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA,kDAA0B;AAC1B,6CAA+B;AAC/B,uCAAyB;AACzB,2CAA6B;AAE7B,MAAM,CAAC,GAAG,IAAA,eAAK,EAAC,2BAA2B,CAAC,CAAC;AAE7C,SAAsB,WAAW,CAAI,EAA+B;;QAClE,MAAM,GAAG,GAAG,MAAM,EAAE,CAAC,OAAO,CAAC,IAAI,CAAC,OAAO,CAAC,EAAE,CAAC,MAAM,EAAE,EAAE,oBAAoB,CAAC,CAAC,CAAC;QAC9E,CAAC,CAAC,6BAA6B,EAAE,GAAG,CAAC,CAAC;QACtC,IAAI,MAAS,CAAC;QACd,IAAI;YACF,MAAM,GAAG,MAAM,EAAE,CAAC,GAAG,CAAC,CAAC;SACxB;QAAC,OAAO,GAAG,EAAE;YACZ,CAAC,CAAC,aAAa,CAAC,CAAC;YACjB,MAAM,EAAE,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;YACrB,MAAM,GAAG,CAAC;SACX;QACD,CAAC,CAAC,gBAAgB,CAAC,CAAC;QACpB,MAAM,EAAE,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;QACrB,OAAO,MAAM,CAAC;IAChB,CAAC;CAAA;AAdD,kCAcC;AAED,MAAM,MAAM;IACV,YAAoB,KAAa;QAAb,UAAK,GAAL,KAAK,CAAQ;IAAG,CAAC;IAErC,QAAQ;QACN,OAAO,IAAI,CAAC,KAAK,CAAC;IACpB,CAAC;IACD,OAAO;QACL,OAAO,QAAQ,CAAC;IAClB,CAAC;CACF;AAED,SAAgB,UAAU,CAAC,CAAS;IAClC,OAAQ,IAAI,MAAM,CAAC,CAAC,CAAmB,CAAC;AAC1C,CAAC;AAFD,gCAEC;AAED,SAAgB,QAAQ,CAAC,CAAS;IAChC,OAAQ,CAAS,YAAY,MAAM,CAAC;AACtC,CAAC;AAFD,4BAEC;AAYD,SAAgB,qBAAqB,CAAC,IAAY;IAChD,MAAM,GAAG,GAAG,EAAS,CAAC;IACtB,MAAM,eAAe,GAAG,CACtB,GAAM,EACN,CAAS,EACT,QAA6C,EAC7C,EAAE;QACF,MAAM,IAAI,GAAG,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,IAAI,EAAE;YACR,GAAG,CAAC,GAAG,CAAC,GAAG,QAAQ,CAAC,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;SACnD;IACH,CAAC,CAAC;IACF,eAAe,CAAC,MAAM,EAAE,0BAA0B,CAAC,CAAC;IACpD,eAAe,CAAC,MAAM,EAAE,mBAAmB,EAAE,CAAC,CAAC,EAAE,CAAC,IAAI,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC;IAC/D,eAAe,CAAC,QAAQ,EAAE,qBAAqB,CAAC,CAAC;IACjD,eAAe,CAAC,YAAY,EAAE,yBAAyB,CAAC,CAAC;IACzD,eAAe,CAAC,YAAY,EAAE,0BAA0B,EAAE,CAAC,CAAC,EAAE,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,CAAQ,CAAC,CAAC;IACvF,eAAe,CAAC,eAAe,EAAE,6BAA6B,CAAC,CAAC;IAEhE,IAAI,GAAG,CAAC,UAAU,KAAK,QAAQ,EAAE;QAC/B,GAAG,CAAC,UAAU,GAAG,IAAI,CAAC;KACvB;IAED,OAAO,GAAG,CAAC;AACb,CAAC;AAxBD,sDAwBC;AAED,SAAgB,KAAK,CAAC,EAAU;IAC9B,OAAO,IAAI,OAAO,CAAC,OAAO,CAAC,EAAE,CAAC,UAAU,CAAC,OAAO,EAAE,EAAE,CAAC,CAAC,CAAC;AACzD,CAAC;AAFD,sBAEC"}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user