Last commit july 5th

This commit is contained in:
2024-07-05 13:46:23 +02:00
parent dad0d86e8c
commit b0e4dfbb76
24982 changed files with 2621219 additions and 413 deletions

13
spa/node_modules/atomically/.editorconfig generated vendored Normal file
View File

@@ -0,0 +1,13 @@
root = true
[*]
charset = utf-8
end_of_line = lf
indent_size = 2
indent_style = space
insert_final_newline = true
trim_trailing_whitespace = true
[*.md]
trim_trailing_whitespace = false

1
spa/node_modules/atomically/.nvmrc generated vendored Normal file
View File

@@ -0,0 +1 @@
v10.12.0

21
spa/node_modules/atomically/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,21 @@
The MIT License (MIT)
Copyright (c) 2020-present Fabio Spampinato
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the "Software"),
to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense,
and/or sell copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.

147
spa/node_modules/atomically/README.md generated vendored Normal file
View File

@@ -0,0 +1,147 @@
# Atomically
Read and write files atomically and reliably.
## Features
- Overview:
- This library is a rewrite of [`write-file-atomic`](https://github.com/npm/write-file-atomic), with some important enhancements on top, you can largely use this as a drop-in replacement.
- This library is written in TypeScript, so types aren't an afterthought but come with library.
- This library is slightly faster than [`write-file-atomic`](https://github.com/npm/write-file-atomic), and it can be 10x faster, while being essentially just as safe, by using the `fsyncWait` option.
- This library has 0 dependencies, so there's less code to vet and the entire thing is roughly 20% smaller than [`write-file-atomic`](https://github.com/npm/write-file-atomic).
- This library tries harder to write files on disk than [`write-file-atomic`](https://github.com/npm/write-file-atomic) does, by default retrying some failed operations and handling some more errors.
- Reliability:
- Reads are retried, when appropriate, until they succeed or the timeout is reached.
- Writes are atomic, meaning that first a temporary file containing the new content is written, then this file is renamed to the final path, this way it's impossible to get a corrupt/partially-written file.
- Writes happening to the same path are queued, ensuring they don't interfere with each other.
- Temporary files can be configured to not be purged from disk if the write operation fails, which is useful for when keeping the temporary file is better than just losing data.
- Any needed missing parent folder will be created automatically.
- Symlinks are resolved automatically.
- `ENOSYS` errors on `chmod`/`chown` operations are ignored.
- `EINVAL`/`EPERM` errors on `chmod`/`chown` operations, in POSIX systems where the user is not root, are ignored.
- `EMFILE`/`ENFILE`/`EAGAIN`/`EBUSY`/`EACCESS`/`EACCS`/`EPERM` errors happening during necessary operations are caught and the operations are retried until they succeed or the timeout is reached.
- `ENAMETOOLONG` errors, both appening because of the final path or the temporary path, are attempted to be worked around by smartly truncating paths.
- Temporary files:
- By default they are purged automatically once the write operation is completed or if the process exits (cleanly or not).
- By default they are created by appending a `.tmp-[timestamp][randomness]` suffix to destination paths:
- The `tmp-` part gives users a hint about the nature of these files, if they happen to see them.
- The `[timestamp]` part consists of the 10 least significant digits of a milliseconds-precise timestamp, making it likely that if more than one of these files are kept on disk the user will see them in chronological order.
- The `[randomness]` part consists of 6 random hex characters.
- If by any chance a collision is found then another suffix is generated.
- Custom options:
- `chown`: it allows you to specify custom group and user ids:
- by default the old file's ids are copied over.
- if custom ids are provided they will be used.
- if `false` the default ids are used.
- `encoding`: it allows you to specify the encoding of the file content:
- by default when reading no encoding is specified and a raw buffer is returned.
- by default when writing `utf8` is used when.
- `fsync`: it allows you to control whether the `fsync` syscall is triggered right after writing the file or not:
- by default the syscall is triggered immediately after writing the file, increasing the chances that the file will actually be written to disk in case of imminent catastrophic failures, like power outages.
- if `false` the syscall won't be triggered.
- `fsyncWait`: it allows you to control whether the triggered `fsync` is waited or not:
- by default the syscall is waited.
- if `false` the syscall will still be triggered but not be waited.
- this increases performance 10x in some cases, and at the end of the day often there's no plan B if `fsync` fails anyway.
- `mode`: it allows you to specify the mode for the file:
- by default the old file's mode is copied over.
- if `false` then `0o666` is used.
- `schedule`: it's a function that returns a promise that resolves to a disposer function, basically it allows you to provide some custom queueing logic for the writing operation, allowing you to perhaps wire `atomically` with your app's main filesystem job scheduler:
- even when a custom `schedule` function is provided write operations will still be queued internally by the library too.
- `timeout`: it allows you to specify the amount of maximum milliseconds within which the library will retry some failed operations:
- when writing asynchronously by default it will keep retrying for 5000 milliseconds.
- when writing synchronously by default it will keep retrying for 100 milliseconds.
- if `0` or `-1` no failed operations will be retried.
- if another number is provided that will be the timeout interval.
- `tmpCreate`: it's a function that will be used to create the custom temporary file path in place of the default one:
- even when a custom function is provided the final temporary path will still be truncated if the library thinks that it may lead to `ENAMETOOLONG` errors.
- paths by default are truncated in a way that preserves an eventual existing leading dot and trailing extension.
- `tmpCreated`: it's a function that will be called with the newly created temporary file path.
- `tmpPurge`: it allows you to control whether the temporary file will be purged from the filesystem or not if the write fails:
- by default it will be purged.
- if `false` it will be kept on disk.
## Install
```sh
npm install --save atomically
```
## Usage
This is the shape of the optional options object:
```ts
type Disposer = () => void;
type ReadOptions = string | {
encoding?: string | null,
mode?: string | number | false,
timeout?: number
};
type WriteOptions = string | {
chown?: { gid: number, uid: number } | false,
encoding?: string | null,
fsync?: boolean,
fsyncWait?: boolean,
mode?: string | number | false,
schedule?: ( filePath: string ) => Promise<Disposer>,
timeout?: number,
tmpCreate?: ( filePath: string ) => string,
tmpCreated?: ( filePath: string ) => any,
tmpPurge?: boolean
};
```
This is the shape of the provided functions:
```ts
function readFile ( filePath: string, options?: ReadOptions ): Promise<Buffer | string>;
function readFileSync ( filePath: string, options?: ReadOptions ): Buffer | string;
function writeFile ( filePath: string, data: Buffer | string | undefined, options?: WriteOptions ): Promise<void>;
function writeFileSync ( filePath: string, data: Buffer | string | undefined, options?: WriteOptions ): void;
```
This is how to use the library:
```ts
import {readFile, readFileSync, writeFile, writeFileSync} from 'atomically';
// Asynchronous read with default option
const buffer = await readFile ( '/foo.txt' );
// Synchronous read assuming the encoding is "utf8"
const string = readFileSync ( '/foo.txt', 'utf8' );
// Asynchronous write with default options
await writeFile ( '/foo.txt', 'my_data' );
// Asynchronous write that doesn't prod the old file for a stat object at all
await writeFile ( '/foo.txt', 'my_data', { chown: false, mode: false } );
// 10x faster asynchronous write that's less resilient against imminent catastrophies
await writeFile ( '/foo.txt', 'my_data', { fsync: false } );
// 10x faster asynchronous write that's essentially still as resilient against imminent catastrophies
await writeFile ( '/foo.txt', 'my_data', { fsyncWait: false } );
// Asynchronous write with a custom schedule function
await writeFile ( '/foo.txt', 'my_data', {
schedule: filePath => {
return new Promise ( resolve => { // When this returned promise will resolve the write operation will begin
MyScheduler.schedule ( filePath, () => { // Hypothetical scheduler function that will eventually tell us to go on with this write operation
const disposer = () => {}; // Hypothetical function that contains eventual clean-up logic, it will be called after the write operation has been completed (successfully or not)
resolve ( disposer ); // Resolving the promise with a disposer, beginning the write operation
})
});
}
});
// Synchronous write with default options
writeFileSync ( '/foo.txt', 'my_data' );
```
## License
MIT © Fabio Spampinato

13
spa/node_modules/atomically/dist/consts.d.ts generated vendored Normal file
View File

@@ -0,0 +1,13 @@
declare const DEFAULT_ENCODING = "utf8";
declare const DEFAULT_FILE_MODE = 438;
declare const DEFAULT_FOLDER_MODE = 511;
declare const DEFAULT_READ_OPTIONS: {};
declare const DEFAULT_WRITE_OPTIONS: {};
declare const DEFAULT_TIMEOUT_ASYNC = 5000;
declare const DEFAULT_TIMEOUT_SYNC = 100;
declare const IS_POSIX = true;
declare const IS_USER_ROOT: boolean;
declare const LIMIT_BASENAME_LENGTH = 128;
declare const LIMIT_FILES_DESCRIPTORS = 10000;
declare const NOOP: () => void;
export { DEFAULT_ENCODING, DEFAULT_FILE_MODE, DEFAULT_FOLDER_MODE, DEFAULT_READ_OPTIONS, DEFAULT_WRITE_OPTIONS, DEFAULT_TIMEOUT_ASYNC, DEFAULT_TIMEOUT_SYNC, IS_POSIX, IS_USER_ROOT, LIMIT_BASENAME_LENGTH, LIMIT_FILES_DESCRIPTORS, NOOP };

28
spa/node_modules/atomically/dist/consts.js generated vendored Normal file
View File

@@ -0,0 +1,28 @@
"use strict";
/* CONSTS */
Object.defineProperty(exports, "__esModule", { value: true });
exports.NOOP = exports.LIMIT_FILES_DESCRIPTORS = exports.LIMIT_BASENAME_LENGTH = exports.IS_USER_ROOT = exports.IS_POSIX = exports.DEFAULT_TIMEOUT_SYNC = exports.DEFAULT_TIMEOUT_ASYNC = exports.DEFAULT_WRITE_OPTIONS = exports.DEFAULT_READ_OPTIONS = exports.DEFAULT_FOLDER_MODE = exports.DEFAULT_FILE_MODE = exports.DEFAULT_ENCODING = void 0;
const DEFAULT_ENCODING = 'utf8';
exports.DEFAULT_ENCODING = DEFAULT_ENCODING;
const DEFAULT_FILE_MODE = 0o666;
exports.DEFAULT_FILE_MODE = DEFAULT_FILE_MODE;
const DEFAULT_FOLDER_MODE = 0o777;
exports.DEFAULT_FOLDER_MODE = DEFAULT_FOLDER_MODE;
const DEFAULT_READ_OPTIONS = {};
exports.DEFAULT_READ_OPTIONS = DEFAULT_READ_OPTIONS;
const DEFAULT_WRITE_OPTIONS = {};
exports.DEFAULT_WRITE_OPTIONS = DEFAULT_WRITE_OPTIONS;
const DEFAULT_TIMEOUT_ASYNC = 5000;
exports.DEFAULT_TIMEOUT_ASYNC = DEFAULT_TIMEOUT_ASYNC;
const DEFAULT_TIMEOUT_SYNC = 100;
exports.DEFAULT_TIMEOUT_SYNC = DEFAULT_TIMEOUT_SYNC;
const IS_POSIX = !!process.getuid;
exports.IS_POSIX = IS_POSIX;
const IS_USER_ROOT = process.getuid ? !process.getuid() : false;
exports.IS_USER_ROOT = IS_USER_ROOT;
const LIMIT_BASENAME_LENGTH = 128; //TODO: fetch the real limit from the filesystem //TODO: fetch the whole-path length limit too
exports.LIMIT_BASENAME_LENGTH = LIMIT_BASENAME_LENGTH;
const LIMIT_FILES_DESCRIPTORS = 10000; //TODO: fetch the real limit from the filesystem
exports.LIMIT_FILES_DESCRIPTORS = LIMIT_FILES_DESCRIPTORS;
const NOOP = () => { };
exports.NOOP = NOOP;

13
spa/node_modules/atomically/dist/index.d.ts generated vendored Normal file
View File

@@ -0,0 +1,13 @@
/// <reference types="node" />
import { Callback, Data, Path, ReadOptions, WriteOptions } from './types';
declare function readFile(filePath: Path, options: string | ReadOptions & {
encoding: string;
}): Promise<string>;
declare function readFile(filePath: Path, options?: ReadOptions): Promise<Buffer>;
declare function readFileSync(filePath: Path, options: string | ReadOptions & {
encoding: string;
}): string;
declare function readFileSync(filePath: Path, options?: ReadOptions): Buffer;
declare const writeFile: (filePath: Path, data: Data, options?: string | WriteOptions | Callback | undefined, callback?: Callback | undefined) => Promise<void>;
declare const writeFileSync: (filePath: Path, data: Data, options?: string | WriteOptions) => void;
export { readFile, readFileSync, writeFile, writeFileSync };

177
spa/node_modules/atomically/dist/index.js generated vendored Normal file
View File

@@ -0,0 +1,177 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
exports.writeFileSync = exports.writeFile = exports.readFileSync = exports.readFile = void 0;
const path = require("path");
const consts_1 = require("./consts");
const fs_1 = require("./utils/fs");
const lang_1 = require("./utils/lang");
const scheduler_1 = require("./utils/scheduler");
const temp_1 = require("./utils/temp");
function readFile(filePath, options = consts_1.DEFAULT_READ_OPTIONS) {
var _a;
if (lang_1.default.isString(options))
return readFile(filePath, { encoding: options });
const timeout = Date.now() + ((_a = options.timeout) !== null && _a !== void 0 ? _a : consts_1.DEFAULT_TIMEOUT_ASYNC);
return fs_1.default.readFileRetry(timeout)(filePath, options);
}
exports.readFile = readFile;
;
function readFileSync(filePath, options = consts_1.DEFAULT_READ_OPTIONS) {
var _a;
if (lang_1.default.isString(options))
return readFileSync(filePath, { encoding: options });
const timeout = Date.now() + ((_a = options.timeout) !== null && _a !== void 0 ? _a : consts_1.DEFAULT_TIMEOUT_SYNC);
return fs_1.default.readFileSyncRetry(timeout)(filePath, options);
}
exports.readFileSync = readFileSync;
;
const writeFile = (filePath, data, options, callback) => {
if (lang_1.default.isFunction(options))
return writeFile(filePath, data, consts_1.DEFAULT_WRITE_OPTIONS, options);
const promise = writeFileAsync(filePath, data, options);
if (callback)
promise.then(callback, callback);
return promise;
};
exports.writeFile = writeFile;
const writeFileAsync = async (filePath, data, options = consts_1.DEFAULT_WRITE_OPTIONS) => {
var _a;
if (lang_1.default.isString(options))
return writeFileAsync(filePath, data, { encoding: options });
const timeout = Date.now() + ((_a = options.timeout) !== null && _a !== void 0 ? _a : consts_1.DEFAULT_TIMEOUT_ASYNC);
let schedulerCustomDisposer = null, schedulerDisposer = null, tempDisposer = null, tempPath = null, fd = null;
try {
if (options.schedule)
schedulerCustomDisposer = await options.schedule(filePath);
schedulerDisposer = await scheduler_1.default.schedule(filePath);
filePath = await fs_1.default.realpathAttempt(filePath) || filePath;
[tempPath, tempDisposer] = temp_1.default.get(filePath, options.tmpCreate || temp_1.default.create, !(options.tmpPurge === false));
const useStatChown = consts_1.IS_POSIX && lang_1.default.isUndefined(options.chown), useStatMode = lang_1.default.isUndefined(options.mode);
if (useStatChown || useStatMode) {
const stat = await fs_1.default.statAttempt(filePath);
if (stat) {
options = { ...options };
if (useStatChown)
options.chown = { uid: stat.uid, gid: stat.gid };
if (useStatMode)
options.mode = stat.mode;
}
}
const parentPath = path.dirname(filePath);
await fs_1.default.mkdirAttempt(parentPath, {
mode: consts_1.DEFAULT_FOLDER_MODE,
recursive: true
});
fd = await fs_1.default.openRetry(timeout)(tempPath, 'w', options.mode || consts_1.DEFAULT_FILE_MODE);
if (options.tmpCreated)
options.tmpCreated(tempPath);
if (lang_1.default.isString(data)) {
await fs_1.default.writeRetry(timeout)(fd, data, 0, options.encoding || consts_1.DEFAULT_ENCODING);
}
else if (!lang_1.default.isUndefined(data)) {
await fs_1.default.writeRetry(timeout)(fd, data, 0, data.length, 0);
}
if (options.fsync !== false) {
if (options.fsyncWait !== false) {
await fs_1.default.fsyncRetry(timeout)(fd);
}
else {
fs_1.default.fsyncAttempt(fd);
}
}
await fs_1.default.closeRetry(timeout)(fd);
fd = null;
if (options.chown)
await fs_1.default.chownAttempt(tempPath, options.chown.uid, options.chown.gid);
if (options.mode)
await fs_1.default.chmodAttempt(tempPath, options.mode);
try {
await fs_1.default.renameRetry(timeout)(tempPath, filePath);
}
catch (error) {
if (error.code !== 'ENAMETOOLONG')
throw error;
await fs_1.default.renameRetry(timeout)(tempPath, temp_1.default.truncate(filePath));
}
tempDisposer();
tempPath = null;
}
finally {
if (fd)
await fs_1.default.closeAttempt(fd);
if (tempPath)
temp_1.default.purge(tempPath);
if (schedulerCustomDisposer)
schedulerCustomDisposer();
if (schedulerDisposer)
schedulerDisposer();
}
};
const writeFileSync = (filePath, data, options = consts_1.DEFAULT_WRITE_OPTIONS) => {
var _a;
if (lang_1.default.isString(options))
return writeFileSync(filePath, data, { encoding: options });
const timeout = Date.now() + ((_a = options.timeout) !== null && _a !== void 0 ? _a : consts_1.DEFAULT_TIMEOUT_SYNC);
let tempDisposer = null, tempPath = null, fd = null;
try {
filePath = fs_1.default.realpathSyncAttempt(filePath) || filePath;
[tempPath, tempDisposer] = temp_1.default.get(filePath, options.tmpCreate || temp_1.default.create, !(options.tmpPurge === false));
const useStatChown = consts_1.IS_POSIX && lang_1.default.isUndefined(options.chown), useStatMode = lang_1.default.isUndefined(options.mode);
if (useStatChown || useStatMode) {
const stat = fs_1.default.statSyncAttempt(filePath);
if (stat) {
options = { ...options };
if (useStatChown)
options.chown = { uid: stat.uid, gid: stat.gid };
if (useStatMode)
options.mode = stat.mode;
}
}
const parentPath = path.dirname(filePath);
fs_1.default.mkdirSyncAttempt(parentPath, {
mode: consts_1.DEFAULT_FOLDER_MODE,
recursive: true
});
fd = fs_1.default.openSyncRetry(timeout)(tempPath, 'w', options.mode || consts_1.DEFAULT_FILE_MODE);
if (options.tmpCreated)
options.tmpCreated(tempPath);
if (lang_1.default.isString(data)) {
fs_1.default.writeSyncRetry(timeout)(fd, data, 0, options.encoding || consts_1.DEFAULT_ENCODING);
}
else if (!lang_1.default.isUndefined(data)) {
fs_1.default.writeSyncRetry(timeout)(fd, data, 0, data.length, 0);
}
if (options.fsync !== false) {
if (options.fsyncWait !== false) {
fs_1.default.fsyncSyncRetry(timeout)(fd);
}
else {
fs_1.default.fsyncAttempt(fd);
}
}
fs_1.default.closeSyncRetry(timeout)(fd);
fd = null;
if (options.chown)
fs_1.default.chownSyncAttempt(tempPath, options.chown.uid, options.chown.gid);
if (options.mode)
fs_1.default.chmodSyncAttempt(tempPath, options.mode);
try {
fs_1.default.renameSyncRetry(timeout)(tempPath, filePath);
}
catch (error) {
if (error.code !== 'ENAMETOOLONG')
throw error;
fs_1.default.renameSyncRetry(timeout)(tempPath, temp_1.default.truncate(filePath));
}
tempDisposer();
tempPath = null;
}
finally {
if (fd)
fs_1.default.closeSyncAttempt(fd);
if (tempPath)
temp_1.default.purge(tempPath);
}
};
exports.writeFileSync = writeFileSync;

28
spa/node_modules/atomically/dist/types.d.ts generated vendored Normal file
View File

@@ -0,0 +1,28 @@
/// <reference types="node" />
declare type Callback = (error: Exception | void) => any;
declare type Data = Buffer | string | undefined;
declare type Disposer = () => void;
declare type Exception = NodeJS.ErrnoException;
declare type FN<Arguments extends any[] = any[], Return = any> = (...args: Arguments) => Return;
declare type Path = string;
declare type ReadOptions = {
encoding?: string | null;
mode?: string | number | false;
timeout?: number;
};
declare type WriteOptions = {
chown?: {
gid: number;
uid: number;
} | false;
encoding?: string | null;
fsync?: boolean;
fsyncWait?: boolean;
mode?: string | number | false;
schedule?: (filePath: string) => Promise<Disposer>;
timeout?: number;
tmpCreate?: (filePath: string) => string;
tmpCreated?: (filePath: string) => any;
tmpPurge?: boolean;
};
export { Callback, Data, Disposer, Exception, FN, Path, ReadOptions, WriteOptions };

3
spa/node_modules/atomically/dist/types.js generated vendored Normal file
View File

@@ -0,0 +1,3 @@
"use strict";
/* TYPES */
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,4 @@
import { Exception, FN } from '../types';
declare const attemptifyAsync: <T extends FN<any[], any>>(fn: T, onError?: FN<[Exception]>) => T;
declare const attemptifySync: <T extends FN<any[], any>>(fn: T, onError?: FN<[Exception]>) => T;
export { attemptifyAsync, attemptifySync };

25
spa/node_modules/atomically/dist/utils/attemptify.js generated vendored Normal file
View File

@@ -0,0 +1,25 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
exports.attemptifySync = exports.attemptifyAsync = void 0;
const consts_1 = require("../consts");
/* ATTEMPTIFY */
//TODO: Maybe publish this as a standalone package
//FIXME: The type castings here aren't exactly correct
const attemptifyAsync = (fn, onError = consts_1.NOOP) => {
return function () {
return fn.apply(undefined, arguments).catch(onError);
};
};
exports.attemptifyAsync = attemptifyAsync;
const attemptifySync = (fn, onError = consts_1.NOOP) => {
return function () {
try {
return fn.apply(undefined, arguments);
}
catch (error) {
return onError(error);
}
};
};
exports.attemptifySync = attemptifySync;

34
spa/node_modules/atomically/dist/utils/fs.d.ts generated vendored Normal file
View File

@@ -0,0 +1,34 @@
/// <reference types="node" />
import * as fs from 'fs';
declare const FS: {
chmodAttempt: typeof fs.chmod.__promisify__;
chownAttempt: typeof fs.chown.__promisify__;
closeAttempt: typeof fs.close.__promisify__;
fsyncAttempt: typeof fs.fsync.__promisify__;
mkdirAttempt: typeof fs.mkdir.__promisify__;
realpathAttempt: typeof fs.realpath.__promisify__;
statAttempt: typeof fs.stat.__promisify__;
unlinkAttempt: typeof fs.unlink.__promisify__;
closeRetry: import("../types").FN<[number], typeof fs.close.__promisify__>;
fsyncRetry: import("../types").FN<[number], typeof fs.fsync.__promisify__>;
openRetry: import("../types").FN<[number], typeof fs.open.__promisify__>;
readFileRetry: import("../types").FN<[number], typeof fs.readFile.__promisify__>;
renameRetry: import("../types").FN<[number], typeof fs.rename.__promisify__>;
statRetry: import("../types").FN<[number], typeof fs.stat.__promisify__>;
writeRetry: import("../types").FN<[number], typeof fs.write.__promisify__>;
chmodSyncAttempt: typeof fs.chmodSync;
chownSyncAttempt: typeof fs.chownSync;
closeSyncAttempt: typeof fs.closeSync;
mkdirSyncAttempt: typeof fs.mkdirSync;
realpathSyncAttempt: typeof fs.realpathSync;
statSyncAttempt: typeof fs.statSync;
unlinkSyncAttempt: typeof fs.unlinkSync;
closeSyncRetry: import("../types").FN<[number], typeof fs.closeSync>;
fsyncSyncRetry: import("../types").FN<[number], typeof fs.fsyncSync>;
openSyncRetry: import("../types").FN<[number], typeof fs.openSync>;
readFileSyncRetry: import("../types").FN<[number], typeof fs.readFileSync>;
renameSyncRetry: import("../types").FN<[number], typeof fs.renameSync>;
statSyncRetry: import("../types").FN<[number], typeof fs.statSync>;
writeSyncRetry: import("../types").FN<[number], typeof fs.writeSync>;
};
export default FS;

42
spa/node_modules/atomically/dist/utils/fs.js generated vendored Normal file
View File

@@ -0,0 +1,42 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const util_1 = require("util");
const attemptify_1 = require("./attemptify");
const fs_handlers_1 = require("./fs_handlers");
const retryify_1 = require("./retryify");
/* FS */
const FS = {
chmodAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.chmod), fs_handlers_1.default.onChangeError),
chownAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.chown), fs_handlers_1.default.onChangeError),
closeAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.close)),
fsyncAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.fsync)),
mkdirAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.mkdir)),
realpathAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.realpath)),
statAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.stat)),
unlinkAttempt: attemptify_1.attemptifyAsync(util_1.promisify(fs.unlink)),
closeRetry: retryify_1.retryifyAsync(util_1.promisify(fs.close), fs_handlers_1.default.isRetriableError),
fsyncRetry: retryify_1.retryifyAsync(util_1.promisify(fs.fsync), fs_handlers_1.default.isRetriableError),
openRetry: retryify_1.retryifyAsync(util_1.promisify(fs.open), fs_handlers_1.default.isRetriableError),
readFileRetry: retryify_1.retryifyAsync(util_1.promisify(fs.readFile), fs_handlers_1.default.isRetriableError),
renameRetry: retryify_1.retryifyAsync(util_1.promisify(fs.rename), fs_handlers_1.default.isRetriableError),
statRetry: retryify_1.retryifyAsync(util_1.promisify(fs.stat), fs_handlers_1.default.isRetriableError),
writeRetry: retryify_1.retryifyAsync(util_1.promisify(fs.write), fs_handlers_1.default.isRetriableError),
chmodSyncAttempt: attemptify_1.attemptifySync(fs.chmodSync, fs_handlers_1.default.onChangeError),
chownSyncAttempt: attemptify_1.attemptifySync(fs.chownSync, fs_handlers_1.default.onChangeError),
closeSyncAttempt: attemptify_1.attemptifySync(fs.closeSync),
mkdirSyncAttempt: attemptify_1.attemptifySync(fs.mkdirSync),
realpathSyncAttempt: attemptify_1.attemptifySync(fs.realpathSync),
statSyncAttempt: attemptify_1.attemptifySync(fs.statSync),
unlinkSyncAttempt: attemptify_1.attemptifySync(fs.unlinkSync),
closeSyncRetry: retryify_1.retryifySync(fs.closeSync, fs_handlers_1.default.isRetriableError),
fsyncSyncRetry: retryify_1.retryifySync(fs.fsyncSync, fs_handlers_1.default.isRetriableError),
openSyncRetry: retryify_1.retryifySync(fs.openSync, fs_handlers_1.default.isRetriableError),
readFileSyncRetry: retryify_1.retryifySync(fs.readFileSync, fs_handlers_1.default.isRetriableError),
renameSyncRetry: retryify_1.retryifySync(fs.renameSync, fs_handlers_1.default.isRetriableError),
statSyncRetry: retryify_1.retryifySync(fs.statSync, fs_handlers_1.default.isRetriableError),
writeSyncRetry: retryify_1.retryifySync(fs.writeSync, fs_handlers_1.default.isRetriableError)
};
/* EXPORT */
exports.default = FS;

View File

@@ -0,0 +1,7 @@
import { Exception } from '../types';
declare const Handlers: {
isChangeErrorOk: (error: Exception) => boolean;
isRetriableError: (error: Exception) => boolean;
onChangeError: (error: Exception) => void;
};
export default Handlers;

28
spa/node_modules/atomically/dist/utils/fs_handlers.js generated vendored Normal file
View File

@@ -0,0 +1,28 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
const consts_1 = require("../consts");
/* FS HANDLERS */
const Handlers = {
isChangeErrorOk: (error) => {
const { code } = error;
if (code === 'ENOSYS')
return true;
if (!consts_1.IS_USER_ROOT && (code === 'EINVAL' || code === 'EPERM'))
return true;
return false;
},
isRetriableError: (error) => {
const { code } = error;
if (code === 'EMFILE' || code === 'ENFILE' || code === 'EAGAIN' || code === 'EBUSY' || code === 'EACCESS' || code === 'EACCS' || code === 'EPERM')
return true;
return false;
},
onChangeError: (error) => {
if (Handlers.isChangeErrorOk(error))
return;
throw error;
}
};
/* EXPORT */
exports.default = Handlers;

6
spa/node_modules/atomically/dist/utils/lang.d.ts generated vendored Normal file
View File

@@ -0,0 +1,6 @@
declare const Lang: {
isFunction: (x: any) => x is Function;
isString: (x: any) => x is string;
isUndefined: (x: any) => x is undefined;
};
export default Lang;

16
spa/node_modules/atomically/dist/utils/lang.js generated vendored Normal file
View File

@@ -0,0 +1,16 @@
"use strict";
/* LANG */
Object.defineProperty(exports, "__esModule", { value: true });
const Lang = {
isFunction: (x) => {
return typeof x === 'function';
},
isString: (x) => {
return typeof x === 'string';
},
isUndefined: (x) => {
return typeof x === 'undefined';
}
};
/* EXPORT */
exports.default = Lang;

4
spa/node_modules/atomically/dist/utils/retryify.d.ts generated vendored Normal file
View File

@@ -0,0 +1,4 @@
import { Exception, FN } from '../types';
declare const retryifyAsync: <T extends FN<any[], any>>(fn: T, isRetriableError: FN<[Exception], boolean | void>) => FN<[number], T>;
declare const retryifySync: <T extends FN<any[], any>>(fn: T, isRetriableError: FN<[Exception], boolean | void>) => FN<[number], T>;
export { retryifyAsync, retryifySync };

45
spa/node_modules/atomically/dist/utils/retryify.js generated vendored Normal file
View File

@@ -0,0 +1,45 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
exports.retryifySync = exports.retryifyAsync = void 0;
const retryify_queue_1 = require("./retryify_queue");
/* RETRYIFY */
const retryifyAsync = (fn, isRetriableError) => {
return function (timestamp) {
return function attempt() {
return retryify_queue_1.default.schedule().then(cleanup => {
return fn.apply(undefined, arguments).then(result => {
cleanup();
return result;
}, error => {
cleanup();
if (Date.now() >= timestamp)
throw error;
if (isRetriableError(error)) {
const delay = Math.round(100 + (400 * Math.random())), delayPromise = new Promise(resolve => setTimeout(resolve, delay));
return delayPromise.then(() => attempt.apply(undefined, arguments));
}
throw error;
});
});
};
};
};
exports.retryifyAsync = retryifyAsync;
const retryifySync = (fn, isRetriableError) => {
return function (timestamp) {
return function attempt() {
try {
return fn.apply(undefined, arguments);
}
catch (error) {
if (Date.now() > timestamp)
throw error;
if (isRetriableError(error))
return attempt.apply(undefined, arguments);
throw error;
}
};
};
};
exports.retryifySync = retryifySync;

View File

@@ -0,0 +1,15 @@
/// <reference types="node" />
declare const RetryfyQueue: {
interval: number;
intervalId: NodeJS.Timeout | undefined;
limit: number;
queueActive: Set<Function>;
queueWaiting: Set<Function>;
init: () => void;
reset: () => void;
add: (fn: Function) => void;
remove: (fn: Function) => void;
schedule: () => Promise<Function>;
tick: () => void;
};
export default RetryfyQueue;

View File

@@ -0,0 +1,58 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
const consts_1 = require("../consts");
/* RETRYIFY QUEUE */
const RetryfyQueue = {
interval: 25,
intervalId: undefined,
limit: consts_1.LIMIT_FILES_DESCRIPTORS,
queueActive: new Set(),
queueWaiting: new Set(),
init: () => {
if (RetryfyQueue.intervalId)
return;
RetryfyQueue.intervalId = setInterval(RetryfyQueue.tick, RetryfyQueue.interval);
},
reset: () => {
if (!RetryfyQueue.intervalId)
return;
clearInterval(RetryfyQueue.intervalId);
delete RetryfyQueue.intervalId;
},
add: (fn) => {
RetryfyQueue.queueWaiting.add(fn);
if (RetryfyQueue.queueActive.size < (RetryfyQueue.limit / 2)) { // Active queue not under preassure, executing immediately
RetryfyQueue.tick();
}
else {
RetryfyQueue.init();
}
},
remove: (fn) => {
RetryfyQueue.queueWaiting.delete(fn);
RetryfyQueue.queueActive.delete(fn);
},
schedule: () => {
return new Promise(resolve => {
const cleanup = () => RetryfyQueue.remove(resolver);
const resolver = () => resolve(cleanup);
RetryfyQueue.add(resolver);
});
},
tick: () => {
if (RetryfyQueue.queueActive.size >= RetryfyQueue.limit)
return;
if (!RetryfyQueue.queueWaiting.size)
return RetryfyQueue.reset();
for (const fn of RetryfyQueue.queueWaiting) {
if (RetryfyQueue.queueActive.size >= RetryfyQueue.limit)
break;
RetryfyQueue.queueWaiting.delete(fn);
RetryfyQueue.queueActive.add(fn);
fn();
}
}
};
/* EXPORT */
exports.default = RetryfyQueue;

View File

@@ -0,0 +1,6 @@
import { Disposer } from '../types';
declare const Scheduler: {
next: (id: string) => void;
schedule: (id: string) => Promise<Disposer>;
};
export default Scheduler;

35
spa/node_modules/atomically/dist/utils/scheduler.js generated vendored Normal file
View File

@@ -0,0 +1,35 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
/* VARIABLES */
const Queues = {};
/* SCHEDULER */
//TODO: Maybe publish this as a standalone package
const Scheduler = {
next: (id) => {
const queue = Queues[id];
if (!queue)
return;
queue.shift();
const job = queue[0];
if (job) {
job(() => Scheduler.next(id));
}
else {
delete Queues[id];
}
},
schedule: (id) => {
return new Promise(resolve => {
let queue = Queues[id];
if (!queue)
queue = Queues[id] = [];
queue.push(resolve);
if (queue.length > 1)
return;
resolve(() => Scheduler.next(id));
});
}
};
/* EXPORT */
exports.default = Scheduler;

11
spa/node_modules/atomically/dist/utils/temp.d.ts generated vendored Normal file
View File

@@ -0,0 +1,11 @@
import { Disposer } from '../types';
declare const Temp: {
store: Record<string, boolean>;
create: (filePath: string) => string;
get: (filePath: string, creator: (filePath: string) => string, purge?: boolean) => [string, Disposer];
purge: (filePath: string) => void;
purgeSync: (filePath: string) => void;
purgeSyncAll: () => void;
truncate: (filePath: string) => string;
};
export default Temp;

56
spa/node_modules/atomically/dist/utils/temp.js generated vendored Normal file
View File

@@ -0,0 +1,56 @@
"use strict";
/* IMPORT */
Object.defineProperty(exports, "__esModule", { value: true });
const path = require("path");
const consts_1 = require("../consts");
const fs_1 = require("./fs");
/* TEMP */
//TODO: Maybe publish this as a standalone package
const Temp = {
store: {},
create: (filePath) => {
const randomness = `000000${Math.floor(Math.random() * 16777215).toString(16)}`.slice(-6), // 6 random-enough hex characters
timestamp = Date.now().toString().slice(-10), // 10 precise timestamp digits
prefix = 'tmp-', suffix = `.${prefix}${timestamp}${randomness}`, tempPath = `${filePath}${suffix}`;
return tempPath;
},
get: (filePath, creator, purge = true) => {
const tempPath = Temp.truncate(creator(filePath));
if (tempPath in Temp.store)
return Temp.get(filePath, creator, purge); // Collision found, try again
Temp.store[tempPath] = purge;
const disposer = () => delete Temp.store[tempPath];
return [tempPath, disposer];
},
purge: (filePath) => {
if (!Temp.store[filePath])
return;
delete Temp.store[filePath];
fs_1.default.unlinkAttempt(filePath);
},
purgeSync: (filePath) => {
if (!Temp.store[filePath])
return;
delete Temp.store[filePath];
fs_1.default.unlinkSyncAttempt(filePath);
},
purgeSyncAll: () => {
for (const filePath in Temp.store) {
Temp.purgeSync(filePath);
}
},
truncate: (filePath) => {
const basename = path.basename(filePath);
if (basename.length <= consts_1.LIMIT_BASENAME_LENGTH)
return filePath; //FIXME: Rough and quick attempt at detecting ok lengths
const truncable = /^(\.?)(.*?)((?:\.[^.]+)?(?:\.tmp-\d{10}[a-f0-9]{6})?)$/.exec(basename);
if (!truncable)
return filePath; //FIXME: No truncable part detected, can't really do much without also changing the parent path, which is unsafe, hoping for the best here
const truncationLength = basename.length - consts_1.LIMIT_BASENAME_LENGTH;
return `${filePath.slice(0, -basename.length)}${truncable[1]}${truncable[2].slice(0, -truncationLength)}${truncable[3]}`; //FIXME: The truncable part might be shorter than needed here
}
};
/* INIT */
process.on('exit', Temp.purgeSyncAll); // Ensuring purgeable temp files are purged on exit
/* EXPORT */
exports.default = Temp;

51
spa/node_modules/atomically/package.json generated vendored Executable file
View File

@@ -0,0 +1,51 @@
{
"name": "atomically",
"description": "Read and write files atomically and reliably.",
"version": "1.7.0",
"main": "dist/index.js",
"types": "dist/index.d.ts",
"scripts": {
"benchmark": "node ./tasks/benchmark.js",
"clean": "rimraf dist",
"compile": "tsc --skipLibCheck && tstei",
"compile:watch": "tsc --skipLibCheck --watch",
"test": "tap --no-coverage-report",
"test:watch": "tap --no-coverage-report --watch",
"prepublishOnly": "npm run clean && npm run compile && npm run test"
},
"bugs": {
"url": "https://github.com/fabiospampinato/atomically/issues"
},
"license": "MIT",
"author": {
"name": "Fabio Spampinato",
"email": "spampinabio@gmail.com"
},
"repository": {
"type": "git",
"url": "https://github.com/fabiospampinato/atomically.git"
},
"keywords": [
"atomic",
"read",
"write",
"file",
"reliable"
],
"engines": {
"node": ">=10.12.0"
},
"dependencies": {},
"devDependencies": {
"@types/node": "^12.7.2",
"lodash": "^4.17.19",
"mkdirp": "^1.0.4",
"promise-resolve-timeout": "^1.2.1",
"require-inject": "^1.4.4",
"rimraf": "^3.0.2",
"tap": "^14.10.7",
"typescript": "^3.5.3",
"typescript-transform-export-interop": "^1.0.2",
"write-file-atomic": "^3.0.3"
}
}

30
spa/node_modules/atomically/src/consts.ts generated vendored Normal file
View File

@@ -0,0 +1,30 @@
/* CONSTS */
const DEFAULT_ENCODING = 'utf8';
const DEFAULT_FILE_MODE = 0o666;
const DEFAULT_FOLDER_MODE = 0o777;
const DEFAULT_READ_OPTIONS = {};
const DEFAULT_WRITE_OPTIONS = {};
const DEFAULT_TIMEOUT_ASYNC = 5000;
const DEFAULT_TIMEOUT_SYNC = 100;
const IS_POSIX = !!process.getuid;
const IS_USER_ROOT = process.getuid ? !process.getuid () : false;
const LIMIT_BASENAME_LENGTH = 128; //TODO: fetch the real limit from the filesystem //TODO: fetch the whole-path length limit too
const LIMIT_FILES_DESCRIPTORS = 10000; //TODO: fetch the real limit from the filesystem
const NOOP = () => {};
/* EXPORT */
export {DEFAULT_ENCODING, DEFAULT_FILE_MODE, DEFAULT_FOLDER_MODE, DEFAULT_READ_OPTIONS, DEFAULT_WRITE_OPTIONS, DEFAULT_TIMEOUT_ASYNC, DEFAULT_TIMEOUT_SYNC, IS_POSIX, IS_USER_ROOT, LIMIT_BASENAME_LENGTH, LIMIT_FILES_DESCRIPTORS, NOOP};

270
spa/node_modules/atomically/src/index.ts generated vendored Executable file
View File

@@ -0,0 +1,270 @@
/* IMPORT */
import * as path from 'path';
import {DEFAULT_ENCODING, DEFAULT_FILE_MODE, DEFAULT_FOLDER_MODE, DEFAULT_READ_OPTIONS, DEFAULT_WRITE_OPTIONS, DEFAULT_TIMEOUT_ASYNC, DEFAULT_TIMEOUT_SYNC, IS_POSIX} from './consts';
import FS from './utils/fs';
import Lang from './utils/lang';
import Scheduler from './utils/scheduler';
import Temp from './utils/temp';
import {Callback, Data, Disposer, Path, ReadOptions, WriteOptions} from './types';
/* ATOMICALLY */
function readFile ( filePath: Path, options: string | ReadOptions & { encoding: string } ): Promise<string>;
function readFile ( filePath: Path, options?: ReadOptions ): Promise<Buffer>;
function readFile ( filePath: Path, options: string | ReadOptions = DEFAULT_READ_OPTIONS ): Promise<Buffer | string> {
if ( Lang.isString ( options ) ) return readFile ( filePath, { encoding: options } );
const timeout = Date.now () + ( options.timeout ?? DEFAULT_TIMEOUT_ASYNC );
return FS.readFileRetry ( timeout )( filePath, options );
};
function readFileSync ( filePath: Path, options: string | ReadOptions & { encoding: string } ): string;
function readFileSync ( filePath: Path, options?: ReadOptions ): Buffer;
function readFileSync ( filePath: Path, options: string | ReadOptions = DEFAULT_READ_OPTIONS ): Buffer | string {
if ( Lang.isString ( options ) ) return readFileSync ( filePath, { encoding: options } );
const timeout = Date.now () + ( options.timeout ?? DEFAULT_TIMEOUT_SYNC );
return FS.readFileSyncRetry ( timeout )( filePath, options );
};
const writeFile = ( filePath: Path, data: Data, options?: string | WriteOptions | Callback, callback?: Callback ): Promise<void> => {
if ( Lang.isFunction ( options ) ) return writeFile ( filePath, data, DEFAULT_WRITE_OPTIONS, options );
const promise = writeFileAsync ( filePath, data, options );
if ( callback ) promise.then ( callback, callback );
return promise;
};
const writeFileAsync = async ( filePath: Path, data: Data, options: string | WriteOptions = DEFAULT_WRITE_OPTIONS ): Promise<void> => {
if ( Lang.isString ( options ) ) return writeFileAsync ( filePath, data, { encoding: options } );
const timeout = Date.now () + ( options.timeout ?? DEFAULT_TIMEOUT_ASYNC );
let schedulerCustomDisposer: Disposer | null = null,
schedulerDisposer: Disposer | null = null,
tempDisposer: Disposer | null = null,
tempPath: string | null = null,
fd: number | null = null;
try {
if ( options.schedule ) schedulerCustomDisposer = await options.schedule ( filePath );
schedulerDisposer = await Scheduler.schedule ( filePath );
filePath = await FS.realpathAttempt ( filePath ) || filePath;
[tempPath, tempDisposer] = Temp.get ( filePath, options.tmpCreate || Temp.create, !( options.tmpPurge === false ) );
const useStatChown = IS_POSIX && Lang.isUndefined ( options.chown ),
useStatMode = Lang.isUndefined ( options.mode );
if ( useStatChown || useStatMode ) {
const stat = await FS.statAttempt ( filePath );
if ( stat ) {
options = { ...options };
if ( useStatChown ) options.chown = { uid: stat.uid, gid: stat.gid };
if ( useStatMode ) options.mode = stat.mode;
}
}
const parentPath = path.dirname ( filePath );
await FS.mkdirAttempt ( parentPath, {
mode: DEFAULT_FOLDER_MODE,
recursive: true
});
fd = await FS.openRetry ( timeout )( tempPath, 'w', options.mode || DEFAULT_FILE_MODE );
if ( options.tmpCreated ) options.tmpCreated ( tempPath );
if ( Lang.isString ( data ) ) {
await FS.writeRetry ( timeout )( fd, data, 0, options.encoding || DEFAULT_ENCODING );
} else if ( !Lang.isUndefined ( data ) ) {
await FS.writeRetry ( timeout )( fd, data, 0, data.length, 0 );
}
if ( options.fsync !== false ) {
if ( options.fsyncWait !== false ) {
await FS.fsyncRetry ( timeout )( fd );
} else {
FS.fsyncAttempt ( fd );
}
}
await FS.closeRetry ( timeout )( fd );
fd = null;
if ( options.chown ) await FS.chownAttempt ( tempPath, options.chown.uid, options.chown.gid );
if ( options.mode ) await FS.chmodAttempt ( tempPath, options.mode );
try {
await FS.renameRetry ( timeout )( tempPath, filePath );
} catch ( error ) {
if ( error.code !== 'ENAMETOOLONG' ) throw error;
await FS.renameRetry ( timeout )( tempPath, Temp.truncate ( filePath ) );
}
tempDisposer ();
tempPath = null;
} finally {
if ( fd ) await FS.closeAttempt ( fd );
if ( tempPath ) Temp.purge ( tempPath );
if ( schedulerCustomDisposer ) schedulerCustomDisposer ();
if ( schedulerDisposer ) schedulerDisposer ();
}
};
const writeFileSync = ( filePath: Path, data: Data, options: string | WriteOptions = DEFAULT_WRITE_OPTIONS ): void => {
if ( Lang.isString ( options ) ) return writeFileSync ( filePath, data, { encoding: options } );
const timeout = Date.now () + ( options.timeout ?? DEFAULT_TIMEOUT_SYNC );
let tempDisposer: Disposer | null = null,
tempPath: string | null = null,
fd: number | null = null;
try {
filePath = FS.realpathSyncAttempt ( filePath ) || filePath;
[tempPath, tempDisposer] = Temp.get ( filePath, options.tmpCreate || Temp.create, !( options.tmpPurge === false ) );
const useStatChown = IS_POSIX && Lang.isUndefined ( options.chown ),
useStatMode = Lang.isUndefined ( options.mode );
if ( useStatChown || useStatMode ) {
const stat = FS.statSyncAttempt ( filePath );
if ( stat ) {
options = { ...options };
if ( useStatChown ) options.chown = { uid: stat.uid, gid: stat.gid };
if ( useStatMode ) options.mode = stat.mode;
}
}
const parentPath = path.dirname ( filePath );
FS.mkdirSyncAttempt ( parentPath, {
mode: DEFAULT_FOLDER_MODE,
recursive: true
});
fd = FS.openSyncRetry ( timeout )( tempPath, 'w', options.mode || DEFAULT_FILE_MODE );
if ( options.tmpCreated ) options.tmpCreated ( tempPath );
if ( Lang.isString ( data ) ) {
FS.writeSyncRetry ( timeout )( fd, data, 0, options.encoding || DEFAULT_ENCODING );
} else if ( !Lang.isUndefined ( data ) ) {
FS.writeSyncRetry ( timeout )( fd, data, 0, data.length, 0 );
}
if ( options.fsync !== false ) {
if ( options.fsyncWait !== false ) {
FS.fsyncSyncRetry ( timeout )( fd );
} else {
FS.fsyncAttempt ( fd );
}
}
FS.closeSyncRetry ( timeout )( fd );
fd = null;
if ( options.chown ) FS.chownSyncAttempt ( tempPath, options.chown.uid, options.chown.gid );
if ( options.mode ) FS.chmodSyncAttempt ( tempPath, options.mode );
try {
FS.renameSyncRetry ( timeout )( tempPath, filePath );
} catch ( error ) {
if ( error.code !== 'ENAMETOOLONG' ) throw error;
FS.renameSyncRetry ( timeout )( tempPath, Temp.truncate ( filePath ) );
}
tempDisposer ();
tempPath = null;
} finally {
if ( fd ) FS.closeSyncAttempt ( fd );
if ( tempPath ) Temp.purge ( tempPath );
}
};
/* EXPORT */
export {readFile, readFileSync, writeFile, writeFileSync};

37
spa/node_modules/atomically/src/types.ts generated vendored Normal file
View File

@@ -0,0 +1,37 @@
/* TYPES */
type Callback = ( error: Exception | void ) => any;
type Data = Buffer | string | undefined;
type Disposer = () => void;
type Exception = NodeJS.ErrnoException;
type FN<Arguments extends any[] = any[], Return = any> = ( ...args: Arguments ) => Return;
type Path = string;
type ReadOptions = {
encoding?: string | null,
mode?: string | number | false,
timeout?: number
};
type WriteOptions = {
chown?: { gid: number, uid: number } | false,
encoding?: string | null,
fsync?: boolean,
fsyncWait?: boolean,
mode?: string | number | false,
schedule?: ( filePath: string ) => Promise<Disposer>,
timeout?: number,
tmpCreate?: ( filePath: string ) => string,
tmpCreated?: ( filePath: string ) => any,
tmpPurge?: boolean
};
/* EXPORT */
export {Callback, Data, Disposer, Exception, FN, Path, ReadOptions, WriteOptions};

42
spa/node_modules/atomically/src/utils/attemptify.ts generated vendored Normal file
View File

@@ -0,0 +1,42 @@
/* IMPORT */
import {NOOP} from '../consts';
import {Exception, FN} from '../types';
/* ATTEMPTIFY */
//TODO: Maybe publish this as a standalone package
//FIXME: The type castings here aren't exactly correct
const attemptifyAsync = <T extends FN> ( fn: T, onError: FN<[Exception]> = NOOP ): T => {
return function () {
return fn.apply ( undefined, arguments ).catch ( onError );
} as T;
};
const attemptifySync = <T extends FN> ( fn: T, onError: FN<[Exception]> = NOOP ): T => {
return function () {
try {
return fn.apply ( undefined, arguments );
} catch ( error ) {
return onError ( error );
}
} as T;
};
/* EXPORT */
export {attemptifyAsync, attemptifySync};

51
spa/node_modules/atomically/src/utils/fs.ts generated vendored Normal file
View File

@@ -0,0 +1,51 @@
/* IMPORT */
import * as fs from 'fs';
import {promisify} from 'util';
import {attemptifyAsync, attemptifySync} from './attemptify';
import Handlers from './fs_handlers';
import {retryifyAsync, retryifySync} from './retryify';
/* FS */
const FS = {
chmodAttempt: attemptifyAsync ( promisify ( fs.chmod ), Handlers.onChangeError ),
chownAttempt: attemptifyAsync ( promisify ( fs.chown ), Handlers.onChangeError ),
closeAttempt: attemptifyAsync ( promisify ( fs.close ) ),
fsyncAttempt: attemptifyAsync ( promisify ( fs.fsync ) ),
mkdirAttempt: attemptifyAsync ( promisify ( fs.mkdir ) ),
realpathAttempt: attemptifyAsync ( promisify ( fs.realpath ) ),
statAttempt: attemptifyAsync ( promisify ( fs.stat ) ),
unlinkAttempt: attemptifyAsync ( promisify ( fs.unlink ) ),
closeRetry: retryifyAsync ( promisify ( fs.close ), Handlers.isRetriableError ),
fsyncRetry: retryifyAsync ( promisify ( fs.fsync ), Handlers.isRetriableError ),
openRetry: retryifyAsync ( promisify ( fs.open ), Handlers.isRetriableError ),
readFileRetry: retryifyAsync ( promisify ( fs.readFile ), Handlers.isRetriableError ),
renameRetry: retryifyAsync ( promisify ( fs.rename ), Handlers.isRetriableError ),
statRetry: retryifyAsync ( promisify ( fs.stat ), Handlers.isRetriableError ),
writeRetry: retryifyAsync ( promisify ( fs.write ), Handlers.isRetriableError ),
chmodSyncAttempt: attemptifySync ( fs.chmodSync, Handlers.onChangeError ),
chownSyncAttempt: attemptifySync ( fs.chownSync, Handlers.onChangeError ),
closeSyncAttempt: attemptifySync ( fs.closeSync ),
mkdirSyncAttempt: attemptifySync ( fs.mkdirSync ),
realpathSyncAttempt: attemptifySync ( fs.realpathSync ),
statSyncAttempt: attemptifySync ( fs.statSync ),
unlinkSyncAttempt: attemptifySync ( fs.unlinkSync ),
closeSyncRetry: retryifySync ( fs.closeSync, Handlers.isRetriableError ),
fsyncSyncRetry: retryifySync ( fs.fsyncSync, Handlers.isRetriableError ),
openSyncRetry: retryifySync ( fs.openSync, Handlers.isRetriableError ),
readFileSyncRetry: retryifySync ( fs.readFileSync, Handlers.isRetriableError ),
renameSyncRetry: retryifySync ( fs.renameSync, Handlers.isRetriableError ),
statSyncRetry: retryifySync ( fs.statSync, Handlers.isRetriableError ),
writeSyncRetry: retryifySync ( fs.writeSync, Handlers.isRetriableError )
};
/* EXPORT */
export default FS;

45
spa/node_modules/atomically/src/utils/fs_handlers.ts generated vendored Normal file
View File

@@ -0,0 +1,45 @@
/* IMPORT */
import {IS_USER_ROOT} from '../consts';
import {Exception} from '../types';
/* FS HANDLERS */
const Handlers = {
isChangeErrorOk: ( error: Exception ): boolean => { //URL: https://github.com/isaacs/node-graceful-fs/blob/master/polyfills.js#L315-L342
const {code} = error;
if ( code === 'ENOSYS' ) return true;
if ( !IS_USER_ROOT && ( code === 'EINVAL' || code === 'EPERM' ) ) return true;
return false;
},
isRetriableError: ( error: Exception ): boolean => {
const {code} = error;
if ( code === 'EMFILE' || code === 'ENFILE' || code === 'EAGAIN' || code === 'EBUSY' || code === 'EACCESS' || code === 'EACCS' || code === 'EPERM' ) return true;
return false;
},
onChangeError: ( error: Exception ): void => {
if ( Handlers.isChangeErrorOk ( error ) ) return;
throw error;
}
};
/* EXPORT */
export default Handlers;

28
spa/node_modules/atomically/src/utils/lang.ts generated vendored Normal file
View File

@@ -0,0 +1,28 @@
/* LANG */
const Lang = {
isFunction: ( x: any ): x is Function => {
return typeof x === 'function';
},
isString: ( x: any ): x is string => {
return typeof x === 'string';
},
isUndefined: ( x: any ): x is undefined => {
return typeof x === 'undefined';
}
};
/* EXPORT */
export default Lang;

78
spa/node_modules/atomically/src/utils/retryify.ts generated vendored Normal file
View File

@@ -0,0 +1,78 @@
/* IMPORT */
import {Exception, FN} from '../types';
import RetryfyQueue from './retryify_queue';
/* RETRYIFY */
const retryifyAsync = <T extends FN> ( fn: T, isRetriableError: FN<[Exception], boolean | void> ): FN<[number], T> => {
return function ( timestamp: number ) {
return function attempt () {
return RetryfyQueue.schedule ().then ( cleanup => {
return fn.apply ( undefined, arguments ).then ( result => {
cleanup ();
return result;
}, error => {
cleanup ();
if ( Date.now () >= timestamp ) throw error;
if ( isRetriableError ( error ) ) {
const delay = Math.round ( 100 + ( 400 * Math.random () ) ),
delayPromise = new Promise ( resolve => setTimeout ( resolve, delay ) );
return delayPromise.then ( () => attempt.apply ( undefined, arguments ) );
}
throw error;
});
});
} as T;
};
};
const retryifySync = <T extends FN> ( fn: T, isRetriableError: FN<[Exception], boolean | void> ): FN<[number], T> => {
return function ( timestamp: number ) {
return function attempt () {
try {
return fn.apply ( undefined, arguments );
} catch ( error ) {
if ( Date.now () > timestamp ) throw error;
if ( isRetriableError ( error ) ) return attempt.apply ( undefined, arguments );
throw error;
}
} as T;
};
};
/* EXPORT */
export {retryifyAsync, retryifySync};

View File

@@ -0,0 +1,95 @@
/* IMPORT */
import {LIMIT_FILES_DESCRIPTORS} from '../consts';
/* RETRYIFY QUEUE */
const RetryfyQueue = {
interval: 25,
intervalId: <NodeJS.Timeout | undefined> undefined,
limit: LIMIT_FILES_DESCRIPTORS,
queueActive: new Set<Function> (),
queueWaiting: new Set<Function> (),
init: (): void => {
if ( RetryfyQueue.intervalId ) return;
RetryfyQueue.intervalId = setInterval ( RetryfyQueue.tick, RetryfyQueue.interval );
},
reset: (): void => {
if ( !RetryfyQueue.intervalId ) return;
clearInterval ( RetryfyQueue.intervalId );
delete RetryfyQueue.intervalId;
},
add: ( fn: Function ): void => {
RetryfyQueue.queueWaiting.add ( fn );
if ( RetryfyQueue.queueActive.size < ( RetryfyQueue.limit / 2 ) ) { // Active queue not under preassure, executing immediately
RetryfyQueue.tick ();
} else {
RetryfyQueue.init ();
}
},
remove: ( fn: Function ): void => {
RetryfyQueue.queueWaiting.delete ( fn );
RetryfyQueue.queueActive.delete ( fn );
},
schedule: (): Promise<Function> => {
return new Promise ( resolve => {
const cleanup = () => RetryfyQueue.remove ( resolver );
const resolver = () => resolve ( cleanup );
RetryfyQueue.add ( resolver );
});
},
tick: (): void => {
if ( RetryfyQueue.queueActive.size >= RetryfyQueue.limit ) return;
if ( !RetryfyQueue.queueWaiting.size ) return RetryfyQueue.reset ();
for ( const fn of RetryfyQueue.queueWaiting ) {
if ( RetryfyQueue.queueActive.size >= RetryfyQueue.limit ) break;
RetryfyQueue.queueWaiting.delete ( fn );
RetryfyQueue.queueActive.add ( fn );
fn ();
}
}
};
/* EXPORT */
export default RetryfyQueue;

60
spa/node_modules/atomically/src/utils/scheduler.ts generated vendored Normal file
View File

@@ -0,0 +1,60 @@
/* IMPORT */
import {Disposer} from '../types';
/* VARIABLES */
const Queues: Record<string, Function[] | undefined> = {};
/* SCHEDULER */
//TODO: Maybe publish this as a standalone package
const Scheduler = {
next: ( id: string ): void => {
const queue = Queues[id];
if ( !queue ) return;
queue.shift ();
const job = queue[0];
if ( job ) {
job ( () => Scheduler.next ( id ) );
} else {
delete Queues[id];
}
},
schedule: ( id: string ): Promise<Disposer> => {
return new Promise ( resolve => {
let queue = Queues[id];
if ( !queue ) queue = Queues[id] = [];
queue.push ( resolve );
if ( queue.length > 1 ) return;
resolve ( () => Scheduler.next ( id ) );
});
}
};
/* EXPORT */
export default Scheduler;

97
spa/node_modules/atomically/src/utils/temp.ts generated vendored Normal file
View File

@@ -0,0 +1,97 @@
/* IMPORT */
import * as path from 'path';
import {LIMIT_BASENAME_LENGTH} from '../consts';
import {Disposer} from '../types';
import FS from './fs';
/* TEMP */
//TODO: Maybe publish this as a standalone package
const Temp = {
store: <Record<string, boolean>> {}, // filePath => purge
create: ( filePath: string ): string => {
const randomness = `000000${Math.floor ( Math.random () * 16777215 ).toString ( 16 )}`.slice ( -6 ), // 6 random-enough hex characters
timestamp = Date.now ().toString ().slice ( -10 ), // 10 precise timestamp digits
prefix = 'tmp-',
suffix = `.${prefix}${timestamp}${randomness}`,
tempPath = `${filePath}${suffix}`;
return tempPath;
},
get: ( filePath: string, creator: ( filePath: string ) => string, purge: boolean = true ): [string, Disposer] => {
const tempPath = Temp.truncate ( creator ( filePath ) );
if ( tempPath in Temp.store ) return Temp.get ( filePath, creator, purge ); // Collision found, try again
Temp.store[tempPath] = purge;
const disposer = () => delete Temp.store[tempPath];
return [tempPath, disposer];
},
purge: ( filePath: string ): void => {
if ( !Temp.store[filePath] ) return;
delete Temp.store[filePath];
FS.unlinkAttempt ( filePath );
},
purgeSync: ( filePath: string ): void => {
if ( !Temp.store[filePath] ) return;
delete Temp.store[filePath];
FS.unlinkSyncAttempt ( filePath );
},
purgeSyncAll: (): void => {
for ( const filePath in Temp.store ) {
Temp.purgeSync ( filePath );
}
},
truncate: ( filePath: string ): string => { // Truncating paths to avoid getting an "ENAMETOOLONG" error //FIXME: This doesn't really always work, the actual filesystem limits must be detected for this to be implemented correctly
const basename = path.basename ( filePath );
if ( basename.length <= LIMIT_BASENAME_LENGTH ) return filePath; //FIXME: Rough and quick attempt at detecting ok lengths
const truncable = /^(\.?)(.*?)((?:\.[^.]+)?(?:\.tmp-\d{10}[a-f0-9]{6})?)$/.exec ( basename );
if ( !truncable ) return filePath; //FIXME: No truncable part detected, can't really do much without also changing the parent path, which is unsafe, hoping for the best here
const truncationLength = basename.length - LIMIT_BASENAME_LENGTH;
return `${filePath.slice ( 0, - basename.length )}${truncable[1]}${truncable[2].slice ( 0, - truncationLength )}${truncable[3]}`; //FIXME: The truncable part might be shorter than needed here
}
};
/* INIT */
process.on ( 'exit', Temp.purgeSyncAll ); // Ensuring purgeable temp files are purged on exit
/* EXPORT */
export default Temp;

72
spa/node_modules/atomically/tasks/benchmark.js generated vendored Normal file
View File

@@ -0,0 +1,72 @@
/* IMPORT */
const fs = require ( 'fs' ),
os = require ( 'os' ),
path = require ( 'path' ),
delay = require ( 'promise-resolve-timeout' ),
writeFileAtomic = require ( 'write-file-atomic' ),
{writeFile, writeFileSync} = require ( '../dist' );
/* BENCHMARK */
const TEMP = os.tmpdir (),
DST = i => path.join ( TEMP, `atomically-temp-${i}.txt` ),
ITERATIONS = 250;
const runSingleAsync = async ( name, fn, buffer, options ) => {
console.time ( name );
for ( let i = 0; i < ITERATIONS; i++ ) {
await fn ( DST ( i ), buffer, options );
}
console.timeEnd ( name );
await delay ( 1000 );
};
const runSingleSync = async ( name, fn, buffer, options ) => {
console.time ( name );
for ( let i = 0; i < ITERATIONS; i++ ) {
fn ( DST ( i ), buffer, options );
}
console.timeEnd ( name );
await delay ( 1000 );
};
const runAllDummy = () => { // Preparation run
runSingleSync ( 'dummy', fs.writeFileSync, '' );
};
const runAllAsync = async ( name, buffer ) => {
await runSingleAsync ( `${name} -> async -> write-file-atomic`, writeFileAtomic, buffer );
await runSingleAsync ( `${name} -> async -> write-file-atomic (fastest)`, writeFileAtomic, buffer, { fsync: false } );
await runSingleAsync ( `${name} -> async -> atomically`, writeFile, buffer );
await runSingleAsync ( `${name} -> async -> atomically (faster)`, writeFile, buffer, { mode: false, chown: false, fsyncWait: false } );
await runSingleAsync ( `${name} -> async -> atomically (fastest)`, writeFile, buffer, { mode: false, chown: false, fsync: false } );
};
const runAllSync = ( name, buffer ) => {
runSingleSync ( `${name} -> sync -> write-file-atomic`, writeFileAtomic.sync, buffer );
runSingleSync ( `${name} -> sync -> write-file-atomic (fastest)`, writeFileAtomic.sync, buffer, { fsync: false } );
runSingleSync ( `${name} -> sync -> atomically`, writeFileSync, buffer );
runSingleSync ( `${name} -> sync -> atomically (faster)`, writeFileSync, buffer, { mode: false, chown: false, fsyncWait: false } );
runSingleSync ( `${name} -> sync -> atomically (fastest)`, writeFileSync, buffer, { mode: false, chown: false, fsync: false } );
};
const runAll = async ( name, buffer ) => {
await runAllAsync ( name, buffer );
console.log ( '-------------------' );
runAllSync ( name, buffer );
};
const run = async () => {
runAllDummy ();
console.log ( '===================' );
await runAll ( '100kb', Buffer.allocUnsafe ( 100 * 1024 ) );
console.log ( '===================' );
await runAll ( '10kb', Buffer.allocUnsafe ( 10 * 1024 ) );
console.log ( '===================' );
await runAll ( '1kb', Buffer.allocUnsafe ( 1024 ) );
console.log ( '===================' );
};
run ();

510
spa/node_modules/atomically/test/basic.js generated vendored Executable file
View File

@@ -0,0 +1,510 @@
'use strict'
process.setMaxListeners(1000000);
const _ = require('lodash')
const fs = require('fs')
const os = require('os')
const path = require('path')
const {test} = require('tap')
const requireInject = require('require-inject')
let expectClose = 0
let closeCalled = 0
let expectCloseSync = 0
let closeSyncCalled = 0
const createErr = code => Object.assign(new Error(code), { code })
let unlinked = []
const fsMock = Object.assign ( {}, fs, {
/* ASYNC */
mkdir (filename, opts, cb) {
return cb(null);
},
realpath (filename, cb) {
return cb(null, filename)
},
open (tmpfile, options, mode, cb) {
if (/noopen/.test(tmpfile)) return cb(createErr('ENOOPEN'))
expectClose++
cb(null, tmpfile)
},
write (fd) {
const cb = arguments[arguments.length - 1]
if (/nowrite/.test(fd)) return cb(createErr('ENOWRITE'))
cb()
},
fsync (fd, cb) {
if (/nofsync/.test(fd)) return cb(createErr('ENOFSYNC'))
cb()
},
close (fd, cb) {
closeCalled++
cb()
},
chown (tmpfile, uid, gid, cb) {
if (/nochown/.test(tmpfile)) return cb(createErr('ENOCHOWN'))
if (/enosys/.test(tmpfile)) return cb(createErr('ENOSYS'))
if (/einval/.test(tmpfile)) return cb(createErr('EINVAL'))
if (/eperm/.test(tmpfile)) return cb(createErr('EPERM'))
cb()
},
chmod (tmpfile, mode, cb) {
if (/nochmod/.test(tmpfile)) return cb(createErr('ENOCHMOD'))
if (/enosys/.test(tmpfile)) return cb(createErr('ENOSYS'))
if (/eperm/.test(tmpfile)) return cb(createErr('EPERM'))
if (/einval/.test(tmpfile)) return cb(createErr('EINVAL'))
cb()
},
rename (tmpfile, filename, cb) {
if (/norename/.test(tmpfile)) return cb(createErr('ENORENAME'))
cb()
},
unlink (tmpfile, cb) {
if (/nounlink/.test(tmpfile)) return cb(createErr('ENOUNLINK'))
cb()
},
stat (tmpfile, cb) {
if (/nostat/.test(tmpfile)) return cb(createErr('ENOSTAT'))
if (/statful/.test(tmpfile)) return cb(null, fs.statSync('/'));
cb()
},
/* SYNC */
mkdirSync (filename) {},
realpathSync (filename, cb) {
return filename
},
openSync (tmpfile, options) {
if (/noopen/.test(tmpfile)) throw createErr('ENOOPEN')
expectCloseSync++
return tmpfile
},
writeSync (fd) {
if (/nowrite/.test(fd)) throw createErr('ENOWRITE')
},
fsyncSync (fd) {
if (/nofsync/.test(fd)) throw createErr('ENOFSYNC')
},
closeSync (fd) {
closeSyncCalled++
},
chownSync (tmpfile, uid, gid) {
if (/nochown/.test(tmpfile)) throw createErr('ENOCHOWN')
if (/enosys/.test(tmpfile)) throw createErr('ENOSYS')
if (/einval/.test(tmpfile)) throw createErr('EINVAL')
if (/eperm/.test(tmpfile)) throw createErr('EPERM')
},
chmodSync (tmpfile, mode) {
if (/nochmod/.test(tmpfile)) throw createErr('ENOCHMOD')
if (/enosys/.test(tmpfile)) throw createErr('ENOSYS')
if (/einval/.test(tmpfile)) throw createErr('EINVAL')
if (/eperm/.test(tmpfile)) throw createErr('EPERM')
},
renameSync (tmpfile, filename) {
if (/norename/.test(tmpfile)) throw createErr('ENORENAME')
},
unlinkSync (tmpfile) {
if (/nounlink/.test(tmpfile)) throw createErr('ENOUNLINK')
unlinked.push(tmpfile)
},
statSync (tmpfile) {
if (/nostat/.test(tmpfile)) throw createErr('ENOSTAT')
if (/statful/.test(tmpfile)) return fs.statSync('/');
}
});
const makeUnstableAsyncFn = function () {
return function () {
if ( Math.random () <= .75 ) {
const code = _.shuffle ([ 'EMFILE', 'ENFILE', 'EAGAIN', 'EBUSY', 'EACCESS', 'EPERM' ])[0];
throw createErr ( code );
}
return arguments[arguments.length -1](null, arguments[0]);
};
};
const makeUnstableSyncFn = function ( fn ) {
return function () {
if ( Math.random () <= .75 ) {
const code = _.shuffle ([ 'EMFILE', 'ENFILE', 'EAGAIN', 'EBUSY', 'EACCESS', 'EPERM' ])[0];
throw createErr ( code );
}
return fn.apply(undefined, arguments)
};
};
const fsMockUnstable = Object.assign ( {}, fsMock, {
open: makeUnstableAsyncFn (),
write: makeUnstableAsyncFn (),
fsync: makeUnstableAsyncFn (),
close: makeUnstableAsyncFn (),
rename: makeUnstableAsyncFn (),
openSync: makeUnstableSyncFn ( _.identity ),
writeSync: makeUnstableSyncFn ( _.noop ),
fsyncSync: makeUnstableSyncFn ( _.noop ),
closeSync: makeUnstableSyncFn ( _.noop ),
renameSync: makeUnstableSyncFn ( _.noop )
});
const {writeFile: writeFileAtomic, writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs: fsMock });
test('async tests', t => {
t.plan(2)
expectClose = 0
closeCalled = 0
t.teardown(() => {
t.parent.equal(closeCalled, expectClose, 'async tests closed all files')
expectClose = 0
closeCalled = 0
})
t.test('non-root tests', t => {
t.plan(28)
writeFileAtomic('good', 'test', { mode: '0777' }, err => {
t.notOk(err, 'No errors occur when passing in options')
})
writeFileAtomic('good', 'test', 'utf8', err => {
t.notOk(err, 'No errors occur when passing in options as string')
})
writeFileAtomic('good', 'test', undefined, err => {
t.notOk(err, 'No errors occur when NOT passing in options')
})
writeFileAtomic('good', 'test', err => {
t.notOk(err)
})
writeFileAtomic('noopen', 'test', err => {
t.is(err && err.message, 'ENOOPEN', 'fs.open failures propagate')
})
writeFileAtomic('nowrite', 'test', err => {
t.is(err && err.message, 'ENOWRITE', 'fs.writewrite failures propagate')
})
writeFileAtomic('nowrite', Buffer.from('test', 'utf8'), err => {
t.is(err && err.message, 'ENOWRITE', 'fs.writewrite failures propagate for buffers')
})
writeFileAtomic('nochown', 'test', { chown: { uid: 100, gid: 100 } }, err => {
t.is(err && err.message, 'ENOCHOWN', 'Chown failures propagate')
})
writeFileAtomic('nochown', 'test', err => {
t.notOk(err, 'No attempt to chown when no uid/gid passed in')
})
writeFileAtomic('nochmod', 'test', { mode: parseInt('741', 8) }, err => {
t.is(err && err.message, 'ENOCHMOD', 'Chmod failures propagate')
})
writeFileAtomic('nofsyncopt', 'test', { fsync: false }, err => {
t.notOk(err, 'fsync skipped if options.fsync is false')
})
writeFileAtomic('norename', 'test', err => {
t.is(err && err.message, 'ENORENAME', 'Rename errors propagate')
})
writeFileAtomic('norename nounlink', 'test', err => {
t.is(err && err.message, 'ENORENAME', 'Failure to unlink the temp file does not clobber the original error')
})
writeFileAtomic('nofsync', 'test', err => {
t.is(err && err.message, 'ENOFSYNC', 'Fsync failures propagate')
})
writeFileAtomic('enosys', 'test', err => {
t.notOk(err, 'No errors on ENOSYS')
})
writeFileAtomic('einval', 'test', { mode: 0o741 }, err => {
t.notOk(err, 'No errors on EINVAL for non root')
})
writeFileAtomic('eperm', 'test', { mode: 0o741 }, err => {
t.notOk(err, 'No errors on EPERM for non root')
})
writeFileAtomic('einval', 'test', { chown: { uid: 100, gid: 100 } }, err => {
t.notOk(err, 'No errors on EINVAL for non root')
})
writeFileAtomic('eperm', 'test', { chown: { uid: 100, gid: 100 } }, err => {
t.notOk(err, 'No errors on EPERM for non root')
})
const optionsImmutable = {};
writeFileAtomic('statful', 'test', optionsImmutable, err => {
t.notOk(err);
t.deepEquals(optionsImmutable, {});
});
const schedule = filePath => {
t.is(filePath, 'good');
return new Promise ( resolve => {
resolve ( () => {
t.is(true,true);
});
});
};
writeFileAtomic('good','test', {schedule}, err => {
t.notOk(err);
});
const tmpCreate = filePath => `.${filePath}.custom`;
const tmpCreated = filePath => t.is ( filePath, '.good.custom' );
writeFileAtomic('good','test', {tmpCreate, tmpCreated}, err => {
t.notOk(err)
})
const longPath = path.join(os.tmpdir(),'.012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789.txt');
const {writeFile: writeFileAtomicNative} = requireInject('../dist', { fs });
writeFileAtomicNative(longPath,'test', err => {
t.notOk(err)
})
const pathMissingFolders = path.join(os.tmpdir(),String(Math.random()),String(Math.random()),String(Math.random()),'foo.txt');
writeFileAtomicNative(pathMissingFolders,'test', err => {
t.notOk(err)
})
})
t.test('errors for root', t => {
const { getuid } = process
process.getuid = () => 0
t.teardown(() => {
process.getuid = getuid
})
const {writeFile: writeFileAtomic} = requireInject('../dist', { fs: fsMock });
t.plan(2)
writeFileAtomic('einval', 'test', { chown: { uid: 100, gid: 100 } }, err => {
t.match(err, { code: 'EINVAL' })
})
writeFileAtomic('einval', 'test', { mode: 0o741 }, err => {
t.match(err, { code: 'EINVAL' })
})
})
})
test('unstable async tests', t => {
t.plan(2);
const {writeFile: writeFileAtomic} = requireInject('../dist', { fs: fsMockUnstable });
writeFileAtomic('good', 'test', err => {
t.notOk(err, 'No errors occur when retryable errors are thrown')
})
writeFileAtomic('good', 'test', { timeout: 0 }, err => {
t.is(!!err.code, true, 'Retrying can be disabled')
})
});
test('sync tests', t => {
t.plan(2)
closeSyncCalled = 0
expectCloseSync = 0
t.teardown(() => {
t.parent.equal(closeSyncCalled, expectCloseSync, 'sync closed all files')
expectCloseSync = 0
closeSyncCalled = 0
})
const throws = function (t, shouldthrow, msg, todo) {
let err
try { todo() } catch (e) { err = e }
t.is(shouldthrow, err && err.message, msg)
}
const noexception = function (t, msg, todo) {
let err
try { todo() } catch (e) { err = e }
t.ifError(err, msg)
}
let tmpfile
t.test('non-root', t => {
t.plan(38)
noexception(t, 'No errors occur when passing in options', () => {
writeFileAtomicSync('good', 'test', { mode: '0777' })
})
noexception(t, 'No errors occur when passing in options as string', () => {
writeFileAtomicSync('good', 'test', 'utf8')
})
noexception(t, 'No errors occur when NOT passing in options', () => {
writeFileAtomicSync('good', 'test')
})
noexception(t, 'fsync never called if options.fsync is falsy', () => {
writeFileAtomicSync('good', 'test', { fsync: false })
})
noexception(t, 'tmpCreated is called on success', () => {
writeFileAtomicSync('good', 'test', {
tmpCreated (gottmpfile) {
tmpfile = gottmpfile
}
})
t.match(tmpfile, /^good\.tmp-\w+$/, 'tmpCreated called for success')
t.match(tmpfile, /^good\.tmp-\d{10}[a-f0-9]{6}$/, 'tmpCreated format')
})
tmpfile = undefined
throws(t, 'ENOOPEN', 'fs.openSync failures propagate', () => {
writeFileAtomicSync('noopen', 'test', {
tmpCreated (gottmpfile) {
tmpfile = gottmpfile
}
})
})
t.is(tmpfile, undefined, 'tmpCreated not called for open failure')
throws(t, 'ENOWRITE', 'fs.writeSync failures propagate', () => {
writeFileAtomicSync('nowrite', 'test', {
tmpCreated (gottmpfile) {
tmpfile = gottmpfile
}
})
})
t.match(tmpfile, /^nowrite\.tmp-\w+$/, 'tmpCreated called for failure after open')
throws(t, 'ENOCHOWN', 'Chown failures propagate', () => {
writeFileAtomicSync('nochown', 'test', { chown: { uid: 100, gid: 100 } })
})
noexception(t, 'No attempt to chown when false passed in', () => {
writeFileAtomicSync('nochown', 'test', { chown: false })
})
noexception(t, 'No errors occured when chown is undefined and original file owner used', () => {
writeFileAtomicSync('chowncopy', 'test', { chown: undefined })
})
throws(t, 'ENORENAME', 'Rename errors propagate', () => {
writeFileAtomicSync('norename', 'test')
})
throws(t, 'ENORENAME', 'Failure to unlink the temp file does not clobber the original error', () => {
writeFileAtomicSync('norename nounlink', 'test')
})
throws(t, 'ENOFSYNC', 'Fsync errors propagate', () => {
writeFileAtomicSync('nofsync', 'test')
})
noexception(t, 'No errors on ENOSYS', () => {
writeFileAtomicSync('enosys', 'test', { chown: { uid: 100, gid: 100 } })
})
noexception(t, 'No errors on EINVAL for non root', () => {
writeFileAtomicSync('einval', 'test', { chown: { uid: 100, gid: 100 } })
})
noexception(t, 'No errors on EPERM for non root', () => {
writeFileAtomicSync('eperm', 'test', { chown: { uid: 100, gid: 100 } })
})
throws(t, 'ENOCHMOD', 'Chmod failures propagate', () => {
writeFileAtomicSync('nochmod', 'test', { mode: 0o741 })
})
noexception(t, 'No errors on EPERM for non root', () => {
writeFileAtomicSync('eperm', 'test', { mode: 0o741 })
})
noexception(t, 'No attempt to chmod when no mode provided', () => {
writeFileAtomicSync('nochmod', 'test', { mode: false })
})
const optionsImmutable = {};
noexception(t, 'options are immutable', () => {
writeFileAtomicSync('statful', 'test', optionsImmutable)
})
t.deepEquals(optionsImmutable, {});
const tmpCreate = filePath => `.${filePath}.custom`;
const tmpCreated = filePath => t.is ( filePath, '.good.custom' );
noexception(t, 'custom temp creator', () => {
writeFileAtomicSync('good', 'test', {tmpCreate, tmpCreated})
})
const path0 = path.join(os.tmpdir(),'atomically-test-0');
const tmpPath0 = path0 + '.temp';
noexception(t, 'temp files are purged on success', () => {
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs });
writeFileAtomicSync(path0, 'test', {tmpCreate: () => tmpPath0})
})
t.is(true,fs.existsSync(path0));
t.is(false,fs.existsSync(tmpPath0));
const path1 = path.join(os.tmpdir(),'atomically-test-norename-1');
const tmpPath1 = path1 + '.temp';
throws(t, 'ENORENAME', 'temp files are purged on error', () => {
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs: Object.assign ( {}, fs, { renameSync: fsMock.renameSync })});
writeFileAtomicSync(path1, 'test', {tmpCreate: () => tmpPath1})
})
t.is(false,fs.existsSync(path1));
t.is(false,fs.existsSync(tmpPath1));
const path2 = path.join(os.tmpdir(),'atomically-test-norename-2');
const tmpPath2 = path2 + '.temp';
throws(t, 'ENORENAME', 'temp files can also not be purged on error', () => {
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs: Object.assign ( {}, fs, { renameSync: fsMock.renameSync })});
writeFileAtomicSync(path2, 'test', {tmpCreate: () => tmpPath2,tmpPurge: false})
})
t.is(false,fs.existsSync(path2));
t.is(true,fs.existsSync(tmpPath2));
const longPath = path.join(os.tmpdir(),'.012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789.txt');
noexception(t, 'temp files are truncated', () => {
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs });
writeFileAtomicSync(longPath, 'test')
})
const pathMissingFolders = path.join(os.tmpdir(),String(Math.random()),String(Math.random()),String(Math.random()),'foo.txt');
noexception(t, 'parent folders are created', () => {
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs });
writeFileAtomicSync(pathMissingFolders, 'test')
})
})
t.test('errors for root', t => {
const { getuid } = process
process.getuid = () => 0
t.teardown(() => {
process.getuid = getuid
})
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs: fsMock });
t.plan(2)
throws(t, 'EINVAL', 'Chown error as root user', () => {
writeFileAtomicSync('einval', 'test', { chown: { uid: 100, gid: 100 } })
})
throws(t, 'EINVAL', 'Chmod error as root user', () => {
writeFileAtomicSync('einval', 'test', { mode: 0o741 })
})
})
})
test('unstable sync tests', t => {
t.plan(2);
const throws = function (t, msg, todo) {
let err
try { todo() } catch (e) { err = e }
t.is(!!err.code, true, msg)
}
const noexception = function (t, msg, todo) {
let err
try { todo() } catch (e) { err = e }
t.ifError(err, msg)
}
noexception(t, 'No errors occur when retryable errors are thrown', () => {
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs: fsMockUnstable });
writeFileAtomicSync('good', 'test')
})
throws(t, 'retrying can be disabled', () => {
const {writeFileSync: writeFileAtomicSync} = requireInject('../dist', { fs: fsMockUnstable });
writeFileAtomicSync('good', 'test', { timeout: 0 })
})
});
test('promises', async t => {
let tmpfile
closeCalled = 0
expectClose = 0
t.teardown(() => {
t.parent.equal(closeCalled, expectClose, 'promises closed all files')
closeCalled = 0
expectClose = 0
})
await writeFileAtomic('good', 'test', {
tmpCreated (gottmpfile) {
tmpfile = gottmpfile
}
})
t.match(tmpfile, /^good\.tmp-\w+$/, 'tmpCreated is called for success')
await writeFileAtomic('good', 'test', {
tmpCreated (gottmpfile) {
return Promise.resolve()
}
})
tmpfile = undefined
await t.rejects(writeFileAtomic('noopen', 'test', {
tmpCreated (gottmpfile) {
tmpfile = gottmpfile
}
}))
t.is(tmpfile, undefined, 'tmpCreated is not called on open failure')
await t.rejects(writeFileAtomic('nowrite', 'test', {
tmpCreated (gottmpfile) {
tmpfile = gottmpfile
}
}))
t.match(tmpfile, /^nowrite\.tmp-\w+$/, 'tmpCreated is called if failure is after open')
})

153
spa/node_modules/atomically/test/concurrency.js generated vendored Executable file
View File

@@ -0,0 +1,153 @@
'use strict'
process.setMaxListeners(1000000);
const fs = require('fs')
const {test} = require('tap')
const requireInject = require('require-inject')
// defining mock for fs so its functions can be modified
const fsMock = Object.assign ( {}, fs, {
/* ASYNC */
mkdir (filename, opts, cb) {
return cb(null);
},
realpath (filename, cb) {
return cb(null, filename)
},
open (tmpfile, options, mode, cb) {
if (/noopen/.test(tmpfile)) return cb(new Error('ENOOPEN'))
cb(null, tmpfile)
},
write (fd) {
const cb = arguments[arguments.length - 1]
if (/nowrite/.test(fd)) return cb(new Error('ENOWRITE'))
cb()
},
fsync (fd, cb) {
if (/nofsync/.test(fd)) return cb(new Error('ENOFSYNC'))
cb()
},
close (fd, cb) {
cb()
},
chown (tmpfile, uid, gid, cb) {
if (/nochown/.test(tmpfile)) return cb(new Error('ENOCHOWN'))
cb()
},
chmod (tmpfile, mode, cb) {
if (/nochmod/.test(tmpfile)) return cb(new Error('ENOCHMOD'))
cb()
},
rename (tmpfile, filename, cb) {
if (/norename/.test(tmpfile)) return cb(new Error('ENORENAME'))
cb()
},
unlink (tmpfile, cb) {
if (/nounlink/.test(tmpfile)) return cb(new Error('ENOUNLINK'))
cb()
},
stat (tmpfile, cb) {
if (/nostat/.test(tmpfile)) return cb(new Error('ENOSTAT'))
cb()
},
/* SYNC */
mkdirSync (filename) {},
realpathSync (filename, cb) {
return filename
},
openSync (tmpfile, options) {
if (/noopen/.test(tmpfile)) throw new Error('ENOOPEN')
return tmpfile
},
writeSync (fd) {
if (/nowrite/.test(fd)) throw new Error('ENOWRITE')
},
fsyncSync (fd) {
if (/nofsync/.test(fd)) throw new Error('ENOFSYNC')
},
closeSync () {},
chownSync (tmpfile, uid, gid) {
if (/nochown/.test(tmpfile)) throw new Error('ENOCHOWN')
},
chmodSync (tmpfile, mode) {
if (/nochmod/.test(tmpfile)) throw new Error('ENOCHMOD')
},
renameSync (tmpfile, filename) {
if (/norename/.test(tmpfile)) throw new Error('ENORENAME')
},
unlinkSync (tmpfile) {
if (/nounlink/.test(tmpfile)) throw new Error('ENOUNLINK')
},
statSync (tmpfile) {
if (/nostat/.test(tmpfile)) throw new Error('ENOSTAT')
}
})
const {writeFile: writeFileAtomic} = requireInject('../dist', { fs: fsMock });
// preserve original functions
const oldRealPath = fsMock.realpath
const oldRename = fsMock.rename
test('ensure writes to the same file are serial', t => {
let fileInUse = false
const ops = 5 // count for how many concurrent write ops to request
t.plan(ops * 3 + 3)
fsMock.realpath = (...args) => {
t.false(fileInUse, 'file not in use')
fileInUse = true
oldRealPath(...args)
}
fsMock.rename = (...args) => {
t.true(fileInUse, 'file in use')
fileInUse = false
oldRename(...args)
}
const {writeFile: writeFileAtomic} = requireInject('../dist', { fs: fsMock });
for (let i = 0; i < ops; i++) {
writeFileAtomic('test', 'test', err => {
if (err) t.fail(err)
else t.pass('wrote without error')
})
}
setTimeout(() => {
writeFileAtomic('test', 'test', err => {
if (err) t.fail(err)
else t.pass('successive writes after delay')
})
}, 500)
})
test('allow write to multiple files in parallel, but same file writes are serial', t => {
const filesInUse = []
const ops = 5
let wasParallel = false
fsMock.realpath = (filename, ...args) => {
filesInUse.push(filename)
const firstOccurence = filesInUse.indexOf(filename)
t.equal(filesInUse.indexOf(filename, firstOccurence + 1), -1, 'serial writes') // check for another occurence after the first
if (filesInUse.length > 1) wasParallel = true // remember that a parallel operation took place
oldRealPath(filename, ...args)
}
fsMock.rename = (filename, ...args) => {
filesInUse.splice(filesInUse.indexOf(filename), 1)
oldRename(filename, ...args)
}
const {writeFile: writeFileAtomic} = requireInject('../dist', { fs: fsMock });
t.plan(ops * 2 * 2 + 1)
let opCount = 0
for (let i = 0; i < ops; i++) {
writeFileAtomic('test', 'test', err => {
if (err) t.fail(err, 'wrote without error')
else t.pass('wrote without error')
})
writeFileAtomic('test2', 'test', err => {
opCount++
if (opCount === ops) t.true(wasParallel, 'parallel writes')
if (err) t.fail(err, 'wrote without error')
else t.pass('wrote without error')
})
}
})

291
spa/node_modules/atomically/test/integration.js generated vendored Executable file
View File

@@ -0,0 +1,291 @@
'use strict'
process.setMaxListeners(1000000);
const fs = require('fs')
const path = require('path')
const {test} = require('tap')
const rimraf = require('rimraf')
const requireInject = require('require-inject')
const workdir = path.join(__dirname, path.basename(__filename, '.js'))
let testfiles = 0
function tmpFile () {
return path.join(workdir, 'test-' + (++testfiles))
}
function readFile (path) {
return fs.readFileSync(path).toString()
}
function didWriteFileAtomic (t, expected, filename, data, options, callback) {
if (options instanceof Function) {
callback = options
options = null
}
if (!options) options = {}
const actual = {}
const {writeFile: writeFileAtomic} = requireInject('../dist', {
fs: Object.assign({}, fs, {
chown (filename, uid, gid, cb) {
actual.uid = uid
actual.gid = gid
process.nextTick(cb)
},
stat (filename, cb) {
fs.stat(filename, (err, stats) => {
if (err) return cb(err)
cb(null, Object.assign(stats, expected || {}))
})
}
})
})
return writeFileAtomic(filename, data, options, err => {
t.isDeeply(actual, expected, 'ownership is as expected')
callback(err)
})
}
function didWriteFileAtomicSync (t, expected, filename, data, options) {
const actual = {}
const {writeFileSync} = requireInject('../dist', {
fs: Object.assign({}, fs, {
chownSync (filename, uid, gid) {
actual.uid = uid
actual.gid = gid
},
statSync (filename) {
const stats = fs.statSync(filename)
return Object.assign(stats, expected || {})
}
})
})
writeFileSync(filename, data, options)
t.isDeeply(actual, expected)
}
function currentUser () {
return {
uid: process.getuid(),
gid: process.getgid()
}
}
test('setup', t => {
rimraf.sync(workdir)
fs.mkdirSync(workdir, {recursive: true})
t.done()
})
test('writes simple file (async)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomic(t, {}, file, '42', err => {
t.ifError(err, 'no error')
t.is(readFile(file), '42', 'content ok')
})
})
test('writes simple file with encoding (async)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomic(t, {}, file, 'foo', 'utf16le', err => {
t.ifError(err, 'no error')
t.is(readFile(file), 'f\u0000o\u0000o\u0000', 'content ok')
})
})
test('writes buffers to simple file (async)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomic(t, {}, file, Buffer.from('42'), err => {
t.ifError(err, 'no error')
t.is(readFile(file), '42', 'content ok')
})
})
test('writes undefined to simple file (async)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomic(t, {}, file, undefined, err => {
t.ifError(err, 'no error')
t.is(readFile(file), '', 'content ok')
})
})
test('writes to symlinks without clobbering (async)', t => {
t.plan(5)
const file = tmpFile()
const link = tmpFile()
fs.writeFileSync(file, '42')
fs.symlinkSync(file, link)
didWriteFileAtomic(t, currentUser(), link, '43', err => {
t.ifError(err, 'no error')
t.is(readFile(file), '43', 'target content ok')
t.is(readFile(link), '43', 'link content ok')
t.ok(fs.lstatSync(link).isSymbolicLink(), 'link is link')
})
})
test('runs chown on given file (async)', t => {
const file = tmpFile()
didWriteFileAtomic(t, { uid: 42, gid: 43 }, file, '42', { chown: { uid: 42, gid: 43 } }, err => {
t.ifError(err, 'no error')
t.is(readFile(file), '42', 'content ok')
t.done()
})
})
test('writes simple file with no chown (async)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomic(t, {}, file, '42', { chown: false }, err => {
t.ifError(err, 'no error')
t.is(readFile(file), '42', 'content ok')
t.done()
})
})
test('runs chmod on given file (async)', t => {
t.plan(5)
const file = tmpFile()
didWriteFileAtomic(t, {}, file, '42', { mode: parseInt('741', 8) }, err => {
t.ifError(err, 'no error')
const stat = fs.statSync(file)
t.is(stat.mode, parseInt('100741', 8))
didWriteFileAtomic(t, { uid: 42, gid: 43 }, file, '23', { chown: { uid: 42, gid: 43 } }, err => {
t.ifError(err, 'no error')
})
})
})
test('run chmod AND chown (async)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomic(t, { uid: 42, gid: 43 }, file, '42', { mode: parseInt('741', 8), chown: { uid: 42, gid: 43 } }, err => {
t.ifError(err, 'no error')
const stat = fs.statSync(file)
t.is(stat.mode, parseInt('100741', 8))
})
})
test('does not change chmod by default (async)', t => {
t.plan(5)
const file = tmpFile()
didWriteFileAtomic(t, {}, file, '42', { mode: parseInt('741', 8) }, err => {
t.ifError(err, 'no error')
didWriteFileAtomic(t, currentUser(), file, '43', err => {
t.ifError(err, 'no error')
const stat = fs.statSync(file)
t.is(stat.mode, parseInt('100741', 8))
})
})
})
test('does not change chown by default (async)', t => {
t.plan(6)
const file = tmpFile()
didWriteFileAtomic(t, { uid: 42, gid: 43 }, file, '42', { chown: { uid: 42, gid: 43 } }, _setModeOnly)
function _setModeOnly (err) {
t.ifError(err, 'no error')
didWriteFileAtomic(t, { uid: 42, gid: 43 }, file, '43', { mode: parseInt('741', 8) }, _allDefault)
}
function _allDefault (err) {
t.ifError(err, 'no error')
didWriteFileAtomic(t, { uid: 42, gid: 43 }, file, '43', _noError)
}
function _noError (err) {
t.ifError(err, 'no error')
}
})
test('writes simple file (sync)', t => {
t.plan(2)
const file = tmpFile()
didWriteFileAtomicSync(t, {}, file, '42')
t.is(readFile(file), '42')
})
test('writes simple file with encoding (sync)', t => {
t.plan(2)
const file = tmpFile()
didWriteFileAtomicSync(t, {}, file, 'foo', 'utf16le')
t.is(readFile(file), 'f\u0000o\u0000o\u0000')
})
test('writes simple buffer file (sync)', t => {
t.plan(2)
const file = tmpFile()
didWriteFileAtomicSync(t, {}, file, Buffer.from('42'))
t.is(readFile(file), '42')
})
test('writes undefined file (sync)', t => {
t.plan(2)
const file = tmpFile()
didWriteFileAtomicSync(t, {}, file, undefined)
t.is(readFile(file), '')
})
test('writes to symlinks without clobbering (sync)', t => {
t.plan(4)
const file = tmpFile()
const link = tmpFile()
fs.writeFileSync(file, '42')
fs.symlinkSync(file, link)
didWriteFileAtomicSync(t, currentUser(), link, '43')
t.is(readFile(file), '43', 'target content ok')
t.is(readFile(link), '43', 'link content ok')
t.ok(fs.lstatSync(link).isSymbolicLink(), 'link is link')
})
test('runs chown on given file (sync)', t => {
t.plan(1)
const file = tmpFile()
didWriteFileAtomicSync(t, { uid: 42, gid: 43 }, file, '42', { chown: { uid: 42, gid: 43 } })
})
test('runs chmod on given file (sync)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomicSync(t, {}, file, '42', { mode: parseInt('741', 8) })
const stat = fs.statSync(file)
t.is(stat.mode, parseInt('100741', 8))
didWriteFileAtomicSync(t, { uid: 42, gid: 43 }, file, '23', { chown: { uid: 42, gid: 43 } })
})
test('runs chown and chmod (sync)', t => {
t.plan(2)
const file = tmpFile()
didWriteFileAtomicSync(t, { uid: 42, gid: 43 }, file, '42', { mode: parseInt('741', 8), chown: { uid: 42, gid: 43 } })
const stat = fs.statSync(file)
t.is(stat.mode, parseInt('100741', 8))
})
test('does not change chmod by default (sync)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomicSync(t, {}, file, '42', { mode: parseInt('741', 8) })
didWriteFileAtomicSync(t, currentUser(), file, '43')
const stat = fs.statSync(file)
t.is(stat.mode, parseInt('100741', 8))
})
test('does not change chown by default (sync)', t => {
t.plan(3)
const file = tmpFile()
didWriteFileAtomicSync(t, { uid: 42, gid: 43 }, file, '42', { chown: { uid: 42, gid: 43 } })
didWriteFileAtomicSync(t, { uid: 42, gid: 43 }, file, '43', { mode: parseInt('741', 8) })
didWriteFileAtomicSync(t, { uid: 42, gid: 43 }, file, '44')
})
test('cleanup', t => {
rimraf.sync(workdir)
t.done()
})

28
spa/node_modules/atomically/tsconfig.json generated vendored Executable file
View File

@@ -0,0 +1,28 @@
{
"compilerOptions": {
"alwaysStrict": true,
"declaration": true,
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"forceConsistentCasingInFileNames": true,
"inlineSourceMap": false,
"jsx": "react",
"lib": ["dom", "scripthost", "es2015", "es2016", "es2017", "es2018", "es2019", "es2020"],
"module": "commonjs",
"moduleResolution": "node",
"newLine": "LF",
"noFallthroughCasesInSwitch": true,
"noUnusedLocals": true,
"noUnusedParameters": false,
"outDir": "dist",
"pretty": true,
"strictNullChecks": true,
"target": "es2018"
},
"include": [
"src"
],
"exclude": [
"node_modules"
]
}