Compare commits
4 commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
59827b0687 |
||
|
|
1be5907e51 | ||
|
|
7514287dbc | ||
|
|
4474307392 |
1849 changed files with 301414 additions and 60094 deletions
|
|
@ -1,8 +0,0 @@
|
|||
{
|
||||
"name": "Holy Unblocker LTS",
|
||||
"image": "mcr.microsoft.com/devcontainers/javascript-node:0-20",
|
||||
"features": {
|
||||
"node": "20"
|
||||
},
|
||||
"postCreateCommand": "npm install"
|
||||
}
|
||||
|
|
@ -1,19 +0,0 @@
|
|||
node_modules
|
||||
.gitpod.yml
|
||||
.vscode
|
||||
.gitattributes
|
||||
.well-known
|
||||
views/archive
|
||||
ngrok.exe
|
||||
pnpm-lock.yaml
|
||||
debug.log
|
||||
lib/rammerhead/sessions/*
|
||||
!lib/rammerhead/sessions/.gitkeep
|
||||
lib/rammerhead/cache-js/*
|
||||
!lib/rammerhead/cache-js/.gitkeep
|
||||
views/dist/*
|
||||
src/.shutdown
|
||||
lib/rammerhead/src/client/hammerhead.min.js
|
||||
lib/rammerhead/src/client/rammerhead.min.js
|
||||
views/archive/gfiles/rarch/roms
|
||||
.idea
|
||||
2
.gitattributes
vendored
Normal file
2
.gitattributes
vendored
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
# Auto detect text files and perform LF normalization
|
||||
* text=auto
|
||||
35
.github/workflows/ci-windows.yml
vendored
35
.github/workflows/ci-windows.yml
vendored
|
|
@ -1,35 +0,0 @@
|
|||
name: CI-Win
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [master]
|
||||
pull_request:
|
||||
branches: [master]
|
||||
|
||||
jobs:
|
||||
build-windows:
|
||||
runs-on: windows-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v2
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v3
|
||||
with:
|
||||
node-version: '20.8.0'
|
||||
|
||||
- name: Install dependencies
|
||||
run: npm run fresh-install
|
||||
|
||||
- name: Build libraries
|
||||
run: npm run build
|
||||
|
||||
- name: Start server
|
||||
run: npm run workflow-test
|
||||
|
||||
- name: Test server response
|
||||
run: npm test
|
||||
|
||||
- name: Stop server after testing
|
||||
run: npm stop
|
||||
40
.github/workflows/ci.yml
vendored
40
.github/workflows/ci.yml
vendored
|
|
@ -1,40 +0,0 @@
|
|||
name: CI-Production
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [master]
|
||||
pull_request:
|
||||
branches: [master]
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v2
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v3
|
||||
with:
|
||||
node-version: '20.8.0'
|
||||
|
||||
- name: Install dependencies
|
||||
run: npm run fresh-install
|
||||
|
||||
- name: Build libraries
|
||||
run: npm run build
|
||||
|
||||
- name: Allow AppArmor sandbox to work on Ubuntu
|
||||
run: echo 0 | sudo tee /proc/sys/kernel/apparmor_restrict_unprivileged_userns
|
||||
# A safer option to fix sandboxes if Chrome is installed on a local client:
|
||||
# echo -e "\nexport CHROME_DEVEL_SANDBOX=/opt/google/chrome/chrome-sandbox" >> ~/.bashrc
|
||||
|
||||
- name: Start server
|
||||
run: npm run workflow-test
|
||||
|
||||
- name: Test server response
|
||||
run: npm test
|
||||
|
||||
- name: Stop server after testing
|
||||
run: npm stop
|
||||
19
.gitignore
vendored
19
.gitignore
vendored
|
|
@ -1,19 +0,0 @@
|
|||
node_modules
|
||||
.gitpod.yml
|
||||
.vscode
|
||||
.gitattributes
|
||||
.well-known
|
||||
views/archive
|
||||
ngrok.exe
|
||||
pnpm-lock.yaml
|
||||
debug.log
|
||||
lib/rammerhead/sessions/*
|
||||
!lib/rammerhead/sessions/.gitkeep
|
||||
lib/rammerhead/cache-js/*
|
||||
!lib/rammerhead/cache-js/.gitkeep
|
||||
views/dist/*
|
||||
src/.shutdown
|
||||
lib/rammerhead/src/client/hammerhead.min.js
|
||||
lib/rammerhead/src/client/rammerhead.min.js
|
||||
views/archive/gfiles/rarch/roms
|
||||
.idea
|
||||
|
|
@ -1,3 +0,0 @@
|
|||
/lib
|
||||
/views/assets/js/*.js
|
||||
/views/dist
|
||||
3
.replit
3
.replit
|
|
@ -1,3 +0,0 @@
|
|||
language = "nodejs"
|
||||
repl.sh"
|
||||
run = "bash repl.sh"
|
||||
|
|
@ -1,58 +0,0 @@
|
|||
# Contributing to Holy Unblocker
|
||||
|
||||
Thank you for considering contributing to Holy Unblocker! Your contributions help us improve and provide better functionality. Please follow the guidelines below to ensure a smooth process.
|
||||
|
||||
## Commit Message Guidelines
|
||||
|
||||
When making commits, please use the following format for your commit messages:
|
||||
|
||||
- **Feature Commits**:
|
||||
Use the prefix `feat:` for new features.
|
||||
Example: `feat: implement user signup functionality`
|
||||
|
||||
- **Fix Commits**:
|
||||
Use the prefix `fix:` for bug fixes.
|
||||
Example: `fix: correct typo in README.md`
|
||||
|
||||
- **Refactor Commits**:
|
||||
Use the prefix `refactor:` for code improvements that do not change functionality.
|
||||
Example: `refactor: improve performance of search algorithm`
|
||||
|
||||
- **Chore Commits**:
|
||||
Use the prefix `chore:` for routine tasks or maintenance.
|
||||
Example: `chore: clean up old tests`
|
||||
|
||||
- **Test Commits**:
|
||||
Use the prefix `test:` for commits related to adding or modifying tests.
|
||||
Example: `test: add tests for user authentication`
|
||||
|
||||
- **Additions**:
|
||||
Use the prefix `add:` for commits that include new code additions, such as functions, classes, or modules that enhance the project.
|
||||
Example: `add: create user profile component`
|
||||
|
||||
- **Updates**:
|
||||
Use the prefix `update:` for commits that modify existing functionality, improve performance, or make changes that are not new features but enhance the current implementation.
|
||||
Example: `update: enhance user profile loading performance`
|
||||
|
||||
- **Version Changes**:
|
||||
Use the format `v6.x.x` for version bumps. List the changes made.
|
||||
Example: v6.3.9 - Added Contributing.md, Testing Out Deployment Fixes
|
||||
|
||||
## Prettier Configuration
|
||||
|
||||
We use Prettier for code formatting. Please ensure that you have Prettier installed with the provided configuration. Run Prettier before committing your changes to maintain consistent formatting. This is done via `Alt + Shift + F` if you are using Visual Studio Code.
|
||||
|
||||
## Version Bumping
|
||||
|
||||
When updating the version in `package.json`, please follow the format `6.x.xx`. Make sure to also update the corresponding version in `src/data.json` and `app.json`.
|
||||
|
||||
## Pull Request Guidelines
|
||||
|
||||
1. **Fork the Repository**: Start by forking the repository to your own GitHub account.
|
||||
2. **Create a Branch**: Create a new branch for your feature or bug fix. Use a descriptive name that reflects the work you are doing.
|
||||
3. **Make Your Changes**: Implement your changes and ensure they follow the commit message guidelines above.
|
||||
4. **Test Your Changes**: If applicable, run tests to confirm your changes work as expected.
|
||||
5. **Push Your Changes**: Push your changes to your forked repository.
|
||||
6. **Create a Pull Request**: Submit a pull request to the main repository, providing a clear description of the changes and why they are necessary.
|
||||
|
||||
Thank you for contributing to Holy Unblocker! We appreciate your help in making our project better.
|
||||
23
Dockerfile
23
Dockerfile
|
|
@ -1,23 +0,0 @@
|
|||
FROM node:20-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
LABEL org.opencontainers.image.title="Holy Unblocker LTS" \
|
||||
org.opencontainers.image.description="An effective, privacy-focused web proxy service" \
|
||||
org.opencontainers.image.version="6.9.3" \
|
||||
org.opencontainers.image.authors="Holy Unblocker Team" \
|
||||
org.opencontainers.image.source="https://github.com/QuiteAFancyEmerald/Holy-Unblocker/"
|
||||
|
||||
RUN apk add --no-cache tor bash
|
||||
|
||||
COPY . .
|
||||
|
||||
RUN npm run fresh-install
|
||||
RUN npm run build
|
||||
|
||||
EXPOSE 8080 9050 9051
|
||||
|
||||
COPY serve.sh /serve.sh
|
||||
RUN chmod +x /serve.sh
|
||||
|
||||
CMD ["/serve.sh"]
|
||||
1
Procfile
1
Procfile
|
|
@ -1 +0,0 @@
|
|||
web: npm run deployment --max_old_space_size=2560
|
||||
549
README.md
549
README.md
|
|
@ -1,526 +1,73 @@
|
|||
<img align="center" src="https://raw.githubusercontent.com/titaniumnetwork-dev/Holy-Unblocker/master/views/assets/img/github_banner.png"></img>
|
||||
# Holy Unblocker
|
||||
A website that can be used to bypass web filters; both extension and firewall. This is the public source code for Holy Unblocker. Works on a large number of sites including YouTube (Full Quality Support), Discord, CoolMathGames and more!
|
||||
|
||||
<img align="left" width="40px" src="https://raw.githubusercontent.com/titaniumnetwork-dev/Holy-Unblocker/master/views/assets/img/logo_github.png"></img>
|
||||
Official Site: https://www.holyubofficial.ml/
|
||||
|
||||
# Holy Unblocker LTS (v6.x.x)
|
||||
Be sure to join Titanium Network's Discord for more official site links: https://discord.com/invite/tgT48PH
|
||||
|
||||

|
||||

|
||||
[](https://hub.docker.com/r/quiteafancyemerald/holy-unblocker)
|
||||
[](https://hub.docker.com/r/quiteafancyemerald/holy-unblocker)
|
||||
<a href="https://heroku.com/deploy?template=https://github.com/QuiteAFancyEmerald/HolyUnblockerPublic" title="Deploy to Heroku"><img alt="Deploy to Heroku" src="https://raw.githubusercontent.com/QuiteAFancyEmerald/HolyUnblockerPublic/master/public/assets/img/heroku.svg?raw" width="140" height="30"><img></a>
|
||||
|
||||
Holy Unblocker LTS is an experimental web proxy service that can bypass web filters or "blockers" regardless of whether the method of censorship is client-side or network-based. This includes the ability to bypass content blockers from governments, chrome extensions, localized client firewalls, and network-related filters. The project even allows the ability to browse Tor/Onion sites in any browser (even Chromium) all through a website!
|
||||
## How to Install
|
||||
|
||||
## You can support Holy Unblocker by starring the repository!
|
||||
Either use the button above to deploy to Heroku or do the below:
|
||||
|
||||
This project serves mostly as a proof of concept for the ideal clientless solution to bypassing censorship. A good use case of this project would be if you ever needed a clientless solution to use Tor or leave minimal traces of device activity. Simply host this project on any domain and have an alternative solution to a VPN without needing to download anything on said device. Being a secure web proxy service, it supports numerous sites while being updated frequently and concentrating on being easy to self-host. Holy Unblocker LTS works with a large number of sites, including YouTube, Discord, GeForce NOW and more!
|
||||
Also has a good amount of locally hosted games featured on the site.
|
||||
`git clone https://github.com/QuiteAFancyEmerald/HolyUnblockerPublic.git`
|
||||
|
||||
### Over 30M+ users since 2020. Thank you so much for the support I could have never imagined how massive the web proxy community has become.
|
||||
`cd HolyUnblockerPublic`
|
||||
|
||||
#### Current Branch: Latest
|
||||
`npm install`
|
||||
|
||||
<details><summary>Branch Types</summary>
|
||||
`npm start`
|
||||
|
||||
- Latest (master; built for FOSS and SEO)
|
||||
- Beta (pending changes; changes that may break things)
|
||||
- Production (v4, v5, v6; stable version of Holy Unblocker LTS. Changes for self hosting in production settings; max filtering evasion and request handling)
|
||||
</details>
|
||||
The default place for the proxy when its started is `http://localhost:8080` but you can change it if needed in config.json
|
||||
|
||||
#### Considering switching branches for self-hosting to a production branch!
|
||||
This website has been hosted locally on Alloy Proxy. More more information go to the Alloy Proxy repo below.
|
||||
|
||||
View the <a href="#deploy-holy-unblocker">self-deployment options</a> if you wish to self host this project. Can't deploy using any of the free options? Check out Railway or look into cheap, paid VPS hosting solutions. If you don't wish to self-host join the discord for more official instance links that are restocked frequently.
|
||||
|
||||
**Be sure to join Titanium Network's Discord for more official site links:** <a href="https://discord.gg/unblock">https://discord.gg/unblock</a>
|
||||
|
||||
<br>
|
||||
|
||||
> [!CAUTION]
|
||||
> If you are going to self-host Holy Unblocker LTS please switch to the PRODUCTION branch for filter evasion features. The master branch is intended for development work and a highly readable source for developers. You can select a production branch (v6.x_production) via the branches dropdown.
|
||||
|
||||
> [!TIP]
|
||||
> Holy Unblocker LTS is optimized for self-hosting to provide you with maximum privacy control! Fork this repository and consider starring. You can self-host using either free or paid deployment options, or set it up on a dedicated instance (VPS) for enhanced performance.
|
||||
|
||||
| **Supported Sites** | **Features** |
|
||||
| -------------------------- | ------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| Youtube | Built-in variety of open source web proxies with both a focus on speed and/or security |
|
||||
| Reddit | Features Source Randomization and DOM Masquerading to circumvent major filters effectively along with randomizations to proxy globals |
|
||||
| Discord | Tab title + icon customization using the Settings Menu for improved browsing history stealth |
|
||||
| Instagram | Adblocking support across all websites while surfing and low latency DNS on official servers |
|
||||
| Reddit.com | SOCKS5 and Onion routing support with Tor within the Settings Menu. Use Tor/Onion sites in any browser! |
|
||||
| GeForce NOW | Game library with moderately decent titles and open-source emulation projects |
|
||||
| Spotify | Bypass regional proxy blocks by swapping regions or enabling Tor |
|
||||
| And essentially all sites! | Built for intensive production loads and ease of setup |
|
||||
|
||||
<img src="https://raw.githubusercontent.com/titaniumnetwork-dev/Holy-Unblocker/master/views/assets/img/preview/hu-v6.4.3-preview.png"></img>
|
||||
<img src="https://raw.githubusercontent.com/titaniumnetwork-dev/Holy-Unblocker/master/views/assets/img/preview/hu-v6.3.0-preview-settings.png"></img>
|
||||
|
||||
## Deploy Holy Unblocker
|
||||
|
||||
### Free Deployments
|
||||
|
||||
[](https://app.koyeb.com/deploy?name=holy-unblocker&type=git&repository=QuiteAFancyEmerald%2FHoly-Unblocker&branch=v6.3_production&builder=buildpack&env%5B%5D=&ports=8080%3Bhttp%3B%2F)
|
||||
[](https://cloud.oracle.com/resourcemanager/stacks/create?zipUrl=https://github.com/BinBashBanana/deploy-buttons/archive/refs/heads/main.zip)
|
||||
|
||||
<details><summary>Alternative Free Sources</summary>
|
||||
|
||||
[](https://app.cyclic.sh/api/app/deploy/shuttlenetwork/shuttle)
|
||||
[](https://render.com/deploy?repo=https://github.com/QuiteAFancyEmerald/Holy-Unblocker)
|
||||
[](https://fly.io/launch?repo=https://github.com/QuiteAFancyEmerald/Holy-Unblocker)
|
||||
|
||||
</details>
|
||||
|
||||
### Production Paid/Free Options (Requires Payment Info)
|
||||
|
||||
[](https://deploy.azure.com/?repository=https://github.com/QuiteAFancyEmerald/Holy-Unblocker)
|
||||
[](https://cloud.ibm.com/devops/setup/deploy?repository=https://github.com/QuiteAFancyEmerald/Holy-Unblocker)
|
||||
[](https://console.aws.amazon.com/amplify/home#/deploy?repo=https://github.com/QuiteAFancyEmerald/Holy-Unblocker)
|
||||
[](https://deploy.cloud.run/?git_repo=https://github.com/QuiteAFancyEmerald/Holy-Unblocker)
|
||||
|
||||
#### What happened to Replit/Heroku Deployment?
|
||||
|
||||
Replit is no longer free and Heroku has a set policy against web proxies. Try GitHub Codespaces or Gitpod instead for development on the cloud OR Koyeb for free hosting.
|
||||
|
||||
### GitHub Codespaces
|
||||
|
||||
<details><summary>Setup Instructions</summary>
|
||||
|
||||
- Fork (and star!) this repository to your GitHub account
|
||||
- Head to the official <a href="https://github.com/codespaces">Codespaces</a> website (ensure you have a GitHub account already made)
|
||||
- Select **New Codespaces** and look for _[USERNAME]/Holy-Unblocker_ on your account
|
||||
- Ensure the branch is set to `master` and the dev container configuration is set to **Holy Unblocker LTS**
|
||||
- Select **Create Codespace** and allow the container to setup
|
||||
- Type `npm run fresh-install` and `npm start` in the terminal
|
||||
- Click "Make public" on the application popup, then access the deployed website via the ports tab.
|
||||
|
||||
</details>
|
||||
|
||||
## Table of contents:
|
||||
|
||||
- [Setup](#how-to-setup)
|
||||
- [Terminal](#terminal)
|
||||
- [Project Configuration](#configuration)
|
||||
- [Server Configuration](#server-configuration-setup)
|
||||
- [TOR Routing](#toronion-routing-setup)
|
||||
- [Proxy](#proxy-configuration)
|
||||
- [Client Navigation](#client-navigation-configuration)
|
||||
- [Games Management](#games-management)
|
||||
- [Structure](#structure)
|
||||
- [Structure Information](#structure-information)
|
||||
- [Static Files](#details-of-views)
|
||||
- [Scripts](#scripts-located-in-viewsassetsjs)
|
||||
- [Future Additions](#future-additions)
|
||||
- [Beginner's Explanation](#vauge-explanation-for-beginners-with-external-proxies-and-hosting)
|
||||
- [Hosting Providers](#list-of-some-good-hosting-options)
|
||||
- [Domain Setup](#freenomdomain-steps)
|
||||
- [Cloudflare Setup](#cloudflare-steps)
|
||||
- [Workspace Configurations](#workspace-configurations)
|
||||
- [Detailed FAQ](#detailed-faq)
|
||||
- [More Information](#more-information)
|
||||
|
||||
## How to Setup
|
||||
|
||||
#### It is highly recommended you switch branches via your IDE to a production released branch. Often the master branch contains unstable or WIP changes.|
|
||||
|
||||
#### Example: v6.x_production instead of master
|
||||
|
||||
### Terminal
|
||||
|
||||
Either use the button above to deploy to the deployment options above or type the commands below on a dedicated server
|
||||
|
||||
**THIS PROJECT REQUIRES NGINX NOT CADDY.**
|
||||
|
||||
Please ensure you are using Node 20.x as well:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/QuiteAFancyEmerald/Holy-Unblocker.git
|
||||
|
||||
cd Holy-Unblocker
|
||||
|
||||
# Edit config.js and set production to true if you want to use pm2 (Allows for easier VPS hosting)
|
||||
npm run fresh-install
|
||||
npm start
|
||||
|
||||
# Or on subsequent uses...
|
||||
npm restart
|
||||
|
||||
# For killing any production processes made with pm2
|
||||
npm run kill
|
||||
|
||||
# For clearing respective Rammerhead cache
|
||||
npm run clean
|
||||
|
||||
# If you encounter any build errors...
|
||||
npm run build
|
||||
|
||||
# If you encounter any service errors...
|
||||
npm run test
|
||||
```
|
||||
|
||||
This website is hosted locally with Scramjet, Ultraviolet (Wisp, Bare-Mux, EpoxyTransport, CurlTransport) and Rammerhead built-in.
|
||||
|
||||
### For security reasons when hosting with a reverse proxy PLEASE use NGINX not Caddy. This is due to wisp-js using loopbacks.
|
||||
|
||||
#### Detailed Setup (Ubuntu Example)
|
||||
You will need Node.js 20.x and Git installed; below is an example for Debian/Ubuntu setup.
|
||||
<details>
|
||||
|
||||
For simplicity sake you can join the TN discord at discord.gg/unblock and request for mirror site links (that are restocked and unblocked).
|
||||
|
||||
### Hosting
|
||||
|
||||
If you wish to self-host however you will first need a VPS or hosting provider:
|
||||
|
||||
- https://docs.titaniumnetwork.org/guides/vps-hosting/
|
||||
- https://github.com/QuiteAFancyEmerald/Holy-Unblocker#deploy-holy-unblocker
|
||||
- https://docs.titaniumnetwork.org/guides/dns-setup/
|
||||
|
||||
### Dependencies
|
||||
|
||||
You will then need to setup git, nginx (or caddy) and Node.js. Here is an example for Ubuntu LTS:
|
||||
```
|
||||
sudo apt update
|
||||
sudo apt upgrade
|
||||
sudo apt install curl git nginx
|
||||
|
||||
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
|
||||
|
||||
export NVM_DIR="$HOME/.nvm"
|
||||
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
|
||||
|
||||
nvm install 20
|
||||
nvm use 20
|
||||
```
|
||||
https://github.com/nvm-sh/nvm
|
||||
https://docs.titaniumnetwork.org/guides/nginx/
|
||||
|
||||
### Tor Support (Optional)
|
||||
https://github.com/QuiteAFancyEmerald/Holy-Unblocker#toronionsocks5-routing-setup
|
||||
|
||||
### Configurating Holy Unblocker
|
||||
Most important options are production along with the obfuscation and DOM masquerading techniques.
|
||||
|
||||
From there just configure as needed: https://github.com/QuiteAFancyEmerald/Holy-Unblocker#configuration
|
||||
|
||||
### Cloning and Running Holy Unblocker
|
||||
|
||||
Then run the respective process; if you have production set to true in the configuration pm2 will be automatically enabled with our own workers/cache system.
|
||||
|
||||
```
|
||||
git clone https://github.com/QuiteAFancyEmerald/Holy-Unblocker.git
|
||||
cd Holy-Unblocker
|
||||
|
||||
npm run fresh-start
|
||||
```
|
||||
|
||||
Then of course if you used NGINX or caddy please restart/reload it
|
||||
```
|
||||
sudo systemctl restart nginx
|
||||
sudo systemctl restart tor
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
Resources for self-hosting:
|
||||
|
||||
- https://github.com/nvm-sh/nvm
|
||||
- https://docs.titaniumnetwork.org/guides/nginx/
|
||||
- https://docs.titaniumnetwork.org/guides/vps-hosting/
|
||||
- https://docs.titaniumnetwork.org/guides/dns-setup/
|
||||
|
||||
### Configuration
|
||||
|
||||
#### Server Configuration Setup
|
||||
|
||||
The default PORT for the proxy when started is `http://localhost:8080`. You can change the PORT and other production metrics if needed in `./ecosystem.config.js`.
|
||||
|
||||
The default PORT for Rammerhead is `3000`. You can change this <a href="https://github.com/QuiteAFancyEmerald/Holy-Unblocker/blob/8f6dcfedb71439a43a19cc0a015ee6ca7e29fd11/lib/rammerhead/holy-config.js#L9">here</a>.
|
||||
|
||||
Every other localized changes for source randomization, auto-minify, etc. are located in `./config.json`.
|
||||
|
||||
**config.json**
|
||||
- `minifyScripts`: Automatically minify respective static assets upon starting the server.
|
||||
- `randomizeIdentifiers`: Enable experimental proxy global randomization for Ultraviolet. This reduces the chances of UV being detected by any extension based filters.
|
||||
- `production`: Utilize a pre-configured production setup for server hosting. Automatically has cache control, session jobs for Rammerhead and source rewrites setup.
|
||||
- `disguiseFiles`: Enable DOM masquerading which obfuscates real the real content fetches for HU LTS. This is done through disguising requests, decompressing and then reconstructing the DOM tree.
|
||||
- `usingSEO`: Enable Source Randomization which randomizes the source by swapping chunks of data specified in `./src/data.json`. Highly useful for masking keywords that will automatically flag or block Holy Unblocker LTS as well as preventing source blocks.
|
||||
|
||||
#### Tor/Onion/SOCKS5 Routing Setup
|
||||
|
||||
You need to setup Tor (no GUI need/GUI is alright. With GUI replace port 9050 with 9150) in order for the Onion Routing setting to work!
|
||||
|
||||
Simply host Tor using this guide: https://tb-manual.torproject.org/installation/
|
||||
|
||||
Alternative Guide (for CLI): https://community.torproject.org/onion-services/setup/install/
|
||||
|
||||
If you are hosting Holy Unblocker LTS on a VPS utilizing Ubuntu consider attaching Tor to systemctl for easier production management. Once Tor is up and running on either Linux or Windows it will work automatically with Holy Unblocker LTS when enabled by the user via the Settings menu.
|
||||
|
||||
If you wish to use a custom HTTP/HTTPS/SOCKS5 proxy to route all traffic through for Scramjet and Ultraviolet this is handled in `./views/assets/js/register-sw.js.` Modify `proxyUrl` with the respective protocol and address. This is done via the proxy option for Wisp. You can change the cases as needed.
|
||||
|
||||
```js
|
||||
proxyUrl = {
|
||||
tor: 'socks5h://localhost:9050',
|
||||
eu: 'socks5h://localhost:7000',
|
||||
jp: 'socks5h://localhost:7001',
|
||||
}
|
||||
```
|
||||
|
||||
#### Proxy Configuration
|
||||
|
||||
The primary location for tweaking any web proxy related settings assigned via the Settings menu is `./views/assets/js/register-sw.js`. Here you can modify the provided transport options set locally via a cookie, swap out SOCKS5 proxies, change Onion routing ports, specify a blacklist, and more.
|
||||
|
||||
- `stockSW`: The default service worker configuration file for Ultraviolet. For Holy Unblocker however adblocking is automatically enabled so this is not used by default.
|
||||
- `blacklistSW`: A modified version of Ultraviolet that allows for blacklisting domains and adblocking.
|
||||
- `proxyUrl`: Specifies a SOCKS5/HTTPS/HTTP protocol URL defaulting to the default Tor proxy port. This can be swapped out with any valid port or SOCK5s proxy. This is done via the proxy option for both epoxy and libcurl.
|
||||
- `transports`: Specifies any provided ports to be swapped via Bare-Mux and utilize Wisp.
|
||||
- `wispUrl`: Modify the pathname or url handling for Wisp
|
||||
- `defaultMode`: Specify the default transport used globally (can be swapped by the users still via the Settings menu)
|
||||
- `ScramjetController`: This constructor allows you to swap out the prefix used for Scramjet dynamically and specify file locations. Note you may need to edit `./views/scram/scramjet.sw` when changing file names.
|
||||
|
||||
#### Client Navigation Configuration
|
||||
|
||||
The primary location for any client side navigation scripts is `./views/assets/js/common.js`. This file is primary used for Omnibox (Search Engine) functionality, swapping proxy options and linking games.
|
||||
|
||||
- `getDomain`: This constant is used for specifying any subdomains to remove when appending a URL into the omnibox.
|
||||
- `goFrame`: This specifies the stealth frame used for Holy Unblocker LTS
|
||||
- `sx`: This constant specifies the search engine you want to be proxied whenever a user types something in that isn't a URL
|
||||
- `search/uvUrl/sjUrl`: These functions specify and parse the queries used for submitted URLs
|
||||
- `RammerheadEncode:` This constant is a dependency for Rammerhead parsing and querying
|
||||
- `urlHandler/asyncUrlHandler`: Used to set functions for the goProx object.
|
||||
- `goProx`: This constant allows for the mapping of URL handling for specific proxies, games or links that need to fall under a web proxy.
|
||||
|
||||
```js
|
||||
const goProx = Object.freeze({
|
||||
ultraviolet: urlHandler(uvUrl),
|
||||
|
||||
scramjet: urlHandler(sjUrl),
|
||||
|
||||
rammerhead: asyncUrlHandler(
|
||||
async (url) => location.origin + (await RammerheadEncode(search(url)))
|
||||
),
|
||||
|
||||
// `location.protocol + "//" + getDomain()` more like `location.origin`
|
||||
|
||||
examplepath: urlHandler(location.protocol + `//c.${getDomain()}/example/`),
|
||||
|
||||
examplesubdomain: urlHandler(location.protocol + '//c.' + getDomain()),
|
||||
|
||||
example: urlHandler(sjUrl('https://example.com')),
|
||||
});
|
||||
```
|
||||
|
||||
- `prSet`: Attaches event listeners using goProx for any buttons or inputs needed
|
||||
|
||||
```js
|
||||
// prSet function code here....
|
||||
|
||||
prSet('pr-uv', 'ultraviolet');
|
||||
prSet('pr-sj', 'scramjet');
|
||||
prSet('pr-rh', 'rammerhead');
|
||||
prSet('pr-yt', 'youtube');
|
||||
prSet('pr-example', 'example');
|
||||
```
|
||||
|
||||
- `huLinks/navLists`: Automatically takes paths stated in `./views/assets/json` and appends them depending on the page and usage. This is used for hiding links that would lead to filter blocks and create an easier system for adding games.
|
||||
|
||||
#### Games Management
|
||||
|
||||
As stated above all game links that need to be appended to a page (including images and descriptions) are managed via the nav files in`./views/assets/json`.
|
||||
|
||||
Download the latest release <a href="https://github.com/QuiteAFancyEmerald/Holy-Unblocker/blob/master/views/GAMES.md">here</a> and extract it within a folder called `/views/archive`.
|
||||
|
||||
- `views/archive/g`: Contains any local or external HTML5/web games.
|
||||
- `views/archive/gfiles/flash`: Contains Ruffle (an Adobe Flash emulator) and a collection of flash games linked to an external CDN.
|
||||
- `views/archive/gfiles/rarch`: Contains webretro which is a project that ports RetroArch to WASM. Supports many systems like GBA, N64, etc; ROMS are NOT INCLUDED.
|
||||
|
||||
## Structure
|
||||
|
||||
<details><summary>Web Pages</summary>
|
||||
|
||||
### Structure Information
|
||||
|
||||
- `/views/`: The physical site base of Holy Unblocker goes here where static assets are served.
|
||||
- `/src/`: For future implementation of obfuscation and keyword removing features.
|
||||
|
||||
#### Details of `/views/`
|
||||
|
||||
- `/dist/` is used for minfied files. Created on build.
|
||||
- `/pages/` is used for the HTML for the site.
|
||||
- `/assets/` is used for storing various CSS, JS, image, and JSON files.
|
||||
- `/scram/` contains the respective local Scramjet implementation. Some files are overridden by the node module.
|
||||
- `/uv/` contains the UV implementation.
|
||||
|
||||
#### Scripts located in `/views/assets/js/`
|
||||
|
||||
- `card.js` adds a fancy visual effect to the box cards displayed on the welcome screen.
|
||||
- `common.js` is used on all pages and allows most site features to function such as autocomplete.
|
||||
- `csel.js` manages the settings menu, omnibox function and other additional features.
|
||||
- `loader.js` is used as an asset for DOM masquerading.
|
||||
- `register-sw.js` creates and manages service workers that allow Ultraviolet to function, and also uses bare transport.
|
||||
|
||||
</details>
|
||||
- `index.html` : The official homepage of the site.
|
||||
- `z.html` : Surf Freely page, page offers to be redirected to either Alloy or Node.
|
||||
- `a.html` : Alloy Proxy page, configured as recommended with Alloy Proxy.
|
||||
- `b.html` : Links to a subdomain for Node Unblocker. I left it in just in case you would like to setup the site differently.
|
||||
- `p.html` : Links to a subdomain for Powermouse. I left it in just in case you would like to setup the site differently.
|
||||
- `g.html` : Games page, help from @BinBashBanana and @kinglalu.
|
||||
- `info.html` : WIP Documentation
|
||||
- `d.html` : Links to an external subdomain with proxied discord. May need to refresh.
|
||||
- `gold.html` : Games page, credits given to @BinBashBanana and Titanium Network for its assets.
|
||||
- `i.html` : Information regarding Settings Menu page. Added this in for standard users.
|
||||
- `t.html` : Terms of Services, AUP and Privacy Policy page.
|
||||
- `k.html` : An iframe version of Krunker. Can be removed if not needed.
|
||||
- `yt.html` : An iframe of Youtube running off of the locally hosted Alloy Proxy.
|
||||
|
||||
## Future Additions
|
||||
- Cookie Authorization
|
||||
- Filters
|
||||
|
||||
<a href="https://github.com/QuiteAFancyEmerald/Holy-Unblocker/blob/master/TODO.md">This</a> is our nonexhaustive todo list for Holy Unblocker LTS v6.x.x and above. Release for production will be v7.x.x and above.
|
||||
## Vauge Explanation for Beginners With External Proxies and Hosting
|
||||
You will first want to host your proxies locally or externally.
|
||||
Somes good hosting options (both free and paid):
|
||||
- <a href="https://heroku.com">Heroku</a> (Free)
|
||||
- <a href="https://nodeclusters.com">NodeClusters</a> (Paid)
|
||||
- <a href="https://glitch.com">Glitch</a> (Free)
|
||||
- <a href="https://repl.it">Repl.it</a> (Free)
|
||||
- <a href="https://azure.microsoft.com/en-us/">Azure</a> (Free and Paid)
|
||||
|
||||
## Vague Explanation for Beginners With External Proxies and Hosting
|
||||
|
||||
You will first want to host your proxies locally or externally.
|
||||
|
||||
#### List of some good hosting options:
|
||||
|
||||
- <a href="https://crunchbits.com/">Crunchbits</a> ( Current Hosting Provider)
|
||||
- <a href="https://greencloudvps.com">Greencloud</a> (Paid)
|
||||
- <a href="https://www.oracle.com/cloud">Oracle Cloud</a> (Free, Paid, Dedicated)
|
||||
- <a href="https://azure.microsoft.com">Azure</a> (Free and Paid)
|
||||
|
||||
Out of the list of hosting providers Dedipath and Azure rank first as a preference. You may also self-host.
|
||||
Out of the list of hosting providers Heroku and NodeClusters rank first as a preference. You may also self-host.
|
||||
|
||||
After you have selected a decent VPS, use Cloudflare for the DNS records for both the site and the subdomains for the proxies.
|
||||
|
||||
This is an example of DNS records. Self-hosting will require `A records` preferably.
|
||||
<img src="https://raw.githubusercontent.com/titaniumnetwork-dev/Holy-Unblocker/master/views/assets/img/dnssetup.png" width="500"></img>
|
||||
This is an example of DNS records involving Heroku. Self-hosting will require `A records` preferably.
|
||||
<img src="https://cdn.discordapp.com/attachments/725506757291671663/756659513179766844/unknown.png" width="500" height="154"></img>
|
||||
|
||||
- `@` and `www.example.com` are being used for Holy Unblocker LTS.
|
||||
- `a.example.com` is being used for other instances like Libreddit, Invidious or web ported games depending on what the site maintainer needs.
|
||||
|
||||
As stated previously, Holy Unblocker is hosted locally with Scramjet, Ultraviolet and Rammerhead out of the box. No need for external instances.
|
||||
|
||||
#### Domain Steps
|
||||
|
||||
- If you prefer to obtain premium domains (TLDs) then use <a href="https://porkbun.com">Porkbun</a>, which offers domains for amazing prices. Literally a `.org` domain normally costs around $5 first year.
|
||||
|
||||
#### Cloudflare Steps
|
||||
|
||||
- Use Cloudflare (make an account), add your site and then add your various DNS targets to Cloudflare. Make sure you add Cloudflare's Nameservers which will be given later when you are adding your site.
|
||||
|
||||
Make sure they are CNAME although A records also work and try to follow this structure:
|
||||
|
||||
**Type | Name | Target**
|
||||
|
||||
`A | @ | VPS IP GOES HERE`
|
||||
`A | www | VPS IP GOES HERE`
|
||||
`A | a | VPS IP GOES HERE`
|
||||
|
||||
Make sure HTTPS is forced and have SSL set to Flexible (if you don't use LetsEncrypt). Otherwise you can have SSL set to Full.
|
||||
|
||||
#### Workspace Configurations
|
||||
|
||||
Preferably if you have your own device use Visual Studio Code. Pretty much the best option you can get but obviously this is an opinion. Also make sure you have <a href="https://nodejs.org/">Node.JS</a> installed on your machine.
|
||||
|
||||
Not going to go too in depth with this part but first fork this repository. The clone it locally through a Terminal of some sort depending on what OS you are on. Make sure you navigate to the folder you want to set this up in.
|
||||
|
||||
```
|
||||
git clone https://github.com/QuiteAFancyEmerald/Holy-Unblocker.git
|
||||
|
||||
cd Holy-Unblocker
|
||||
|
||||
npm run fresh-install
|
||||
|
||||
# If you wish to start the project
|
||||
|
||||
npm start
|
||||
|
||||
# For testing endpoints and errors
|
||||
|
||||
npm run test
|
||||
```
|
||||
|
||||
Now simply add the folder you cloned this repo in in VSC. Then run `npm install`. I recommend that if you are releasing this publically on GitHub that you add a `.gitignore` in your root directory with the following exclusions:
|
||||
|
||||
```
|
||||
node_modules
|
||||
```
|
||||
|
||||
Now you have your following workspace environment setup. To deploy the following workspace you just created you will need to look up depending on your hosting provider.
|
||||
|
||||
For an online IDE that you can use on your school computer and/or chromebook use GitPod. Basically the equivalent of Visual Studio Code but with in-browser support.
|
||||
|
||||
- Make an account: `https://gitpod.io/`
|
||||
- Fork this repo and enter in this URL to setup your workspace: `https://gitpod.io#https://github.com/YourNameHere/Holy-Unblocker/`
|
||||
|
||||
Use the same steps above by running `npm install` in your repository and adding a `.gitignore` in your root directory specifying to exclude `node_modules`.
|
||||
|
||||
## Detailed FAQ
|
||||
|
||||
<details>
|
||||
<summary>Quick FAQ</summary>
|
||||
|
||||
#### Where can I find the games for this repo? (404 errors, etc.)
|
||||
|
||||
Due to piracy concerns, size, etc. this has been moved over <a href="https://github.com/QuiteAFancyEmerald/HU-Archive">here</a>. EmuLibrary is not featured in the public version.
|
||||
|
||||
**Why is the site I am on not working correctly or having CAPTCHA errors?**
|
||||
|
||||
Captcha support is spotty on all of the current proxies sadly. It is primarily supported by Scramjet. Therefore some sites may not work with any of the sites.
|
||||
|
||||
**I am getting 502 errors. What do I do?**
|
||||
|
||||
When this happens you may either switch sites to fix the error or wait a bit. Sometimes clearing your cache can help.
|
||||
|
||||
If you still have any questions feel free to ask them in the discord linked here.
|
||||
|
||||
</details>
|
||||
|
||||
### Why are official domains now numbered? Is this project maintained again?
|
||||
|
||||
Yes, this project is active again for LTS support! However, the approach is now much simpler to ensure functionality: domain restocks as needed and a highly maintained source. More than ever, this project serves as a proof of concept for the brave souls willing to innovate in the web proxy service space.
|
||||
|
||||
<details><summary>Former Closing Message (Original - 2022)</summary>
|
||||
|
||||
This isn’t the greatest announcement sorry. After lots of thought and severe hesitation I’m shutting down Holy Unblocker and leaving TN. It's just been something that I’ve been super conflicted with for months hence the lack of updates and the massive gaps that happened last year. I just didn’t want to throw away a project that I passionately enjoyed and spent time on while making amazing friends and meeting epic devs here. I could go on forever for who these people are but ima like leave it here. They know who they are :D
|
||||
|
||||
The main change of thought is that I’m finally just putting an end right now due to 1) the lack of motivation 2) the community is NOT the greatest at time and not the nicest at times (have to put that out here) 3) the future doesn’t look so good for HU/TN as a project.
|
||||
|
||||
Some things I’ll be keeping secret since there are more reasons to this choice unless otherwise for those who don’t find this enough information. Good friends here will know that I’ve been super stressed about this choice for months now. Also regardless a good motivator for this choice is the fact that I’ll be graduating soon.
|
||||
|
||||
It’s possible that I may continue/come back for this in the future or keep it on GitHub only. I leave this here because even now I am still doubting myself about this change. But for now I’d check out other proxy sites like Incognito (Duce DOES a ton of updates frequently and he is the creator/developer of Ultraviolet so give him some love) :yayy_hopi:
|
||||
|
||||
Check out his Patreon also! For current HU patrons you will not be billed next month and the HU Patreon will be archived so head over to Duce’s patron so he can purchase more domains for Incognito.
|
||||
|
||||
With love <3
|
||||
Emerald :HuTaoHype:
|
||||
|
||||
</details>
|
||||
- `a.deepsoil.ml` is being used for Node Unblocker.
|
||||
- `p.deepsoil.ml` is being used for Powermouse.
|
||||
|
||||
As stated previously, Holy Unblocker is hosted locally with Alloy.
|
||||
## More Information
|
||||
This project uses Alloy Proxy, Node Unblocker and Powermouse which are linked below. Credits also given to Titanium Network and all its developers as this project would not be possible without them. View the official website for more detail. :)
|
||||
|
||||
This project is maintained by the Holy Unblocker LTS team and is an official flagship Titanium Network web proxy site.
|
||||
|
||||
- <a href="https://github.com/titaniumnetwork-dev/">https://github.com/titaniumnetwork-dev/</a>
|
||||
- <a href="https://titaniumnetwork.org/">https://titaniumnetwork.org/</a>
|
||||
|
||||
View the official website for more detail and credits.
|
||||
|
||||
### Web Proxy Sources:
|
||||
|
||||
This project currently uses Scramjet and Ultraviolet as web proxies adhering to the Wisp protocol. Bare-Mux is utilized for swapping transport systems to be utilized with Wisp. The included transport systems are EpoxyTransport and libcurl-transport. Rammerhead is also provided as an additional web proxy option.
|
||||
|
||||
- <a href="https://github.com/MercuryWorkshop/scramjet">Scramjet</a>
|
||||
- <a href="https://github.com/titaniumnetwork-dev/Ultraviolet">Ultraviolet</a>
|
||||
- <a href="https://github.com/MercuryWorkshop/wisp-server-node">Wisp-Server-Node</a>
|
||||
- <a href="https://github.com/MercuryWorkshop/wisp-server-python">Wisp-Server-Python</a>
|
||||
- <a href="https://github.com/MercuryWorkshop/EpoxyTransport">EpoxyTransport</a>
|
||||
- <a href="https://github.com/MercuryWorkshop/CurlTransport">libcurl-transport</a>
|
||||
- <a href="https://github.com/MercuryWorkshop/bare-mux">Bare-Mux</a>
|
||||
- <a href="https://github.com/binary-person/rammerhead">Rammerhead</a>
|
||||
- <a href="https://gist.github.com/BinBashBanana/a1fd7345e2d86e69d5a532f16cbdbdaa">DetectorDetector</a>
|
||||
|
||||
### Other Dependencies:
|
||||
|
||||
- <a href="https://github.com/tsparticles/tsparticles">tsparticles</a>
|
||||
- <a href="https://github.com/fastify/fastify">fastify</a>
|
||||
- <a href="https://github.com/fastify/fastify-helmet">@fastify/helmet</a>
|
||||
- <a href="https://github.com/fastify/fastify-static">@fastify/static</a>
|
||||
- <a href="https://github.com/DerpmanDev/modal">Modal</a>
|
||||
- <a href="https://github.com/BinBashBanana/webretro">webretro</a>
|
||||
- <a href="https://ruffle.rs/">Ruffle</a>
|
||||
- <a href="https://github.com/michalsnik/aos">AOS</a>
|
||||
- <a href="https://github.com/nordtheme">Nord Theme</a>
|
||||
- <a href="https://fontawesome.com/">Font Awesome</a>
|
||||
|
||||
### Notable Mentions:
|
||||
|
||||
- <a href="https://crunchbits.com/">Crunchbits</a> (Hosting Provider)
|
||||
- https://github.com/titaniumnetwork-dev/
|
||||
- https://github.com/titaniumnetwork-dev/alloyproxy
|
||||
- https://github.com/nfriedly/node-unblocker
|
||||
- https://nodeclusters.com
|
||||
- https://titaniumnetwork.org/
|
||||
|
|
|
|||
119
TODO.md
119
TODO.md
|
|
@ -1,119 +0,0 @@
|
|||
This will be our nonexhaustive todo list for Holy Unblocker LTS v6.x.x and above. Release for production will be v8.x.x and above.
|
||||
|
||||
## Proxy/Site Functionality
|
||||
|
||||
- [ ] Update to use scramjetFrame instead of our own window handling
|
||||
- [ ] Implement wisp python to the project instead of the unreliable wisp-server-node
|
||||
- [ ] Add booksmark menu (source wise already present pretty much)
|
||||
- [ ] Add Chii + ensuring users can access devtools while browsing - partial
|
||||
- [ ] Setting to open multiple stealth frames; basically about:blank but using our system. Pops out in another tab
|
||||
- [ ] Omnibox should state what the current site the user is on like a proper URL bar
|
||||
- [ ] Improve adblocking functions on site using Workerware + a pre-bundled uBlock Origin
|
||||
- [ ] Add a "website self-destruct" button to the settings menu
|
||||
- [ ] Transport Options Swapping on Frame (Settings Menu doesn't swap)
|
||||
- [ ] Implement advanced data URI system
|
||||
- [ ] Allow custom Wisp urls from the settings menu (not config side)
|
||||
- [x] Swap to wisp-js over wisp-server-node for security and performance - done
|
||||
- [x] Fix keyword/descriptor randomisation - done
|
||||
- [x] Adapt Wisp protocol replacing bare which is very unsecure - done
|
||||
- [x] Improved error handling for proxy errors - done
|
||||
- [x] Ensure Ultraviolet is updated to support bare-mux and wisp - done
|
||||
- [x] Ensure Scramjet is added and works together with UV's implementation - done
|
||||
- [x] Adapt Scramjet as main proxy for the project - done
|
||||
- [x] Refactor register-sw.js - done
|
||||
- [x] Add Rammerhead support - done
|
||||
- [x] Fix slow Ultraviolet speeds despite being local; something on the backend?? - done
|
||||
- [x] Fix Ultraviolet on Firefox - (partial/needs work)
|
||||
- [x] Adapt Applications page to use Scramjet (for Reddit, YouTube, Discord) - done
|
||||
- [x] Added libcurl transport and epoxy transport to meet standards of SJ + Wisp - done
|
||||
- [x] SOCKS5/Tor routing option that can be configured as a settings menu option - done
|
||||
- [x] SOCKS5 regional proxy implementation - done
|
||||
- [x] Update Applications page to reflect modern fast links (use examples from the modern web proxy base) - done. can be expanded later
|
||||
- [x] Update settings menu again to make more room for more features - done
|
||||
- [x] Update csel.js (after Setting menu redesign) to support custom transports, icon swap, routing - done
|
||||
- [x] Flesh out and rework the UV / Scramjet / bare client error page - done
|
||||
- [x] Update sw.js to support workerware (https://github.com/MercuryWorkshop/workerware)-- This is not done however we have our own middleware system implemented for adblocking, etc.
|
||||
- [x] Omnibox autoupdate script (for the Google/Bing style auto suggest feature) - done
|
||||
- [x] Omnibar functionality (back and forward navigation, settings menu and create new stealth page with URL) - done
|
||||
- [x] Games library will feature new games - done
|
||||
- [x] Servers now utilise NextDNS w/ ads and malware blocked; anycast + low latency - done
|
||||
- [x] Revamp the Stealth Frame with a slight animation (ease in and then the wheeling loading with a gradient fading away once its loaded or shows the error page LOL), a loading wheel/page and lastly a omnibox widget. It will have like nav buttons, some of the settings from the settings menu, a home button, a button that brings up the Setting menu and be in a designed position. Intent is to reduce the back/forth nature that users have to do currently making it more tedious to use the site. - partial. needs functionality.
|
||||
|
||||
## Code Cleanup
|
||||
|
||||
- [x] Optimize the JS. This time it won't be in one line and will be somewhat thoroughly commented.
|
||||
- [x] Ensure all the original submodules get added back to HU-Archive
|
||||
- [x] SEO overhaul adapted from the v3 SEO Guide format - partial
|
||||
- [x] Optimize the stylesheets and the HTML layout. Add more proper commenting and redivide the code so that it's less hard on the eyes.
|
||||
- [x] Remove all current obfuscation in the source code. It needs to be dynamically obfuscated if anything, or not obfuscated at all. This option will be a config option on the server side before rendering with Fastify for a performance focus. Meta elements will have an additonal attribute indicating if they should be moved. This is to ensure a SEO source can be served by config or a source focused on pure censorship evasion.
|
||||
- [x] Restructure navigation scripts to ensure updated proxy functionality is sanitized and effective - done
|
||||
- [x] Particles.js automatically adjusting per display size - done
|
||||
- [x] Fix routes.mjs throwing with incorrect paths - done
|
||||
- [x] Create CI testing script - done
|
||||
- [x] XSS and fingerprinting protection (may need updates) - done
|
||||
- [x] Greatly improved native source rewrites and routing - done
|
||||
- [x] Update games navigation JS and page/change to JSON object system - done
|
||||
- [x] Mobile support - (partial)
|
||||
- [x] Fastify routes modified to ensure perfect SEO. This means absolute paths such as /example instead of ?z - done
|
||||
- [x] Randomize the \_\_uv$config global, and optionally randomize the UV prefix and URL encoding via cookies
|
||||
|
||||
## Site Redesign
|
||||
|
||||
- [x] Documentation on-site + Getting Started information updated (Tor, etc.) - partial; good enough
|
||||
- [x] Update colors + add themes - done
|
||||
- [x] Landing Cards - done
|
||||
- [x] Change fonts to cleaner look
|
||||
- [x] Add more AOS interactions on scroll or hover
|
||||
- [x] Add subtle noise to background elements
|
||||
- [x] Toggle elements
|
||||
- [x] Other card options
|
||||
- [x] Radial blur elements
|
||||
- [x] Code standard examples - in source
|
||||
- [x] Horizontal/general movement on scroll with AOS
|
||||
- [x] Showcase dev dependencies
|
||||
- [x] Update branding and icons
|
||||
- [x] Landing Page
|
||||
- [x] Settings Menu - partial I want to fix some colours
|
||||
- [x] More Dropdown Menu
|
||||
- [x] Web Proxies page
|
||||
- [x] Application page
|
||||
- [x] Games Library page
|
||||
- [x] Emulators Library page
|
||||
- [x] Emu Library page
|
||||
- [x] Web Games page
|
||||
- [x] Flash Games page
|
||||
- [x] FAQ page
|
||||
- [x] Credits page
|
||||
- [x] TOS page
|
||||
- [x] Footer Design
|
||||
- [x] Header Design
|
||||
|
||||
## Community Requests
|
||||
|
||||
- [x] Add [Quake WASM](https://github.com/GMH-Code/Quake-WASM)
|
||||
- [x] Celeste WASM
|
||||
|
||||
## Changelog (Old; too lazy to type it all out now)
|
||||
|
||||
- Added wisp support
|
||||
- Fixed AD config setting being opt-out; ads are not implemented in the project however
|
||||
- Added Rammerhead support (locally)
|
||||
- Drastically updated visuals across the service and refactored stylesheets
|
||||
- Bumped games page functionality
|
||||
- Updated randomization scripts to ES6 syntax and implemented the alternative to RegEx string replacement
|
||||
- Helmet for express implemented into backend
|
||||
- Improved component handling via templates.mjs along with deletion of obsolete files that previously handled this standard in a poor format
|
||||
- Fixed oddly slow speeds with Ultraviolet (as well as a general version bump to support epoxy-tls and bare-mux)
|
||||
- Implemented testing scripts for an improved GitHub actions workflow by doing a quick test on proxy + site functionality
|
||||
- Greatly optimized client-side scripts across the site with a new standard, and generally reworked to no longer leave global variables
|
||||
- Changes to server.mjs with path logic and error handling
|
||||
- Updated standards for common scripts
|
||||
- libcurl and bare-as-module support added
|
||||
- Deleted 5 JS scripts and moved lots of data into JSON files. Big reorganization. Games menu core scripts now nested inside of common.js utilizing a JSON system
|
||||
- Massive updates to the Settings menu visually and functionality wise; added Bare-Mux support for swapping transports to work with Ultraviolet, default icons and selective adblocking + Tor on any proxy instances
|
||||
- CSS Has been partially restructured for mobile support, and is now properly arranged into clearly labeled sections (for the most part)
|
||||
- Incorporated makeshift domain blacklisting functionality into Ultraviolet, currently used for blocking ads if ads are disabled in settings
|
||||
- Fleshed out the SEO with more descriptions and better labeling
|
||||
- Switched to Fastify for serving content from the backend; a separate Express backend file is kept in case it's still needed
|
||||
- Rammerhead is now locally built into the HU LTS repository
|
||||
- Simplified the HU LTS setup process and added more default npm commands
|
||||
309
app.js
Normal file
309
app.js
Normal file
|
|
@ -0,0 +1,309 @@
|
|||
const express = require('express'),
|
||||
app = express(),
|
||||
http = require('http'),
|
||||
https = require('https'),
|
||||
fs = require('fs'),
|
||||
querystring = require('querystring'),
|
||||
session = require('express-session'),
|
||||
sanitizer = require('sanitizer'),
|
||||
fetch = require('node-fetch');
|
||||
|
||||
const config = JSON.parse(fs.readFileSync('./config.json', { encoding: 'utf8' }));
|
||||
if (!config.prefix.startsWith('/')) {
|
||||
config.prefix = `/${config.prefix}`;
|
||||
}
|
||||
|
||||
if (!config.prefix.endsWith('/')) {
|
||||
config.prefix = `${config.prefix}/`;
|
||||
}
|
||||
|
||||
let server;
|
||||
let server_protocol;
|
||||
const server_options = {
|
||||
key: fs.readFileSync('./ssl/default.key'),
|
||||
cert: fs.readFileSync('./ssl/default.crt')
|
||||
}
|
||||
if (config.ssl == true) {
|
||||
server = https.createServer(server_options, app);
|
||||
server_protocol = 'https://';
|
||||
} else {
|
||||
server = http.createServer(app);
|
||||
server_protocol = 'http://';
|
||||
};
|
||||
|
||||
var login = require('./auth');
|
||||
|
||||
console.log(`Alloy Proxy now running on ${server_protocol}0.0.0.0:${config.port}! Proxy prefix is "${config.prefix}"!`);
|
||||
server.listen(process.env.PORT || config.port);
|
||||
|
||||
btoa = (str) => {
|
||||
str = new Buffer.from(str).toString('base64');
|
||||
return str;
|
||||
};
|
||||
|
||||
atob = (str) => {
|
||||
str = new Buffer.from(str, 'base64').toString('utf-8');
|
||||
return str;
|
||||
};
|
||||
|
||||
rewrite_url = (dataURL, option) => {
|
||||
var websiteURL;
|
||||
var websitePath;
|
||||
if (option == 'decode') {
|
||||
websiteURL = atob(dataURL.split('/').splice(0, 1).join('/'));
|
||||
websitePath = '/' + dataURL.split('/').splice(1).join('/');
|
||||
} else {
|
||||
websiteURL = btoa(dataURL.split('/').splice(0, 3).join('/'));
|
||||
websitePath = '/' + dataURL.split('/').splice(3).join('/');
|
||||
}
|
||||
if (websitePath == '/') { return `${websiteURL}`; } else return `${websiteURL}${websitePath}`;
|
||||
};
|
||||
|
||||
app.use(session({
|
||||
secret: 'alloy',
|
||||
cookie: { sameSite: 'none', secure: 'true' },
|
||||
saveUninitialized: true,
|
||||
resave: true
|
||||
}));
|
||||
// We made our own version of body-parser instead, due to issues.
|
||||
app.use((req, res, next) => {
|
||||
if (req.method == 'POST') {
|
||||
req.raw_body = '';
|
||||
req.on('data', chunk => {
|
||||
req.raw_body += chunk.toString(); // convert Buffer to string
|
||||
});
|
||||
req.on('end', () => {
|
||||
req.str_body = req.raw_body;
|
||||
try {
|
||||
req.body = JSON.parse(req.raw_body);
|
||||
} catch (err) {
|
||||
req.body = {}
|
||||
}
|
||||
next();
|
||||
});
|
||||
} else return next();
|
||||
});
|
||||
|
||||
app.use(`${config.prefix}utils/`, async(req, res, next) => {
|
||||
if (req.url.startsWith('/assets/')) { res.sendFile(__dirname + '/utils' + req.url); }
|
||||
if (req.query.url) {
|
||||
let url = atob(req.query.url);
|
||||
if (url.startsWith('https://') || url.startsWith('http://')) {
|
||||
url = url;
|
||||
} else if (url.startsWith('//')) {
|
||||
url = 'http:' + url;
|
||||
} else {
|
||||
url = 'http://' + url;
|
||||
}
|
||||
return res.redirect(307, config.prefix + rewrite_url(url));
|
||||
}
|
||||
});
|
||||
|
||||
app.post(`${config.prefix}session/`, async(req, res, next) => {
|
||||
/* var cookies = request.cookies;
|
||||
console.log(cookies);
|
||||
if ('session_id' in cookies) {
|
||||
var sid = cookies['session_id'];
|
||||
if (login.isLoggedIn(sid)) {
|
||||
response.setHeader('Set-Cookie', 'session_id=' + sid);
|
||||
response.end(login.hello(sid));
|
||||
} else {
|
||||
response.end("Invalid session_id! Please login again\n");
|
||||
}
|
||||
} else {
|
||||
response.end("Please login via HTTP POST\n");
|
||||
} */
|
||||
let url = querystring.parse(req.raw_body).url;
|
||||
if (url.startsWith('//')) { url = 'http:' + url; } else if (url.startsWith('https://') || url.startsWith('http://')) { url = url } else { url = 'http://' + url };
|
||||
return res.redirect(config.prefix + rewrite_url(url));
|
||||
});
|
||||
|
||||
app.use(config.prefix, async(req, res, next) => {
|
||||
var proxy = {};
|
||||
proxy.url = rewrite_url(req.url.slice(1), 'decode');
|
||||
proxy.url = {
|
||||
href: proxy.url,
|
||||
hostname: proxy.url.split('/').splice(2).splice(0, 1).join('/'),
|
||||
origin: proxy.url.split('/').splice(0, 3).join('/'),
|
||||
encoded_origin: btoa(proxy.url.split('/').splice(0, 3).join('/')),
|
||||
path: '/' + proxy.url.split('/').splice(3).join('/'),
|
||||
protocol: proxy.url.split('\:').splice(0, 1).join(''),
|
||||
}
|
||||
|
||||
proxy.url.encoded_origin = btoa(proxy.url.origin);
|
||||
|
||||
proxy.requestHeaders = req.headers;
|
||||
proxy.requestHeaders['host'] = proxy.url.hostname;
|
||||
if (proxy.requestHeaders['referer']) {
|
||||
let referer = '/' + String(proxy.requestHeaders['referer']).split('/').splice(3).join('/');
|
||||
|
||||
referer = rewrite_url(referer.replace(config.prefix, ''), 'decode');
|
||||
|
||||
if (referer.startsWith('https://') || referer.startsWith('http://')) {
|
||||
referer = referer;
|
||||
|
||||
} else referer = proxy.url.href;
|
||||
|
||||
proxy.requestHeaders['referer'] = referer;
|
||||
}
|
||||
|
||||
|
||||
if (proxy.requestHeaders['origin']) {
|
||||
let origin = '/' + String(proxy.requestHeaders['origin']).split('/').splice(3).join('/');
|
||||
|
||||
origin = rewrite_url(origin.replace(config.prefix, ''), 'decode');
|
||||
|
||||
if (origin.startsWith('https://') || origin.startsWith('http://')) {
|
||||
|
||||
origin = origin.split('/').splice(0, 3).join('/');
|
||||
|
||||
} else origin = proxy.url.origin;
|
||||
|
||||
proxy.requestHeaders['origin'] = origin;
|
||||
}
|
||||
|
||||
if (proxy.requestHeaders.cookie) {
|
||||
delete proxy.requestHeaders.cookie;
|
||||
}
|
||||
const httpAgent = new http.Agent({
|
||||
keepAlive: true
|
||||
});
|
||||
const httpsAgent = new https.Agent({
|
||||
keepAlive: true
|
||||
});
|
||||
proxy.options = {
|
||||
method: req.method,
|
||||
headers: proxy.requestHeaders,
|
||||
redirect: 'manual',
|
||||
agent: function(_parsedURL) {
|
||||
if (_parsedURL.protocol == 'http:') {
|
||||
return httpAgent;
|
||||
} else {
|
||||
return httpsAgent;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
if (req.method == 'POST') {
|
||||
proxy.options.body = req.str_body;
|
||||
}
|
||||
if (proxy.url.hostname == 'discord.com' && proxy.url.path == '/') { return res.redirect(307, config.prefix + rewrite_url('https://discord.com/login')); };
|
||||
|
||||
if (proxy.url.hostname == 'www.reddit.com') { return res.redirect(307, config.prefix + rewrite_url('https://old.reddit.com')); };
|
||||
|
||||
if (!req.url.slice(1).startsWith(`${proxy.url.encoded_origin}/`)) { return res.redirect(307, config.prefix + proxy.url.encoded_origin + '/'); };
|
||||
|
||||
proxy.response = await fetch(proxy.url.href, proxy.options).catch(err => res.send(fs.readFileSync('./utils/error/error.html', 'utf8').toString().replace('%ERROR%', `Error 400: Could not make request to '${sanitizer.sanitize(proxy.url.href)}'!`)));
|
||||
|
||||
if (typeof proxy.response.buffer != 'function') return;
|
||||
|
||||
proxy.buffer = await proxy.response.buffer();
|
||||
|
||||
proxy.content_type = 'text/plain';
|
||||
|
||||
proxy.response.headers.forEach((e, i, a) => {
|
||||
if (i == 'content-type') proxy.content_type = e;
|
||||
});
|
||||
if (proxy.content_type == null || typeof proxy.content_type == 'undefined') proxy.content_type = 'text/html';
|
||||
|
||||
proxy.sendResponse = proxy.buffer;
|
||||
|
||||
// Parsing the headers from the response to remove square brackets so we can set them as the response headers.
|
||||
proxy.headers = Object.fromEntries(
|
||||
Object.entries(JSON.parse(JSON.stringify(proxy.response.headers.raw())))
|
||||
.map(([key, val]) => [key, val[0]])
|
||||
);
|
||||
|
||||
// Parsing all the headers to remove all of the bad headers that could affect proxies performance.
|
||||
Object.entries(proxy.headers).forEach(([header_name, header_value]) => {
|
||||
if (header_name.startsWith('content-encoding') || header_name.startsWith('x-') || header_name.startsWith('cf-') || header_name.startsWith('strict-transport-security') || header_name.startsWith('content-security-policy')) {
|
||||
delete proxy.headers[header_name];
|
||||
}
|
||||
});
|
||||
|
||||
// If theres a location for a redirect in the response, then the proxy will get the response location then redirect you to the proxied version of the url.
|
||||
if (proxy.response.headers.get('location')) {
|
||||
return res.redirect(307, config.prefix + rewrite_url(String(proxy.response.headers.get('location'))));
|
||||
}
|
||||
|
||||
res.status(proxy.response.status);
|
||||
res.set(proxy.headers);
|
||||
res.contentType(proxy.content_type);
|
||||
if (proxy.content_type.startsWith('text/html')) {
|
||||
req.session.url = proxy.url.origin;
|
||||
proxy.sendResponse = proxy.sendResponse.toString()
|
||||
.replace(/integrity="(.*?)"/gi, '')
|
||||
.replace(/nonce="(.*?)"/gi, '')
|
||||
.replace(/(href|src|poster|data|action|srcset)="\/\/(.*?)"/gi, `$1` + `="http://` + `$2` + `"`)
|
||||
.replace(/(href|src|poster|data|action|srcset)='\/\/(.*?)'/gi, `$1` + `='http://` + `$2` + `'`)
|
||||
.replace(/(href|src|poster|data|action|srcset)="\/(.*?)"/gi, `$1` + `="${config.prefix}${proxy.url.encoded_origin}/` + `$2` + `"`)
|
||||
.replace(/(href|src|poster|data|action|srcset)='\/(.*?)'/gi, `$1` + `='${config.prefix}${proxy.url.encoded_origin}/` + `$2` + `'`)
|
||||
.replace(/'(https:\/\/|http:\/\/)(.*?)'/gi, function(str) {
|
||||
str = str.split(`'`).slice(1).slice(0, -1).join(``);
|
||||
return `'${config.prefix}${rewrite_url(str)}'`
|
||||
})
|
||||
.replace(/"(https:\/\/|http:\/\/)(.*?)"/gi, function(str) {
|
||||
str = str.split(`"`).slice(1).slice(0, -1).join(``);
|
||||
return `"${config.prefix}${rewrite_url(str)}"`
|
||||
})
|
||||
.replace(/(window|document).location.href/gi, `"${proxy.url.href}"`)
|
||||
.replace(/(window|document).location.hostname/gi, `"${proxy.url.hostname}"`)
|
||||
.replace(/(window|document).location.pathname/gi, `"${proxy.url.path}"`)
|
||||
.replace(/location.href/gi, `"${proxy.url.href}"`)
|
||||
.replace(/location.hostname/gi, `"${proxy.url.hostname}"`)
|
||||
.replace(/location.pathname/gi, `"${proxy.url.path}"`)
|
||||
.replace(/<html(.*?)>/gi, `<html` + '$1' + `><script src="${config.prefix}utils/assets/inject.js" id="_alloy_data" prefix="${config.prefix}" url="${btoa(proxy.url.href)}"></script>`);
|
||||
|
||||
// Temp hotfix for Youtube search bar until my script injection can fix it.
|
||||
|
||||
if (proxy.url.hostname == 'www.youtube.com') { proxy.sendResponse = proxy.sendResponse.replace(/\/results/gi, `${config.prefix}${proxy.url.encoded_origin}/results`); };
|
||||
} else if (proxy.content_type.startsWith('text/css')) {
|
||||
proxy.sendResponse = proxy.sendResponse.toString()
|
||||
.replace(/url\("\/\/(.*?)"\)/gi, `url("http://` + `$1` + `")`)
|
||||
.replace(/url\('\/\/(.*?)'\)/gi, `url('http://` + `$1` + `')`)
|
||||
.replace(/url\(\/\/(.*?)\)/gi, `url(http://` + `$1` + `)`)
|
||||
.replace(/url\("\/(.*?)"\)/gi, `url("${config.prefix}${proxy.url.encoded_origin}/` + `$1` + `")`)
|
||||
.replace(/url\('\/(.*?)'\)/gi, `url('${config.prefix}${proxy.url.encoded_origin}/` + `$1` + `')`)
|
||||
.replace(/url\(\/(.*?)\)/gi, `url(${config.prefix}${proxy.url.encoded_origin}/` + `$1` + `)`)
|
||||
.replace(/"(https:\/\/|http:\/\/)(.*?)"/gi, function(str) {
|
||||
str = str.split(`"`).slice(1).slice(0, -1).join(``);
|
||||
return `"${config.prefix}${rewrite_url(str)}"`
|
||||
})
|
||||
.replace(/'(https:\/\/|http:\/\/)(.*?)'/gi, function(str) {
|
||||
str = str.split(`'`).slice(1).slice(0, -1).join(``);
|
||||
return `'${config.prefix}${rewrite_url(str)}'`
|
||||
})
|
||||
.replace(/\((https:\/\/|http:\/\/)(.*?)\)/gi, function(str) {
|
||||
str = str.split(`(`).slice(1).join(``).split(')').slice(0, -1).join('');
|
||||
return `(${config.prefix}${rewrite_url(str)})`
|
||||
});
|
||||
|
||||
};
|
||||
// We send the response from the server rewritten.
|
||||
res.send(proxy.sendResponse);
|
||||
});
|
||||
|
||||
app.use('/', express.static('public'));
|
||||
|
||||
app.use(async(req, res, next) => {
|
||||
if (req.headers['referer']) {
|
||||
|
||||
let referer = '/' + String(req.headers['referer']).split('/').splice(3).join('/');
|
||||
|
||||
referer = rewrite_url(referer.replace(config.prefix, ''), 'decode').split('/').splice(0, 3).join('/');
|
||||
|
||||
if (referer.startsWith('https://') || referer.startsWith('http://')) {
|
||||
res.redirect(307, config.prefix + btoa(referer) + req.url)
|
||||
} else {
|
||||
if (req.session.url) {
|
||||
|
||||
res.redirect(307, config.prefix + btoa(req.session.url) + req.url)
|
||||
|
||||
} else return next();
|
||||
}
|
||||
} else if (req.session.url) {
|
||||
|
||||
res.redirect(307, config.prefix + btoa(req.session.url) + req.url)
|
||||
|
||||
} else return next();
|
||||
});
|
||||
23
app.json
23
app.json
|
|
@ -1,18 +1,7 @@
|
|||
{
|
||||
"name": "Holy Unblocker v6.8.7",
|
||||
"description": "Holy Unblocker is a secure web proxy service supporting numerous sites while concentrating on detail with design, mechanics, and features. Bypass web filters regardless of whether it is an extension or network-based.",
|
||||
"repository": "https://github.com/QuiteAFancyEmerald/Holy-Unblocker",
|
||||
"logo": "https://raw.githubusercontent.com/QuiteAFancyEmerald/Holy-Unblocker/master/views/assets/img/icon.png",
|
||||
"keywords": [
|
||||
"holyunblocker",
|
||||
"rammerhead",
|
||||
"scramjet",
|
||||
"ultraviolet",
|
||||
"titaniumnetwork",
|
||||
"node",
|
||||
"proxy",
|
||||
"unblocker",
|
||||
"webproxy",
|
||||
"games"
|
||||
]
|
||||
}
|
||||
"name": "Holy Unblocker",
|
||||
"description": "A website that can be used to bypass web filters; both extension and firewall. Alloy Proxy hosted locally. (Can be used as a template.)",
|
||||
"repository": "https://github.com/QuiteAFancyEmerald/HolyUnblockerPublic/",
|
||||
"logo": "https://www.holyubofficial.ml/assets/img/i.png",
|
||||
"keywords": ["node", "proxy", "unblocker", "webproxy", "games", "holyunblocker", "alloy"]
|
||||
}
|
||||
62
auth.js
Normal file
62
auth.js
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
#!/usr/bin/env node
|
||||
|
||||
var crypto = require('crypto');
|
||||
var os = require('os');
|
||||
var querystring = require('querystring');
|
||||
var url = require('url');
|
||||
|
||||
var Cookies = require('cookies');
|
||||
|
||||
var settings = {
|
||||
hashes: [],
|
||||
redirect: '/',
|
||||
};
|
||||
|
||||
module.exports = function(env) {
|
||||
for (var k in env) {
|
||||
settings[k] = env[k];
|
||||
}
|
||||
return module.exports;
|
||||
};
|
||||
|
||||
module.exports.auth = function(req, res, next) {
|
||||
// Allow using with express as well as socket.io
|
||||
next = next || res;
|
||||
var cookies = new Cookies(req);
|
||||
var hash = cookies.get('session') ?
|
||||
module.exports.hash(cookies.get('session')) : '';
|
||||
if (settings.hashes.indexOf(hash) >= 0) {
|
||||
next();
|
||||
} else {
|
||||
next(new Error('Bad session key.'));
|
||||
}
|
||||
};
|
||||
|
||||
module.exports.sign = function(req, res, next) {
|
||||
var cookies = new Cookies(req, res);
|
||||
var query = url.parse(req.url, true).query;
|
||||
cookies.set('session', query.key ? module.exports.hash(query.key) : null);
|
||||
res.writeHead(302, { location: query.path ? query.path : settings.redirect });
|
||||
res.end();
|
||||
};
|
||||
|
||||
module.exports.generate = function() {
|
||||
var key = crypto.randomBytes(24).toString('base64');
|
||||
var hash = module.exports.hash(module.exports.hash(key));
|
||||
settings.hashes.push(hash);
|
||||
return { key: key, hash: hash };
|
||||
};
|
||||
|
||||
module.exports.hash = function(key) {
|
||||
var hmac = crypto.createHmac('SHA256', key);
|
||||
hmac.update(key);
|
||||
return hmac.digest('base64');
|
||||
};
|
||||
|
||||
if (require.main === module) {
|
||||
var pair = module.exports.generate();
|
||||
console.log('Call authlink.generate() for a keypair or add\n' +
|
||||
'authlink({hashes:[\'' + pair.hash + '\']})\n' +
|
||||
'and then authenticate on authlink.sign with the querystring\n?' +
|
||||
querystring.stringify({ key: pair.key }));
|
||||
}
|
||||
|
|
@ -1,3 +0,0 @@
|
|||
(async () => {
|
||||
await import('./src/server.mjs');
|
||||
})();
|
||||
13
config.json
13
config.json
|
|
@ -1,10 +1,5 @@
|
|||
{
|
||||
"title": "HU LTS",
|
||||
"host": "0.0.0.0",
|
||||
"pathname": "/",
|
||||
"minifyScripts": true,
|
||||
"randomizeIdentifiers": true,
|
||||
"production": false,
|
||||
"disguiseFiles": true,
|
||||
"usingSEO": false
|
||||
}
|
||||
"port": "8080",
|
||||
"prefix": "/home/",
|
||||
"ssl": false
|
||||
}
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
version: "3.9"
|
||||
|
||||
services:
|
||||
holy-unblocker:
|
||||
image: quiteafancyemerald/holy-unblocker:6.9.4
|
||||
build: .
|
||||
container_name: holy-unblocker
|
||||
ports:
|
||||
- "8080:8080"
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
NODE_ENV: production
|
||||
|
|
@ -1,60 +0,0 @@
|
|||
module.exports = {
|
||||
apps: [
|
||||
{
|
||||
name: 'HolyUBLTS',
|
||||
script: './backend.js',
|
||||
env: {
|
||||
PORT: 8080,
|
||||
NODE_ENV: 'development',
|
||||
},
|
||||
env_production: {
|
||||
PORT: 8080,
|
||||
NODE_ENV: 'production',
|
||||
},
|
||||
instances: '1',
|
||||
exec_interpreter: 'babel-node',
|
||||
exec_mode: 'fork',
|
||||
autorestart: true,
|
||||
exp_backoff_restart_delay: 100,
|
||||
cron_restart: '*/10 * * * *',
|
||||
kill_timeout: 3000,
|
||||
watch: false,
|
||||
},
|
||||
{
|
||||
name: 'HolyUBLTS-src-refresh',
|
||||
script: './run-command.mjs',
|
||||
args: 'build',
|
||||
env: {
|
||||
NODE_ENV: 'development',
|
||||
},
|
||||
env_production: {
|
||||
NODE_ENV: 'production',
|
||||
},
|
||||
instances: '1',
|
||||
exec_interpreter: 'babel-node',
|
||||
exec_mode: 'fork',
|
||||
autorestart: true,
|
||||
restart_delay: 1000 * 60 * 10,
|
||||
kill_timeout: 3000,
|
||||
watch: false,
|
||||
},
|
||||
{
|
||||
name: 'HolyUBLTS-cache-clean',
|
||||
script: './run-command.mjs',
|
||||
args: 'clean',
|
||||
env: {
|
||||
NODE_ENV: 'development',
|
||||
},
|
||||
env_production: {
|
||||
NODE_ENV: 'production',
|
||||
},
|
||||
instances: '1',
|
||||
exec_interpreter: 'babel-node',
|
||||
exec_mode: 'fork',
|
||||
autorestart: true,
|
||||
restart_delay: 1000 * 60 * 60 * 24 * 7,
|
||||
kill_timeout: 3000,
|
||||
watch: false,
|
||||
},
|
||||
],
|
||||
};
|
||||
|
|
@ -1,144 +0,0 @@
|
|||
## v1.2.41
|
||||
|
||||
- handle removeStaleSessions of .get() returning undefined from corrupted session files
|
||||
|
||||
## v1.2.4
|
||||
|
||||
- fix crashes from corrupted sessions
|
||||
|
||||
## v1.2.3
|
||||
|
||||
- fix memory usage issues when downloading huge files
|
||||
- fix iframing cross-origin proxy
|
||||
|
||||
## v1.2.2
|
||||
|
||||
- add disk cache option for processed JS files. fixes huge server memory usage and enables workers to share the same cache
|
||||
- update `testcafe-hammerhead` to `v24.5.18`. fixes huge server slowdowns as brotli compression level is now adjusted to a much more reasonable value
|
||||
|
||||
## v1.2.11
|
||||
|
||||
- fix huge spikes of memory usage by replacing localStorage system with a custom one
|
||||
- more fixes for iframing
|
||||
|
||||
## v1.2.01
|
||||
|
||||
- avoid using unstable API `fs.cpSync` in build.js
|
||||
|
||||
## v1.2.0
|
||||
|
||||
- added multithreading support
|
||||
|
||||
## v1.1.34
|
||||
|
||||
- convert hooks to stackable rewrite system
|
||||
|
||||
## v1.1.33
|
||||
|
||||
- delete hooks only after all fix function calls
|
||||
|
||||
## v1.1.32
|
||||
|
||||
- fix localStorage communication between windows by forcing them to read/write from realLocalStorage on every (get/set)Item call
|
||||
|
||||
## v1.1.31
|
||||
|
||||
- add argument for ignoring files in `addStaticFilesToProxy`
|
||||
- fix parseProxyUrl().proxy.port for 443 and 80 urls
|
||||
|
||||
## v1.1.3
|
||||
|
||||
- add option to restrict IP to session
|
||||
|
||||
## v1.1.21
|
||||
|
||||
- fix rewriting only non-websocket server headers
|
||||
- fix errors when calling focus()/click()... to a closed iframe
|
||||
- don't strip headers (hook onto res.writeHead) if connection is a websocket
|
||||
|
||||
## v1.1.2
|
||||
|
||||
- build to rammerhead.js and rammerhead.min.js
|
||||
- fix same-domain iframes
|
||||
- add jsdoc definitions for rammerhead store classes
|
||||
- fix http proxy setting not deleting correctly
|
||||
|
||||
## v1.1.1
|
||||
|
||||
- fix uncatchable connection crash errors
|
||||
- avoid shuffling percent encodings
|
||||
- prevent forwarding localStorage endpoint to site by referrer
|
||||
- fix (un)shuffle for location.hash and location.search
|
||||
|
||||
## v1.1.0
|
||||
|
||||
- add url encoding
|
||||
- handle ECONNRESET manually
|
||||
- bring back MemoryStore class for module exports
|
||||
- add server option to disable localStorage syncing
|
||||
- fix `RammerheadSessionFileCache` not saving cache to disk correctly
|
||||
|
||||
## v1.0.8
|
||||
|
||||
- handle websocket EPIPE error
|
||||
- replace hammerhead's connection reset guard with a non-crashing rammerhead's reset guard
|
||||
- add missing element attr getter unrewrite
|
||||
- fix url rewriting for ports 80 and 443
|
||||
|
||||
## v1.0.7
|
||||
|
||||
- disable http2 support (for proxy to destination sites) because error handling is too complicated to handle
|
||||
- removed server headers `report-to` (to avoid proxy url leak) and `cross-origin-embedder-policy` (which fixes reCAPTCHA v3)
|
||||
|
||||
## v1.0.61
|
||||
|
||||
- fix logger.error undefined (caused by not fully updating arguments for httpResponse.badRequest)
|
||||
|
||||
## v1.0.6
|
||||
|
||||
- expose more utils for npm package
|
||||
- show password box if needed for html demo
|
||||
|
||||
## v1.0.5
|
||||
|
||||
- expose more modules for npm package
|
||||
- add support for .env files
|
||||
- add `deleteUnused` config option
|
||||
- fix default 3 day session delete
|
||||
|
||||
## v1.0.43
|
||||
|
||||
- revert "revert fix for fix npm package"
|
||||
|
||||
## v1.0.42
|
||||
|
||||
- add entrypoint index.js for rammerhead package
|
||||
- add package-lock.json to source control
|
||||
|
||||
## v1.0.41
|
||||
|
||||
- update demo link
|
||||
- fix npm package
|
||||
|
||||
## v1.0.4
|
||||
|
||||
- add support for environment variable `DEVELOPMENT`
|
||||
- fix crash when fetching /deletesession with a non-existent session id
|
||||
|
||||
## v1.0.3
|
||||
|
||||
- fix stability issues with websocket
|
||||
|
||||
## v1.0.2
|
||||
|
||||
- update `testcafe-hammerhead` to `v24.5.13`
|
||||
|
||||
## v1.0.1
|
||||
|
||||
- removed multi worker and rate limiting support to defer the complexity to other more suitable platforms like Docker. See [this commit](https://github.com/binary-person/rammerhead/tree/31ac3d23f30487f0dcd14323dc029f4ceb3b235a) if you wish to see the original attempt at this.
|
||||
- removed unused session cleanup (as traversing the session list forces the cache into memory)
|
||||
- lots of cleanup
|
||||
|
||||
## v1.0.0
|
||||
|
||||
- Initial commit
|
||||
|
|
@ -1,22 +0,0 @@
|
|||
# Rammerhead
|
||||
|
||||
This fork is based around running a rammerhead under a reverse proxy and easy to run with docker
|
||||
|
||||
> [!IMPORTANT]
|
||||
> This is designed to run on the directory `/rammer/` any other route won't work
|
||||
|
||||
Examples using This fork:
|
||||
|
||||
- [Ruby](https://ruby.rubynetwork.co)
|
||||
- [Github](https://github.com/ruby-network/ruby)
|
||||
|
||||
Other forks:
|
||||
|
||||
- https://github.com/nebulaservices/rammerhead
|
||||
|
||||
(The fork that most other forks are based off of is below)
|
||||
- https://github.com/holy-unblocker/rammerhead
|
||||
|
||||
Original:
|
||||
|
||||
https://github.com/binary-person/rammerhead
|
||||
|
|
@ -1,71 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const cookie = require('cookie');
|
||||
|
||||
module.exports = {
|
||||
//// HOSTING CONFIGURATION ////
|
||||
|
||||
bindingAddress: "0.0.0.0",
|
||||
port: process.env.PORT || 3000,
|
||||
crossDomainPort: null,
|
||||
publicDir: null,
|
||||
|
||||
ssl: null,
|
||||
|
||||
// this function's return object will determine how the client url rewriting will work.
|
||||
// set them differently from bindingAddress and port if rammerhead is being served
|
||||
// from a reverse proxy.
|
||||
getServerInfo: (req) => {
|
||||
const { origin_proxy } = cookie.parse(req.headers.cookie || '');
|
||||
|
||||
let origin;
|
||||
|
||||
try {
|
||||
origin = new URL(origin_proxy);
|
||||
} catch (error) {
|
||||
origin = new URL(`${req.socket.encrypted ? 'https:' : 'http:'}//${req.headers.host}`);
|
||||
}
|
||||
|
||||
const { hostname, port, protocol } = origin;
|
||||
|
||||
return {
|
||||
hostname,
|
||||
port,
|
||||
crossDomainPort: port,
|
||||
protocol
|
||||
};
|
||||
},
|
||||
|
||||
password: null,
|
||||
|
||||
// disable or enable localStorage sync (turn off if clients send over huge localStorage data, resulting in huge memory usages)
|
||||
disableLocalStorageSync: false,
|
||||
|
||||
// restrict sessions to be only used per IP
|
||||
restrictSessionToIP: false,
|
||||
|
||||
//// REWRITE HEADER CONFIGURATION ////
|
||||
|
||||
stripClientHeaders: [
|
||||
'cf-ipcountry',
|
||||
'cf-ray',
|
||||
'x-forwarded-proto',
|
||||
'cf-visitor',
|
||||
'cf-connecting-ip',
|
||||
'cdn-loop',
|
||||
'x-forwarded-for'
|
||||
],
|
||||
rewriteServerHeaders: {
|
||||
// you can also specify a function to modify/add the header using the original value (undefined if adding the header)
|
||||
// 'x-frame-options': (originalHeaderValue) => '',
|
||||
'x-frame-options': null // set to null to tell rammerhead that you want to delete it
|
||||
},
|
||||
|
||||
//// LOGGING CONFIGURATION ////
|
||||
|
||||
// valid values: 'disabled', 'debug', 'traffic', 'info', 'warn', 'error'
|
||||
generatePrefix: (level) => `[${new Date().toISOString()}] [${level.toUpperCase()}] `,
|
||||
|
||||
// logger depends on this value
|
||||
getIP: (req) => (req.headers['x-forwarded-for'] || req.connection.remoteAddress || '').split(',')[0].trim()
|
||||
};
|
||||
4711
lib/rammerhead/package-lock.json
generated
4711
lib/rammerhead/package-lock.json
generated
File diff suppressed because it is too large
Load diff
|
|
@ -1,56 +0,0 @@
|
|||
{
|
||||
"name": "@rubynetwork/rammerhead",
|
||||
"version": "1.2.41-ruby.2",
|
||||
"description": "User friendly web proxy powered by testcafe-hammerhead",
|
||||
"main": "src/index.js",
|
||||
"private": false,
|
||||
"scripts": {
|
||||
"start": "node server.js",
|
||||
"build": "node src/build.js",
|
||||
"bstart": "npm run build && npm run start",
|
||||
"test": "npm run format && npm run lint && npm run build",
|
||||
"lint": "eslint -c .eslintrc.json --ext .js src",
|
||||
"format": "prettier --write 'src/**/*.js'",
|
||||
"clientes5": "es-check es5 src/client/*.js public/**/*.js"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/binary-person/rammerhead.git"
|
||||
},
|
||||
"author": "Simon Cheng <simoncheng559@gmail.com> (https://github.com/binary-person)",
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/binary-person/rammerhead/issues"
|
||||
},
|
||||
"homepage": "https://github.com/binary-person/rammerhead#readme",
|
||||
"dependencies": {
|
||||
"async-exit-hook": "^2.0.1",
|
||||
"cookie": "^0.5.0",
|
||||
"fastify": "^4.26.2",
|
||||
"keyv-lru-files": "github:holy-unblocker/keyv-lru-files",
|
||||
"mime": "^2.5.2",
|
||||
"testcafe-hammerhead": "24.5.18",
|
||||
"uglify-js": "^3.15.3",
|
||||
"uuid": "^8.3.2",
|
||||
"ws": "^8.2.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"babel-eslint": "^10.1.0",
|
||||
"dotenv-flow": "^3.2.0",
|
||||
"eslint": "^7.32.0",
|
||||
"npm-force-resolutions": "0.0.10",
|
||||
"prettier": "^2.3.2"
|
||||
},
|
||||
"resolutions": {
|
||||
"tmp": "0.2.1"
|
||||
},
|
||||
"files": [
|
||||
"src/*",
|
||||
"holy-config.js",
|
||||
"CHANGELOG.md",
|
||||
"package.json",
|
||||
"README.md",
|
||||
"sessions/.gitkeep",
|
||||
"cache-js/.gitkeep"
|
||||
]
|
||||
}
|
||||
|
|
@ -1,75 +0,0 @@
|
|||
require('dotenv-flow').config();
|
||||
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
const UglifyJS = require('uglify-js');
|
||||
|
||||
// modify unmodifable items that cannot be hooked in rammerhead.js
|
||||
fs.writeFileSync(
|
||||
path.join(__dirname, './client/hammerhead.js'),
|
||||
// part of fix for iframing issue
|
||||
'window["%is-hammerhead%"] = true;\n' +
|
||||
fs
|
||||
.readFileSync(path.join(__dirname, '../node_modules/testcafe-hammerhead/lib/client/hammerhead.js'), 'utf8')
|
||||
// fix iframing proxy issue
|
||||
.replace(
|
||||
/window === window\.top/g,
|
||||
'((window.parent === window.top && !window.top["%hammerhead%"]) || window === window.top)'
|
||||
)
|
||||
.replace(
|
||||
'isCrossDomainParent = parentLocationWrapper === parentWindow.location',
|
||||
'isCrossDomainParent = parentLocationWrapper === parentWindow.location || !parentWindow["%hammerhead%"]'
|
||||
)
|
||||
.replace(
|
||||
'!sameOriginCheck(window1Location, window2Location)',
|
||||
'!(sameOriginCheck(window1Location, window2Location) && (!!window1["%is-hammerhead%"] === !!window2["%is-hammerhead%"]))'
|
||||
)
|
||||
// return false when unable to convert properties on other windows to booleans (!)
|
||||
.replace(
|
||||
/!(parent|parentWindow|window1|window2|window\.top)\[("%(?:is-)?hammerhead%")]/g,
|
||||
'!(() => { try{ return $1[$2]; }catch(error){ return true } })()'
|
||||
)
|
||||
|
||||
// disable saving to localStorage as we are using a completely different implementation
|
||||
.replace('saveToNativeStorage = function () {', 'saveToNativeStorage = function () {return;')
|
||||
|
||||
// prevent calls to elements on a closed iframe
|
||||
.replace('dispatchEvent: function () {', '$& if (!window) return null;')
|
||||
.replace('click: function () {', '$& if (!window) return null;')
|
||||
.replace('setSelectionRange: function () {', '$& if (!window) return null;')
|
||||
.replace('select: function () {', '$& if (!window) return null;')
|
||||
.replace('focus: function () {', '$& if (!window) return null;')
|
||||
.replace('blur: function () {', '$& if (!window) return null;')
|
||||
.replace('preventDefault: function () {', '$& if (!window) return null;')
|
||||
|
||||
// expose hooks for rammerhead.js
|
||||
.replace(
|
||||
'function parseProxyUrl$1',
|
||||
'window.overrideParseProxyUrl = function(rewrite) {parseProxyUrl$$1 = rewrite(parseProxyUrl$$1)}; $&'
|
||||
)
|
||||
.replace(
|
||||
'function getProxyUrl$1',
|
||||
'window.overrideGetProxyUrl = function(rewrite) {getProxyUrl$$1 = rewrite(getProxyUrl$$1)}; $&'
|
||||
)
|
||||
.replace('return window.location.search;', 'return (new URL(get$$2())).search;')
|
||||
.replace('return window.location.hash;', 'return (new URL(get$$2())).hash;')
|
||||
.replace(
|
||||
'setter: function (search) {',
|
||||
'$& var url = new URL(get$$2()); url.search = search; window.location = convertToProxyUrl(url.href); return search;'
|
||||
)
|
||||
.replace(
|
||||
'setter: function (hash) {',
|
||||
'$& var url = new URL(get$$2()); url.hash = hash; window.location.hash = (new URL(convertToProxyUrl(url.href))).hash; return hash;'
|
||||
)
|
||||
);
|
||||
|
||||
const minify = (fileName, newFileName) => {
|
||||
const minified = UglifyJS.minify(fs.readFileSync(path.join(__dirname, './client', fileName), 'utf8'));
|
||||
if (minified.error) {
|
||||
throw minified.error;
|
||||
}
|
||||
fs.writeFileSync(path.join(__dirname, './client', newFileName), minified.code, 'utf8');
|
||||
};
|
||||
|
||||
minify('rammerhead.js', 'rammerhead.min.js');
|
||||
minify('hammerhead.js', 'hammerhead.min.js');
|
||||
|
|
@ -1,80 +0,0 @@
|
|||
/**
|
||||
* @typedef {'disabled'|'debug'|'traffic'|'info'|'warn'|'error'} LoggingLevels
|
||||
*/
|
||||
|
||||
const LOG_LEVELS = ['disabled', 'debug', 'traffic', 'info', 'warn', 'error'];
|
||||
|
||||
function defaultGeneratePrefix(level) {
|
||||
return `[${new Date().toISOString()}] [${level.toUpperCase()}] `;
|
||||
}
|
||||
|
||||
class RammerheadLogging {
|
||||
/**
|
||||
* @param {object} options
|
||||
* @param {LoggingLevels} options.logLevel - logLevel to initialize the logger with
|
||||
* @param {(data: string) => void} options.logger - expects the logger to automatically add a newline, just like what console.log does
|
||||
* @param {*} options.loggerThis - logger will be called with loggerThis binded
|
||||
* @param {(level: LoggingLevels) => string} options.generatePrefix - generates a prefix before every log. set to null to disable
|
||||
*/
|
||||
constructor({
|
||||
logLevel = 'info',
|
||||
logger = console.log,
|
||||
loggerThis = console,
|
||||
generatePrefix = defaultGeneratePrefix
|
||||
} = {}) {
|
||||
this.logger = logger;
|
||||
this.loggerThis = loggerThis;
|
||||
this.generatePrefix = generatePrefix;
|
||||
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
this._logRank = this._getLogRank(logLevel);
|
||||
}
|
||||
|
||||
get logLevel() {
|
||||
return LOG_LEVELS[this._logRank];
|
||||
}
|
||||
/**
|
||||
* logger() will be called based on this log level
|
||||
* @param {LoggingLevels} level
|
||||
*/
|
||||
set logLevel(level) {
|
||||
this._logRank = this._getLogRank(level);
|
||||
}
|
||||
callLogger(data) {
|
||||
this.logger.call(this.loggerThis, data);
|
||||
}
|
||||
/**
|
||||
* @param {LoggingLevels} level
|
||||
* @param {string} data
|
||||
*/
|
||||
log(level, data) {
|
||||
const rank = this._getLogRank(level);
|
||||
// the higher the log level, the more important it is.
|
||||
// ensure it's not disabled
|
||||
if (rank && this._logRank <= rank) {
|
||||
this.callLogger((this.generatePrefix ? this.generatePrefix(level) : '') + data);
|
||||
}
|
||||
}
|
||||
debug = (data) => this.log('debug', data);
|
||||
traffic = (data) => this.log('traffic', data);
|
||||
info = (data) => this.log('info', data);
|
||||
warn = (data) => this.log('warn', data);
|
||||
error = (data) => this.log('error', data);
|
||||
|
||||
/**
|
||||
* @private
|
||||
* @param {LoggingLevels} level
|
||||
* @returns {number}
|
||||
*/
|
||||
_getLogRank(level) {
|
||||
const index = LOG_LEVELS.indexOf(level);
|
||||
if (index === -1) {
|
||||
throw new TypeError(`Invalid log level '${level}'. Valid log levels: ${LOG_LEVELS.join(', ')}`);
|
||||
}
|
||||
return index;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = RammerheadLogging;
|
||||
|
|
@ -1,107 +0,0 @@
|
|||
const RammerheadLogging = require('./RammerheadLogging');
|
||||
const RammerheadSession = require('./RammerheadSession');
|
||||
const RammerheadSessionAbstractStore = require('./RammerheadSessionAbstractStore');
|
||||
|
||||
class RammerheadSessionMemoryStore extends RammerheadSessionAbstractStore {
|
||||
/**
|
||||
* @param {object} options
|
||||
* @param {RammerheadLogging|undefined} options.logger
|
||||
* @param {number|null} options.staleTimeout - if inactivity goes beyond this, then the session is deleted. null to disable
|
||||
* @param {number|null} options.maxToLive - if now - createdAt surpasses maxToLive, then the session is deleted. null to disable
|
||||
* @param {number} options.cleanupInterval - every cleanupInterval ms will run a cleanup check
|
||||
*/
|
||||
constructor({
|
||||
logger = new RammerheadLogging({ logLevel: 'disabled' }),
|
||||
staleTimeout = 1000 * 60 * 30, // 30 minutes
|
||||
maxToLive = 1000 * 60 * 60 * 4, // 4 hours
|
||||
cleanupInterval = 1000 * 60 * 1 // 1 minute
|
||||
} = {}) {
|
||||
super();
|
||||
this.logger = logger;
|
||||
this.mapStore = new Map();
|
||||
setInterval(() => this._cleanupRun(staleTimeout, maxToLive), cleanupInterval).unref();
|
||||
}
|
||||
|
||||
/**
|
||||
* @returns {string[]} - list of session ids in store
|
||||
*/
|
||||
keys() {
|
||||
return Array.from(this.mapStore.keys());
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @returns {boolean}
|
||||
*/
|
||||
has(id) {
|
||||
const exists = this.mapStore.has(id);
|
||||
this.logger.debug(`(MemoryStore.has) ${id} ${exists}`);
|
||||
return exists;
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @param {boolean} updateActiveTimestamp
|
||||
* @returns {RammerheadSession|undefined}
|
||||
*/
|
||||
get(id, updateActiveTimestamp = true) {
|
||||
if (!this.has(id)) return;
|
||||
this.logger.debug(`(MemoryStore.get) ${id} ${updateActiveTimestamp}`);
|
||||
|
||||
const session = this.mapStore.get(id);
|
||||
if (updateActiveTimestamp) session.updateLastUsed();
|
||||
|
||||
return session;
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @returns {RammerheadSession}
|
||||
*/
|
||||
add(id) {
|
||||
if (this.has(id)) throw new Error('the following session already exists: ' + id);
|
||||
this.logger.debug(`(MemoryStore.add) ${id}`);
|
||||
const session = new RammerheadSession({ id });
|
||||
this.mapStore.set(id, session);
|
||||
return session;
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @returns {boolean} - returns true when a delete operation is performed
|
||||
*/
|
||||
delete(id) {
|
||||
return this.mapStore.delete(id);
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @param {string} serializedSession
|
||||
*/
|
||||
addSerializedSession(id, serializedSession) {
|
||||
this.logger.debug(`(MemoryStore.addSerializedSession) adding serialized session id ${id} to store`);
|
||||
const session = RammerheadSession.DeserializeSession(id, serializedSession);
|
||||
session.updateLastUsed();
|
||||
this.mapStore.set(id, session);
|
||||
this.logger.debug(`(FileCache.addSerializedSession) added ${id}`);
|
||||
}
|
||||
|
||||
/**
|
||||
* @private
|
||||
* @param {number|null} staleTimeout
|
||||
* @param {number|null} maxToLive
|
||||
*/
|
||||
_cleanupRun(staleTimeout, maxToLive) {
|
||||
this.logger.debug(`(MemoryStore._cleanupRun) cleanup run. Need to go through ${this.mapStore.size} sessions`);
|
||||
|
||||
const now = Date.now();
|
||||
for (const [sessionId, session] of this.mapStore) {
|
||||
if (
|
||||
(staleTimeout && now - session.lastUsed > staleTimeout) ||
|
||||
(maxToLive && now - session.createdAt > maxToLive)
|
||||
) {
|
||||
this.mapStore.delete(sessionId);
|
||||
this.logger.debug(`(MemoryStore._cleanupRun) delete ${sessionId}`);
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.debug('(MemoryStore._cleanupRun) finished cleanup run');
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = RammerheadSessionMemoryStore;
|
||||
|
|
@ -1,574 +0,0 @@
|
|||
const http = require('http');
|
||||
const https = require('https');
|
||||
const stream = require('stream');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { getPathname } = require('testcafe-hammerhead/lib/utils/url');
|
||||
const { Proxy } = require('testcafe-hammerhead');
|
||||
const WebSocket = require('ws');
|
||||
const httpResponse = require('../util/httpResponse');
|
||||
const streamToString = require('../util/streamToString');
|
||||
const URLPath = require('../util/URLPath');
|
||||
const RammerheadLogging = require('../classes/RammerheadLogging');
|
||||
|
||||
require('../util/fixCorsHeader');
|
||||
require('../util/fixWebsocket');
|
||||
require('../util/addMoreErrorGuards');
|
||||
require('../util/addUrlShuffling');
|
||||
require('../util/patchAsyncResourceProcessor');
|
||||
let addJSDiskCache = function (path, size) {
|
||||
require('../util/addJSDiskCache')(path, size);
|
||||
// modification only works once
|
||||
addJSDiskCache = () => {};
|
||||
};
|
||||
|
||||
/**
|
||||
* taken directly from
|
||||
* https://github.com/DevExpress/testcafe-hammerhead/blob/a9fbf7746ff347f7bdafe1f80cf7135eeac21e34/src/typings/proxy.d.ts#L1
|
||||
* @typedef {object} ServerInfo
|
||||
* @property {string} hostname
|
||||
* @property {number} port
|
||||
* @property {number} crossDomainPort
|
||||
* @property {string} protocol
|
||||
* @property {string} domain
|
||||
* @property {boolean} cacheRequests
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {object} RammerheadServerInfo
|
||||
* @property {string} hostname
|
||||
* @property {number} port
|
||||
* @property {'https:'|'http:'} protocol
|
||||
*/
|
||||
|
||||
/**
|
||||
* @private
|
||||
* @typedef {import('./RammerheadSession')} RammerheadSession
|
||||
*/
|
||||
|
||||
/**
|
||||
* wrapper for hammerhead's Proxy
|
||||
*/
|
||||
class RammerheadProxy extends Proxy {
|
||||
/**
|
||||
*
|
||||
* @param {object} options
|
||||
* @param {RammerheadLogging|undefined} options.logger
|
||||
* @param {(req: http.IncomingMessage) => string} options.loggerGetIP - use custom logic to get IP, either from headers or directly
|
||||
* @param {string} options.bindingAddress - hostname for proxy to bind to
|
||||
* @param {number} options.port - port for proxy to listen to
|
||||
* @param {number|null} options.crossDomainPort - crossDomain port to simulate cross origin requests. set to null
|
||||
* to disable using this. highly not recommended to disable this because it breaks sites that check for the origin header
|
||||
* @param {boolean} options.dontListen - avoid calling http.listen() if you need to use sticky-session to load balance
|
||||
* @param {http.ServerOptions} options.ssl - set to null to disable ssl
|
||||
* @param {(req: http.IncomingMessage) => RammerheadServerInfo} options.getServerInfo - force hammerhead to rewrite using specified
|
||||
* server info (server info includes hostname, port, and protocol). Useful for a reverse proxy setup like nginx where you
|
||||
* need to rewrite the hostname/port/protocol
|
||||
* @param {boolean} options.disableLocalStorageSync - disables localStorage syncing (default: false)
|
||||
* @param {string} options.diskJsCachePath - set to null to disable disk cache and use memory instead (disabled by default)
|
||||
* @param {number} options.jsCacheSize - in bytes. default: 50mb
|
||||
*/
|
||||
constructor({
|
||||
loggerGetIP = (req) => req.socket.remoteAddress,
|
||||
logger = new RammerheadLogging({ logLevel: 'disabled' }),
|
||||
bindingAddress = '127.0.0.1',
|
||||
port = 8080,
|
||||
crossDomainPort = 8081,
|
||||
dontListen = false,
|
||||
ssl = null,
|
||||
getServerInfo = (req) => {
|
||||
const { hostname, port } = new URL('http://' + req.headers.host);
|
||||
return {
|
||||
hostname,
|
||||
port,
|
||||
protocol: req.socket.encrypted ? 'https:' : 'http:'
|
||||
};
|
||||
},
|
||||
disableLocalStorageSync = false,
|
||||
diskJsCachePath = null,
|
||||
jsCacheSize = 50 * 1024 * 1024
|
||||
} = {}) {
|
||||
if (!crossDomainPort) {
|
||||
const httpOrHttps = ssl ? https : http;
|
||||
const proxyHttpOrHttps = http;
|
||||
const originalProxyCreateServer = proxyHttpOrHttps.createServer;
|
||||
const originalCreateServer = httpOrHttps.createServer; // handle recursion case if proxyHttpOrHttps and httpOrHttps are the same
|
||||
let onlyOneHttpServer = null;
|
||||
|
||||
// a hack to force testcafe-hammerhead's proxy library into using only one http port.
|
||||
// a downside to using only one proxy server is that crossdomain requests
|
||||
// will not be simulated correctly
|
||||
proxyHttpOrHttps.createServer = function (...args) {
|
||||
const emptyFunc = () => {};
|
||||
if (onlyOneHttpServer) {
|
||||
// createServer for server1 already called. now we return a mock http server for server2
|
||||
return { on: emptyFunc, listen: emptyFunc, close: emptyFunc };
|
||||
}
|
||||
if (args.length !== 2) throw new Error('unexpected argument length coming from hammerhead');
|
||||
return (onlyOneHttpServer = originalCreateServer(...args));
|
||||
};
|
||||
|
||||
// now, we force the server to listen to a specific port and a binding address, regardless of what
|
||||
// hammerhead server.listen(anything)
|
||||
const originalListen = http.Server.prototype.listen;
|
||||
http.Server.prototype.listen = function (_proxyPort) {
|
||||
if (dontListen) return;
|
||||
originalListen.call(this, port, bindingAddress);
|
||||
};
|
||||
|
||||
// actual proxy initialization
|
||||
// the values don't matter (except for developmentMode), since we'll be rewriting serverInfo anyway
|
||||
super('hostname', 'port', 'port', {
|
||||
ssl,
|
||||
developmentMode: true,
|
||||
cache: true
|
||||
});
|
||||
|
||||
// restore hooked functions to their original state
|
||||
proxyHttpOrHttps.createServer = originalProxyCreateServer;
|
||||
http.Server.prototype.listen = originalListen;
|
||||
} else {
|
||||
// just initialize the proxy as usual, since we don't need to do hacky stuff like the above.
|
||||
// we still need to make sure the proxy binds to the correct address though
|
||||
const originalListen = http.Server.prototype.listen;
|
||||
http.Server.prototype.listen = function (portArg) {
|
||||
if (dontListen) return;
|
||||
originalListen.call(this, portArg, bindingAddress);
|
||||
};
|
||||
super('doesntmatter', port, crossDomainPort, {
|
||||
ssl,
|
||||
developmentMode: true,
|
||||
cache: true
|
||||
});
|
||||
this.crossDomainPort = crossDomainPort;
|
||||
http.Server.prototype.listen = originalListen;
|
||||
}
|
||||
|
||||
this._setupRammerheadServiceRoutes();
|
||||
this._setupLocalStorageServiceRoutes(disableLocalStorageSync);
|
||||
|
||||
this.onRequestPipeline = [];
|
||||
this.onUpgradePipeline = [];
|
||||
this.websocketRoutes = [];
|
||||
this.rewriteServerHeaders = {
|
||||
'permissions-policy': (headerValue) => headerValue && headerValue.replace(/sync-xhr/g, 'sync-yes'),
|
||||
'feature-policy': (headerValue) => headerValue && headerValue.replace(/sync-xhr/g, 'sync-yes'),
|
||||
'referrer-policy': () => 'no-referrer-when-downgrade',
|
||||
'report-to': () => undefined,
|
||||
'cross-origin-embedder-policy': () => undefined
|
||||
};
|
||||
|
||||
this.getServerInfo = getServerInfo;
|
||||
this.serverInfo1 = null; // make sure no one uses these serverInfo
|
||||
this.serverInfo2 = null;
|
||||
|
||||
this.loggerGetIP = loggerGetIP;
|
||||
this.logger = logger;
|
||||
|
||||
addJSDiskCache(diskJsCachePath, jsCacheSize);
|
||||
}
|
||||
|
||||
// add WS routing
|
||||
/**
|
||||
* since we have .GET and .POST, why not add in a .WS also
|
||||
* @param {string|RegExp} route - can be '/route/to/things' or /^\\/route\\/(this)|(that)\\/things$/
|
||||
* @param {(ws: WebSocket, req: http.IncomingMessage) => WebSocket} handler - ws is the connection between the client and the server
|
||||
* @param {object} websocketOptions - read https://www.npmjs.com/package/ws for a list of Websocket.Server options. Note that
|
||||
* the { noServer: true } will always be applied
|
||||
* @returns {WebSocket.Server}
|
||||
*/
|
||||
WS(route, handler, websocketOptions = {}) {
|
||||
if (this.checkIsRoute(route)) {
|
||||
throw new TypeError('WS route already exists');
|
||||
}
|
||||
|
||||
const wsServer = new WebSocket.Server({
|
||||
...websocketOptions,
|
||||
noServer: true
|
||||
});
|
||||
this.websocketRoutes.push({ route, handler, wsServer });
|
||||
|
||||
return wsServer;
|
||||
}
|
||||
unregisterWS(route) {
|
||||
if (!this.getWSRoute(route, true)) {
|
||||
throw new TypeError('websocket route does not exist');
|
||||
}
|
||||
}
|
||||
/**
|
||||
* @param {string} path
|
||||
* @returns {{ route: string|RegExp, handler: (ws: WebSocket, req: http.IncomingMessage) => WebSocket, wsServer: WebSocket.Server}|null}
|
||||
*/
|
||||
getWSRoute(path, doDelete = false) {
|
||||
for (let i = 0; i < this.websocketRoutes.length; i++) {
|
||||
if (
|
||||
(typeof this.websocketRoutes[i].route === 'string' && this.websocketRoutes[i].route === path) ||
|
||||
(this.websocketRoutes[i] instanceof RegExp && this.websocketRoutes[i].route.test(path))
|
||||
) {
|
||||
const route = this.websocketRoutes[i];
|
||||
if (doDelete) {
|
||||
this.websocketRoutes.splice(i, 1);
|
||||
i--;
|
||||
}
|
||||
return route;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
_WSRouteHandler(req, socket, head) {
|
||||
const route = this.getWSRoute(req.url);
|
||||
if (route) {
|
||||
// RH stands for rammerhead. RHROUTE is a custom implementation by rammerhead that is
|
||||
// unrelated to hammerhead
|
||||
this.logger.traffic(`WSROUTE UPGRADE ${this.loggerGetIP(req)} ${req.url}`);
|
||||
route.wsServer.handleUpgrade(req, socket, head, (client, req) => {
|
||||
this.logger.traffic(`WSROUTE OPEN ${this.loggerGetIP(req)} ${req.url}`);
|
||||
client.once('close', () => {
|
||||
this.logger.traffic(`WSROUTE CLOSE ${this.loggerGetIP(req)} ${req.url}`);
|
||||
});
|
||||
route.handler(client, req);
|
||||
});
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
// manage pipelines //
|
||||
/**
|
||||
* @param {(req: http.IncomingMessage,
|
||||
* res: http.ServerResponse,
|
||||
* serverInfo: ServerInfo,
|
||||
* isRoute: boolean,
|
||||
* isWebsocket: boolean) => Promise<boolean>} onRequest - return true to terminate handoff to proxy.
|
||||
* There is an isWebsocket even though there is an onUpgrade pipeline already. This is because hammerhead
|
||||
* processes the onUpgrade and then passes it directly to onRequest, but without the "head" Buffer argument.
|
||||
* The onUpgrade pipeline is to solve that lack of the "head" argument issue in case one needs it.
|
||||
* @param {boolean} beginning - whether to add it to the beginning of the pipeline
|
||||
*/
|
||||
addToOnRequestPipeline(onRequest, beginning = false) {
|
||||
if (beginning) {
|
||||
this.onRequestPipeline.push(onRequest);
|
||||
} else {
|
||||
this.onRequestPipeline.unshift(onRequest);
|
||||
}
|
||||
}
|
||||
/**
|
||||
* @param {(req: http.IncomingMessage,
|
||||
* socket: stream.Duplex,
|
||||
* head: Buffer,
|
||||
* serverInfo: ServerInfo,
|
||||
* isRoute: boolean) => Promise<boolean>} onUpgrade - return true to terminate handoff to proxy
|
||||
* @param {boolean} beginning - whether to add it to the beginning of the pipeline
|
||||
*/
|
||||
addToOnUpgradePipeline(onUpgrade, beginning = false) {
|
||||
if (beginning) {
|
||||
this.onUpgradePipeline.push(onUpgrade);
|
||||
} else {
|
||||
this.onUpgradePipeline.unshift(onUpgrade);
|
||||
}
|
||||
}
|
||||
|
||||
// override hammerhead's proxy functions to use the pipeline //
|
||||
checkIsRoute(req) {
|
||||
if (req instanceof RegExp) {
|
||||
return !!this.getWSRoute(req);
|
||||
}
|
||||
// code modified from
|
||||
// https://github.com/DevExpress/testcafe-hammerhead/blob/879d6ae205bb711dfba8c1c88db635e8803b8840/src/proxy/router.ts#L95
|
||||
const routerQuery = `${req.method} ${getPathname(req.url || '')}`;
|
||||
const route = this.routes.get(routerQuery);
|
||||
if (route) {
|
||||
return true;
|
||||
}
|
||||
for (const routeWithParams of this.routesWithParams) {
|
||||
const routeMatch = routerQuery.match(routeWithParams.re);
|
||||
if (routeMatch) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return !!this.getWSRoute(req.url);
|
||||
}
|
||||
/**
|
||||
* @param {http.IncomingMessage} req
|
||||
* @param {http.ServerResponse} res
|
||||
* @param {ServerInfo} serverInfo
|
||||
*/
|
||||
async _onRequest(req, res, serverInfo) {
|
||||
serverInfo = this._rewriteServerInfo(req);
|
||||
|
||||
const isWebsocket = res instanceof stream.Duplex;
|
||||
|
||||
if (!isWebsocket) {
|
||||
// strip server headers
|
||||
const originalWriteHead = res.writeHead;
|
||||
const self = this;
|
||||
res.writeHead = function (statusCode, statusMessage, headers) {
|
||||
if (!headers) {
|
||||
headers = statusMessage;
|
||||
statusMessage = undefined;
|
||||
}
|
||||
|
||||
if (headers) {
|
||||
const alreadyRewrittenHeaders = [];
|
||||
if (Array.isArray(headers)) {
|
||||
// [content-type, text/html, headerKey, headerValue, ...]
|
||||
for (let i = 0; i < headers.length - 1; i += 2) {
|
||||
const header = headers[i].toLowerCase();
|
||||
if (header in self.rewriteServerHeaders) {
|
||||
alreadyRewrittenHeaders.push(header);
|
||||
headers[i + 1] =
|
||||
self.rewriteServerHeaders[header] &&
|
||||
self.rewriteServerHeaders[header](headers[i + 1]);
|
||||
if (!headers[i + 1]) {
|
||||
headers.splice(i, 2);
|
||||
i -= 2;
|
||||
}
|
||||
}
|
||||
}
|
||||
for (const header in self.rewriteServerHeaders) {
|
||||
if (alreadyRewrittenHeaders.includes(header)) continue;
|
||||
// if user wants to add headers, they can do that here
|
||||
const value = self.rewriteServerHeaders[header] && self.rewriteServerHeaders[header]();
|
||||
if (value) {
|
||||
headers.push(header, value);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
for (const header in headers) {
|
||||
if (header in self.rewriteServerHeaders) {
|
||||
alreadyRewrittenHeaders.push(header);
|
||||
headers[header] =
|
||||
self.rewriteServerHeaders[header] && self.rewriteServerHeaders[header]();
|
||||
if (!headers[header]) {
|
||||
delete headers[header];
|
||||
}
|
||||
}
|
||||
}
|
||||
for (const header in self.rewriteServerHeaders) {
|
||||
if (alreadyRewrittenHeaders.includes(header)) continue;
|
||||
const value = self.rewriteServerHeaders[header] && self.rewriteServerHeaders[header]();
|
||||
if (value) {
|
||||
headers[header] = value;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (statusMessage) {
|
||||
originalWriteHead.call(this, statusCode, statusMessage, headers);
|
||||
} else {
|
||||
originalWriteHead.call(this, statusCode, headers);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
const isRoute = this.checkIsRoute(req);
|
||||
const ip = this.loggerGetIP(req);
|
||||
|
||||
this.logger.traffic(`${isRoute ? 'ROUTE ' : ''}${ip} ${req.url}`);
|
||||
for (const handler of this.onRequestPipeline) {
|
||||
if ((await handler.call(this, req, res, serverInfo, isRoute, isWebsocket)) === true) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
// hammerhead's routing does not support websockets. Allowing it
|
||||
// will result in an error thrown
|
||||
if (isRoute && isWebsocket) {
|
||||
httpResponse.badRequest(this.logger, req, res, ip, 'Rejected unsupported websocket request');
|
||||
return;
|
||||
}
|
||||
super._onRequest(req, res, serverInfo);
|
||||
}
|
||||
/**
|
||||
* @param {http.IncomingMessage} req
|
||||
* @param {stream.Duplex} socket
|
||||
* @param {Buffer} head
|
||||
* @param {ServerInfo} serverInfo
|
||||
*/
|
||||
async _onUpgradeRequest(req, socket, head, serverInfo) {
|
||||
serverInfo = this._rewriteServerInfo(req);
|
||||
for (const handler of this.onUpgradePipeline) {
|
||||
const isRoute = this.checkIsRoute(req);
|
||||
if ((await handler.call(this, req, socket, head, serverInfo, isRoute)) === true) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
if (this._WSRouteHandler(req, socket, head)) return;
|
||||
super._onUpgradeRequest(req, socket, head, serverInfo);
|
||||
}
|
||||
|
||||
/**
|
||||
* @private
|
||||
* @param {http.IncomingMessage} req
|
||||
* @returns {ServerInfo}
|
||||
*/
|
||||
_rewriteServerInfo(req) {
|
||||
const serverInfo = this.getServerInfo(req);
|
||||
return {
|
||||
hostname: serverInfo.hostname,
|
||||
port: serverInfo.port,
|
||||
crossDomainPort: serverInfo.crossDomainPort || this.crossDomainPort || serverInfo.port,
|
||||
protocol: serverInfo.protocol,
|
||||
domain: `${serverInfo.protocol}//${serverInfo.hostname}:${serverInfo.port}`,
|
||||
cacheRequests: false
|
||||
};
|
||||
}
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
_setupRammerheadServiceRoutes() {
|
||||
this.GET('/rammerhead.js', {
|
||||
content: fs.readFileSync(
|
||||
path.join(__dirname, '../client/rammerhead' + (process.env.DEVELOPMENT ? '.js' : '.min.js'))
|
||||
),
|
||||
contentType: 'application/x-javascript'
|
||||
});
|
||||
this.GET('/api/shuffleDict', (req, res) => {
|
||||
const { id } = new URLPath(req.url).getParams();
|
||||
if (!id || !this.openSessions.has(id)) {
|
||||
return httpResponse.badRequest(this.logger, req, res, this.loggerGetIP(req), 'Invalid session id');
|
||||
}
|
||||
res.end(JSON.stringify(this.openSessions.get(id).shuffleDict) || '');
|
||||
});
|
||||
}
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
_setupLocalStorageServiceRoutes(disableSync) {
|
||||
this.POST('/syncLocalStorage', async (req, res) => {
|
||||
if (disableSync) {
|
||||
res.writeHead(404);
|
||||
res.end('server disabled localStorage sync');
|
||||
return;
|
||||
}
|
||||
const badRequest = (msg) => httpResponse.badRequest(this.logger, req, res, this.loggerGetIP(req), msg);
|
||||
const respondJson = (obj) => res.end(JSON.stringify(obj));
|
||||
const { sessionId, origin } = new URLPath(req.url).getParams();
|
||||
|
||||
if (!sessionId || !this.openSessions.has(sessionId)) {
|
||||
return badRequest('Invalid session id');
|
||||
}
|
||||
if (!origin) {
|
||||
return badRequest('Invalid origin');
|
||||
}
|
||||
|
||||
let parsed;
|
||||
try {
|
||||
parsed = JSON.parse(await streamToString(req));
|
||||
} catch (e) {
|
||||
return badRequest('bad client body');
|
||||
}
|
||||
|
||||
const now = Date.now();
|
||||
const session = this.openSessions.get(sessionId, false);
|
||||
if (!session.data.localStorage) session.data.localStorage = {};
|
||||
|
||||
switch (parsed.type) {
|
||||
case 'sync':
|
||||
if (parsed.fetch) {
|
||||
// client is syncing for the first time
|
||||
if (!session.data.localStorage[origin]) {
|
||||
// server does not have any data on origin, so create an empty record
|
||||
// and send an empty object back
|
||||
session.data.localStorage[origin] = { data: {}, timestamp: now };
|
||||
return respondJson({
|
||||
timestamp: now,
|
||||
data: {}
|
||||
});
|
||||
} else {
|
||||
// server does have data, so send data back
|
||||
return respondJson({
|
||||
timestamp: session.data.localStorage[origin].timestamp,
|
||||
data: session.data.localStorage[origin].data
|
||||
});
|
||||
}
|
||||
} else {
|
||||
// sync server and client localStorage
|
||||
|
||||
parsed.timestamp = parseInt(parsed.timestamp);
|
||||
if (isNaN(parsed.timestamp)) return badRequest('must specify valid timestamp');
|
||||
if (parsed.timestamp > now) return badRequest('cannot specify timestamp in the future');
|
||||
if (!parsed.data || typeof parsed.data !== 'object')
|
||||
return badRequest('data must be an object');
|
||||
|
||||
for (const prop in parsed.data) {
|
||||
if (typeof parsed.data[prop] !== 'string') {
|
||||
return badRequest('data[prop] must be a string');
|
||||
}
|
||||
}
|
||||
|
||||
if (!session.data.localStorage[origin]) {
|
||||
// server does not have data, so use client's
|
||||
session.data.localStorage[origin] = { data: parsed.data, timestamp: now };
|
||||
return respondJson({});
|
||||
} else if (session.data.localStorage[origin].timestamp <= parsed.timestamp) {
|
||||
// server data is either the same as client or outdated, but we
|
||||
// sync even if timestamps are the same in case the client changed the localStorage
|
||||
// without updating
|
||||
session.data.localStorage[origin].data = parsed.data;
|
||||
session.data.localStorage[origin].timestamp = parsed.timestamp;
|
||||
return respondJson({});
|
||||
} else {
|
||||
// client data is stale
|
||||
return respondJson({
|
||||
timestamp: session.data.localStorage[origin].timestamp,
|
||||
data: session.data.localStorage[origin].data
|
||||
});
|
||||
}
|
||||
}
|
||||
case 'update':
|
||||
if (!session.data.localStorage[origin])
|
||||
return badRequest('must perform sync first on a new origin');
|
||||
if (!parsed.updateData || typeof parsed.updateData !== 'object')
|
||||
return badRequest('updateData must be an object');
|
||||
for (const prop in parsed.updateData) {
|
||||
if (!parsed.updateData[prop] || typeof parsed.updateData[prop] !== 'string')
|
||||
return badRequest('updateData[prop] must be a non-empty string');
|
||||
}
|
||||
for (const prop in parsed.updateData) {
|
||||
session.data.localStorage[origin].data[prop] = parsed.updateData[prop];
|
||||
}
|
||||
session.data.localStorage[origin].timestamp = now;
|
||||
return respondJson({
|
||||
timestamp: now
|
||||
});
|
||||
default:
|
||||
return badRequest('unknown type ' + parsed.type);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
openSession() {
|
||||
throw new TypeError('unimplemented. please use a RammerheadSessionStore and use their .add() method');
|
||||
}
|
||||
close() {
|
||||
super.close();
|
||||
this.openSessions.close();
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} route
|
||||
* @param {StaticContent | (req: http.IncomingMessage, res: http.ServerResponse) => void} handler
|
||||
*/
|
||||
GET(route, handler) {
|
||||
if (route === '/hammerhead.js') {
|
||||
handler.content = fs.readFileSync(
|
||||
path.join(__dirname, '../client/hammerhead' + (process.env.DEVELOPMENT ? '.js' : '.min.js'))
|
||||
);
|
||||
}
|
||||
super.GET(route, handler);
|
||||
}
|
||||
|
||||
// the following is to fix hamerhead's typescript definitions
|
||||
/**
|
||||
* @param {string} route
|
||||
* @param {StaticContent | (req: http.IncomingMessage, res: http.ServerResponse) => void} handler
|
||||
*/
|
||||
POST(route, handler) {
|
||||
super.POST(route, handler);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = RammerheadProxy;
|
||||
|
|
@ -1,128 +0,0 @@
|
|||
const { Session } = require('testcafe-hammerhead');
|
||||
const UploadStorage = require('testcafe-hammerhead/lib/upload/storage');
|
||||
const generateId = require('../util/generateId');
|
||||
const StrShuffler = require('../util/StrShuffler');
|
||||
|
||||
// disable UploadStorage, a testcafe testing feature we do not need
|
||||
const emptyFunc = () => {};
|
||||
UploadStorage.prototype.copy = emptyFunc;
|
||||
UploadStorage.prototype.get = emptyFunc;
|
||||
UploadStorage.prototype.store = emptyFunc;
|
||||
|
||||
/**
|
||||
* wrapper for initializing Session with saving capabilities
|
||||
*/
|
||||
class RammerheadSession extends Session {
|
||||
data = {};
|
||||
createdAt = Date.now();
|
||||
lastUsed = Date.now();
|
||||
// ID used to generate the global autocomplete session.
|
||||
// This must be a string containing 32 alphanumerical characters.
|
||||
static autocompleteId = 'collectsearchautocompleteresults';
|
||||
|
||||
/**
|
||||
* @param {object} options
|
||||
* @param {string} options.id
|
||||
* @param {boolean} options.dontConnectToData - used when we want to connect to data later (or simply don't want to)
|
||||
* @param {boolean} options.disableShuffling
|
||||
* @param {string[]} options.prependScripts
|
||||
*/
|
||||
constructor({ id = generateId(), dontConnectToData = false, disableShuffling = false, prependScripts = [] } = {}) {
|
||||
super(['blah/blah'], {
|
||||
allowMultipleWindows: true,
|
||||
disablePageCaching: false
|
||||
});
|
||||
|
||||
// necessary abstract methods for Session
|
||||
this.getIframePayloadScript = async () => '';
|
||||
this.getPayloadScript = async () => '';
|
||||
this.getAuthCredentials = () => ({});
|
||||
this.handleFileDownload = () => void 0;
|
||||
this.handlePageError = () => void 0;
|
||||
this.handleAttachment = () => void 0;
|
||||
// this.handlePageError = (ctx, err) => {
|
||||
// console.error(ctx.req.url);
|
||||
// console.error(err);
|
||||
// };
|
||||
|
||||
// intellisense //
|
||||
/**
|
||||
* @type {{ host: string, hostname: string, bypassRules?: string[]; port?: string; proxyAuth?: string, authHeader?: string } | null}
|
||||
*/
|
||||
this.externalProxySettings = null;
|
||||
|
||||
// disable http2. error handling from http2 proxy client to non-http2 user is too complicated to handle
|
||||
// (status code 0, for example, will crash rammerhead)
|
||||
this.isHttp2Disabled = () => true;
|
||||
if (id !== RammerheadSession.autocompleteId) {
|
||||
this.injectable.scripts.push(...prependScripts);
|
||||
this.injectable.scripts.push('/rammer/rammerhead.js');
|
||||
} else {
|
||||
this.injectable.scripts.length = 0;
|
||||
}
|
||||
|
||||
this.id = id;
|
||||
this.shuffleDict = disableShuffling ? null : StrShuffler.generateDictionary();
|
||||
if (!dontConnectToData) {
|
||||
this.connectHammerheadToData();
|
||||
}
|
||||
}
|
||||
/**
|
||||
* @param {boolean} dontCookie - set this to true if the store is using a more reliable approach to
|
||||
* saving the cookies (like in serializeSession)
|
||||
*/
|
||||
connectHammerheadToData(dontCookie = false) {
|
||||
this._connectObjectToHook(this, 'createdAt');
|
||||
this._connectObjectToHook(this, 'lastUsed');
|
||||
this._connectObjectToHook(this, 'injectable');
|
||||
this._connectObjectToHook(this, 'externalProxySettings');
|
||||
this._connectObjectToHook(this, 'shuffleDict');
|
||||
if (!dontCookie) this._connectObjectToHook(this.cookies._cookieJar.store, 'idx', 'cookies');
|
||||
}
|
||||
|
||||
updateLastUsed() {
|
||||
this.lastUsed = Date.now();
|
||||
}
|
||||
serializeSession() {
|
||||
return JSON.stringify({
|
||||
data: this.data,
|
||||
serializedCookieJar: this.cookies.serializeJar()
|
||||
});
|
||||
}
|
||||
// hook system and serializing are for two different store systems
|
||||
static DeserializeSession(id, serializedSession) {
|
||||
const parsed = JSON.parse(serializedSession);
|
||||
if (!parsed.data) throw new Error('expected serializedSession to contain data object');
|
||||
if (!parsed.serializedCookieJar)
|
||||
throw new Error('expected serializedSession to contain serializedCookieJar object');
|
||||
|
||||
const session = new RammerheadSession({ id, dontConnectToData: true });
|
||||
session.data = parsed.data;
|
||||
session.connectHammerheadToData(true);
|
||||
session.cookies.setJar(parsed.serializedCookieJar);
|
||||
return session;
|
||||
}
|
||||
|
||||
hasRequestEventListeners() {
|
||||
// force forceProxySrcForImage to be true
|
||||
// see https://github.com/DevExpress/testcafe-hammerhead/blob/a9fbf7746ff347f7bdafe1f80cf7135eeac21e34/src/session/index.ts#L180
|
||||
return true;
|
||||
}
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
_connectObjectToHook(obj, prop, dataProp = prop) {
|
||||
const originalValue = obj[prop];
|
||||
Object.defineProperty(obj, prop, {
|
||||
get: () => this.data[dataProp],
|
||||
set: (value) => {
|
||||
this.data[dataProp] = value;
|
||||
}
|
||||
});
|
||||
if (!(dataProp in this.data)) {
|
||||
this.data[dataProp] = originalValue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = RammerheadSession;
|
||||
|
|
@ -1,101 +0,0 @@
|
|||
/* eslint-disable no-unused-vars */
|
||||
|
||||
/**
|
||||
* @private
|
||||
* @typedef {import("./RammerheadSession")} RammerheadSession
|
||||
*/
|
||||
|
||||
/**
|
||||
* this is the minimum in order to have a fully working versatile session store. Though it is an abstract
|
||||
* class and should be treated as such, it includes default functions deemed necessary that are not
|
||||
* particular to different implementations
|
||||
* @abstract
|
||||
*/
|
||||
class RammerheadSessionAbstractStore {
|
||||
constructor() {
|
||||
if (this.constructor === RammerheadSessionAbstractStore) {
|
||||
throw new Error('abstract classes cannot be instantiated');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {import('./RammerheadProxy')} proxy - this will overwrite proxy.openSessions with this class instance and
|
||||
* adds a request handler that calls loadSessionToMemory
|
||||
* @param {boolean} removeExistingSessions - whether to remove all sessions before overwriting proxy.openSessions
|
||||
*/
|
||||
attachToProxy(proxy, removeExistingSessions = true) {
|
||||
if (proxy.openSessions === this) throw new TypeError('already attached to proxy');
|
||||
|
||||
if (removeExistingSessions) {
|
||||
for (const [, session] of proxy.openSessions.entries()) {
|
||||
proxy.closeSession(session);
|
||||
}
|
||||
}
|
||||
proxy.openSessions = this;
|
||||
}
|
||||
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
_mustImplement() {
|
||||
throw new Error('must be implemented');
|
||||
}
|
||||
|
||||
/**
|
||||
* @abstract
|
||||
* @returns {string[]} - list of session ids in store
|
||||
*/
|
||||
keys() {
|
||||
this._mustImplement();
|
||||
}
|
||||
/**
|
||||
* @abstract
|
||||
* @param {string} id
|
||||
* @returns {boolean}
|
||||
*/
|
||||
has(id) {
|
||||
this._mustImplement();
|
||||
}
|
||||
/**
|
||||
* @abstract
|
||||
* @param {string} id
|
||||
* @param {boolean} updateActiveTimestamp
|
||||
* @returns {RammerheadSession|undefined}
|
||||
*/
|
||||
get(id, updateActiveTimestamp = true) {
|
||||
this._mustImplement();
|
||||
}
|
||||
/**
|
||||
* the implemented method here will use the dataOperation option in RammerheadSession however they
|
||||
* see fit
|
||||
* @abstract
|
||||
* @param {string} id
|
||||
* @returns {RammerheadSession}
|
||||
*/
|
||||
add(id) {
|
||||
this._mustImplement();
|
||||
}
|
||||
/**
|
||||
* @abstract
|
||||
* @param {string} id
|
||||
* @returns {boolean} - returns true when a delete operation is performed
|
||||
*/
|
||||
delete(id) {
|
||||
this._mustImplement();
|
||||
}
|
||||
/**
|
||||
* @abstract
|
||||
* @param {string} id
|
||||
* @param {string} serializedSession
|
||||
*/
|
||||
addSerializedSession(id, serializedSession) {
|
||||
this._mustImplement();
|
||||
}
|
||||
/**
|
||||
* optional abstract method
|
||||
*/
|
||||
close() {}
|
||||
}
|
||||
|
||||
module.exports = RammerheadSessionAbstractStore;
|
||||
|
|
@ -1,238 +0,0 @@
|
|||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const RammerheadSessionAbstractStore = require('./RammerheadSessionAbstractStore');
|
||||
const RammerheadSession = require('./RammerheadSession');
|
||||
const RammerheadLogging = require('../classes/RammerheadLogging');
|
||||
|
||||
// rh = rammerhead. extra f to distinguish between rhsession (folder) and rhfsession (file)
|
||||
const sessionFileExtension = '.rhfsession';
|
||||
|
||||
class RammerheadSessionFileCache extends RammerheadSessionAbstractStore {
|
||||
/**
|
||||
*
|
||||
* @param {object} options
|
||||
* @param {string} options.saveDirectory - all cacheTimeouted sessions will be saved in this folder
|
||||
* to avoid storing all the sessions in the memory.
|
||||
* @param {RammerheadLogging|undefined} options.logger
|
||||
* @param {number} options.cacheTimeout - timeout before saving cache to disk and deleting it from the cache
|
||||
* @param {number} options.cacheCheckInterval
|
||||
* @param {boolean} options.deleteUnused - (default: true) if set to true, it deletes unused sessions when saving cache to disk
|
||||
* @param {boolean} options.deleteCorruptedSessions - (default: true) if set to true, auto-deletes session files that
|
||||
* give a parse error (happens when nodejs exits abruptly while serializing session to disk)
|
||||
* @param {object|null} options.staleCleanupOptions - set to null to disable cleaning up stale sessions
|
||||
* @param {number|null} options.staleCleanupOptions.staleTimeout - stale sessions that are inside saveDirectory that go over
|
||||
* this timeout will be deleted. Set to null to disable.
|
||||
* @param {number|null} options.staleCleanupOptions.maxToLive - any created sessions that are older than this will be deleted no matter the usage.
|
||||
* Set to null to disable.
|
||||
* @param {number} options.staleCleanupOptions.staleCheckInterval
|
||||
*/
|
||||
constructor({
|
||||
saveDirectory = path.join(__dirname, '../../sessions'),
|
||||
logger = new RammerheadLogging({ logLevel: 'disabled' }),
|
||||
cacheTimeout = 1000 * 60 * 20, // 20 minutes
|
||||
cacheCheckInterval = 1000 * 60 * 10, // 10 minutes,
|
||||
deleteUnused = true,
|
||||
deleteCorruptedSessions = true,
|
||||
staleCleanupOptions = {
|
||||
staleTimeout: 1000 * 60 * 60 * 24 * 1, // 1 day
|
||||
maxToLive: 1000 * 60 * 60 * 24 * 4, // four days
|
||||
staleCheckInterval: 1000 * 60 * 60 * 1 // 1 hour
|
||||
}
|
||||
} = {}) {
|
||||
super();
|
||||
this.saveDirectory = saveDirectory;
|
||||
this.logger = logger;
|
||||
this.deleteUnused = deleteUnused;
|
||||
this.cacheTimeout = cacheTimeout;
|
||||
this.deleteCorruptedSessions = deleteCorruptedSessions;
|
||||
/**
|
||||
* @type {Map.<string, RammerheadSession>}
|
||||
*/
|
||||
this.cachedSessions = new Map();
|
||||
setInterval(() => this._saveCacheToDisk(), cacheCheckInterval).unref();
|
||||
if (staleCleanupOptions) {
|
||||
this._removeStaleSessions(staleCleanupOptions.staleTimeout, staleCleanupOptions.maxToLive);
|
||||
setInterval(
|
||||
() => this._removeStaleSessions(staleCleanupOptions.staleTimeout, staleCleanupOptions.maxToLive),
|
||||
staleCleanupOptions.staleCheckInterval
|
||||
).unref();
|
||||
}
|
||||
}
|
||||
|
||||
keysStore() {
|
||||
return fs
|
||||
.readdirSync(this.saveDirectory)
|
||||
.filter((file) => file.endsWith(sessionFileExtension))
|
||||
.map((file) => file.slice(0, -sessionFileExtension.length));
|
||||
}
|
||||
/**
|
||||
* @returns {string[]} - list of session ids in store
|
||||
*/
|
||||
keys() {
|
||||
let arr = this.keysStore();
|
||||
for (const id of this.cachedSessions.keys()) {
|
||||
if (!arr.includes(id)) arr.push(id);
|
||||
}
|
||||
return arr;
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @returns {boolean}
|
||||
*/
|
||||
has(id) {
|
||||
return this.cachedSessions.has(id) || fs.existsSync(this._getSessionFilePath(id));
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @param {boolean} updateActiveTimestamp
|
||||
* @returns {RammerheadSession|undefined}
|
||||
*/
|
||||
get(id, updateActiveTimestamp = true, cacheToMemory = true) {
|
||||
if (!this.has(id)) {
|
||||
this.logger.debug(`(FileCache.get) ${id} does not exist`);
|
||||
return;
|
||||
}
|
||||
|
||||
this.logger.debug(`(FileCache.get) ${id}`);
|
||||
if (this.cachedSessions.has(id)) {
|
||||
this.logger.debug(`(FileCache.get) returning memory cached session ${id}`);
|
||||
return this.cachedSessions.get(id);
|
||||
}
|
||||
|
||||
let session;
|
||||
try {
|
||||
session = RammerheadSession.DeserializeSession(id, fs.readFileSync(this._getSessionFilePath(id)));
|
||||
} catch (e) {
|
||||
if (e.name === 'SyntaxError' && e.message.includes('JSON')) {
|
||||
this.logger.warn(`(FileCache.get) ${id} bad JSON`);
|
||||
if (this.deleteCorruptedSessions) {
|
||||
this.delete(id);
|
||||
this.logger.warn(`(FileCache.get) ${id} deleted because of bad JSON`);
|
||||
}
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if (updateActiveTimestamp) {
|
||||
this.logger.debug(`(FileCache.get) ${id} update active timestamp`);
|
||||
session.updateLastUsed();
|
||||
}
|
||||
|
||||
if (cacheToMemory) {
|
||||
this.cachedSessions.set(id, session);
|
||||
this.logger.debug(`(FileCache.get) saved ${id} into cache memory`);
|
||||
}
|
||||
|
||||
return session;
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @returns {RammerheadSession}
|
||||
*/
|
||||
add(id) {
|
||||
if (this.has(id)) throw new Error(`session ${id} already exists`);
|
||||
|
||||
fs.writeFileSync(this._getSessionFilePath(id), new RammerheadSession().serializeSession());
|
||||
|
||||
this.logger.debug(`FileCache.add ${id}`);
|
||||
|
||||
return this.get(id);
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @returns {boolean} - returns true when a delete operation is performed
|
||||
*/
|
||||
delete(id) {
|
||||
if (id === RammerheadSession.autocompleteId) return false;
|
||||
this.logger.debug(`(FileCache.delete) deleting ${id}`);
|
||||
if (this.has(id)) {
|
||||
fs.unlinkSync(this._getSessionFilePath(id));
|
||||
this.cachedSessions.delete(id);
|
||||
this.logger.debug(`(FileCache.delete) deleted ${id}`);
|
||||
return true;
|
||||
}
|
||||
this.logger.debug(`(FileCache.delete) ${id} does not exist`);
|
||||
return false;
|
||||
}
|
||||
/**
|
||||
* @param {string} id
|
||||
* @param {string} serializedSession
|
||||
*/
|
||||
addSerializedSession(id, serializedSession) {
|
||||
this.logger.debug(`(FileCache.addSerializedSession) adding serialized session id ${id} to store`);
|
||||
const session = RammerheadSession.DeserializeSession(id, serializedSession);
|
||||
fs.writeFileSync(this._getSessionFilePath(id), session.serializeSession());
|
||||
this.logger.debug(`(FileCache.addSerializedSession) added ${id} to cache`);
|
||||
}
|
||||
close() {
|
||||
this.logger.debug(`(FileCache.close) calling _saveCacheToDisk`);
|
||||
this._saveCacheToDisk(true);
|
||||
}
|
||||
|
||||
/**
|
||||
* @private
|
||||
* @param {string} id
|
||||
* @returns {string} - generated file path to session
|
||||
*/
|
||||
_getSessionFilePath(id) {
|
||||
return path.join(this.saveDirectory, id.replace(/\/|\\/g, '') + sessionFileExtension);
|
||||
}
|
||||
/**
|
||||
* @private
|
||||
* @param {number|null} staleTimeout
|
||||
* @param {number|null} maxToLive
|
||||
*/
|
||||
_removeStaleSessions(staleTimeout, maxToLive) {
|
||||
const sessionIds = this.keysStore();
|
||||
let deleteCount = 0;
|
||||
this.logger.debug(`(FileCache._removeStaleSessions) Need to go through ${sessionIds.length} sessions in store`);
|
||||
|
||||
const now = Date.now();
|
||||
for (const id of sessionIds) {
|
||||
const session = this.get(id, false, false);
|
||||
if (!session) {
|
||||
this.logger.debug(`(FileCache._removeStaleSessions) skipping ${id} as .get() returned undefined`);
|
||||
continue;
|
||||
}
|
||||
if (
|
||||
(staleTimeout && now - session.lastUsed > staleTimeout) ||
|
||||
(maxToLive && now - session.createdAt > maxToLive)
|
||||
) {
|
||||
this.delete(id);
|
||||
deleteCount++;
|
||||
this.logger.debug(`(FileCache._removeStaleSessions) deleted ${id}`);
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.debug(`(FileCache._removeStaleSessions) Deleted ${deleteCount} sessions from store`);
|
||||
}
|
||||
/**
|
||||
* @private
|
||||
*/
|
||||
_saveCacheToDisk(forceSave) {
|
||||
let deleteCount = 0;
|
||||
this.logger.debug(`(FileCache._saveCacheToDisk) need to go through ${this.cachedSessions.size} sessions`);
|
||||
|
||||
const now = Date.now();
|
||||
for (const [sessionId, session] of this.cachedSessions) {
|
||||
if (forceSave || now - session.lastUsed > this.cacheTimeout) {
|
||||
if (session.lastUsed === session.createdAt && this.deleteUnused) {
|
||||
this.cachedSessions.delete(sessionId);
|
||||
deleteCount++;
|
||||
this.logger.debug(`(FileCache._saveCacheToDisk) deleted unused ${sessionId} from memory`);
|
||||
} else {
|
||||
fs.writeFileSync(this._getSessionFilePath(sessionId), session.serializeSession());
|
||||
this.cachedSessions.delete(sessionId);
|
||||
deleteCount++;
|
||||
this.logger.debug(
|
||||
`(FileCache._saveCacheToDisk) removed ${sessionId} from memory and saved to store`
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.debug(`(FileCache._saveCacheToDisk) Removed ${deleteCount} sessions from memory`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = RammerheadSessionFileCache;
|
||||
File diff suppressed because it is too large
Load diff
|
|
@ -1,497 +0,0 @@
|
|||
(function () {
|
||||
var hammerhead = window['%hammerhead%'];
|
||||
if (!hammerhead) throw new Error('hammerhead not loaded yet');
|
||||
if (hammerhead.settings._settings.sessionId) {
|
||||
// task.js already loaded. this will likely never happen though since this file loads before task.js
|
||||
console.warn('unexpected task.js to load before rammerhead.js. url shuffling cannot be used');
|
||||
main();
|
||||
} else {
|
||||
// wait for task.js to load
|
||||
hookHammerheadStartOnce(main);
|
||||
// before task.js, we need to add url shuffling
|
||||
addUrlShuffling();
|
||||
}
|
||||
|
||||
function main() {
|
||||
fixUrlRewrite();
|
||||
fixElementGetter();
|
||||
fixCrossWindowLocalStorage();
|
||||
|
||||
delete window.overrideGetProxyUrl;
|
||||
delete window.overrideParseProxyUrl;
|
||||
delete window.overrideIsCrossDomainWindows;
|
||||
|
||||
// other code if they want to also hook onto hammerhead start //
|
||||
if (window.rammerheadStartListeners) {
|
||||
for (const eachListener of window.rammerheadStartListeners) {
|
||||
try {
|
||||
eachListener();
|
||||
} catch (e) {
|
||||
console.error(e);
|
||||
}
|
||||
}
|
||||
delete window.rammerheadStartListeners;
|
||||
}
|
||||
|
||||
// sync localStorage code //
|
||||
// disable if other code wants to implement their own localStorage site wrapper
|
||||
if (window.rammerheadDisableLocalStorageImplementation) {
|
||||
delete window.rammerheadDisableLocalStorageImplementation;
|
||||
return;
|
||||
}
|
||||
// consts
|
||||
var timestampKey = 'rammerhead_synctimestamp';
|
||||
var updateInterval = 5000;
|
||||
var isSyncing = false;
|
||||
|
||||
var proxiedLocalStorage = localStorage;
|
||||
var realLocalStorage = proxiedLocalStorage.internal.nativeStorage;
|
||||
var sessionId = hammerhead.settings._settings.sessionId;
|
||||
var origin = window.__get$(window, 'location').origin;
|
||||
var keyChanges = [];
|
||||
|
||||
try {
|
||||
syncLocalStorage();
|
||||
} catch (e) {
|
||||
if (e.message !== 'server wants to disable localStorage syncing') {
|
||||
throw e;
|
||||
}
|
||||
return;
|
||||
}
|
||||
proxiedLocalStorage.addChangeEventListener(function (event) {
|
||||
if (isSyncing) return;
|
||||
if (keyChanges.indexOf(event.key) === -1) keyChanges.push(event.key);
|
||||
});
|
||||
setInterval(function () {
|
||||
var update = compileUpdate();
|
||||
if (!update) return;
|
||||
localStorageRequest({ type: 'update', updateData: update }, function (data) {
|
||||
updateTimestamp(data.timestamp);
|
||||
});
|
||||
|
||||
keyChanges = [];
|
||||
}, updateInterval);
|
||||
document.addEventListener('visibilitychange', function () {
|
||||
if (document.visibilityState === 'hidden') {
|
||||
var update = compileUpdate();
|
||||
if (update) {
|
||||
// even though we'll never get the timestamp, it's fine. this way,
|
||||
// the data is safer
|
||||
hammerhead.nativeMethods.sendBeacon.call(
|
||||
window.navigator,
|
||||
getSyncStorageEndpoint(),
|
||||
JSON.stringify({
|
||||
type: 'update',
|
||||
updateData: update
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
function syncLocalStorage() {
|
||||
isSyncing = true;
|
||||
var timestamp = getTimestamp();
|
||||
var response;
|
||||
if (!timestamp) {
|
||||
// first time syncing
|
||||
response = localStorageRequest({ type: 'sync', fetch: true });
|
||||
if (response.timestamp) {
|
||||
updateTimestamp(response.timestamp);
|
||||
overwriteLocalStorage(response.data);
|
||||
}
|
||||
} else {
|
||||
// resync
|
||||
response = localStorageRequest({ type: 'sync', timestamp: timestamp, data: proxiedLocalStorage });
|
||||
if (response.timestamp) {
|
||||
updateTimestamp(response.timestamp);
|
||||
overwriteLocalStorage(response.data);
|
||||
}
|
||||
}
|
||||
isSyncing = false;
|
||||
|
||||
function overwriteLocalStorage(data) {
|
||||
if (!data || typeof data !== 'object') throw new TypeError('data must be an object');
|
||||
proxiedLocalStorage.clear();
|
||||
for (var prop in data) {
|
||||
proxiedLocalStorage[prop] = data[prop];
|
||||
}
|
||||
}
|
||||
}
|
||||
function updateTimestamp(timestamp) {
|
||||
if (!timestamp) throw new TypeError('timestamp must be defined');
|
||||
if (isNaN(parseInt(timestamp))) throw new TypeError('timestamp must be a number. received' + timestamp);
|
||||
realLocalStorage[timestampKey] = timestamp;
|
||||
}
|
||||
function getTimestamp() {
|
||||
var rawTimestamp = realLocalStorage[timestampKey];
|
||||
var timestamp = parseInt(rawTimestamp);
|
||||
if (isNaN(timestamp)) {
|
||||
if (rawTimestamp) {
|
||||
console.warn('invalid timestamp retrieved from storage: ' + rawTimestamp);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
return timestamp;
|
||||
}
|
||||
function getSyncStorageEndpoint() {
|
||||
return (
|
||||
'/syncLocalStorage?sessionId=' + encodeURIComponent(sessionId) + '&origin=' + encodeURIComponent(origin)
|
||||
);
|
||||
}
|
||||
function localStorageRequest(data, callback) {
|
||||
if (!data || typeof data !== 'object') throw new TypeError('data must be an object');
|
||||
|
||||
var request = hammerhead.createNativeXHR();
|
||||
// make synchronous if there is no callback
|
||||
request.open('POST', getSyncStorageEndpoint(), !!callback);
|
||||
request.setRequestHeader('content-type', 'application/json');
|
||||
request.send(JSON.stringify(data));
|
||||
function check() {
|
||||
if (request.status === 404) {
|
||||
throw new Error('server wants to disable localStorage syncing');
|
||||
}
|
||||
if (request.status !== 200)
|
||||
throw new Error(
|
||||
'server sent a non 200 code. got ' + request.status + '. Response: ' + request.responseText
|
||||
);
|
||||
}
|
||||
if (!callback) {
|
||||
check();
|
||||
return JSON.parse(request.responseText);
|
||||
} else {
|
||||
request.onload = function () {
|
||||
check();
|
||||
callback(JSON.parse(request.responseText));
|
||||
};
|
||||
}
|
||||
}
|
||||
function compileUpdate() {
|
||||
if (!keyChanges.length) return null;
|
||||
|
||||
var updates = {};
|
||||
for (var i = 0; i < keyChanges.length; i++) {
|
||||
updates[keyChanges[i]] = proxiedLocalStorage[keyChanges[i]];
|
||||
}
|
||||
|
||||
keyChanges = [];
|
||||
return updates;
|
||||
}
|
||||
}
|
||||
|
||||
var noShuffling = false;
|
||||
function addUrlShuffling() {
|
||||
const request = new XMLHttpRequest();
|
||||
const sessionId = (location.pathname.slice(1).match(/^[a-z0-9]+/i) || [])[0];
|
||||
if (!sessionId) {
|
||||
console.warn('cannot get session id from url');
|
||||
return;
|
||||
}
|
||||
request.open('GET', '/rammer/api/shuffleDict?id=' + sessionId, false);
|
||||
request.send();
|
||||
if (request.status !== 200) {
|
||||
console.warn(
|
||||
`received a non 200 status code while trying to fetch shuffleDict:\nstatus: ${request.status}\nresponse: ${request.responseText}`
|
||||
);
|
||||
return;
|
||||
}
|
||||
const shuffleDict = JSON.parse(request.responseText);
|
||||
if (!shuffleDict) return;
|
||||
|
||||
// pasting entire thing here "because lazy" - m28
|
||||
const mod = (n, m) => ((n % m) + m) % m;
|
||||
const baseDictionary = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz~-';
|
||||
const shuffledIndicator = '_rhs';
|
||||
const generateDictionary = function () {
|
||||
let str = '';
|
||||
const split = baseDictionary.split('');
|
||||
while (split.length > 0) {
|
||||
str += split.splice(Math.floor(Math.random() * split.length), 1)[0];
|
||||
}
|
||||
return str;
|
||||
};
|
||||
class StrShuffler {
|
||||
constructor(dictionary = generateDictionary()) {
|
||||
this.dictionary = dictionary;
|
||||
}
|
||||
shuffle(str) {
|
||||
if (str.startsWith(shuffledIndicator)) {
|
||||
return str;
|
||||
}
|
||||
let shuffledStr = '';
|
||||
for (let i = 0; i < str.length; i++) {
|
||||
const char = str.charAt(i);
|
||||
const idx = baseDictionary.indexOf(char);
|
||||
if (char === '%' && str.length - i >= 3) {
|
||||
shuffledStr += char;
|
||||
shuffledStr += str.charAt(++i);
|
||||
shuffledStr += str.charAt(++i);
|
||||
} else if (idx === -1) {
|
||||
shuffledStr += char;
|
||||
} else {
|
||||
shuffledStr += this.dictionary.charAt(mod(idx + i, baseDictionary.length));
|
||||
}
|
||||
}
|
||||
return shuffledIndicator + shuffledStr;
|
||||
}
|
||||
unshuffle(str) {
|
||||
if (!str.startsWith(shuffledIndicator)) {
|
||||
return str;
|
||||
}
|
||||
|
||||
str = str.slice(shuffledIndicator.length);
|
||||
|
||||
let unshuffledStr = '';
|
||||
for (let i = 0; i < str.length; i++) {
|
||||
const char = str.charAt(i);
|
||||
const idx = this.dictionary.indexOf(char);
|
||||
if (char === '%' && str.length - i >= 3) {
|
||||
unshuffledStr += char;
|
||||
unshuffledStr += str.charAt(++i);
|
||||
unshuffledStr += str.charAt(++i);
|
||||
} else if (idx === -1) {
|
||||
unshuffledStr += char;
|
||||
} else {
|
||||
unshuffledStr += baseDictionary.charAt(mod(idx - i, baseDictionary.length));
|
||||
}
|
||||
}
|
||||
return unshuffledStr;
|
||||
}
|
||||
}
|
||||
|
||||
function patch(url) {
|
||||
// url = _rhsEPrcb://bqhQko.tHR/
|
||||
// remove slash
|
||||
return url.replace(/(^.*?:\/)\//, '$1');
|
||||
}
|
||||
|
||||
function unpatch(url) {
|
||||
// url = _rhsEPrcb:/bqhQko.tHR/
|
||||
// restore slash
|
||||
return url.replace(/^.*?:\/(?!\/)/, '$&/');
|
||||
}
|
||||
|
||||
const replaceUrl = (url, replacer) => {
|
||||
// regex: https://google.com/ sessionid/ url
|
||||
return (url || '').replace(/^((?:[a-z0-9]+:\/\/[^/]+)?(?:\/[^/]+\/))([^]+)/i, function (_, g1, g2) {
|
||||
return g1 + replacer(g2);
|
||||
})
|
||||
};
|
||||
|
||||
const shuffler = new StrShuffler(shuffleDict);
|
||||
|
||||
// shuffle current url if it isn't already shuffled (unshuffled urls likely come from user input)
|
||||
const oldUrl = location.href;
|
||||
const newUrl = replaceUrl(location.href, (url) => shuffler.shuffle(url));
|
||||
if (oldUrl !== newUrl) {
|
||||
history.replaceState(null, null, newUrl);
|
||||
}
|
||||
|
||||
const getProxyUrl = hammerhead.utils.url.getProxyUrl;
|
||||
const parseProxyUrl = hammerhead.utils.url.parseProxyUrl;
|
||||
hammerhead.utils.url.overrideGetProxyUrl(function (url, opts) {
|
||||
if (noShuffling) {
|
||||
return getProxyUrl(url, opts);
|
||||
}
|
||||
return replaceUrl(getProxyUrl(url, opts), (u) => patch(shuffler.shuffle(u)), true)
|
||||
});
|
||||
hammerhead.utils.url.overrideParseProxyUrl(function (url) {
|
||||
return parseProxyUrl(replaceUrl(url, (u) => shuffler.unshuffle(unpatch(u)), false));
|
||||
});
|
||||
// manual hooks //
|
||||
window.overrideGetProxyUrl(
|
||||
(getProxyUrl$1) =>
|
||||
function (url, opts) {
|
||||
if (noShuffling) {
|
||||
return getProxyUrl$1(url, opts);
|
||||
}
|
||||
return replaceUrl(getProxyUrl$1(url, opts), (u) => patch(shuffler.shuffle(u)), true);
|
||||
}
|
||||
);
|
||||
window.overrideParseProxyUrl(
|
||||
(parseProxyUrl$1) =>
|
||||
function (url) {
|
||||
return parseProxyUrl$1(replaceUrl(url, (u) => shuffler.unshuffle(unpatch(u)), false));
|
||||
}
|
||||
);
|
||||
}
|
||||
function fixUrlRewrite() {
|
||||
const port = location.port || (location.protocol === 'https:' ? '443' : '80');
|
||||
const getProxyUrl = hammerhead.utils.url.getProxyUrl;
|
||||
hammerhead.utils.url.overrideGetProxyUrl(function (url, opts = {}) {
|
||||
if (!opts.proxyPort) {
|
||||
opts.proxyPort = port;
|
||||
}
|
||||
return getProxyUrl(url, opts);
|
||||
});
|
||||
window.overrideParseProxyUrl(
|
||||
(parseProxyUrl$1) =>
|
||||
function (url) {
|
||||
const parsed = parseProxyUrl$1(url);
|
||||
if (!parsed || !parsed.proxy) return parsed;
|
||||
if (!parsed.proxy.port) {
|
||||
parsed.proxy.port = port;
|
||||
}
|
||||
return parsed;
|
||||
}
|
||||
);
|
||||
}
|
||||
function fixElementGetter() {
|
||||
const fixList = {
|
||||
HTMLAnchorElement: ['href'],
|
||||
HTMLAreaElement: ['href'],
|
||||
HTMLBaseElement: ['href'],
|
||||
HTMLEmbedElement: ['src'],
|
||||
HTMLFormElement: ['action'],
|
||||
HTMLFrameElement: ['src'],
|
||||
HTMLIFrameElement: ['src'],
|
||||
HTMLImageElement: ['src'],
|
||||
HTMLInputElement: ['src'],
|
||||
HTMLLinkElement: ['href'],
|
||||
HTMLMediaElement: ['src'],
|
||||
HTMLModElement: ['cite'],
|
||||
HTMLObjectElement: ['data'],
|
||||
HTMLQuoteElement: ['cite'],
|
||||
HTMLScriptElement: ['src'],
|
||||
HTMLSourceElement: ['src'],
|
||||
HTMLTrackElement: ['src']
|
||||
};
|
||||
const urlRewrite = (url) => (hammerhead.utils.url.parseProxyUrl(url) || {}).destUrl || url;
|
||||
for (const ElementClass in fixList) {
|
||||
for (const attr of fixList[ElementClass]) {
|
||||
if (!window[ElementClass]) {
|
||||
console.warn('unexpected unsupported element class ' + ElementClass);
|
||||
continue;
|
||||
}
|
||||
const desc = Object.getOwnPropertyDescriptor(window[ElementClass].prototype, attr);
|
||||
const originalGet = desc.get;
|
||||
desc.get = function () {
|
||||
return urlRewrite(originalGet.call(this));
|
||||
};
|
||||
if (attr === 'action') {
|
||||
const originalSet = desc.set;
|
||||
// don't shuffle form action urls
|
||||
desc.set = function (value) {
|
||||
noShuffling = true;
|
||||
try {
|
||||
var returnVal = originalSet.call(this, value);
|
||||
} catch (e) {
|
||||
noShuffling = false;
|
||||
throw e;
|
||||
}
|
||||
noShuffling = false;
|
||||
return returnVal;
|
||||
};
|
||||
}
|
||||
Object.defineProperty(window[ElementClass].prototype, attr, desc);
|
||||
}
|
||||
}
|
||||
}
|
||||
function fixCrossWindowLocalStorage() {
|
||||
// completely replace hammerhead's implementation as restore() and save() on every
|
||||
// call is just not viable (mainly memory issues as the garbage collector is sometimes not fast enough)
|
||||
|
||||
const prefix = `rammerhead|storage-wrapper|${hammerhead.settings._settings.sessionId}|${
|
||||
window.__get$(window, 'location').host
|
||||
}|`;
|
||||
const toRealStorageKey = (key = '') => prefix + key;
|
||||
const fromRealStorageKey = (key = '') => {
|
||||
if (!key.startsWith(prefix)) return null;
|
||||
return key.slice(prefix.length);
|
||||
};
|
||||
|
||||
const replaceStorageInstance = (storageProp, realStorage) => {
|
||||
const reservedProps = ['internal', 'clear', 'key', 'getItem', 'setItem', 'removeItem', 'length'];
|
||||
Object.defineProperty(window, storageProp, {
|
||||
// define a value-based instead of getter-based property, since with this localStorage implementation,
|
||||
// we don't need to rely on sharing a single memory-based storage across frames, unlike hammerhead
|
||||
configurable: true,
|
||||
writable: true,
|
||||
// still use window[storageProp] as basis to allow scripts to access localStorage.internal
|
||||
value: new Proxy(window[storageProp], {
|
||||
get(target, prop, receiver) {
|
||||
if (reservedProps.includes(prop) && prop !== 'length') {
|
||||
return Reflect.get(target, prop, receiver);
|
||||
} else if (prop === 'length') {
|
||||
let len = 0;
|
||||
for (const [key] of Object.entries(realStorage)) {
|
||||
if (fromRealStorageKey(key)) len++;
|
||||
}
|
||||
return len;
|
||||
} else {
|
||||
return realStorage[toRealStorageKey(prop)];
|
||||
}
|
||||
},
|
||||
set(_, prop, value) {
|
||||
if (!reservedProps.includes(prop)) {
|
||||
realStorage[toRealStorageKey(prop)] = value;
|
||||
}
|
||||
return true;
|
||||
},
|
||||
deleteProperty(_, prop) {
|
||||
delete realStorage[toRealStorageKey(prop)];
|
||||
return true;
|
||||
},
|
||||
has(target, prop) {
|
||||
return toRealStorageKey(prop) in realStorage || prop in target;
|
||||
},
|
||||
ownKeys() {
|
||||
const list = [];
|
||||
for (const [key] of Object.entries(realStorage)) {
|
||||
const proxyKey = fromRealStorageKey(key);
|
||||
if (proxyKey && !reservedProps.includes(proxyKey)) list.push(proxyKey);
|
||||
}
|
||||
return list;
|
||||
},
|
||||
getOwnPropertyDescriptor(_, prop) {
|
||||
return Object.getOwnPropertyDescriptor(realStorage, toRealStorageKey(prop));
|
||||
},
|
||||
defineProperty(_, prop, desc) {
|
||||
if (!reservedProps.includes(prop)) {
|
||||
Object.defineProperty(realStorage, toRealStorageKey(prop), desc);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
})
|
||||
});
|
||||
};
|
||||
const rewriteFunction = (prop, newFunc) => {
|
||||
Storage.prototype[prop] = new Proxy(Storage.prototype[prop], {
|
||||
apply(_, thisArg, args) {
|
||||
return newFunc.apply(thisArg, args);
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
replaceStorageInstance('localStorage', hammerhead.storages.localStorageProxy.internal.nativeStorage);
|
||||
replaceStorageInstance('sessionStorage', hammerhead.storages.sessionStorageProxy.internal.nativeStorage);
|
||||
rewriteFunction('clear', function () {
|
||||
for (const [key] of Object.entries(this)) {
|
||||
delete this[key];
|
||||
}
|
||||
});
|
||||
rewriteFunction('key', function (keyNum) {
|
||||
return (Object.entries(this)[keyNum] || [])[0] || null;
|
||||
});
|
||||
rewriteFunction('getItem', function (key) {
|
||||
return this.internal.nativeStorage[toRealStorageKey(key)] || null;
|
||||
});
|
||||
rewriteFunction('setItem', function (key, value) {
|
||||
if (key) {
|
||||
this.internal.nativeStorage[toRealStorageKey(key)] = value;
|
||||
}
|
||||
});
|
||||
rewriteFunction('removeItem', function (key) {
|
||||
delete this.internal.nativeStorage[toRealStorageKey(key)];
|
||||
});
|
||||
}
|
||||
|
||||
function hookHammerheadStartOnce(callback) {
|
||||
var originalStart = hammerhead.__proto__.start;
|
||||
hammerhead.__proto__.start = function () {
|
||||
originalStart.apply(this, arguments);
|
||||
hammerhead.__proto__.start = originalStart;
|
||||
callback();
|
||||
};
|
||||
}
|
||||
})();
|
||||
|
|
@ -1,110 +0,0 @@
|
|||
const cookie = require('cookie');
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
const os = require('os');
|
||||
|
||||
module.exports = {
|
||||
//// HOSTING CONFIGURATION ////
|
||||
|
||||
bindingAddress: '127.0.0.1',
|
||||
port: process.env.PORT,
|
||||
crossDomainPort: null,
|
||||
publicDir: path.join(__dirname, '../public'), // set to null to disable
|
||||
|
||||
// if workers is null or 1, multithreading is disabled
|
||||
workers: os.cpus().length,
|
||||
|
||||
// ssl object is either null or { key: fs.readFileSync('path/to/key'), cert: fs.readFileSync('path/to/cert') }
|
||||
// for more info, see https://nodejs.org/api/https.html#https_https_createserver_options_requestlistener
|
||||
ssl: null,
|
||||
|
||||
// this function's return object will determine how the client url rewriting will work.
|
||||
// set them differently from bindingAddress and port if rammerhead is being served
|
||||
// from a reverse proxy.
|
||||
getServerInfo: (req) => {
|
||||
const { origin_proxy } = cookie.parse(req.headers.cookie || '');
|
||||
|
||||
let origin;
|
||||
|
||||
try {
|
||||
origin = new URL(origin_proxy);
|
||||
} catch (error) {
|
||||
console.log(error, req.headers.cookie);
|
||||
origin = new URL(`${req.socket.encrypted ? 'https:' : 'http:'}//${req.headers.host}`);
|
||||
}
|
||||
|
||||
const { hostname, port, protocol } = origin;
|
||||
|
||||
return {
|
||||
hostname,
|
||||
port,
|
||||
crossDomainPort: port,
|
||||
protocol
|
||||
};
|
||||
},
|
||||
// example of non-hard-coding the hostname header
|
||||
// getServerInfo: (req) => {
|
||||
// return { hostname: new URL('http://' + req.headers.host).hostname, port: 443, crossDomainPort: 8443, protocol: 'https: };
|
||||
// },
|
||||
|
||||
// enforce a password for creating new sessions. set to null to disable
|
||||
password: null,
|
||||
|
||||
// disable or enable localStorage sync (turn off if clients send over huge localStorage data, resulting in huge memory usages)
|
||||
disableLocalStorageSync: false,
|
||||
|
||||
// restrict sessions to be only used per IP
|
||||
restrictSessionToIP: true,
|
||||
|
||||
// use disk for caching js rewrites. set to null to use memory instead (not recommended for HDD disks)
|
||||
diskJsCachePath: path.join(__dirname, '../cache-js'),
|
||||
jsCacheSize: 5 * 1024 * 1024 * 1024, // recommended: 50mb for memory, 5gb for disk
|
||||
|
||||
//// REWRITE HEADER CONFIGURATION ////
|
||||
|
||||
// removes reverse proxy headers
|
||||
// cloudflare example:
|
||||
// stripClientHeaders: ['cf-ipcountry', 'cf-ray', 'x-forwarded-proto', 'cf-visitor', 'cf-connecting-ip', 'cdn-loop', 'x-forwarded-for'],
|
||||
stripClientHeaders: [],
|
||||
// if you want to modify response headers, like removing the x-frame-options header, do it like so:
|
||||
// rewriteServerHeaders: {
|
||||
// // you can also specify a function to modify/add the header using the original value (undefined if adding the header)
|
||||
// // 'x-frame-options': (originalHeaderValue) => '',
|
||||
// 'x-frame-options': null, // set to null to tell rammerhead that you want to delete it
|
||||
// },
|
||||
rewriteServerHeaders: {
|
||||
// you can also specify a function to modify/add the header using the original value (undefined if adding the header)
|
||||
// 'x-frame-options': (originalHeaderValue) => '',
|
||||
'x-frame-options': null // set to null to tell rammerhead that you want to delete it
|
||||
},
|
||||
|
||||
//// SESSION STORE CONFIG ////
|
||||
|
||||
// see src/classes/RammerheadSessionFileCache.js for more details and options
|
||||
fileCacheSessionConfig: {
|
||||
saveDirectory: path.join(__dirname, '../sessions'),
|
||||
cacheTimeout: 1000 * 60 * 20, // 20 minutes
|
||||
cacheCheckInterval: 1000 * 60 * 10, // 10 minutes
|
||||
deleteUnused: true,
|
||||
staleCleanupOptions: {
|
||||
staleTimeout: 1000 * 60 * 60 * 24 * 3, // 3 days
|
||||
maxToLive: null,
|
||||
staleCheckInterval: 1000 * 60 * 60 * 6 // 6 hours
|
||||
},
|
||||
// corrupted session files happens when nodejs exits abruptly while serializing the JSON sessions to disk
|
||||
deleteCorruptedSessions: true,
|
||||
},
|
||||
|
||||
//// LOGGING CONFIGURATION ////
|
||||
|
||||
// valid values: 'disabled', 'debug', 'traffic', 'info', 'warn', 'error'
|
||||
logLevel: process.env.DEVELOPMENT ? 'debug' : 'info',
|
||||
generatePrefix: (level) => `[${new Date().toISOString()}] [${level.toUpperCase()}] `,
|
||||
|
||||
// logger depends on this value
|
||||
getIP: (req) => (req.headers['x-forwarded-for'] || req.connection.remoteAddress || '').split(',')[0].trim()
|
||||
// use the example below if rammerhead is sitting behind a reverse proxy like nginx
|
||||
// getIP: req => (req.headers['x-forwarded-for'] || req.connection.remoteAddress || '').split(',')[0].trim()
|
||||
};
|
||||
|
||||
if (fs.existsSync(path.join(__dirname, '../holy-config.js'))) Object.assign(module.exports, require('../holy-config'));
|
||||
|
|
@ -1,23 +0,0 @@
|
|||
const RammerheadProxy = require('./classes/RammerheadProxy');
|
||||
const RammerheadLogging = require('./classes/RammerheadLogging');
|
||||
const RammerheadSession = require('./classes/RammerheadSession');
|
||||
const RammerheadSessionAbstractStore = require('./classes/RammerheadSessionAbstractStore');
|
||||
const RammerheadSessionFileCache = require('./classes/RammerheadSessionFileCache');
|
||||
const generateId = require('./util/generateId');
|
||||
const addStaticFilesToProxy = require('./util/addStaticDirToProxy');
|
||||
const RammerheadSessionMemoryStore = require('./classes/RammerheadMemoryStore');
|
||||
const StrShuffler = require('./util/StrShuffler');
|
||||
const URLPath = require('./util/URLPath');
|
||||
|
||||
module.exports = {
|
||||
RammerheadProxy,
|
||||
RammerheadLogging,
|
||||
RammerheadSession,
|
||||
RammerheadSessionAbstractStore,
|
||||
RammerheadSessionMemoryStore,
|
||||
RammerheadSessionFileCache,
|
||||
StrShuffler,
|
||||
generateId,
|
||||
addStaticFilesToProxy,
|
||||
URLPath
|
||||
};
|
||||
|
|
@ -1 +0,0 @@
|
|||
require('./server/index.js');
|
||||
|
|
@ -1,53 +0,0 @@
|
|||
const exitHook = require('async-exit-hook');
|
||||
const RammerheadProxy = require('../classes/RammerheadProxy');
|
||||
const addStaticDirToProxy = require('../util/addStaticDirToProxy');
|
||||
const RammerheadSessionFileCache = require('../classes/RammerheadSessionFileCache');
|
||||
const config = require('../config');
|
||||
const setupRoutes = require('./setupRoutes');
|
||||
const setupPipeline = require('./setupPipeline');
|
||||
const RammerheadLogging = require('../classes/RammerheadLogging');
|
||||
|
||||
/**
|
||||
*
|
||||
* @returns {import('node:http').Server}
|
||||
*/
|
||||
function createRammerhead() {
|
||||
require.main = module;
|
||||
|
||||
const logger = new RammerheadLogging({
|
||||
logLevel: config.logLevel,
|
||||
generatePrefix: (level) => config.generatePrefix(level)
|
||||
});
|
||||
|
||||
const proxyServer = new RammerheadProxy({
|
||||
logger,
|
||||
loggerGetIP: config.getIP,
|
||||
bindingAddress: config.bindingAddress,
|
||||
port: config.port,
|
||||
crossDomainPort: null,
|
||||
dontListen: true,
|
||||
ssl: config.ssl,
|
||||
getServerInfo: config.getServerInfo,
|
||||
disableLocalStorageSync: config.disableLocalStorageSync,
|
||||
diskJsCachePath: config.diskJsCachePath,
|
||||
jsCacheSize: config.jsCacheSize
|
||||
});
|
||||
|
||||
if (config.publicDir) addStaticDirToProxy(proxyServer, config.publicDir);
|
||||
|
||||
const fileCacheOptions = { logger, ...config.fileCacheSessionConfig };
|
||||
const sessionStore = new RammerheadSessionFileCache(fileCacheOptions);
|
||||
sessionStore.attachToProxy(proxyServer);
|
||||
|
||||
setupPipeline(proxyServer, sessionStore);
|
||||
setupRoutes(proxyServer, sessionStore, logger);
|
||||
|
||||
// nicely close proxy server and save sessions to store before we exit
|
||||
exitHook(() => {
|
||||
proxyServer.close();
|
||||
});
|
||||
|
||||
return proxyServer.server1;
|
||||
}
|
||||
|
||||
module.exports = createRammerhead;
|
||||
|
|
@ -1,29 +0,0 @@
|
|||
const config = require('../config');
|
||||
const getSessionId = require('../util/getSessionId');
|
||||
|
||||
/**
|
||||
* @param {import('../classes/RammerheadProxy')} proxyServer
|
||||
* @param {import('../classes/RammerheadSessionAbstractStore')} sessionStore
|
||||
*/
|
||||
module.exports = function setupPipeline(proxyServer, sessionStore) {
|
||||
// remove headers defined in config.js
|
||||
proxyServer.addToOnRequestPipeline((req, res, _serverInfo, isRoute) => {
|
||||
if (isRoute) return; // only strip those that are going to the proxy destination website
|
||||
|
||||
// restrict session to IP if enabled
|
||||
if (config.restrictSessionToIP) {
|
||||
const sessionId = getSessionId(req.url);
|
||||
const session = sessionId && sessionStore.get(sessionId);
|
||||
if (session && session.data.restrictIP && session.data.restrictIP !== config.getIP(req)) {
|
||||
res.writeHead(403);
|
||||
res.end('Sessions must come from the same IP');
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
for (const eachHeader of config.stripClientHeaders) {
|
||||
delete req.headers[eachHeader];
|
||||
}
|
||||
});
|
||||
Object.assign(proxyServer.rewriteServerHeaders, config.rewriteServerHeaders);
|
||||
};
|
||||
|
|
@ -1,115 +0,0 @@
|
|||
const generateId = require('../util/generateId');
|
||||
const URLPath = require('../util/URLPath');
|
||||
const httpResponse = require('../util/httpResponse');
|
||||
const config = require('../config');
|
||||
const StrShuffler = require('../util/StrShuffler');
|
||||
const RammerheadSession = require('../classes/RammerheadSession');
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {import('../classes/RammerheadProxy')} proxyServer
|
||||
* @param {import('../classes/RammerheadSessionAbstractStore')} sessionStore
|
||||
* @param {import('../classes/RammerheadLogging')} logger
|
||||
*/
|
||||
module.exports = function setupRoutes(proxyServer, sessionStore, logger) {
|
||||
const isNotAuthorized = (req, res) => {
|
||||
if (!config.password) return;
|
||||
const { pwd } = new URLPath(req.url).getParams();
|
||||
if (config.password !== pwd) {
|
||||
httpResponse.accessForbidden(logger, req, res, config.getIP(req), 'bad password');
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
};
|
||||
if (process.env.DEVELOPMENT) {
|
||||
proxyServer.GET('/garbageCollect', (req, res) => {
|
||||
global.gc();
|
||||
res.end('Ok');
|
||||
});
|
||||
}
|
||||
proxyServer.GET('/needpassword', (req, res) => {
|
||||
res.end(config.password ? 'true' : 'false');
|
||||
});
|
||||
proxyServer.GET('/newsession', (req, res) => {
|
||||
if (isNotAuthorized(req, res)) return;
|
||||
|
||||
const id = generateId();
|
||||
const session = new RammerheadSession();
|
||||
session.data.restrictIP = config.getIP(req);
|
||||
|
||||
// workaround for saving the modified session to disk
|
||||
sessionStore.addSerializedSession(id, session.serializeSession());
|
||||
res.end(id);
|
||||
});
|
||||
proxyServer.GET('/editsession', (req, res) => {
|
||||
if (isNotAuthorized(req, res)) return;
|
||||
|
||||
let { id, httpProxy, enableShuffling } = new URLPath(req.url).getParams();
|
||||
|
||||
if (!id || !sessionStore.has(id)) {
|
||||
return httpResponse.badRequest(logger, req, res, config.getIP(req), 'not found');
|
||||
}
|
||||
|
||||
const session = sessionStore.get(id);
|
||||
|
||||
if (httpProxy) {
|
||||
if (httpProxy.startsWith('http://')) {
|
||||
httpProxy = httpProxy.slice(7);
|
||||
}
|
||||
session.setExternalProxySettings(httpProxy);
|
||||
} else {
|
||||
session.externalProxySettings = null;
|
||||
}
|
||||
if (enableShuffling === '1' && !session.shuffleDict) {
|
||||
session.shuffleDict = StrShuffler.generateDictionary();
|
||||
}
|
||||
if (enableShuffling === '0') {
|
||||
session.shuffleDict = null;
|
||||
}
|
||||
|
||||
res.end('Success');
|
||||
});
|
||||
proxyServer.GET('/deletesession', (req, res) => {
|
||||
if (isNotAuthorized(req, res)) return;
|
||||
|
||||
const { id } = new URLPath(req.url).getParams();
|
||||
|
||||
if (!id || !sessionStore.has(id)) {
|
||||
res.end('not found');
|
||||
return;
|
||||
}
|
||||
|
||||
if (id === RammerheadSession.autocompleteId) {
|
||||
res.end('Failure');
|
||||
return;
|
||||
}
|
||||
|
||||
sessionStore.delete(id);
|
||||
res.end('Success');
|
||||
});
|
||||
proxyServer.GET('/sessionexists', (req, res) => {
|
||||
const id = new URLPath(req.url).get('id');
|
||||
if (!id) {
|
||||
httpResponse.badRequest(logger, req, res, config.getIP(req), 'Must specify id parameter');
|
||||
} else {
|
||||
res.end(sessionStore.has(id) ? 'exists' : 'not found');
|
||||
}
|
||||
});
|
||||
proxyServer.GET('/mainport', (req, res) => {
|
||||
const serverInfo = config.getServerInfo(req);
|
||||
res.end((serverInfo.port || '').toString());
|
||||
});
|
||||
|
||||
// Generate the global autocomplete session if it does not already exist.
|
||||
if (!sessionStore.has(RammerheadSession.autocompleteId)) {
|
||||
const session = new RammerheadSession({
|
||||
id: RammerheadSession.autocompleteId,
|
||||
dontConnectToData: true,
|
||||
});
|
||||
// workaround for saving the modified session to disk
|
||||
sessionStore.addSerializedSession(
|
||||
RammerheadSession.autocompleteId,
|
||||
session.serializeSession()
|
||||
);
|
||||
}
|
||||
};
|
||||
|
|
@ -1,77 +0,0 @@
|
|||
/*
|
||||
|
||||
baseDictionary originally generated with (certain characters was removed to avoid breaking pages):
|
||||
|
||||
let str = '';
|
||||
for (let i = 32; i <= 126; i++) {
|
||||
let c = String.fromCharCode(i);
|
||||
if (c !== '/' && c !== '_' && encodeURI(c).length === 1) str += c;
|
||||
}
|
||||
|
||||
*/
|
||||
|
||||
const mod = (n, m) => ((n % m) + m) % m;
|
||||
const baseDictionary = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz~-';
|
||||
const shuffledIndicator = '_rhs';
|
||||
const generateDictionary = function () {
|
||||
let str = '';
|
||||
const split = baseDictionary.split('');
|
||||
while (split.length > 0) {
|
||||
str += split.splice(Math.floor(Math.random() * split.length), 1)[0];
|
||||
}
|
||||
return str;
|
||||
};
|
||||
class StrShuffler {
|
||||
constructor(dictionary = generateDictionary()) {
|
||||
this.dictionary = dictionary;
|
||||
}
|
||||
shuffle(str) {
|
||||
if (str.startsWith(shuffledIndicator)) {
|
||||
return str;
|
||||
}
|
||||
let shuffledStr = '';
|
||||
for (let i = 0; i < str.length; i++) {
|
||||
const char = str.charAt(i);
|
||||
const idx = baseDictionary.indexOf(char);
|
||||
if (char === '%' && str.length - i >= 3) {
|
||||
shuffledStr += char;
|
||||
shuffledStr += str.charAt(++i);
|
||||
shuffledStr += str.charAt(++i);
|
||||
} else if (idx === -1) {
|
||||
shuffledStr += char;
|
||||
} else {
|
||||
shuffledStr += this.dictionary.charAt(mod(idx + i, baseDictionary.length));
|
||||
}
|
||||
}
|
||||
return shuffledIndicator + shuffledStr;
|
||||
}
|
||||
unshuffle(str) {
|
||||
if (!str.startsWith(shuffledIndicator)) {
|
||||
return str;
|
||||
}
|
||||
|
||||
str = str.slice(shuffledIndicator.length);
|
||||
|
||||
let unshuffledStr = '';
|
||||
for (let i = 0; i < str.length; i++) {
|
||||
const char = str.charAt(i);
|
||||
const idx = this.dictionary.indexOf(char);
|
||||
if (char === '%' && str.length - i >= 3) {
|
||||
unshuffledStr += char;
|
||||
unshuffledStr += str.charAt(++i);
|
||||
unshuffledStr += str.charAt(++i);
|
||||
} else if (idx === -1) {
|
||||
unshuffledStr += char;
|
||||
} else {
|
||||
unshuffledStr += baseDictionary.charAt(mod(idx - i, baseDictionary.length));
|
||||
}
|
||||
}
|
||||
return unshuffledStr;
|
||||
}
|
||||
}
|
||||
|
||||
StrShuffler.baseDictionary = baseDictionary;
|
||||
StrShuffler.shuffledIndicator = shuffledIndicator;
|
||||
StrShuffler.generateDictionary = generateDictionary;
|
||||
|
||||
module.exports = StrShuffler;
|
||||
|
|
@ -1,24 +0,0 @@
|
|||
/**
|
||||
* for lazy people who don't want to type out `new URL('http://blah' + req.url).searchParams.get(ugh)` all the time
|
||||
*/
|
||||
module.exports = class URLPath extends URL {
|
||||
/**
|
||||
* @param {string} path - /site/path
|
||||
*/
|
||||
constructor(path) {
|
||||
super(path, 'http://foobar');
|
||||
}
|
||||
/**
|
||||
* @param {string} param - ?param=value
|
||||
* @returns {string|null}
|
||||
*/
|
||||
get(param) {
|
||||
return this.searchParams.get(param);
|
||||
}
|
||||
/**
|
||||
* @returns {{[param: string]: string}}
|
||||
*/
|
||||
getParams() {
|
||||
return Object.fromEntries(this.searchParams);
|
||||
}
|
||||
};
|
||||
|
|
@ -1,65 +0,0 @@
|
|||
const LRUCache = require('lru-cache');
|
||||
const LRUFiles = require('keyv-lru-files');
|
||||
const crypto = require('crypto');
|
||||
const fs = require('fs');
|
||||
|
||||
let cacheGet = async (_key) => {
|
||||
throw new TypeError('cannot cache get: must initialize cache settings first');
|
||||
};
|
||||
let cacheSet = async (_key, _value) => {
|
||||
throw new TypeError('cannot cache set: must initialize cache settings first');
|
||||
};
|
||||
|
||||
module.exports = async function (diskJsCachePath, jsCacheSize) {
|
||||
const md5 = (data) => crypto.createHash('md5').update(data).digest('hex');
|
||||
|
||||
if (!diskJsCachePath) {
|
||||
const jsLRUMemCache = new LRUCache({
|
||||
max: jsCacheSize,
|
||||
length: (n) => n.length
|
||||
});
|
||||
cacheGet = (key) => jsLRUMemCache.get(md5(key));
|
||||
cacheSet = (key, value) => jsLRUMemCache.set(md5(key), value);
|
||||
} else {
|
||||
if (!fs.existsSync(diskJsCachePath)) {
|
||||
throw new TypeError('disk cache folder does not exist: ' + diskJsCachePath);
|
||||
}
|
||||
if (!fs.lstatSync(diskJsCachePath).isDirectory()) {
|
||||
throw new TypeError('disk cache folder must be a directory: ' + diskJsCachePath);
|
||||
}
|
||||
const jsLRUFileCache = new LRUFiles({
|
||||
dir: diskJsCachePath,
|
||||
size: jsCacheSize
|
||||
});
|
||||
await jsLRUFileCache.open_sqlite();
|
||||
cacheGet = async (key) => (await jsLRUFileCache.get(md5(key)))?.toString('utf8');
|
||||
cacheSet = async (key, value) => await jsLRUFileCache.set(md5(key), value);
|
||||
}
|
||||
};
|
||||
|
||||
// patch ScriptResourceProcessor
|
||||
// https://github.com/DevExpress/testcafe-hammerhead/blob/7f80940225bc1c615517455dc7d30452b0365243/src/processing/resources/script.ts#L21
|
||||
|
||||
const scriptProcessor = require('testcafe-hammerhead/lib/processing/resources/script');
|
||||
const { processScript } = require('testcafe-hammerhead/lib/processing/script');
|
||||
const { updateScriptImportUrls } = require('testcafe-hammerhead/lib/utils/url');
|
||||
const BUILTIN_HEADERS = require('testcafe-hammerhead/lib/request-pipeline/builtin-header-names');
|
||||
|
||||
scriptProcessor.__proto__.processResource = async function processResource(script, ctx, _charset, urlReplacer) {
|
||||
if (!script) return script;
|
||||
|
||||
let processedScript = await cacheGet(script);
|
||||
|
||||
if (!processedScript) {
|
||||
processedScript = processScript(
|
||||
script,
|
||||
true,
|
||||
false,
|
||||
urlReplacer,
|
||||
ctx.destRes.headers[BUILTIN_HEADERS.serviceWorkerAllowed]
|
||||
);
|
||||
await cacheSet(script, processedScript);
|
||||
} else processedScript = updateScriptImportUrls(processedScript, ctx.serverInfo, ctx.session.id, ctx.windowId);
|
||||
|
||||
return processedScript;
|
||||
};
|
||||
|
|
@ -1,36 +0,0 @@
|
|||
// handle the additional errors: ERR_INVALID_PROTOCOL and ETIMEDOUT
|
||||
// hammerhead handled errors: ECONNRESET, EPIPE (or ECONNABORTED for windows)
|
||||
|
||||
const hGuard = require('testcafe-hammerhead/lib/request-pipeline/connection-reset-guard');
|
||||
const isConnectionResetError = hGuard.isConnectionResetError;
|
||||
hGuard.isConnectionResetError = function (err) {
|
||||
// for some reason, ECONNRESET isn't handled correctly
|
||||
if (
|
||||
isConnectionResetError(err) ||
|
||||
err.code === 'ERR_INVALID_PROTOCOL' ||
|
||||
err.code === 'ETIMEDOUT' ||
|
||||
err.code === 'ECONNRESET' ||
|
||||
err.code === 'EPIPE'
|
||||
) {
|
||||
return true;
|
||||
}
|
||||
console.error('Unknown crash-inducing error:', err);
|
||||
// never return false as to avoid crashing the server
|
||||
return true;
|
||||
};
|
||||
|
||||
process.on('uncaughtException', (err) => {
|
||||
// for some reason, the above never catches all of the errors. this is a last resort failsafe
|
||||
if (
|
||||
err.message.includes('ECONN') ||
|
||||
err.message.includes('EPIPE') ||
|
||||
err.message.includes('ETIMEDOUT') ||
|
||||
err.message.includes('ERR_INVALID_')
|
||||
) {
|
||||
// crash avoided!
|
||||
console.error('Avoided crash:' + err.message);
|
||||
} else {
|
||||
// probably a TypeError or something important
|
||||
console.error('Something broke...', err)
|
||||
}
|
||||
});
|
||||
|
|
@ -1,65 +0,0 @@
|
|||
const mime = require('mime');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
// these routes are reserved by hammerhead and rammerhead
|
||||
const forbiddenRoutes = [
|
||||
'/rammerhead.js',
|
||||
'/hammerhead.js',
|
||||
'/task.js',
|
||||
'/iframe-task.js',
|
||||
'/messaging',
|
||||
'/transport-worker.js',
|
||||
'/worker-hammerhead.js'
|
||||
];
|
||||
|
||||
const isDirectory = (dir) => fs.lstatSync(dir).isDirectory();
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {import('testcafe-hammerhead').Proxy} proxy
|
||||
* @param {string} staticDir - all of the files and folders in the specified directory will be served
|
||||
* publicly. /index.html will automatically link to /
|
||||
* @param {string} rootPath - all the files that will be served under rootPath
|
||||
*/
|
||||
function addStaticFilesToProxy(proxy, staticDir, rootPath = '/', shouldIgnoreFile = (_file, _dir) => false) {
|
||||
if (!isDirectory(staticDir)) {
|
||||
throw new TypeError('specified folder path is not a directory');
|
||||
}
|
||||
|
||||
if (!rootPath.endsWith('/')) rootPath = rootPath + '/';
|
||||
if (!rootPath.startsWith('/')) rootPath = '/' + rootPath;
|
||||
|
||||
const files = fs.readdirSync(staticDir);
|
||||
|
||||
files.map((file) => {
|
||||
if (isDirectory(path.join(staticDir, file))) {
|
||||
addStaticFilesToProxy(proxy, path.join(staticDir, file), rootPath + file + '/', shouldIgnoreFile);
|
||||
return;
|
||||
}
|
||||
|
||||
if (shouldIgnoreFile(file, staticDir)) {
|
||||
return;
|
||||
}
|
||||
|
||||
const pathToFile = path.join(staticDir, file);
|
||||
const staticContent = {
|
||||
content: fs.readFileSync(pathToFile),
|
||||
contentType: mime.getType(file)
|
||||
};
|
||||
const route = rootPath + file;
|
||||
|
||||
if (forbiddenRoutes.includes(route)) {
|
||||
throw new TypeError(
|
||||
`route clashes with hammerhead. problematic route: ${route}. problematic static file: ${pathToFile}`
|
||||
);
|
||||
}
|
||||
|
||||
proxy.GET(rootPath + file, staticContent);
|
||||
if (file === 'index.html') {
|
||||
proxy.GET(rootPath, staticContent);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
module.exports = addStaticFilesToProxy;
|
||||
|
|
@ -1,88 +0,0 @@
|
|||
const RequestPipelineContext = require('testcafe-hammerhead/lib/request-pipeline/context');
|
||||
const StrShuffler = require('./StrShuffler');
|
||||
const getSessionId = require('./getSessionId');
|
||||
|
||||
const replaceUrl = (url, replacer) => {
|
||||
// regex: https://google.com/ sessionid/ url
|
||||
return (url || '').replace(/^((?:[a-z0-9]+:\/\/[^/]+)?(?:\/[^/]+\/))([^]+)/i, function (_, g1, g2) {
|
||||
return g1 + replacer(g2);
|
||||
});
|
||||
};
|
||||
|
||||
function patch(url) {
|
||||
// url = _rhsEPrcb://bqhQko.tHR/
|
||||
// remove slash
|
||||
return url.replace(/(^.*?:\/)\//, '$1');
|
||||
}
|
||||
|
||||
function unpatch(url) {
|
||||
// url = _rhsEPrcb:/bqhQko.tHR/
|
||||
// restore slash
|
||||
return url.replace(/^.*?:\/(?!\/)/, '$&/');
|
||||
}
|
||||
|
||||
// unshuffle incoming url //
|
||||
const BUILTIN_HEADERS = require('testcafe-hammerhead/lib/request-pipeline/builtin-header-names');
|
||||
const _dispatch = RequestPipelineContext.prototype.dispatch;
|
||||
RequestPipelineContext.prototype.dispatch = function (openSessions) {
|
||||
let sessionId = getSessionId(this.req.url);
|
||||
let session = sessionId && openSessions.get(sessionId);
|
||||
if (!session) {
|
||||
sessionId = getSessionId(this.req.headers[BUILTIN_HEADERS.referer]);
|
||||
session = sessionId && openSessions.get(sessionId);
|
||||
}
|
||||
if (session && session.shuffleDict) {
|
||||
const shuffler = new StrShuffler(session.shuffleDict);
|
||||
this.req.url = replaceUrl(this.req.url, (url) => shuffler.unshuffle(unpatch(url)));
|
||||
if (getSessionId(this.req.headers[BUILTIN_HEADERS.referer]) === sessionId) {
|
||||
this.req.headers[BUILTIN_HEADERS.referer] = replaceUrl(this.req.headers[BUILTIN_HEADERS.referer], (url) =>
|
||||
shuffler.unshuffle(unpatch(url))
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return _dispatch.call(this, openSessions);
|
||||
};
|
||||
|
||||
// shuffle rewritten proxy urls //
|
||||
let disableShuffling = false; // for later use
|
||||
const _toProxyUrl = RequestPipelineContext.prototype.toProxyUrl;
|
||||
RequestPipelineContext.prototype.toProxyUrl = function (...args) {
|
||||
const proxyUrl = _toProxyUrl.apply(this, args);
|
||||
|
||||
if (!this.session.shuffleDict || disableShuffling) return proxyUrl;
|
||||
|
||||
const shuffler = new StrShuffler(this.session.shuffleDict);
|
||||
return replaceUrl(proxyUrl, (url) => patch(shuffler.shuffle(url)));
|
||||
};
|
||||
|
||||
// unshuffle task.js referer header
|
||||
const Proxy = require('testcafe-hammerhead/lib/proxy/index');
|
||||
const __onTaskScriptRequest = Proxy.prototype._onTaskScriptRequest;
|
||||
Proxy.prototype._onTaskScriptRequest = async function _onTaskScriptRequest(req, ...args) {
|
||||
const referer = req.headers[BUILTIN_HEADERS.referer];
|
||||
|
||||
const sessionId = getSessionId(referer);
|
||||
const session = sessionId && this.openSessions.get(sessionId);
|
||||
if (session && session.shuffleDict) {
|
||||
const shuffler = new StrShuffler(session.shuffleDict);
|
||||
req.headers[BUILTIN_HEADERS.referer] = replaceUrl(req.headers[BUILTIN_HEADERS.referer], (url) =>
|
||||
shuffler.unshuffle(unpatch(url))
|
||||
);
|
||||
}
|
||||
return __onTaskScriptRequest.call(this, req, ...args);
|
||||
};
|
||||
|
||||
// don't shuffle action urls (because we don't get to control the rewriting when the user submits the form)
|
||||
const DomProcessor = require('testcafe-hammerhead/lib/processing/dom/index');
|
||||
const __processUrlAttrs = DomProcessor.prototype._processUrlAttrs;
|
||||
DomProcessor.prototype._processUrlAttrs = function _processUrlAttrs(el, urlReplacer, pattern) {
|
||||
try {
|
||||
disableShuffling = pattern.urlAttr?.toLowerCase() === 'action';
|
||||
__processUrlAttrs.call(this, el, urlReplacer, pattern);
|
||||
disableShuffling = false;
|
||||
} catch (e) {
|
||||
disableShuffling = false;
|
||||
throw e;
|
||||
}
|
||||
};
|
||||
|
|
@ -1,33 +0,0 @@
|
|||
const urlUtils = require('testcafe-hammerhead/lib/utils/url');
|
||||
const RequestPipelineContext = require('testcafe-hammerhead/lib/request-pipeline/context');
|
||||
|
||||
/**
|
||||
* if a non-crossdomain origin makes a request to a crossdomain port, the ports are flipped. this is to fix that issue.
|
||||
* there is also another issue with https://domain and https://domain:443 not matching. port 443/80 are automatically
|
||||
* removed if https and 443, and http and 80.
|
||||
* original: https://github.com/DevExpress/testcafe-hammerhead/blob/f5b0508d10614bf39a75c772dc6bd01c24f29417/src/request-pipeline/context.ts#L436
|
||||
*/
|
||||
RequestPipelineContext.prototype.getProxyOrigin = function getProxyOrigin(isCrossDomain = false) {
|
||||
// if we receive a request that has a proxy origin header, (ctx.getProxyOrigin(!!ctx.dest.reqOrigin),
|
||||
// https://github.com/DevExpress/testcafe-hammerhead/blob/f5b0508d10614bf39a75c772dc6bd01c24f29417/src/request-pipeline/header-transforms/transforms.ts#L128),
|
||||
// then we must return the other port over. however, the issue with this is we don't know if the incoming request is actually a
|
||||
// crossdomain port (a simple check for reqOrigin cannot suffice, as a request from a non-crossdomain origin to a crossdomain port and
|
||||
// vice versa can happen),
|
||||
// so this will fix the issue from non-crossdomain port to crossdomain-port but will NOT fix crosdomain-port to non-crossdomain port.
|
||||
// However, the latter case will never happen because hammerhead made all client rewriting cross-domain requests to always use the
|
||||
// cross-domain ports, even if the origin is from a cross-domain port.
|
||||
const port = isCrossDomain ? this.serverInfo.port : this.serverInfo.crossDomainPort;
|
||||
|
||||
// don't add a port if port is 443 and protocol is https:, and don't add a port if port is 80 and protocol is http:.
|
||||
// note that this isn't supported by the client rewriting, so client hammerhead's port.toString() will throw an error
|
||||
const hostPort =
|
||||
(this.serverInfo.protocol == 'https:' && port == 443) || (this.serverInfo.protocol == 'http:' && port == 80)
|
||||
? null
|
||||
: port;
|
||||
|
||||
return urlUtils.getDomain({
|
||||
protocol: this.serverInfo.protocol,
|
||||
// use host instead of hostname so we can manually add in the port
|
||||
host: this.serverInfo.hostname + (hostPort ? ':' + hostPort : '')
|
||||
});
|
||||
};
|
||||
|
|
@ -1,21 +0,0 @@
|
|||
// fixes unpipe error and crashes resulting from http requests to websocket proxy endpoint
|
||||
|
||||
const stages = require('testcafe-hammerhead/lib/request-pipeline/stages');
|
||||
const { Duplex } = require('stream');
|
||||
|
||||
stages.unshift(function fixWebsocket(ctx) {
|
||||
ctx.isWebSocket = ctx.res instanceof Duplex;
|
||||
});
|
||||
|
||||
// fixes EPIPE error when trying to write head to a closed socket
|
||||
const hammerheadWS = require('testcafe-hammerhead/lib/request-pipeline/websocket');
|
||||
const respondOnWebSocket = hammerheadWS.respondOnWebSocket;
|
||||
hammerheadWS.respondOnWebSocket = function (ctx) {
|
||||
ctx.res.on('error', (err) => {
|
||||
if (err.code !== 'EPIPE') {
|
||||
console.error('Unknown crash-inducing error:', err);
|
||||
}
|
||||
// cleanup end will automatically be handled by the 'end' listener
|
||||
});
|
||||
respondOnWebSocket(ctx);
|
||||
};
|
||||
|
|
@ -1,3 +0,0 @@
|
|||
const uuid = require('uuid').v4;
|
||||
|
||||
module.exports = () => uuid().replace(/-/g, '');
|
||||
|
|
@ -1 +0,0 @@
|
|||
module.exports = (reqPath) => ((reqPath || '').match(/^(?:[a-z0-9]+:\/\/[^/]+)?\/([a-z0-9]{32})/i) || [])[1];
|
||||
|
|
@ -1,19 +0,0 @@
|
|||
/**
|
||||
* @typedef {'badRequest'|'accessForbidden'} httpResponseTypes
|
||||
*/
|
||||
|
||||
/**
|
||||
* @type {{[key in httpResponseTypes]: (logger: import('../classes/RammerheadLogging'), req: import('http').IncomingMessage, res: import('http').ServerResponse, ip: string, msg: string) => void}}
|
||||
*/
|
||||
module.exports = {
|
||||
badRequest: (logger, req, res, ip, msg) => {
|
||||
logger.error(`(httpResponse.badRequest) ${ip} ${req.url} ${msg}`);
|
||||
res.writeHead(400);
|
||||
res.end(msg);
|
||||
},
|
||||
accessForbidden: (logger, req, res, ip, msg) => {
|
||||
logger.error(`(httpResponse.badRequest) ${ip} ${req.url} ${msg}`);
|
||||
res.writeHead(403);
|
||||
res.end(msg);
|
||||
}
|
||||
};
|
||||
|
|
@ -1,64 +0,0 @@
|
|||
// https://github.com/DevExpress/testcafe-hammerhead/blob/7f80940225bc1c615517455dc7d30452b0365243/src/processing/resources/index.ts
|
||||
|
||||
const url = require('url');
|
||||
const pageProcessor = require('testcafe-hammerhead/lib/processing/resources/page');
|
||||
const manifestProcessor = require('testcafe-hammerhead/lib/processing/resources/manifest');
|
||||
const scriptProcessor = require('testcafe-hammerhead/lib/processing/resources/script');
|
||||
const stylesheetProcessor = require('testcafe-hammerhead/lib/processing/resources/stylesheet');
|
||||
const urlUtil = require('testcafe-hammerhead/lib/utils/url');
|
||||
const { encodeContent, decodeContent } = require('testcafe-hammerhead/lib/processing/encoding');
|
||||
const { platform } = require('os');
|
||||
|
||||
const IS_WIN = platform() === 'win32';
|
||||
const DISK_RE = /^[A-Za-z]:/;
|
||||
const RESOURCE_PROCESSORS = [pageProcessor, manifestProcessor, scriptProcessor, stylesheetProcessor];
|
||||
|
||||
function getResourceUrlReplacer(ctx) {
|
||||
return function (resourceUrl, resourceType, charsetAttrValue, baseUrl, isCrossDomain) {
|
||||
if (!urlUtil.isSupportedProtocol(resourceUrl) && !urlUtil.isSpecialPage(resourceUrl)) return resourceUrl;
|
||||
|
||||
if (IS_WIN && ctx.dest.protocol === 'file:' && DISK_RE.test(resourceUrl)) resourceUrl = '/' + resourceUrl;
|
||||
|
||||
// NOTE: Resolves base URLs without a protocol ('//google.com/path' for example).
|
||||
baseUrl = baseUrl ? url.resolve(ctx.dest.url, baseUrl) : '';
|
||||
resourceUrl = urlUtil.processSpecialChars(resourceUrl);
|
||||
|
||||
let resolvedUrl = url.resolve(baseUrl || ctx.dest.url, resourceUrl);
|
||||
|
||||
if (!urlUtil.isValidUrl(resolvedUrl)) return resourceUrl;
|
||||
|
||||
// NOTE: Script or <link rel='preload' as='script'>
|
||||
const isScriptLike = urlUtil.parseResourceType(resourceType).isScript;
|
||||
const charsetStr = charsetAttrValue || (isScriptLike && ctx.contentInfo.charset.get());
|
||||
|
||||
resolvedUrl = urlUtil.ensureTrailingSlash(resourceUrl, resolvedUrl);
|
||||
|
||||
if (!urlUtil.isValidUrl(resolvedUrl)) return resolvedUrl;
|
||||
|
||||
return ctx.toProxyUrl(resolvedUrl, isCrossDomain, resourceType, charsetStr);
|
||||
};
|
||||
}
|
||||
|
||||
require('testcafe-hammerhead/lib/processing/resources/index').process = async function process(ctx) {
|
||||
const { destResBody, contentInfo } = ctx;
|
||||
const { encoding, charset } = contentInfo;
|
||||
|
||||
for (const processor of RESOURCE_PROCESSORS) {
|
||||
if (!processor.shouldProcessResource(ctx)) continue;
|
||||
|
||||
const urlReplacer = getResourceUrlReplacer(ctx);
|
||||
|
||||
if (pageProcessor === processor) await ctx.prepareInjectableUserScripts();
|
||||
|
||||
const decoded = await decodeContent(destResBody, encoding, charset);
|
||||
|
||||
// @ts-ignore: Cannot invoke an expression whose type lacks a call signature
|
||||
const processed = await processor.processResource(decoded, ctx, charset, urlReplacer); // <-- add async support
|
||||
|
||||
if (processed === pageProcessor.RESTART_PROCESSING) return await process(ctx);
|
||||
|
||||
return await encodeContent(processed, encoding, charset);
|
||||
}
|
||||
|
||||
return destResBody;
|
||||
};
|
||||
|
|
@ -1,8 +0,0 @@
|
|||
module.exports = function streamToString(stream) {
|
||||
const chunks = [];
|
||||
return new Promise((resolve, reject) => {
|
||||
stream.on('data', (chunk) => chunks.push(Buffer.from(chunk)));
|
||||
stream.on('error', (err) => reject(err));
|
||||
stream.on('end', () => resolve(Buffer.concat(chunks).toString('utf8')));
|
||||
});
|
||||
};
|
||||
16
node_modules/.bin/mime
generated
vendored
Normal file
16
node_modules/.bin/mime
generated
vendored
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*)
|
||||
if command -v cygpath > /dev/null 2>&1; then
|
||||
basedir=`cygpath -w "$basedir"`
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
exec "$basedir/node" "$basedir/../mime/cli.js" "$@"
|
||||
else
|
||||
exec node "$basedir/../mime/cli.js" "$@"
|
||||
fi
|
||||
17
node_modules/.bin/mime.cmd
generated
vendored
Normal file
17
node_modules/.bin/mime.cmd
generated
vendored
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
@ECHO off
|
||||
GOTO start
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
||||
:start
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\mime\cli.js" %*
|
||||
28
node_modules/.bin/mime.ps1
generated
vendored
Normal file
28
node_modules/.bin/mime.ps1
generated
vendored
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "$basedir/node$exe" "$basedir/../mime/cli.js" $args
|
||||
} else {
|
||||
& "$basedir/node$exe" "$basedir/../mime/cli.js" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "node$exe" "$basedir/../mime/cli.js" $args
|
||||
} else {
|
||||
& "node$exe" "$basedir/../mime/cli.js" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
||||
16
node_modules/.bin/vows
generated
vendored
Normal file
16
node_modules/.bin/vows
generated
vendored
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*)
|
||||
if command -v cygpath > /dev/null 2>&1; then
|
||||
basedir=`cygpath -w "$basedir"`
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
exec "$basedir/node" "$basedir/../vows/bin/vows" "$@"
|
||||
else
|
||||
exec node "$basedir/../vows/bin/vows" "$@"
|
||||
fi
|
||||
17
node_modules/.bin/vows.cmd
generated
vendored
Normal file
17
node_modules/.bin/vows.cmd
generated
vendored
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
@ECHO off
|
||||
GOTO start
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
||||
:start
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\vows\bin\vows" %*
|
||||
28
node_modules/.bin/vows.ps1
generated
vendored
Normal file
28
node_modules/.bin/vows.ps1
generated
vendored
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "$basedir/node$exe" "$basedir/../vows/bin/vows" $args
|
||||
} else {
|
||||
& "$basedir/node$exe" "$basedir/../vows/bin/vows" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "node$exe" "$basedir/../vows/bin/vows" $args
|
||||
} else {
|
||||
& "node$exe" "$basedir/../vows/bin/vows" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
||||
1045
node_modules/.package-lock.json
generated
vendored
Normal file
1045
node_modules/.package-lock.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
243
node_modules/accepts/HISTORY.md
generated
vendored
Normal file
243
node_modules/accepts/HISTORY.md
generated
vendored
Normal file
|
|
@ -0,0 +1,243 @@
|
|||
1.3.8 / 2022-02-02
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.34
|
||||
- deps: mime-db@~1.51.0
|
||||
* deps: negotiator@0.6.3
|
||||
|
||||
1.3.7 / 2019-04-29
|
||||
==================
|
||||
|
||||
* deps: negotiator@0.6.2
|
||||
- Fix sorting charset, encoding, and language with extra parameters
|
||||
|
||||
1.3.6 / 2019-04-28
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.24
|
||||
- deps: mime-db@~1.40.0
|
||||
|
||||
1.3.5 / 2018-02-28
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.18
|
||||
- deps: mime-db@~1.33.0
|
||||
|
||||
1.3.4 / 2017-08-22
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.16
|
||||
- deps: mime-db@~1.29.0
|
||||
|
||||
1.3.3 / 2016-05-02
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.11
|
||||
- deps: mime-db@~1.23.0
|
||||
* deps: negotiator@0.6.1
|
||||
- perf: improve `Accept` parsing speed
|
||||
- perf: improve `Accept-Charset` parsing speed
|
||||
- perf: improve `Accept-Encoding` parsing speed
|
||||
- perf: improve `Accept-Language` parsing speed
|
||||
|
||||
1.3.2 / 2016-03-08
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.10
|
||||
- Fix extension of `application/dash+xml`
|
||||
- Update primary extension for `audio/mp4`
|
||||
- deps: mime-db@~1.22.0
|
||||
|
||||
1.3.1 / 2016-01-19
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.9
|
||||
- deps: mime-db@~1.21.0
|
||||
|
||||
1.3.0 / 2015-09-29
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.7
|
||||
- deps: mime-db@~1.19.0
|
||||
* deps: negotiator@0.6.0
|
||||
- Fix including type extensions in parameters in `Accept` parsing
|
||||
- Fix parsing `Accept` parameters with quoted equals
|
||||
- Fix parsing `Accept` parameters with quoted semicolons
|
||||
- Lazy-load modules from main entry point
|
||||
- perf: delay type concatenation until needed
|
||||
- perf: enable strict mode
|
||||
- perf: hoist regular expressions
|
||||
- perf: remove closures getting spec properties
|
||||
- perf: remove a closure from media type parsing
|
||||
- perf: remove property delete from media type parsing
|
||||
|
||||
1.2.13 / 2015-09-06
|
||||
===================
|
||||
|
||||
* deps: mime-types@~2.1.6
|
||||
- deps: mime-db@~1.18.0
|
||||
|
||||
1.2.12 / 2015-07-30
|
||||
===================
|
||||
|
||||
* deps: mime-types@~2.1.4
|
||||
- deps: mime-db@~1.16.0
|
||||
|
||||
1.2.11 / 2015-07-16
|
||||
===================
|
||||
|
||||
* deps: mime-types@~2.1.3
|
||||
- deps: mime-db@~1.15.0
|
||||
|
||||
1.2.10 / 2015-07-01
|
||||
===================
|
||||
|
||||
* deps: mime-types@~2.1.2
|
||||
- deps: mime-db@~1.14.0
|
||||
|
||||
1.2.9 / 2015-06-08
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.1
|
||||
- perf: fix deopt during mapping
|
||||
|
||||
1.2.8 / 2015-06-07
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.1.0
|
||||
- deps: mime-db@~1.13.0
|
||||
* perf: avoid argument reassignment & argument slice
|
||||
* perf: avoid negotiator recursive construction
|
||||
* perf: enable strict mode
|
||||
* perf: remove unnecessary bitwise operator
|
||||
|
||||
1.2.7 / 2015-05-10
|
||||
==================
|
||||
|
||||
* deps: negotiator@0.5.3
|
||||
- Fix media type parameter matching to be case-insensitive
|
||||
|
||||
1.2.6 / 2015-05-07
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.11
|
||||
- deps: mime-db@~1.9.1
|
||||
* deps: negotiator@0.5.2
|
||||
- Fix comparing media types with quoted values
|
||||
- Fix splitting media types with quoted commas
|
||||
|
||||
1.2.5 / 2015-03-13
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.10
|
||||
- deps: mime-db@~1.8.0
|
||||
|
||||
1.2.4 / 2015-02-14
|
||||
==================
|
||||
|
||||
* Support Node.js 0.6
|
||||
* deps: mime-types@~2.0.9
|
||||
- deps: mime-db@~1.7.0
|
||||
* deps: negotiator@0.5.1
|
||||
- Fix preference sorting to be stable for long acceptable lists
|
||||
|
||||
1.2.3 / 2015-01-31
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.8
|
||||
- deps: mime-db@~1.6.0
|
||||
|
||||
1.2.2 / 2014-12-30
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.7
|
||||
- deps: mime-db@~1.5.0
|
||||
|
||||
1.2.1 / 2014-12-30
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.5
|
||||
- deps: mime-db@~1.3.1
|
||||
|
||||
1.2.0 / 2014-12-19
|
||||
==================
|
||||
|
||||
* deps: negotiator@0.5.0
|
||||
- Fix list return order when large accepted list
|
||||
- Fix missing identity encoding when q=0 exists
|
||||
- Remove dynamic building of Negotiator class
|
||||
|
||||
1.1.4 / 2014-12-10
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.4
|
||||
- deps: mime-db@~1.3.0
|
||||
|
||||
1.1.3 / 2014-11-09
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.3
|
||||
- deps: mime-db@~1.2.0
|
||||
|
||||
1.1.2 / 2014-10-14
|
||||
==================
|
||||
|
||||
* deps: negotiator@0.4.9
|
||||
- Fix error when media type has invalid parameter
|
||||
|
||||
1.1.1 / 2014-09-28
|
||||
==================
|
||||
|
||||
* deps: mime-types@~2.0.2
|
||||
- deps: mime-db@~1.1.0
|
||||
* deps: negotiator@0.4.8
|
||||
- Fix all negotiations to be case-insensitive
|
||||
- Stable sort preferences of same quality according to client order
|
||||
|
||||
1.1.0 / 2014-09-02
|
||||
==================
|
||||
|
||||
* update `mime-types`
|
||||
|
||||
1.0.7 / 2014-07-04
|
||||
==================
|
||||
|
||||
* Fix wrong type returned from `type` when match after unknown extension
|
||||
|
||||
1.0.6 / 2014-06-24
|
||||
==================
|
||||
|
||||
* deps: negotiator@0.4.7
|
||||
|
||||
1.0.5 / 2014-06-20
|
||||
==================
|
||||
|
||||
* fix crash when unknown extension given
|
||||
|
||||
1.0.4 / 2014-06-19
|
||||
==================
|
||||
|
||||
* use `mime-types`
|
||||
|
||||
1.0.3 / 2014-06-11
|
||||
==================
|
||||
|
||||
* deps: negotiator@0.4.6
|
||||
- Order by specificity when quality is the same
|
||||
|
||||
1.0.2 / 2014-05-29
|
||||
==================
|
||||
|
||||
* Fix interpretation when header not in request
|
||||
* deps: pin negotiator@0.4.5
|
||||
|
||||
1.0.1 / 2014-01-18
|
||||
==================
|
||||
|
||||
* Identity encoding isn't always acceptable
|
||||
* deps: negotiator@~0.4.0
|
||||
|
||||
1.0.0 / 2013-12-27
|
||||
==================
|
||||
|
||||
* Genesis
|
||||
23
node_modules/accepts/LICENSE
generated
vendored
Normal file
23
node_modules/accepts/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
(The MIT License)
|
||||
|
||||
Copyright (c) 2014 Jonathan Ong <me@jongleberry.com>
|
||||
Copyright (c) 2015 Douglas Christopher Wilson <doug@somethingdoug.com>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining
|
||||
a copy of this software and associated documentation files (the
|
||||
'Software'), to deal in the Software without restriction, including
|
||||
without limitation the rights to use, copy, modify, merge, publish,
|
||||
distribute, sublicense, and/or sell copies of the Software, and to
|
||||
permit persons to whom the Software is furnished to do so, subject to
|
||||
the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
|
||||
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
140
node_modules/accepts/README.md
generated
vendored
Normal file
140
node_modules/accepts/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,140 @@
|
|||
# accepts
|
||||
|
||||
[![NPM Version][npm-version-image]][npm-url]
|
||||
[![NPM Downloads][npm-downloads-image]][npm-url]
|
||||
[![Node.js Version][node-version-image]][node-version-url]
|
||||
[![Build Status][github-actions-ci-image]][github-actions-ci-url]
|
||||
[![Test Coverage][coveralls-image]][coveralls-url]
|
||||
|
||||
Higher level content negotiation based on [negotiator](https://www.npmjs.com/package/negotiator).
|
||||
Extracted from [koa](https://www.npmjs.com/package/koa) for general use.
|
||||
|
||||
In addition to negotiator, it allows:
|
||||
|
||||
- Allows types as an array or arguments list, ie `(['text/html', 'application/json'])`
|
||||
as well as `('text/html', 'application/json')`.
|
||||
- Allows type shorthands such as `json`.
|
||||
- Returns `false` when no types match
|
||||
- Treats non-existent headers as `*`
|
||||
|
||||
## Installation
|
||||
|
||||
This is a [Node.js](https://nodejs.org/en/) module available through the
|
||||
[npm registry](https://www.npmjs.com/). Installation is done using the
|
||||
[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally):
|
||||
|
||||
```sh
|
||||
$ npm install accepts
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
```js
|
||||
var accepts = require('accepts')
|
||||
```
|
||||
|
||||
### accepts(req)
|
||||
|
||||
Create a new `Accepts` object for the given `req`.
|
||||
|
||||
#### .charset(charsets)
|
||||
|
||||
Return the first accepted charset. If nothing in `charsets` is accepted,
|
||||
then `false` is returned.
|
||||
|
||||
#### .charsets()
|
||||
|
||||
Return the charsets that the request accepts, in the order of the client's
|
||||
preference (most preferred first).
|
||||
|
||||
#### .encoding(encodings)
|
||||
|
||||
Return the first accepted encoding. If nothing in `encodings` is accepted,
|
||||
then `false` is returned.
|
||||
|
||||
#### .encodings()
|
||||
|
||||
Return the encodings that the request accepts, in the order of the client's
|
||||
preference (most preferred first).
|
||||
|
||||
#### .language(languages)
|
||||
|
||||
Return the first accepted language. If nothing in `languages` is accepted,
|
||||
then `false` is returned.
|
||||
|
||||
#### .languages()
|
||||
|
||||
Return the languages that the request accepts, in the order of the client's
|
||||
preference (most preferred first).
|
||||
|
||||
#### .type(types)
|
||||
|
||||
Return the first accepted type (and it is returned as the same text as what
|
||||
appears in the `types` array). If nothing in `types` is accepted, then `false`
|
||||
is returned.
|
||||
|
||||
The `types` array can contain full MIME types or file extensions. Any value
|
||||
that is not a full MIME types is passed to `require('mime-types').lookup`.
|
||||
|
||||
#### .types()
|
||||
|
||||
Return the types that the request accepts, in the order of the client's
|
||||
preference (most preferred first).
|
||||
|
||||
## Examples
|
||||
|
||||
### Simple type negotiation
|
||||
|
||||
This simple example shows how to use `accepts` to return a different typed
|
||||
respond body based on what the client wants to accept. The server lists it's
|
||||
preferences in order and will get back the best match between the client and
|
||||
server.
|
||||
|
||||
```js
|
||||
var accepts = require('accepts')
|
||||
var http = require('http')
|
||||
|
||||
function app (req, res) {
|
||||
var accept = accepts(req)
|
||||
|
||||
// the order of this list is significant; should be server preferred order
|
||||
switch (accept.type(['json', 'html'])) {
|
||||
case 'json':
|
||||
res.setHeader('Content-Type', 'application/json')
|
||||
res.write('{"hello":"world!"}')
|
||||
break
|
||||
case 'html':
|
||||
res.setHeader('Content-Type', 'text/html')
|
||||
res.write('<b>hello, world!</b>')
|
||||
break
|
||||
default:
|
||||
// the fallback is text/plain, so no need to specify it above
|
||||
res.setHeader('Content-Type', 'text/plain')
|
||||
res.write('hello, world!')
|
||||
break
|
||||
}
|
||||
|
||||
res.end()
|
||||
}
|
||||
|
||||
http.createServer(app).listen(3000)
|
||||
```
|
||||
|
||||
You can test this out with the cURL program:
|
||||
```sh
|
||||
curl -I -H'Accept: text/html' http://localhost:3000/
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
[MIT](LICENSE)
|
||||
|
||||
[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/accepts/master
|
||||
[coveralls-url]: https://coveralls.io/r/jshttp/accepts?branch=master
|
||||
[github-actions-ci-image]: https://badgen.net/github/checks/jshttp/accepts/master?label=ci
|
||||
[github-actions-ci-url]: https://github.com/jshttp/accepts/actions/workflows/ci.yml
|
||||
[node-version-image]: https://badgen.net/npm/node/accepts
|
||||
[node-version-url]: https://nodejs.org/en/download
|
||||
[npm-downloads-image]: https://badgen.net/npm/dm/accepts
|
||||
[npm-url]: https://npmjs.org/package/accepts
|
||||
[npm-version-image]: https://badgen.net/npm/v/accepts
|
||||
238
node_modules/accepts/index.js
generated
vendored
Normal file
238
node_modules/accepts/index.js
generated
vendored
Normal file
|
|
@ -0,0 +1,238 @@
|
|||
/*!
|
||||
* accepts
|
||||
* Copyright(c) 2014 Jonathan Ong
|
||||
* Copyright(c) 2015 Douglas Christopher Wilson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Module dependencies.
|
||||
* @private
|
||||
*/
|
||||
|
||||
var Negotiator = require('negotiator')
|
||||
var mime = require('mime-types')
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
* @public
|
||||
*/
|
||||
|
||||
module.exports = Accepts
|
||||
|
||||
/**
|
||||
* Create a new Accepts object for the given req.
|
||||
*
|
||||
* @param {object} req
|
||||
* @public
|
||||
*/
|
||||
|
||||
function Accepts (req) {
|
||||
if (!(this instanceof Accepts)) {
|
||||
return new Accepts(req)
|
||||
}
|
||||
|
||||
this.headers = req.headers
|
||||
this.negotiator = new Negotiator(req)
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if the given `type(s)` is acceptable, returning
|
||||
* the best match when true, otherwise `undefined`, in which
|
||||
* case you should respond with 406 "Not Acceptable".
|
||||
*
|
||||
* The `type` value may be a single mime type string
|
||||
* such as "application/json", the extension name
|
||||
* such as "json" or an array `["json", "html", "text/plain"]`. When a list
|
||||
* or array is given the _best_ match, if any is returned.
|
||||
*
|
||||
* Examples:
|
||||
*
|
||||
* // Accept: text/html
|
||||
* this.types('html');
|
||||
* // => "html"
|
||||
*
|
||||
* // Accept: text/*, application/json
|
||||
* this.types('html');
|
||||
* // => "html"
|
||||
* this.types('text/html');
|
||||
* // => "text/html"
|
||||
* this.types('json', 'text');
|
||||
* // => "json"
|
||||
* this.types('application/json');
|
||||
* // => "application/json"
|
||||
*
|
||||
* // Accept: text/*, application/json
|
||||
* this.types('image/png');
|
||||
* this.types('png');
|
||||
* // => undefined
|
||||
*
|
||||
* // Accept: text/*;q=.5, application/json
|
||||
* this.types(['html', 'json']);
|
||||
* this.types('html', 'json');
|
||||
* // => "json"
|
||||
*
|
||||
* @param {String|Array} types...
|
||||
* @return {String|Array|Boolean}
|
||||
* @public
|
||||
*/
|
||||
|
||||
Accepts.prototype.type =
|
||||
Accepts.prototype.types = function (types_) {
|
||||
var types = types_
|
||||
|
||||
// support flattened arguments
|
||||
if (types && !Array.isArray(types)) {
|
||||
types = new Array(arguments.length)
|
||||
for (var i = 0; i < types.length; i++) {
|
||||
types[i] = arguments[i]
|
||||
}
|
||||
}
|
||||
|
||||
// no types, return all requested types
|
||||
if (!types || types.length === 0) {
|
||||
return this.negotiator.mediaTypes()
|
||||
}
|
||||
|
||||
// no accept header, return first given type
|
||||
if (!this.headers.accept) {
|
||||
return types[0]
|
||||
}
|
||||
|
||||
var mimes = types.map(extToMime)
|
||||
var accepts = this.negotiator.mediaTypes(mimes.filter(validMime))
|
||||
var first = accepts[0]
|
||||
|
||||
return first
|
||||
? types[mimes.indexOf(first)]
|
||||
: false
|
||||
}
|
||||
|
||||
/**
|
||||
* Return accepted encodings or best fit based on `encodings`.
|
||||
*
|
||||
* Given `Accept-Encoding: gzip, deflate`
|
||||
* an array sorted by quality is returned:
|
||||
*
|
||||
* ['gzip', 'deflate']
|
||||
*
|
||||
* @param {String|Array} encodings...
|
||||
* @return {String|Array}
|
||||
* @public
|
||||
*/
|
||||
|
||||
Accepts.prototype.encoding =
|
||||
Accepts.prototype.encodings = function (encodings_) {
|
||||
var encodings = encodings_
|
||||
|
||||
// support flattened arguments
|
||||
if (encodings && !Array.isArray(encodings)) {
|
||||
encodings = new Array(arguments.length)
|
||||
for (var i = 0; i < encodings.length; i++) {
|
||||
encodings[i] = arguments[i]
|
||||
}
|
||||
}
|
||||
|
||||
// no encodings, return all requested encodings
|
||||
if (!encodings || encodings.length === 0) {
|
||||
return this.negotiator.encodings()
|
||||
}
|
||||
|
||||
return this.negotiator.encodings(encodings)[0] || false
|
||||
}
|
||||
|
||||
/**
|
||||
* Return accepted charsets or best fit based on `charsets`.
|
||||
*
|
||||
* Given `Accept-Charset: utf-8, iso-8859-1;q=0.2, utf-7;q=0.5`
|
||||
* an array sorted by quality is returned:
|
||||
*
|
||||
* ['utf-8', 'utf-7', 'iso-8859-1']
|
||||
*
|
||||
* @param {String|Array} charsets...
|
||||
* @return {String|Array}
|
||||
* @public
|
||||
*/
|
||||
|
||||
Accepts.prototype.charset =
|
||||
Accepts.prototype.charsets = function (charsets_) {
|
||||
var charsets = charsets_
|
||||
|
||||
// support flattened arguments
|
||||
if (charsets && !Array.isArray(charsets)) {
|
||||
charsets = new Array(arguments.length)
|
||||
for (var i = 0; i < charsets.length; i++) {
|
||||
charsets[i] = arguments[i]
|
||||
}
|
||||
}
|
||||
|
||||
// no charsets, return all requested charsets
|
||||
if (!charsets || charsets.length === 0) {
|
||||
return this.negotiator.charsets()
|
||||
}
|
||||
|
||||
return this.negotiator.charsets(charsets)[0] || false
|
||||
}
|
||||
|
||||
/**
|
||||
* Return accepted languages or best fit based on `langs`.
|
||||
*
|
||||
* Given `Accept-Language: en;q=0.8, es, pt`
|
||||
* an array sorted by quality is returned:
|
||||
*
|
||||
* ['es', 'pt', 'en']
|
||||
*
|
||||
* @param {String|Array} langs...
|
||||
* @return {Array|String}
|
||||
* @public
|
||||
*/
|
||||
|
||||
Accepts.prototype.lang =
|
||||
Accepts.prototype.langs =
|
||||
Accepts.prototype.language =
|
||||
Accepts.prototype.languages = function (languages_) {
|
||||
var languages = languages_
|
||||
|
||||
// support flattened arguments
|
||||
if (languages && !Array.isArray(languages)) {
|
||||
languages = new Array(arguments.length)
|
||||
for (var i = 0; i < languages.length; i++) {
|
||||
languages[i] = arguments[i]
|
||||
}
|
||||
}
|
||||
|
||||
// no languages, return all requested languages
|
||||
if (!languages || languages.length === 0) {
|
||||
return this.negotiator.languages()
|
||||
}
|
||||
|
||||
return this.negotiator.languages(languages)[0] || false
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert extnames to mime.
|
||||
*
|
||||
* @param {String} type
|
||||
* @return {String}
|
||||
* @private
|
||||
*/
|
||||
|
||||
function extToMime (type) {
|
||||
return type.indexOf('/') === -1
|
||||
? mime.lookup(type)
|
||||
: type
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if mime is valid.
|
||||
*
|
||||
* @param {String} type
|
||||
* @return {String}
|
||||
* @private
|
||||
*/
|
||||
|
||||
function validMime (type) {
|
||||
return typeof type === 'string'
|
||||
}
|
||||
47
node_modules/accepts/package.json
generated
vendored
Normal file
47
node_modules/accepts/package.json
generated
vendored
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
{
|
||||
"name": "accepts",
|
||||
"description": "Higher-level content negotiation",
|
||||
"version": "1.3.8",
|
||||
"contributors": [
|
||||
"Douglas Christopher Wilson <doug@somethingdoug.com>",
|
||||
"Jonathan Ong <me@jongleberry.com> (http://jongleberry.com)"
|
||||
],
|
||||
"license": "MIT",
|
||||
"repository": "jshttp/accepts",
|
||||
"dependencies": {
|
||||
"mime-types": "~2.1.34",
|
||||
"negotiator": "0.6.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"deep-equal": "1.0.1",
|
||||
"eslint": "7.32.0",
|
||||
"eslint-config-standard": "14.1.1",
|
||||
"eslint-plugin-import": "2.25.4",
|
||||
"eslint-plugin-markdown": "2.2.1",
|
||||
"eslint-plugin-node": "11.1.0",
|
||||
"eslint-plugin-promise": "4.3.1",
|
||||
"eslint-plugin-standard": "4.1.0",
|
||||
"mocha": "9.2.0",
|
||||
"nyc": "15.1.0"
|
||||
},
|
||||
"files": [
|
||||
"LICENSE",
|
||||
"HISTORY.md",
|
||||
"index.js"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
},
|
||||
"scripts": {
|
||||
"lint": "eslint .",
|
||||
"test": "mocha --reporter spec --check-leaks --bail test/",
|
||||
"test-ci": "nyc --reporter=lcov --reporter=text npm test",
|
||||
"test-cov": "nyc --reporter=html --reporter=text npm test"
|
||||
},
|
||||
"keywords": [
|
||||
"content",
|
||||
"negotiation",
|
||||
"accept",
|
||||
"accepts"
|
||||
]
|
||||
}
|
||||
21
node_modules/array-flatten/LICENSE
generated
vendored
Normal file
21
node_modules/array-flatten/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014 Blake Embrey (hello@blakeembrey.com)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
43
node_modules/array-flatten/README.md
generated
vendored
Normal file
43
node_modules/array-flatten/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,43 @@
|
|||
# Array Flatten
|
||||
|
||||
[![NPM version][npm-image]][npm-url]
|
||||
[![NPM downloads][downloads-image]][downloads-url]
|
||||
[![Build status][travis-image]][travis-url]
|
||||
[![Test coverage][coveralls-image]][coveralls-url]
|
||||
|
||||
> Flatten an array of nested arrays into a single flat array. Accepts an optional depth.
|
||||
|
||||
## Installation
|
||||
|
||||
```
|
||||
npm install array-flatten --save
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```javascript
|
||||
var flatten = require('array-flatten')
|
||||
|
||||
flatten([1, [2, [3, [4, [5], 6], 7], 8], 9])
|
||||
//=> [1, 2, 3, 4, 5, 6, 7, 8, 9]
|
||||
|
||||
flatten([1, [2, [3, [4, [5], 6], 7], 8], 9], 2)
|
||||
//=> [1, 2, 3, [4, [5], 6], 7, 8, 9]
|
||||
|
||||
(function () {
|
||||
flatten(arguments) //=> [1, 2, 3]
|
||||
})(1, [2, 3])
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
|
||||
[npm-image]: https://img.shields.io/npm/v/array-flatten.svg?style=flat
|
||||
[npm-url]: https://npmjs.org/package/array-flatten
|
||||
[downloads-image]: https://img.shields.io/npm/dm/array-flatten.svg?style=flat
|
||||
[downloads-url]: https://npmjs.org/package/array-flatten
|
||||
[travis-image]: https://img.shields.io/travis/blakeembrey/array-flatten.svg?style=flat
|
||||
[travis-url]: https://travis-ci.org/blakeembrey/array-flatten
|
||||
[coveralls-image]: https://img.shields.io/coveralls/blakeembrey/array-flatten.svg?style=flat
|
||||
[coveralls-url]: https://coveralls.io/r/blakeembrey/array-flatten?branch=master
|
||||
64
node_modules/array-flatten/array-flatten.js
generated
vendored
Normal file
64
node_modules/array-flatten/array-flatten.js
generated
vendored
Normal file
|
|
@ -0,0 +1,64 @@
|
|||
'use strict'
|
||||
|
||||
/**
|
||||
* Expose `arrayFlatten`.
|
||||
*/
|
||||
module.exports = arrayFlatten
|
||||
|
||||
/**
|
||||
* Recursive flatten function with depth.
|
||||
*
|
||||
* @param {Array} array
|
||||
* @param {Array} result
|
||||
* @param {Number} depth
|
||||
* @return {Array}
|
||||
*/
|
||||
function flattenWithDepth (array, result, depth) {
|
||||
for (var i = 0; i < array.length; i++) {
|
||||
var value = array[i]
|
||||
|
||||
if (depth > 0 && Array.isArray(value)) {
|
||||
flattenWithDepth(value, result, depth - 1)
|
||||
} else {
|
||||
result.push(value)
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
/**
|
||||
* Recursive flatten function. Omitting depth is slightly faster.
|
||||
*
|
||||
* @param {Array} array
|
||||
* @param {Array} result
|
||||
* @return {Array}
|
||||
*/
|
||||
function flattenForever (array, result) {
|
||||
for (var i = 0; i < array.length; i++) {
|
||||
var value = array[i]
|
||||
|
||||
if (Array.isArray(value)) {
|
||||
flattenForever(value, result)
|
||||
} else {
|
||||
result.push(value)
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
/**
|
||||
* Flatten an array, with the ability to define a depth.
|
||||
*
|
||||
* @param {Array} array
|
||||
* @param {Number} depth
|
||||
* @return {Array}
|
||||
*/
|
||||
function arrayFlatten (array, depth) {
|
||||
if (depth == null) {
|
||||
return flattenForever(array, [])
|
||||
}
|
||||
|
||||
return flattenWithDepth(array, [], depth)
|
||||
}
|
||||
39
node_modules/array-flatten/package.json
generated
vendored
Normal file
39
node_modules/array-flatten/package.json
generated
vendored
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
{
|
||||
"name": "array-flatten",
|
||||
"version": "1.1.1",
|
||||
"description": "Flatten an array of nested arrays into a single flat array",
|
||||
"main": "array-flatten.js",
|
||||
"files": [
|
||||
"array-flatten.js",
|
||||
"LICENSE"
|
||||
],
|
||||
"scripts": {
|
||||
"test": "istanbul cover _mocha -- -R spec"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/blakeembrey/array-flatten.git"
|
||||
},
|
||||
"keywords": [
|
||||
"array",
|
||||
"flatten",
|
||||
"arguments",
|
||||
"depth"
|
||||
],
|
||||
"author": {
|
||||
"name": "Blake Embrey",
|
||||
"email": "hello@blakeembrey.com",
|
||||
"url": "http://blakeembrey.me"
|
||||
},
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/blakeembrey/array-flatten/issues"
|
||||
},
|
||||
"homepage": "https://github.com/blakeembrey/array-flatten",
|
||||
"devDependencies": {
|
||||
"istanbul": "^0.3.13",
|
||||
"mocha": "^2.2.4",
|
||||
"pre-commit": "^1.0.7",
|
||||
"standard": "^3.7.3"
|
||||
}
|
||||
}
|
||||
2
node_modules/balanced-match/.github/FUNDING.yml
generated
vendored
Normal file
2
node_modules/balanced-match/.github/FUNDING.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
tidelift: "npm/balanced-match"
|
||||
patreon: juliangruber
|
||||
21
node_modules/balanced-match/LICENSE.md
generated
vendored
Normal file
21
node_modules/balanced-match/LICENSE.md
generated
vendored
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
(MIT)
|
||||
|
||||
Copyright (c) 2013 Julian Gruber <julian@juliangruber.com>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||
this software and associated documentation files (the "Software"), to deal in
|
||||
the Software without restriction, including without limitation the rights to
|
||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
|
||||
of the Software, and to permit persons to whom the Software is furnished to do
|
||||
so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
97
node_modules/balanced-match/README.md
generated
vendored
Normal file
97
node_modules/balanced-match/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,97 @@
|
|||
# balanced-match
|
||||
|
||||
Match balanced string pairs, like `{` and `}` or `<b>` and `</b>`. Supports regular expressions as well!
|
||||
|
||||
[](http://travis-ci.org/juliangruber/balanced-match)
|
||||
[](https://www.npmjs.org/package/balanced-match)
|
||||
|
||||
[](https://ci.testling.com/juliangruber/balanced-match)
|
||||
|
||||
## Example
|
||||
|
||||
Get the first matching pair of braces:
|
||||
|
||||
```js
|
||||
var balanced = require('balanced-match');
|
||||
|
||||
console.log(balanced('{', '}', 'pre{in{nested}}post'));
|
||||
console.log(balanced('{', '}', 'pre{first}between{second}post'));
|
||||
console.log(balanced(/\s+\{\s+/, /\s+\}\s+/, 'pre { in{nest} } post'));
|
||||
```
|
||||
|
||||
The matches are:
|
||||
|
||||
```bash
|
||||
$ node example.js
|
||||
{ start: 3, end: 14, pre: 'pre', body: 'in{nested}', post: 'post' }
|
||||
{ start: 3,
|
||||
end: 9,
|
||||
pre: 'pre',
|
||||
body: 'first',
|
||||
post: 'between{second}post' }
|
||||
{ start: 3, end: 17, pre: 'pre', body: 'in{nest}', post: 'post' }
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
### var m = balanced(a, b, str)
|
||||
|
||||
For the first non-nested matching pair of `a` and `b` in `str`, return an
|
||||
object with those keys:
|
||||
|
||||
* **start** the index of the first match of `a`
|
||||
* **end** the index of the matching `b`
|
||||
* **pre** the preamble, `a` and `b` not included
|
||||
* **body** the match, `a` and `b` not included
|
||||
* **post** the postscript, `a` and `b` not included
|
||||
|
||||
If there's no match, `undefined` will be returned.
|
||||
|
||||
If the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `['{', 'a', '']` and `{a}}` will match `['', 'a', '}']`.
|
||||
|
||||
### var r = balanced.range(a, b, str)
|
||||
|
||||
For the first non-nested matching pair of `a` and `b` in `str`, return an
|
||||
array with indexes: `[ <a index>, <b index> ]`.
|
||||
|
||||
If there's no match, `undefined` will be returned.
|
||||
|
||||
If the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `[ 1, 3 ]` and `{a}}` will match `[0, 2]`.
|
||||
|
||||
## Installation
|
||||
|
||||
With [npm](https://npmjs.org) do:
|
||||
|
||||
```bash
|
||||
npm install balanced-match
|
||||
```
|
||||
|
||||
## Security contact information
|
||||
|
||||
To report a security vulnerability, please use the
|
||||
[Tidelift security contact](https://tidelift.com/security).
|
||||
Tidelift will coordinate the fix and disclosure.
|
||||
|
||||
## License
|
||||
|
||||
(MIT)
|
||||
|
||||
Copyright (c) 2013 Julian Gruber <julian@juliangruber.com>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||
this software and associated documentation files (the "Software"), to deal in
|
||||
the Software without restriction, including without limitation the rights to
|
||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
|
||||
of the Software, and to permit persons to whom the Software is furnished to do
|
||||
so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
62
node_modules/balanced-match/index.js
generated
vendored
Normal file
62
node_modules/balanced-match/index.js
generated
vendored
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
'use strict';
|
||||
module.exports = balanced;
|
||||
function balanced(a, b, str) {
|
||||
if (a instanceof RegExp) a = maybeMatch(a, str);
|
||||
if (b instanceof RegExp) b = maybeMatch(b, str);
|
||||
|
||||
var r = range(a, b, str);
|
||||
|
||||
return r && {
|
||||
start: r[0],
|
||||
end: r[1],
|
||||
pre: str.slice(0, r[0]),
|
||||
body: str.slice(r[0] + a.length, r[1]),
|
||||
post: str.slice(r[1] + b.length)
|
||||
};
|
||||
}
|
||||
|
||||
function maybeMatch(reg, str) {
|
||||
var m = str.match(reg);
|
||||
return m ? m[0] : null;
|
||||
}
|
||||
|
||||
balanced.range = range;
|
||||
function range(a, b, str) {
|
||||
var begs, beg, left, right, result;
|
||||
var ai = str.indexOf(a);
|
||||
var bi = str.indexOf(b, ai + 1);
|
||||
var i = ai;
|
||||
|
||||
if (ai >= 0 && bi > 0) {
|
||||
if(a===b) {
|
||||
return [ai, bi];
|
||||
}
|
||||
begs = [];
|
||||
left = str.length;
|
||||
|
||||
while (i >= 0 && !result) {
|
||||
if (i == ai) {
|
||||
begs.push(i);
|
||||
ai = str.indexOf(a, i + 1);
|
||||
} else if (begs.length == 1) {
|
||||
result = [ begs.pop(), bi ];
|
||||
} else {
|
||||
beg = begs.pop();
|
||||
if (beg < left) {
|
||||
left = beg;
|
||||
right = bi;
|
||||
}
|
||||
|
||||
bi = str.indexOf(b, i + 1);
|
||||
}
|
||||
|
||||
i = ai < bi && ai >= 0 ? ai : bi;
|
||||
}
|
||||
|
||||
if (begs.length) {
|
||||
result = [ left, right ];
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
48
node_modules/balanced-match/package.json
generated
vendored
Normal file
48
node_modules/balanced-match/package.json
generated
vendored
Normal file
|
|
@ -0,0 +1,48 @@
|
|||
{
|
||||
"name": "balanced-match",
|
||||
"description": "Match balanced character pairs, like \"{\" and \"}\"",
|
||||
"version": "1.0.2",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/juliangruber/balanced-match.git"
|
||||
},
|
||||
"homepage": "https://github.com/juliangruber/balanced-match",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "tape test/test.js",
|
||||
"bench": "matcha test/bench.js"
|
||||
},
|
||||
"devDependencies": {
|
||||
"matcha": "^0.7.0",
|
||||
"tape": "^4.6.0"
|
||||
},
|
||||
"keywords": [
|
||||
"match",
|
||||
"regexp",
|
||||
"test",
|
||||
"balanced",
|
||||
"parse"
|
||||
],
|
||||
"author": {
|
||||
"name": "Julian Gruber",
|
||||
"email": "mail@juliangruber.com",
|
||||
"url": "http://juliangruber.com"
|
||||
},
|
||||
"license": "MIT",
|
||||
"testling": {
|
||||
"files": "test/*.js",
|
||||
"browsers": [
|
||||
"ie/8..latest",
|
||||
"firefox/20..latest",
|
||||
"firefox/nightly",
|
||||
"chrome/25..latest",
|
||||
"chrome/canary",
|
||||
"opera/12..latest",
|
||||
"opera/next",
|
||||
"safari/5.1..latest",
|
||||
"ipad/6.0..latest",
|
||||
"iphone/6.0..latest",
|
||||
"android-browser/4.2..latest"
|
||||
]
|
||||
}
|
||||
}
|
||||
665
node_modules/body-parser/HISTORY.md
generated
vendored
Normal file
665
node_modules/body-parser/HISTORY.md
generated
vendored
Normal file
|
|
@ -0,0 +1,665 @@
|
|||
1.20.2 / 2023-02-21
|
||||
===================
|
||||
|
||||
* Fix strict json error message on Node.js 19+
|
||||
* deps: content-type@~1.0.5
|
||||
- perf: skip value escaping when unnecessary
|
||||
* deps: raw-body@2.5.2
|
||||
|
||||
1.20.1 / 2022-10-06
|
||||
===================
|
||||
|
||||
* deps: qs@6.11.0
|
||||
* perf: remove unnecessary object clone
|
||||
|
||||
1.20.0 / 2022-04-02
|
||||
===================
|
||||
|
||||
* Fix error message for json parse whitespace in `strict`
|
||||
* Fix internal error when inflated body exceeds limit
|
||||
* Prevent loss of async hooks context
|
||||
* Prevent hanging when request already read
|
||||
* deps: depd@2.0.0
|
||||
- Replace internal `eval` usage with `Function` constructor
|
||||
- Use instance methods on `process` to check for listeners
|
||||
* deps: http-errors@2.0.0
|
||||
- deps: depd@2.0.0
|
||||
- deps: statuses@2.0.1
|
||||
* deps: on-finished@2.4.1
|
||||
* deps: qs@6.10.3
|
||||
* deps: raw-body@2.5.1
|
||||
- deps: http-errors@2.0.0
|
||||
|
||||
1.19.2 / 2022-02-15
|
||||
===================
|
||||
|
||||
* deps: bytes@3.1.2
|
||||
* deps: qs@6.9.7
|
||||
* Fix handling of `__proto__` keys
|
||||
* deps: raw-body@2.4.3
|
||||
- deps: bytes@3.1.2
|
||||
|
||||
1.19.1 / 2021-12-10
|
||||
===================
|
||||
|
||||
* deps: bytes@3.1.1
|
||||
* deps: http-errors@1.8.1
|
||||
- deps: inherits@2.0.4
|
||||
- deps: toidentifier@1.0.1
|
||||
- deps: setprototypeof@1.2.0
|
||||
* deps: qs@6.9.6
|
||||
* deps: raw-body@2.4.2
|
||||
- deps: bytes@3.1.1
|
||||
- deps: http-errors@1.8.1
|
||||
* deps: safe-buffer@5.2.1
|
||||
* deps: type-is@~1.6.18
|
||||
|
||||
1.19.0 / 2019-04-25
|
||||
===================
|
||||
|
||||
* deps: bytes@3.1.0
|
||||
- Add petabyte (`pb`) support
|
||||
* deps: http-errors@1.7.2
|
||||
- Set constructor name when possible
|
||||
- deps: setprototypeof@1.1.1
|
||||
- deps: statuses@'>= 1.5.0 < 2'
|
||||
* deps: iconv-lite@0.4.24
|
||||
- Added encoding MIK
|
||||
* deps: qs@6.7.0
|
||||
- Fix parsing array brackets after index
|
||||
* deps: raw-body@2.4.0
|
||||
- deps: bytes@3.1.0
|
||||
- deps: http-errors@1.7.2
|
||||
- deps: iconv-lite@0.4.24
|
||||
* deps: type-is@~1.6.17
|
||||
- deps: mime-types@~2.1.24
|
||||
- perf: prevent internal `throw` on invalid type
|
||||
|
||||
1.18.3 / 2018-05-14
|
||||
===================
|
||||
|
||||
* Fix stack trace for strict json parse error
|
||||
* deps: depd@~1.1.2
|
||||
- perf: remove argument reassignment
|
||||
* deps: http-errors@~1.6.3
|
||||
- deps: depd@~1.1.2
|
||||
- deps: setprototypeof@1.1.0
|
||||
- deps: statuses@'>= 1.3.1 < 2'
|
||||
* deps: iconv-lite@0.4.23
|
||||
- Fix loading encoding with year appended
|
||||
- Fix deprecation warnings on Node.js 10+
|
||||
* deps: qs@6.5.2
|
||||
* deps: raw-body@2.3.3
|
||||
- deps: http-errors@1.6.3
|
||||
- deps: iconv-lite@0.4.23
|
||||
* deps: type-is@~1.6.16
|
||||
- deps: mime-types@~2.1.18
|
||||
|
||||
1.18.2 / 2017-09-22
|
||||
===================
|
||||
|
||||
* deps: debug@2.6.9
|
||||
* perf: remove argument reassignment
|
||||
|
||||
1.18.1 / 2017-09-12
|
||||
===================
|
||||
|
||||
* deps: content-type@~1.0.4
|
||||
- perf: remove argument reassignment
|
||||
- perf: skip parameter parsing when no parameters
|
||||
* deps: iconv-lite@0.4.19
|
||||
- Fix ISO-8859-1 regression
|
||||
- Update Windows-1255
|
||||
* deps: qs@6.5.1
|
||||
- Fix parsing & compacting very deep objects
|
||||
* deps: raw-body@2.3.2
|
||||
- deps: iconv-lite@0.4.19
|
||||
|
||||
1.18.0 / 2017-09-08
|
||||
===================
|
||||
|
||||
* Fix JSON strict violation error to match native parse error
|
||||
* Include the `body` property on verify errors
|
||||
* Include the `type` property on all generated errors
|
||||
* Use `http-errors` to set status code on errors
|
||||
* deps: bytes@3.0.0
|
||||
* deps: debug@2.6.8
|
||||
* deps: depd@~1.1.1
|
||||
- Remove unnecessary `Buffer` loading
|
||||
* deps: http-errors@~1.6.2
|
||||
- deps: depd@1.1.1
|
||||
* deps: iconv-lite@0.4.18
|
||||
- Add support for React Native
|
||||
- Add a warning if not loaded as utf-8
|
||||
- Fix CESU-8 decoding in Node.js 8
|
||||
- Improve speed of ISO-8859-1 encoding
|
||||
* deps: qs@6.5.0
|
||||
* deps: raw-body@2.3.1
|
||||
- Use `http-errors` for standard emitted errors
|
||||
- deps: bytes@3.0.0
|
||||
- deps: iconv-lite@0.4.18
|
||||
- perf: skip buffer decoding on overage chunk
|
||||
* perf: prevent internal `throw` when missing charset
|
||||
|
||||
1.17.2 / 2017-05-17
|
||||
===================
|
||||
|
||||
* deps: debug@2.6.7
|
||||
- Fix `DEBUG_MAX_ARRAY_LENGTH`
|
||||
- deps: ms@2.0.0
|
||||
* deps: type-is@~1.6.15
|
||||
- deps: mime-types@~2.1.15
|
||||
|
||||
1.17.1 / 2017-03-06
|
||||
===================
|
||||
|
||||
* deps: qs@6.4.0
|
||||
- Fix regression parsing keys starting with `[`
|
||||
|
||||
1.17.0 / 2017-03-01
|
||||
===================
|
||||
|
||||
* deps: http-errors@~1.6.1
|
||||
- Make `message` property enumerable for `HttpError`s
|
||||
- deps: setprototypeof@1.0.3
|
||||
* deps: qs@6.3.1
|
||||
- Fix compacting nested arrays
|
||||
|
||||
1.16.1 / 2017-02-10
|
||||
===================
|
||||
|
||||
* deps: debug@2.6.1
|
||||
- Fix deprecation messages in WebStorm and other editors
|
||||
- Undeprecate `DEBUG_FD` set to `1` or `2`
|
||||
|
||||
1.16.0 / 2017-01-17
|
||||
===================
|
||||
|
||||
* deps: debug@2.6.0
|
||||
- Allow colors in workers
|
||||
- Deprecated `DEBUG_FD` environment variable
|
||||
- Fix error when running under React Native
|
||||
- Use same color for same namespace
|
||||
- deps: ms@0.7.2
|
||||
* deps: http-errors@~1.5.1
|
||||
- deps: inherits@2.0.3
|
||||
- deps: setprototypeof@1.0.2
|
||||
- deps: statuses@'>= 1.3.1 < 2'
|
||||
* deps: iconv-lite@0.4.15
|
||||
- Added encoding MS-31J
|
||||
- Added encoding MS-932
|
||||
- Added encoding MS-936
|
||||
- Added encoding MS-949
|
||||
- Added encoding MS-950
|
||||
- Fix GBK/GB18030 handling of Euro character
|
||||
* deps: qs@6.2.1
|
||||
- Fix array parsing from skipping empty values
|
||||
* deps: raw-body@~2.2.0
|
||||
- deps: iconv-lite@0.4.15
|
||||
* deps: type-is@~1.6.14
|
||||
- deps: mime-types@~2.1.13
|
||||
|
||||
1.15.2 / 2016-06-19
|
||||
===================
|
||||
|
||||
* deps: bytes@2.4.0
|
||||
* deps: content-type@~1.0.2
|
||||
- perf: enable strict mode
|
||||
* deps: http-errors@~1.5.0
|
||||
- Use `setprototypeof` module to replace `__proto__` setting
|
||||
- deps: statuses@'>= 1.3.0 < 2'
|
||||
- perf: enable strict mode
|
||||
* deps: qs@6.2.0
|
||||
* deps: raw-body@~2.1.7
|
||||
- deps: bytes@2.4.0
|
||||
- perf: remove double-cleanup on happy path
|
||||
* deps: type-is@~1.6.13
|
||||
- deps: mime-types@~2.1.11
|
||||
|
||||
1.15.1 / 2016-05-05
|
||||
===================
|
||||
|
||||
* deps: bytes@2.3.0
|
||||
- Drop partial bytes on all parsed units
|
||||
- Fix parsing byte string that looks like hex
|
||||
* deps: raw-body@~2.1.6
|
||||
- deps: bytes@2.3.0
|
||||
* deps: type-is@~1.6.12
|
||||
- deps: mime-types@~2.1.10
|
||||
|
||||
1.15.0 / 2016-02-10
|
||||
===================
|
||||
|
||||
* deps: http-errors@~1.4.0
|
||||
- Add `HttpError` export, for `err instanceof createError.HttpError`
|
||||
- deps: inherits@2.0.1
|
||||
- deps: statuses@'>= 1.2.1 < 2'
|
||||
* deps: qs@6.1.0
|
||||
* deps: type-is@~1.6.11
|
||||
- deps: mime-types@~2.1.9
|
||||
|
||||
1.14.2 / 2015-12-16
|
||||
===================
|
||||
|
||||
* deps: bytes@2.2.0
|
||||
* deps: iconv-lite@0.4.13
|
||||
* deps: qs@5.2.0
|
||||
* deps: raw-body@~2.1.5
|
||||
- deps: bytes@2.2.0
|
||||
- deps: iconv-lite@0.4.13
|
||||
* deps: type-is@~1.6.10
|
||||
- deps: mime-types@~2.1.8
|
||||
|
||||
1.14.1 / 2015-09-27
|
||||
===================
|
||||
|
||||
* Fix issue where invalid charset results in 400 when `verify` used
|
||||
* deps: iconv-lite@0.4.12
|
||||
- Fix CESU-8 decoding in Node.js 4.x
|
||||
* deps: raw-body@~2.1.4
|
||||
- Fix masking critical errors from `iconv-lite`
|
||||
- deps: iconv-lite@0.4.12
|
||||
* deps: type-is@~1.6.9
|
||||
- deps: mime-types@~2.1.7
|
||||
|
||||
1.14.0 / 2015-09-16
|
||||
===================
|
||||
|
||||
* Fix JSON strict parse error to match syntax errors
|
||||
* Provide static `require` analysis in `urlencoded` parser
|
||||
* deps: depd@~1.1.0
|
||||
- Support web browser loading
|
||||
* deps: qs@5.1.0
|
||||
* deps: raw-body@~2.1.3
|
||||
- Fix sync callback when attaching data listener causes sync read
|
||||
* deps: type-is@~1.6.8
|
||||
- Fix type error when given invalid type to match against
|
||||
- deps: mime-types@~2.1.6
|
||||
|
||||
1.13.3 / 2015-07-31
|
||||
===================
|
||||
|
||||
* deps: type-is@~1.6.6
|
||||
- deps: mime-types@~2.1.4
|
||||
|
||||
1.13.2 / 2015-07-05
|
||||
===================
|
||||
|
||||
* deps: iconv-lite@0.4.11
|
||||
* deps: qs@4.0.0
|
||||
- Fix dropping parameters like `hasOwnProperty`
|
||||
- Fix user-visible incompatibilities from 3.1.0
|
||||
- Fix various parsing edge cases
|
||||
* deps: raw-body@~2.1.2
|
||||
- Fix error stack traces to skip `makeError`
|
||||
- deps: iconv-lite@0.4.11
|
||||
* deps: type-is@~1.6.4
|
||||
- deps: mime-types@~2.1.2
|
||||
- perf: enable strict mode
|
||||
- perf: remove argument reassignment
|
||||
|
||||
1.13.1 / 2015-06-16
|
||||
===================
|
||||
|
||||
* deps: qs@2.4.2
|
||||
- Downgraded from 3.1.0 because of user-visible incompatibilities
|
||||
|
||||
1.13.0 / 2015-06-14
|
||||
===================
|
||||
|
||||
* Add `statusCode` property on `Error`s, in addition to `status`
|
||||
* Change `type` default to `application/json` for JSON parser
|
||||
* Change `type` default to `application/x-www-form-urlencoded` for urlencoded parser
|
||||
* Provide static `require` analysis
|
||||
* Use the `http-errors` module to generate errors
|
||||
* deps: bytes@2.1.0
|
||||
- Slight optimizations
|
||||
* deps: iconv-lite@0.4.10
|
||||
- The encoding UTF-16 without BOM now defaults to UTF-16LE when detection fails
|
||||
- Leading BOM is now removed when decoding
|
||||
* deps: on-finished@~2.3.0
|
||||
- Add defined behavior for HTTP `CONNECT` requests
|
||||
- Add defined behavior for HTTP `Upgrade` requests
|
||||
- deps: ee-first@1.1.1
|
||||
* deps: qs@3.1.0
|
||||
- Fix dropping parameters like `hasOwnProperty`
|
||||
- Fix various parsing edge cases
|
||||
- Parsed object now has `null` prototype
|
||||
* deps: raw-body@~2.1.1
|
||||
- Use `unpipe` module for unpiping requests
|
||||
- deps: iconv-lite@0.4.10
|
||||
* deps: type-is@~1.6.3
|
||||
- deps: mime-types@~2.1.1
|
||||
- perf: reduce try block size
|
||||
- perf: remove bitwise operations
|
||||
* perf: enable strict mode
|
||||
* perf: remove argument reassignment
|
||||
* perf: remove delete call
|
||||
|
||||
1.12.4 / 2015-05-10
|
||||
===================
|
||||
|
||||
* deps: debug@~2.2.0
|
||||
* deps: qs@2.4.2
|
||||
- Fix allowing parameters like `constructor`
|
||||
* deps: on-finished@~2.2.1
|
||||
* deps: raw-body@~2.0.1
|
||||
- Fix a false-positive when unpiping in Node.js 0.8
|
||||
- deps: bytes@2.0.1
|
||||
* deps: type-is@~1.6.2
|
||||
- deps: mime-types@~2.0.11
|
||||
|
||||
1.12.3 / 2015-04-15
|
||||
===================
|
||||
|
||||
* Slight efficiency improvement when not debugging
|
||||
* deps: depd@~1.0.1
|
||||
* deps: iconv-lite@0.4.8
|
||||
- Add encoding alias UNICODE-1-1-UTF-7
|
||||
* deps: raw-body@1.3.4
|
||||
- Fix hanging callback if request aborts during read
|
||||
- deps: iconv-lite@0.4.8
|
||||
|
||||
1.12.2 / 2015-03-16
|
||||
===================
|
||||
|
||||
* deps: qs@2.4.1
|
||||
- Fix error when parameter `hasOwnProperty` is present
|
||||
|
||||
1.12.1 / 2015-03-15
|
||||
===================
|
||||
|
||||
* deps: debug@~2.1.3
|
||||
- Fix high intensity foreground color for bold
|
||||
- deps: ms@0.7.0
|
||||
* deps: type-is@~1.6.1
|
||||
- deps: mime-types@~2.0.10
|
||||
|
||||
1.12.0 / 2015-02-13
|
||||
===================
|
||||
|
||||
* add `debug` messages
|
||||
* accept a function for the `type` option
|
||||
* use `content-type` to parse `Content-Type` headers
|
||||
* deps: iconv-lite@0.4.7
|
||||
- Gracefully support enumerables on `Object.prototype`
|
||||
* deps: raw-body@1.3.3
|
||||
- deps: iconv-lite@0.4.7
|
||||
* deps: type-is@~1.6.0
|
||||
- fix argument reassignment
|
||||
- fix false-positives in `hasBody` `Transfer-Encoding` check
|
||||
- support wildcard for both type and subtype (`*/*`)
|
||||
- deps: mime-types@~2.0.9
|
||||
|
||||
1.11.0 / 2015-01-30
|
||||
===================
|
||||
|
||||
* make internal `extended: true` depth limit infinity
|
||||
* deps: type-is@~1.5.6
|
||||
- deps: mime-types@~2.0.8
|
||||
|
||||
1.10.2 / 2015-01-20
|
||||
===================
|
||||
|
||||
* deps: iconv-lite@0.4.6
|
||||
- Fix rare aliases of single-byte encodings
|
||||
* deps: raw-body@1.3.2
|
||||
- deps: iconv-lite@0.4.6
|
||||
|
||||
1.10.1 / 2015-01-01
|
||||
===================
|
||||
|
||||
* deps: on-finished@~2.2.0
|
||||
* deps: type-is@~1.5.5
|
||||
- deps: mime-types@~2.0.7
|
||||
|
||||
1.10.0 / 2014-12-02
|
||||
===================
|
||||
|
||||
* make internal `extended: true` array limit dynamic
|
||||
|
||||
1.9.3 / 2014-11-21
|
||||
==================
|
||||
|
||||
* deps: iconv-lite@0.4.5
|
||||
- Fix Windows-31J and X-SJIS encoding support
|
||||
* deps: qs@2.3.3
|
||||
- Fix `arrayLimit` behavior
|
||||
* deps: raw-body@1.3.1
|
||||
- deps: iconv-lite@0.4.5
|
||||
* deps: type-is@~1.5.3
|
||||
- deps: mime-types@~2.0.3
|
||||
|
||||
1.9.2 / 2014-10-27
|
||||
==================
|
||||
|
||||
* deps: qs@2.3.2
|
||||
- Fix parsing of mixed objects and values
|
||||
|
||||
1.9.1 / 2014-10-22
|
||||
==================
|
||||
|
||||
* deps: on-finished@~2.1.1
|
||||
- Fix handling of pipelined requests
|
||||
* deps: qs@2.3.0
|
||||
- Fix parsing of mixed implicit and explicit arrays
|
||||
* deps: type-is@~1.5.2
|
||||
- deps: mime-types@~2.0.2
|
||||
|
||||
1.9.0 / 2014-09-24
|
||||
==================
|
||||
|
||||
* include the charset in "unsupported charset" error message
|
||||
* include the encoding in "unsupported content encoding" error message
|
||||
* deps: depd@~1.0.0
|
||||
|
||||
1.8.4 / 2014-09-23
|
||||
==================
|
||||
|
||||
* fix content encoding to be case-insensitive
|
||||
|
||||
1.8.3 / 2014-09-19
|
||||
==================
|
||||
|
||||
* deps: qs@2.2.4
|
||||
- Fix issue with object keys starting with numbers truncated
|
||||
|
||||
1.8.2 / 2014-09-15
|
||||
==================
|
||||
|
||||
* deps: depd@0.4.5
|
||||
|
||||
1.8.1 / 2014-09-07
|
||||
==================
|
||||
|
||||
* deps: media-typer@0.3.0
|
||||
* deps: type-is@~1.5.1
|
||||
|
||||
1.8.0 / 2014-09-05
|
||||
==================
|
||||
|
||||
* make empty-body-handling consistent between chunked requests
|
||||
- empty `json` produces `{}`
|
||||
- empty `raw` produces `new Buffer(0)`
|
||||
- empty `text` produces `''`
|
||||
- empty `urlencoded` produces `{}`
|
||||
* deps: qs@2.2.3
|
||||
- Fix issue where first empty value in array is discarded
|
||||
* deps: type-is@~1.5.0
|
||||
- fix `hasbody` to be true for `content-length: 0`
|
||||
|
||||
1.7.0 / 2014-09-01
|
||||
==================
|
||||
|
||||
* add `parameterLimit` option to `urlencoded` parser
|
||||
* change `urlencoded` extended array limit to 100
|
||||
* respond with 413 when over `parameterLimit` in `urlencoded`
|
||||
|
||||
1.6.7 / 2014-08-29
|
||||
==================
|
||||
|
||||
* deps: qs@2.2.2
|
||||
- Remove unnecessary cloning
|
||||
|
||||
1.6.6 / 2014-08-27
|
||||
==================
|
||||
|
||||
* deps: qs@2.2.0
|
||||
- Array parsing fix
|
||||
- Performance improvements
|
||||
|
||||
1.6.5 / 2014-08-16
|
||||
==================
|
||||
|
||||
* deps: on-finished@2.1.0
|
||||
|
||||
1.6.4 / 2014-08-14
|
||||
==================
|
||||
|
||||
* deps: qs@1.2.2
|
||||
|
||||
1.6.3 / 2014-08-10
|
||||
==================
|
||||
|
||||
* deps: qs@1.2.1
|
||||
|
||||
1.6.2 / 2014-08-07
|
||||
==================
|
||||
|
||||
* deps: qs@1.2.0
|
||||
- Fix parsing array of objects
|
||||
|
||||
1.6.1 / 2014-08-06
|
||||
==================
|
||||
|
||||
* deps: qs@1.1.0
|
||||
- Accept urlencoded square brackets
|
||||
- Accept empty values in implicit array notation
|
||||
|
||||
1.6.0 / 2014-08-05
|
||||
==================
|
||||
|
||||
* deps: qs@1.0.2
|
||||
- Complete rewrite
|
||||
- Limits array length to 20
|
||||
- Limits object depth to 5
|
||||
- Limits parameters to 1,000
|
||||
|
||||
1.5.2 / 2014-07-27
|
||||
==================
|
||||
|
||||
* deps: depd@0.4.4
|
||||
- Work-around v8 generating empty stack traces
|
||||
|
||||
1.5.1 / 2014-07-26
|
||||
==================
|
||||
|
||||
* deps: depd@0.4.3
|
||||
- Fix exception when global `Error.stackTraceLimit` is too low
|
||||
|
||||
1.5.0 / 2014-07-20
|
||||
==================
|
||||
|
||||
* deps: depd@0.4.2
|
||||
- Add `TRACE_DEPRECATION` environment variable
|
||||
- Remove non-standard grey color from color output
|
||||
- Support `--no-deprecation` argument
|
||||
- Support `--trace-deprecation` argument
|
||||
* deps: iconv-lite@0.4.4
|
||||
- Added encoding UTF-7
|
||||
* deps: raw-body@1.3.0
|
||||
- deps: iconv-lite@0.4.4
|
||||
- Added encoding UTF-7
|
||||
- Fix `Cannot switch to old mode now` error on Node.js 0.10+
|
||||
* deps: type-is@~1.3.2
|
||||
|
||||
1.4.3 / 2014-06-19
|
||||
==================
|
||||
|
||||
* deps: type-is@1.3.1
|
||||
- fix global variable leak
|
||||
|
||||
1.4.2 / 2014-06-19
|
||||
==================
|
||||
|
||||
* deps: type-is@1.3.0
|
||||
- improve type parsing
|
||||
|
||||
1.4.1 / 2014-06-19
|
||||
==================
|
||||
|
||||
* fix urlencoded extended deprecation message
|
||||
|
||||
1.4.0 / 2014-06-19
|
||||
==================
|
||||
|
||||
* add `text` parser
|
||||
* add `raw` parser
|
||||
* check accepted charset in content-type (accepts utf-8)
|
||||
* check accepted encoding in content-encoding (accepts identity)
|
||||
* deprecate `bodyParser()` middleware; use `.json()` and `.urlencoded()` as needed
|
||||
* deprecate `urlencoded()` without provided `extended` option
|
||||
* lazy-load urlencoded parsers
|
||||
* parsers split into files for reduced mem usage
|
||||
* support gzip and deflate bodies
|
||||
- set `inflate: false` to turn off
|
||||
* deps: raw-body@1.2.2
|
||||
- Support all encodings from `iconv-lite`
|
||||
|
||||
1.3.1 / 2014-06-11
|
||||
==================
|
||||
|
||||
* deps: type-is@1.2.1
|
||||
- Switch dependency from mime to mime-types@1.0.0
|
||||
|
||||
1.3.0 / 2014-05-31
|
||||
==================
|
||||
|
||||
* add `extended` option to urlencoded parser
|
||||
|
||||
1.2.2 / 2014-05-27
|
||||
==================
|
||||
|
||||
* deps: raw-body@1.1.6
|
||||
- assert stream encoding on node.js 0.8
|
||||
- assert stream encoding on node.js < 0.10.6
|
||||
- deps: bytes@1
|
||||
|
||||
1.2.1 / 2014-05-26
|
||||
==================
|
||||
|
||||
* invoke `next(err)` after request fully read
|
||||
- prevents hung responses and socket hang ups
|
||||
|
||||
1.2.0 / 2014-05-11
|
||||
==================
|
||||
|
||||
* add `verify` option
|
||||
* deps: type-is@1.2.0
|
||||
- support suffix matching
|
||||
|
||||
1.1.2 / 2014-05-11
|
||||
==================
|
||||
|
||||
* improve json parser speed
|
||||
|
||||
1.1.1 / 2014-05-11
|
||||
==================
|
||||
|
||||
* fix repeated limit parsing with every request
|
||||
|
||||
1.1.0 / 2014-05-10
|
||||
==================
|
||||
|
||||
* add `type` option
|
||||
* deps: pin for safety and consistency
|
||||
|
||||
1.0.2 / 2014-04-14
|
||||
==================
|
||||
|
||||
* use `type-is` module
|
||||
|
||||
1.0.1 / 2014-03-20
|
||||
==================
|
||||
|
||||
* lower default limits to 100kb
|
||||
23
node_modules/body-parser/LICENSE
generated
vendored
Normal file
23
node_modules/body-parser/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
(The MIT License)
|
||||
|
||||
Copyright (c) 2014 Jonathan Ong <me@jongleberry.com>
|
||||
Copyright (c) 2014-2015 Douglas Christopher Wilson <doug@somethingdoug.com>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining
|
||||
a copy of this software and associated documentation files (the
|
||||
'Software'), to deal in the Software without restriction, including
|
||||
without limitation the rights to use, copy, modify, merge, publish,
|
||||
distribute, sublicense, and/or sell copies of the Software, and to
|
||||
permit persons to whom the Software is furnished to do so, subject to
|
||||
the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
|
||||
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
465
node_modules/body-parser/README.md
generated
vendored
Normal file
465
node_modules/body-parser/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,465 @@
|
|||
# body-parser
|
||||
|
||||
[![NPM Version][npm-version-image]][npm-url]
|
||||
[![NPM Downloads][npm-downloads-image]][npm-url]
|
||||
[![Build Status][ci-image]][ci-url]
|
||||
[![Test Coverage][coveralls-image]][coveralls-url]
|
||||
|
||||
Node.js body parsing middleware.
|
||||
|
||||
Parse incoming request bodies in a middleware before your handlers, available
|
||||
under the `req.body` property.
|
||||
|
||||
**Note** As `req.body`'s shape is based on user-controlled input, all
|
||||
properties and values in this object are untrusted and should be validated
|
||||
before trusting. For example, `req.body.foo.toString()` may fail in multiple
|
||||
ways, for example the `foo` property may not be there or may not be a string,
|
||||
and `toString` may not be a function and instead a string or other user input.
|
||||
|
||||
[Learn about the anatomy of an HTTP transaction in Node.js](https://nodejs.org/en/docs/guides/anatomy-of-an-http-transaction/).
|
||||
|
||||
_This does not handle multipart bodies_, due to their complex and typically
|
||||
large nature. For multipart bodies, you may be interested in the following
|
||||
modules:
|
||||
|
||||
* [busboy](https://www.npmjs.org/package/busboy#readme) and
|
||||
[connect-busboy](https://www.npmjs.org/package/connect-busboy#readme)
|
||||
* [multiparty](https://www.npmjs.org/package/multiparty#readme) and
|
||||
[connect-multiparty](https://www.npmjs.org/package/connect-multiparty#readme)
|
||||
* [formidable](https://www.npmjs.org/package/formidable#readme)
|
||||
* [multer](https://www.npmjs.org/package/multer#readme)
|
||||
|
||||
This module provides the following parsers:
|
||||
|
||||
* [JSON body parser](#bodyparserjsonoptions)
|
||||
* [Raw body parser](#bodyparserrawoptions)
|
||||
* [Text body parser](#bodyparsertextoptions)
|
||||
* [URL-encoded form body parser](#bodyparserurlencodedoptions)
|
||||
|
||||
Other body parsers you might be interested in:
|
||||
|
||||
- [body](https://www.npmjs.org/package/body#readme)
|
||||
- [co-body](https://www.npmjs.org/package/co-body#readme)
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
$ npm install body-parser
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
```js
|
||||
var bodyParser = require('body-parser')
|
||||
```
|
||||
|
||||
The `bodyParser` object exposes various factories to create middlewares. All
|
||||
middlewares will populate the `req.body` property with the parsed body when
|
||||
the `Content-Type` request header matches the `type` option, or an empty
|
||||
object (`{}`) if there was no body to parse, the `Content-Type` was not matched,
|
||||
or an error occurred.
|
||||
|
||||
The various errors returned by this module are described in the
|
||||
[errors section](#errors).
|
||||
|
||||
### bodyParser.json([options])
|
||||
|
||||
Returns middleware that only parses `json` and only looks at requests where
|
||||
the `Content-Type` header matches the `type` option. This parser accepts any
|
||||
Unicode encoding of the body and supports automatic inflation of `gzip` and
|
||||
`deflate` encodings.
|
||||
|
||||
A new `body` object containing the parsed data is populated on the `request`
|
||||
object after the middleware (i.e. `req.body`).
|
||||
|
||||
#### Options
|
||||
|
||||
The `json` function takes an optional `options` object that may contain any of
|
||||
the following keys:
|
||||
|
||||
##### inflate
|
||||
|
||||
When set to `true`, then deflated (compressed) bodies will be inflated; when
|
||||
`false`, deflated bodies are rejected. Defaults to `true`.
|
||||
|
||||
##### limit
|
||||
|
||||
Controls the maximum request body size. If this is a number, then the value
|
||||
specifies the number of bytes; if it is a string, the value is passed to the
|
||||
[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults
|
||||
to `'100kb'`.
|
||||
|
||||
##### reviver
|
||||
|
||||
The `reviver` option is passed directly to `JSON.parse` as the second
|
||||
argument. You can find more information on this argument
|
||||
[in the MDN documentation about JSON.parse](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse#Example.3A_Using_the_reviver_parameter).
|
||||
|
||||
##### strict
|
||||
|
||||
When set to `true`, will only accept arrays and objects; when `false` will
|
||||
accept anything `JSON.parse` accepts. Defaults to `true`.
|
||||
|
||||
##### type
|
||||
|
||||
The `type` option is used to determine what media type the middleware will
|
||||
parse. This option can be a string, array of strings, or a function. If not a
|
||||
function, `type` option is passed directly to the
|
||||
[type-is](https://www.npmjs.org/package/type-is#readme) library and this can
|
||||
be an extension name (like `json`), a mime type (like `application/json`), or
|
||||
a mime type with a wildcard (like `*/*` or `*/json`). If a function, the `type`
|
||||
option is called as `fn(req)` and the request is parsed if it returns a truthy
|
||||
value. Defaults to `application/json`.
|
||||
|
||||
##### verify
|
||||
|
||||
The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`,
|
||||
where `buf` is a `Buffer` of the raw request body and `encoding` is the
|
||||
encoding of the request. The parsing can be aborted by throwing an error.
|
||||
|
||||
### bodyParser.raw([options])
|
||||
|
||||
Returns middleware that parses all bodies as a `Buffer` and only looks at
|
||||
requests where the `Content-Type` header matches the `type` option. This
|
||||
parser supports automatic inflation of `gzip` and `deflate` encodings.
|
||||
|
||||
A new `body` object containing the parsed data is populated on the `request`
|
||||
object after the middleware (i.e. `req.body`). This will be a `Buffer` object
|
||||
of the body.
|
||||
|
||||
#### Options
|
||||
|
||||
The `raw` function takes an optional `options` object that may contain any of
|
||||
the following keys:
|
||||
|
||||
##### inflate
|
||||
|
||||
When set to `true`, then deflated (compressed) bodies will be inflated; when
|
||||
`false`, deflated bodies are rejected. Defaults to `true`.
|
||||
|
||||
##### limit
|
||||
|
||||
Controls the maximum request body size. If this is a number, then the value
|
||||
specifies the number of bytes; if it is a string, the value is passed to the
|
||||
[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults
|
||||
to `'100kb'`.
|
||||
|
||||
##### type
|
||||
|
||||
The `type` option is used to determine what media type the middleware will
|
||||
parse. This option can be a string, array of strings, or a function.
|
||||
If not a function, `type` option is passed directly to the
|
||||
[type-is](https://www.npmjs.org/package/type-is#readme) library and this
|
||||
can be an extension name (like `bin`), a mime type (like
|
||||
`application/octet-stream`), or a mime type with a wildcard (like `*/*` or
|
||||
`application/*`). If a function, the `type` option is called as `fn(req)`
|
||||
and the request is parsed if it returns a truthy value. Defaults to
|
||||
`application/octet-stream`.
|
||||
|
||||
##### verify
|
||||
|
||||
The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`,
|
||||
where `buf` is a `Buffer` of the raw request body and `encoding` is the
|
||||
encoding of the request. The parsing can be aborted by throwing an error.
|
||||
|
||||
### bodyParser.text([options])
|
||||
|
||||
Returns middleware that parses all bodies as a string and only looks at
|
||||
requests where the `Content-Type` header matches the `type` option. This
|
||||
parser supports automatic inflation of `gzip` and `deflate` encodings.
|
||||
|
||||
A new `body` string containing the parsed data is populated on the `request`
|
||||
object after the middleware (i.e. `req.body`). This will be a string of the
|
||||
body.
|
||||
|
||||
#### Options
|
||||
|
||||
The `text` function takes an optional `options` object that may contain any of
|
||||
the following keys:
|
||||
|
||||
##### defaultCharset
|
||||
|
||||
Specify the default character set for the text content if the charset is not
|
||||
specified in the `Content-Type` header of the request. Defaults to `utf-8`.
|
||||
|
||||
##### inflate
|
||||
|
||||
When set to `true`, then deflated (compressed) bodies will be inflated; when
|
||||
`false`, deflated bodies are rejected. Defaults to `true`.
|
||||
|
||||
##### limit
|
||||
|
||||
Controls the maximum request body size. If this is a number, then the value
|
||||
specifies the number of bytes; if it is a string, the value is passed to the
|
||||
[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults
|
||||
to `'100kb'`.
|
||||
|
||||
##### type
|
||||
|
||||
The `type` option is used to determine what media type the middleware will
|
||||
parse. This option can be a string, array of strings, or a function. If not
|
||||
a function, `type` option is passed directly to the
|
||||
[type-is](https://www.npmjs.org/package/type-is#readme) library and this can
|
||||
be an extension name (like `txt`), a mime type (like `text/plain`), or a mime
|
||||
type with a wildcard (like `*/*` or `text/*`). If a function, the `type`
|
||||
option is called as `fn(req)` and the request is parsed if it returns a
|
||||
truthy value. Defaults to `text/plain`.
|
||||
|
||||
##### verify
|
||||
|
||||
The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`,
|
||||
where `buf` is a `Buffer` of the raw request body and `encoding` is the
|
||||
encoding of the request. The parsing can be aborted by throwing an error.
|
||||
|
||||
### bodyParser.urlencoded([options])
|
||||
|
||||
Returns middleware that only parses `urlencoded` bodies and only looks at
|
||||
requests where the `Content-Type` header matches the `type` option. This
|
||||
parser accepts only UTF-8 encoding of the body and supports automatic
|
||||
inflation of `gzip` and `deflate` encodings.
|
||||
|
||||
A new `body` object containing the parsed data is populated on the `request`
|
||||
object after the middleware (i.e. `req.body`). This object will contain
|
||||
key-value pairs, where the value can be a string or array (when `extended` is
|
||||
`false`), or any type (when `extended` is `true`).
|
||||
|
||||
#### Options
|
||||
|
||||
The `urlencoded` function takes an optional `options` object that may contain
|
||||
any of the following keys:
|
||||
|
||||
##### extended
|
||||
|
||||
The `extended` option allows to choose between parsing the URL-encoded data
|
||||
with the `querystring` library (when `false`) or the `qs` library (when
|
||||
`true`). The "extended" syntax allows for rich objects and arrays to be
|
||||
encoded into the URL-encoded format, allowing for a JSON-like experience
|
||||
with URL-encoded. For more information, please
|
||||
[see the qs library](https://www.npmjs.org/package/qs#readme).
|
||||
|
||||
Defaults to `true`, but using the default has been deprecated. Please
|
||||
research into the difference between `qs` and `querystring` and choose the
|
||||
appropriate setting.
|
||||
|
||||
##### inflate
|
||||
|
||||
When set to `true`, then deflated (compressed) bodies will be inflated; when
|
||||
`false`, deflated bodies are rejected. Defaults to `true`.
|
||||
|
||||
##### limit
|
||||
|
||||
Controls the maximum request body size. If this is a number, then the value
|
||||
specifies the number of bytes; if it is a string, the value is passed to the
|
||||
[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults
|
||||
to `'100kb'`.
|
||||
|
||||
##### parameterLimit
|
||||
|
||||
The `parameterLimit` option controls the maximum number of parameters that
|
||||
are allowed in the URL-encoded data. If a request contains more parameters
|
||||
than this value, a 413 will be returned to the client. Defaults to `1000`.
|
||||
|
||||
##### type
|
||||
|
||||
The `type` option is used to determine what media type the middleware will
|
||||
parse. This option can be a string, array of strings, or a function. If not
|
||||
a function, `type` option is passed directly to the
|
||||
[type-is](https://www.npmjs.org/package/type-is#readme) library and this can
|
||||
be an extension name (like `urlencoded`), a mime type (like
|
||||
`application/x-www-form-urlencoded`), or a mime type with a wildcard (like
|
||||
`*/x-www-form-urlencoded`). If a function, the `type` option is called as
|
||||
`fn(req)` and the request is parsed if it returns a truthy value. Defaults
|
||||
to `application/x-www-form-urlencoded`.
|
||||
|
||||
##### verify
|
||||
|
||||
The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`,
|
||||
where `buf` is a `Buffer` of the raw request body and `encoding` is the
|
||||
encoding of the request. The parsing can be aborted by throwing an error.
|
||||
|
||||
## Errors
|
||||
|
||||
The middlewares provided by this module create errors using the
|
||||
[`http-errors` module](https://www.npmjs.com/package/http-errors). The errors
|
||||
will typically have a `status`/`statusCode` property that contains the suggested
|
||||
HTTP response code, an `expose` property to determine if the `message` property
|
||||
should be displayed to the client, a `type` property to determine the type of
|
||||
error without matching against the `message`, and a `body` property containing
|
||||
the read body, if available.
|
||||
|
||||
The following are the common errors created, though any error can come through
|
||||
for various reasons.
|
||||
|
||||
### content encoding unsupported
|
||||
|
||||
This error will occur when the request had a `Content-Encoding` header that
|
||||
contained an encoding but the "inflation" option was set to `false`. The
|
||||
`status` property is set to `415`, the `type` property is set to
|
||||
`'encoding.unsupported'`, and the `charset` property will be set to the
|
||||
encoding that is unsupported.
|
||||
|
||||
### entity parse failed
|
||||
|
||||
This error will occur when the request contained an entity that could not be
|
||||
parsed by the middleware. The `status` property is set to `400`, the `type`
|
||||
property is set to `'entity.parse.failed'`, and the `body` property is set to
|
||||
the entity value that failed parsing.
|
||||
|
||||
### entity verify failed
|
||||
|
||||
This error will occur when the request contained an entity that could not be
|
||||
failed verification by the defined `verify` option. The `status` property is
|
||||
set to `403`, the `type` property is set to `'entity.verify.failed'`, and the
|
||||
`body` property is set to the entity value that failed verification.
|
||||
|
||||
### request aborted
|
||||
|
||||
This error will occur when the request is aborted by the client before reading
|
||||
the body has finished. The `received` property will be set to the number of
|
||||
bytes received before the request was aborted and the `expected` property is
|
||||
set to the number of expected bytes. The `status` property is set to `400`
|
||||
and `type` property is set to `'request.aborted'`.
|
||||
|
||||
### request entity too large
|
||||
|
||||
This error will occur when the request body's size is larger than the "limit"
|
||||
option. The `limit` property will be set to the byte limit and the `length`
|
||||
property will be set to the request body's length. The `status` property is
|
||||
set to `413` and the `type` property is set to `'entity.too.large'`.
|
||||
|
||||
### request size did not match content length
|
||||
|
||||
This error will occur when the request's length did not match the length from
|
||||
the `Content-Length` header. This typically occurs when the request is malformed,
|
||||
typically when the `Content-Length` header was calculated based on characters
|
||||
instead of bytes. The `status` property is set to `400` and the `type` property
|
||||
is set to `'request.size.invalid'`.
|
||||
|
||||
### stream encoding should not be set
|
||||
|
||||
This error will occur when something called the `req.setEncoding` method prior
|
||||
to this middleware. This module operates directly on bytes only and you cannot
|
||||
call `req.setEncoding` when using this module. The `status` property is set to
|
||||
`500` and the `type` property is set to `'stream.encoding.set'`.
|
||||
|
||||
### stream is not readable
|
||||
|
||||
This error will occur when the request is no longer readable when this middleware
|
||||
attempts to read it. This typically means something other than a middleware from
|
||||
this module read the request body already and the middleware was also configured to
|
||||
read the same request. The `status` property is set to `500` and the `type`
|
||||
property is set to `'stream.not.readable'`.
|
||||
|
||||
### too many parameters
|
||||
|
||||
This error will occur when the content of the request exceeds the configured
|
||||
`parameterLimit` for the `urlencoded` parser. The `status` property is set to
|
||||
`413` and the `type` property is set to `'parameters.too.many'`.
|
||||
|
||||
### unsupported charset "BOGUS"
|
||||
|
||||
This error will occur when the request had a charset parameter in the
|
||||
`Content-Type` header, but the `iconv-lite` module does not support it OR the
|
||||
parser does not support it. The charset is contained in the message as well
|
||||
as in the `charset` property. The `status` property is set to `415`, the
|
||||
`type` property is set to `'charset.unsupported'`, and the `charset` property
|
||||
is set to the charset that is unsupported.
|
||||
|
||||
### unsupported content encoding "bogus"
|
||||
|
||||
This error will occur when the request had a `Content-Encoding` header that
|
||||
contained an unsupported encoding. The encoding is contained in the message
|
||||
as well as in the `encoding` property. The `status` property is set to `415`,
|
||||
the `type` property is set to `'encoding.unsupported'`, and the `encoding`
|
||||
property is set to the encoding that is unsupported.
|
||||
|
||||
## Examples
|
||||
|
||||
### Express/Connect top-level generic
|
||||
|
||||
This example demonstrates adding a generic JSON and URL-encoded parser as a
|
||||
top-level middleware, which will parse the bodies of all incoming requests.
|
||||
This is the simplest setup.
|
||||
|
||||
```js
|
||||
var express = require('express')
|
||||
var bodyParser = require('body-parser')
|
||||
|
||||
var app = express()
|
||||
|
||||
// parse application/x-www-form-urlencoded
|
||||
app.use(bodyParser.urlencoded({ extended: false }))
|
||||
|
||||
// parse application/json
|
||||
app.use(bodyParser.json())
|
||||
|
||||
app.use(function (req, res) {
|
||||
res.setHeader('Content-Type', 'text/plain')
|
||||
res.write('you posted:\n')
|
||||
res.end(JSON.stringify(req.body, null, 2))
|
||||
})
|
||||
```
|
||||
|
||||
### Express route-specific
|
||||
|
||||
This example demonstrates adding body parsers specifically to the routes that
|
||||
need them. In general, this is the most recommended way to use body-parser with
|
||||
Express.
|
||||
|
||||
```js
|
||||
var express = require('express')
|
||||
var bodyParser = require('body-parser')
|
||||
|
||||
var app = express()
|
||||
|
||||
// create application/json parser
|
||||
var jsonParser = bodyParser.json()
|
||||
|
||||
// create application/x-www-form-urlencoded parser
|
||||
var urlencodedParser = bodyParser.urlencoded({ extended: false })
|
||||
|
||||
// POST /login gets urlencoded bodies
|
||||
app.post('/login', urlencodedParser, function (req, res) {
|
||||
res.send('welcome, ' + req.body.username)
|
||||
})
|
||||
|
||||
// POST /api/users gets JSON bodies
|
||||
app.post('/api/users', jsonParser, function (req, res) {
|
||||
// create user in req.body
|
||||
})
|
||||
```
|
||||
|
||||
### Change accepted type for parsers
|
||||
|
||||
All the parsers accept a `type` option which allows you to change the
|
||||
`Content-Type` that the middleware will parse.
|
||||
|
||||
```js
|
||||
var express = require('express')
|
||||
var bodyParser = require('body-parser')
|
||||
|
||||
var app = express()
|
||||
|
||||
// parse various different custom JSON types as JSON
|
||||
app.use(bodyParser.json({ type: 'application/*+json' }))
|
||||
|
||||
// parse some custom thing into a Buffer
|
||||
app.use(bodyParser.raw({ type: 'application/vnd.custom-type' }))
|
||||
|
||||
// parse an HTML body into a string
|
||||
app.use(bodyParser.text({ type: 'text/html' }))
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
[MIT](LICENSE)
|
||||
|
||||
[ci-image]: https://badgen.net/github/checks/expressjs/body-parser/master?label=ci
|
||||
[ci-url]: https://github.com/expressjs/body-parser/actions/workflows/ci.yml
|
||||
[coveralls-image]: https://badgen.net/coveralls/c/github/expressjs/body-parser/master
|
||||
[coveralls-url]: https://coveralls.io/r/expressjs/body-parser?branch=master
|
||||
[node-version-image]: https://badgen.net/npm/node/body-parser
|
||||
[node-version-url]: https://nodejs.org/en/download
|
||||
[npm-downloads-image]: https://badgen.net/npm/dm/body-parser
|
||||
[npm-url]: https://npmjs.org/package/body-parser
|
||||
[npm-version-image]: https://badgen.net/npm/v/body-parser
|
||||
25
node_modules/body-parser/SECURITY.md
generated
vendored
Normal file
25
node_modules/body-parser/SECURITY.md
generated
vendored
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
# Security Policies and Procedures
|
||||
|
||||
## Reporting a Bug
|
||||
|
||||
The Express team and community take all security bugs seriously. Thank you
|
||||
for improving the security of Express. We appreciate your efforts and
|
||||
responsible disclosure and will make every effort to acknowledge your
|
||||
contributions.
|
||||
|
||||
Report security bugs by emailing the current owner(s) of `body-parser`. This
|
||||
information can be found in the npm registry using the command
|
||||
`npm owner ls body-parser`.
|
||||
If unsure or unable to get the information from the above, open an issue
|
||||
in the [project issue tracker](https://github.com/expressjs/body-parser/issues)
|
||||
asking for the current contact information.
|
||||
|
||||
To ensure the timely response to your report, please ensure that the entirety
|
||||
of the report is contained within the email body and not solely behind a web
|
||||
link or an attachment.
|
||||
|
||||
At least one owner will acknowledge your email within 48 hours, and will send a
|
||||
more detailed response within 48 hours indicating the next steps in handling
|
||||
your report. After the initial reply to your report, the owners will
|
||||
endeavor to keep you informed of the progress towards a fix and full
|
||||
announcement, and may ask for additional information or guidance.
|
||||
156
node_modules/body-parser/index.js
generated
vendored
Normal file
156
node_modules/body-parser/index.js
generated
vendored
Normal file
|
|
@ -0,0 +1,156 @@
|
|||
/*!
|
||||
* body-parser
|
||||
* Copyright(c) 2014-2015 Douglas Christopher Wilson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Module dependencies.
|
||||
* @private
|
||||
*/
|
||||
|
||||
var deprecate = require('depd')('body-parser')
|
||||
|
||||
/**
|
||||
* Cache of loaded parsers.
|
||||
* @private
|
||||
*/
|
||||
|
||||
var parsers = Object.create(null)
|
||||
|
||||
/**
|
||||
* @typedef Parsers
|
||||
* @type {function}
|
||||
* @property {function} json
|
||||
* @property {function} raw
|
||||
* @property {function} text
|
||||
* @property {function} urlencoded
|
||||
*/
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
* @type {Parsers}
|
||||
*/
|
||||
|
||||
exports = module.exports = deprecate.function(bodyParser,
|
||||
'bodyParser: use individual json/urlencoded middlewares')
|
||||
|
||||
/**
|
||||
* JSON parser.
|
||||
* @public
|
||||
*/
|
||||
|
||||
Object.defineProperty(exports, 'json', {
|
||||
configurable: true,
|
||||
enumerable: true,
|
||||
get: createParserGetter('json')
|
||||
})
|
||||
|
||||
/**
|
||||
* Raw parser.
|
||||
* @public
|
||||
*/
|
||||
|
||||
Object.defineProperty(exports, 'raw', {
|
||||
configurable: true,
|
||||
enumerable: true,
|
||||
get: createParserGetter('raw')
|
||||
})
|
||||
|
||||
/**
|
||||
* Text parser.
|
||||
* @public
|
||||
*/
|
||||
|
||||
Object.defineProperty(exports, 'text', {
|
||||
configurable: true,
|
||||
enumerable: true,
|
||||
get: createParserGetter('text')
|
||||
})
|
||||
|
||||
/**
|
||||
* URL-encoded parser.
|
||||
* @public
|
||||
*/
|
||||
|
||||
Object.defineProperty(exports, 'urlencoded', {
|
||||
configurable: true,
|
||||
enumerable: true,
|
||||
get: createParserGetter('urlencoded')
|
||||
})
|
||||
|
||||
/**
|
||||
* Create a middleware to parse json and urlencoded bodies.
|
||||
*
|
||||
* @param {object} [options]
|
||||
* @return {function}
|
||||
* @deprecated
|
||||
* @public
|
||||
*/
|
||||
|
||||
function bodyParser (options) {
|
||||
// use default type for parsers
|
||||
var opts = Object.create(options || null, {
|
||||
type: {
|
||||
configurable: true,
|
||||
enumerable: true,
|
||||
value: undefined,
|
||||
writable: true
|
||||
}
|
||||
})
|
||||
|
||||
var _urlencoded = exports.urlencoded(opts)
|
||||
var _json = exports.json(opts)
|
||||
|
||||
return function bodyParser (req, res, next) {
|
||||
_json(req, res, function (err) {
|
||||
if (err) return next(err)
|
||||
_urlencoded(req, res, next)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a getter for loading a parser.
|
||||
* @private
|
||||
*/
|
||||
|
||||
function createParserGetter (name) {
|
||||
return function get () {
|
||||
return loadParser(name)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load a parser module.
|
||||
* @private
|
||||
*/
|
||||
|
||||
function loadParser (parserName) {
|
||||
var parser = parsers[parserName]
|
||||
|
||||
if (parser !== undefined) {
|
||||
return parser
|
||||
}
|
||||
|
||||
// this uses a switch for static require analysis
|
||||
switch (parserName) {
|
||||
case 'json':
|
||||
parser = require('./lib/types/json')
|
||||
break
|
||||
case 'raw':
|
||||
parser = require('./lib/types/raw')
|
||||
break
|
||||
case 'text':
|
||||
parser = require('./lib/types/text')
|
||||
break
|
||||
case 'urlencoded':
|
||||
parser = require('./lib/types/urlencoded')
|
||||
break
|
||||
}
|
||||
|
||||
// store to prevent invoking require()
|
||||
return (parsers[parserName] = parser)
|
||||
}
|
||||
205
node_modules/body-parser/lib/read.js
generated
vendored
Normal file
205
node_modules/body-parser/lib/read.js
generated
vendored
Normal file
|
|
@ -0,0 +1,205 @@
|
|||
/*!
|
||||
* body-parser
|
||||
* Copyright(c) 2014-2015 Douglas Christopher Wilson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Module dependencies.
|
||||
* @private
|
||||
*/
|
||||
|
||||
var createError = require('http-errors')
|
||||
var destroy = require('destroy')
|
||||
var getBody = require('raw-body')
|
||||
var iconv = require('iconv-lite')
|
||||
var onFinished = require('on-finished')
|
||||
var unpipe = require('unpipe')
|
||||
var zlib = require('zlib')
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
*/
|
||||
|
||||
module.exports = read
|
||||
|
||||
/**
|
||||
* Read a request into a buffer and parse.
|
||||
*
|
||||
* @param {object} req
|
||||
* @param {object} res
|
||||
* @param {function} next
|
||||
* @param {function} parse
|
||||
* @param {function} debug
|
||||
* @param {object} options
|
||||
* @private
|
||||
*/
|
||||
|
||||
function read (req, res, next, parse, debug, options) {
|
||||
var length
|
||||
var opts = options
|
||||
var stream
|
||||
|
||||
// flag as parsed
|
||||
req._body = true
|
||||
|
||||
// read options
|
||||
var encoding = opts.encoding !== null
|
||||
? opts.encoding
|
||||
: null
|
||||
var verify = opts.verify
|
||||
|
||||
try {
|
||||
// get the content stream
|
||||
stream = contentstream(req, debug, opts.inflate)
|
||||
length = stream.length
|
||||
stream.length = undefined
|
||||
} catch (err) {
|
||||
return next(err)
|
||||
}
|
||||
|
||||
// set raw-body options
|
||||
opts.length = length
|
||||
opts.encoding = verify
|
||||
? null
|
||||
: encoding
|
||||
|
||||
// assert charset is supported
|
||||
if (opts.encoding === null && encoding !== null && !iconv.encodingExists(encoding)) {
|
||||
return next(createError(415, 'unsupported charset "' + encoding.toUpperCase() + '"', {
|
||||
charset: encoding.toLowerCase(),
|
||||
type: 'charset.unsupported'
|
||||
}))
|
||||
}
|
||||
|
||||
// read body
|
||||
debug('read body')
|
||||
getBody(stream, opts, function (error, body) {
|
||||
if (error) {
|
||||
var _error
|
||||
|
||||
if (error.type === 'encoding.unsupported') {
|
||||
// echo back charset
|
||||
_error = createError(415, 'unsupported charset "' + encoding.toUpperCase() + '"', {
|
||||
charset: encoding.toLowerCase(),
|
||||
type: 'charset.unsupported'
|
||||
})
|
||||
} else {
|
||||
// set status code on error
|
||||
_error = createError(400, error)
|
||||
}
|
||||
|
||||
// unpipe from stream and destroy
|
||||
if (stream !== req) {
|
||||
unpipe(req)
|
||||
destroy(stream, true)
|
||||
}
|
||||
|
||||
// read off entire request
|
||||
dump(req, function onfinished () {
|
||||
next(createError(400, _error))
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// verify
|
||||
if (verify) {
|
||||
try {
|
||||
debug('verify body')
|
||||
verify(req, res, body, encoding)
|
||||
} catch (err) {
|
||||
next(createError(403, err, {
|
||||
body: body,
|
||||
type: err.type || 'entity.verify.failed'
|
||||
}))
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
// parse
|
||||
var str = body
|
||||
try {
|
||||
debug('parse body')
|
||||
str = typeof body !== 'string' && encoding !== null
|
||||
? iconv.decode(body, encoding)
|
||||
: body
|
||||
req.body = parse(str)
|
||||
} catch (err) {
|
||||
next(createError(400, err, {
|
||||
body: str,
|
||||
type: err.type || 'entity.parse.failed'
|
||||
}))
|
||||
return
|
||||
}
|
||||
|
||||
next()
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the content stream of the request.
|
||||
*
|
||||
* @param {object} req
|
||||
* @param {function} debug
|
||||
* @param {boolean} [inflate=true]
|
||||
* @return {object}
|
||||
* @api private
|
||||
*/
|
||||
|
||||
function contentstream (req, debug, inflate) {
|
||||
var encoding = (req.headers['content-encoding'] || 'identity').toLowerCase()
|
||||
var length = req.headers['content-length']
|
||||
var stream
|
||||
|
||||
debug('content-encoding "%s"', encoding)
|
||||
|
||||
if (inflate === false && encoding !== 'identity') {
|
||||
throw createError(415, 'content encoding unsupported', {
|
||||
encoding: encoding,
|
||||
type: 'encoding.unsupported'
|
||||
})
|
||||
}
|
||||
|
||||
switch (encoding) {
|
||||
case 'deflate':
|
||||
stream = zlib.createInflate()
|
||||
debug('inflate body')
|
||||
req.pipe(stream)
|
||||
break
|
||||
case 'gzip':
|
||||
stream = zlib.createGunzip()
|
||||
debug('gunzip body')
|
||||
req.pipe(stream)
|
||||
break
|
||||
case 'identity':
|
||||
stream = req
|
||||
stream.length = length
|
||||
break
|
||||
default:
|
||||
throw createError(415, 'unsupported content encoding "' + encoding + '"', {
|
||||
encoding: encoding,
|
||||
type: 'encoding.unsupported'
|
||||
})
|
||||
}
|
||||
|
||||
return stream
|
||||
}
|
||||
|
||||
/**
|
||||
* Dump the contents of a request.
|
||||
*
|
||||
* @param {object} req
|
||||
* @param {function} callback
|
||||
* @api private
|
||||
*/
|
||||
|
||||
function dump (req, callback) {
|
||||
if (onFinished.isFinished(req)) {
|
||||
callback(null)
|
||||
} else {
|
||||
onFinished(req, callback)
|
||||
req.resume()
|
||||
}
|
||||
}
|
||||
247
node_modules/body-parser/lib/types/json.js
generated
vendored
Normal file
247
node_modules/body-parser/lib/types/json.js
generated
vendored
Normal file
|
|
@ -0,0 +1,247 @@
|
|||
/*!
|
||||
* body-parser
|
||||
* Copyright(c) 2014 Jonathan Ong
|
||||
* Copyright(c) 2014-2015 Douglas Christopher Wilson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Module dependencies.
|
||||
* @private
|
||||
*/
|
||||
|
||||
var bytes = require('bytes')
|
||||
var contentType = require('content-type')
|
||||
var createError = require('http-errors')
|
||||
var debug = require('debug')('body-parser:json')
|
||||
var read = require('../read')
|
||||
var typeis = require('type-is')
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
*/
|
||||
|
||||
module.exports = json
|
||||
|
||||
/**
|
||||
* RegExp to match the first non-space in a string.
|
||||
*
|
||||
* Allowed whitespace is defined in RFC 7159:
|
||||
*
|
||||
* ws = *(
|
||||
* %x20 / ; Space
|
||||
* %x09 / ; Horizontal tab
|
||||
* %x0A / ; Line feed or New line
|
||||
* %x0D ) ; Carriage return
|
||||
*/
|
||||
|
||||
var FIRST_CHAR_REGEXP = /^[\x20\x09\x0a\x0d]*([^\x20\x09\x0a\x0d])/ // eslint-disable-line no-control-regex
|
||||
|
||||
var JSON_SYNTAX_CHAR = '#'
|
||||
var JSON_SYNTAX_REGEXP = /#+/g
|
||||
|
||||
/**
|
||||
* Create a middleware to parse JSON bodies.
|
||||
*
|
||||
* @param {object} [options]
|
||||
* @return {function}
|
||||
* @public
|
||||
*/
|
||||
|
||||
function json (options) {
|
||||
var opts = options || {}
|
||||
|
||||
var limit = typeof opts.limit !== 'number'
|
||||
? bytes.parse(opts.limit || '100kb')
|
||||
: opts.limit
|
||||
var inflate = opts.inflate !== false
|
||||
var reviver = opts.reviver
|
||||
var strict = opts.strict !== false
|
||||
var type = opts.type || 'application/json'
|
||||
var verify = opts.verify || false
|
||||
|
||||
if (verify !== false && typeof verify !== 'function') {
|
||||
throw new TypeError('option verify must be function')
|
||||
}
|
||||
|
||||
// create the appropriate type checking function
|
||||
var shouldParse = typeof type !== 'function'
|
||||
? typeChecker(type)
|
||||
: type
|
||||
|
||||
function parse (body) {
|
||||
if (body.length === 0) {
|
||||
// special-case empty json body, as it's a common client-side mistake
|
||||
// TODO: maybe make this configurable or part of "strict" option
|
||||
return {}
|
||||
}
|
||||
|
||||
if (strict) {
|
||||
var first = firstchar(body)
|
||||
|
||||
if (first !== '{' && first !== '[') {
|
||||
debug('strict violation')
|
||||
throw createStrictSyntaxError(body, first)
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
debug('parse json')
|
||||
return JSON.parse(body, reviver)
|
||||
} catch (e) {
|
||||
throw normalizeJsonSyntaxError(e, {
|
||||
message: e.message,
|
||||
stack: e.stack
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
return function jsonParser (req, res, next) {
|
||||
if (req._body) {
|
||||
debug('body already parsed')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
req.body = req.body || {}
|
||||
|
||||
// skip requests without bodies
|
||||
if (!typeis.hasBody(req)) {
|
||||
debug('skip empty body')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
debug('content-type %j', req.headers['content-type'])
|
||||
|
||||
// determine if request should be parsed
|
||||
if (!shouldParse(req)) {
|
||||
debug('skip parsing')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
// assert charset per RFC 7159 sec 8.1
|
||||
var charset = getCharset(req) || 'utf-8'
|
||||
if (charset.slice(0, 4) !== 'utf-') {
|
||||
debug('invalid charset')
|
||||
next(createError(415, 'unsupported charset "' + charset.toUpperCase() + '"', {
|
||||
charset: charset,
|
||||
type: 'charset.unsupported'
|
||||
}))
|
||||
return
|
||||
}
|
||||
|
||||
// read
|
||||
read(req, res, next, parse, debug, {
|
||||
encoding: charset,
|
||||
inflate: inflate,
|
||||
limit: limit,
|
||||
verify: verify
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create strict violation syntax error matching native error.
|
||||
*
|
||||
* @param {string} str
|
||||
* @param {string} char
|
||||
* @return {Error}
|
||||
* @private
|
||||
*/
|
||||
|
||||
function createStrictSyntaxError (str, char) {
|
||||
var index = str.indexOf(char)
|
||||
var partial = ''
|
||||
|
||||
if (index !== -1) {
|
||||
partial = str.substring(0, index) + JSON_SYNTAX_CHAR
|
||||
|
||||
for (var i = index + 1; i < str.length; i++) {
|
||||
partial += JSON_SYNTAX_CHAR
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
JSON.parse(partial); /* istanbul ignore next */ throw new SyntaxError('strict violation')
|
||||
} catch (e) {
|
||||
return normalizeJsonSyntaxError(e, {
|
||||
message: e.message.replace(JSON_SYNTAX_REGEXP, function (placeholder) {
|
||||
return str.substring(index, index + placeholder.length)
|
||||
}),
|
||||
stack: e.stack
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the first non-whitespace character in a string.
|
||||
*
|
||||
* @param {string} str
|
||||
* @return {function}
|
||||
* @private
|
||||
*/
|
||||
|
||||
function firstchar (str) {
|
||||
var match = FIRST_CHAR_REGEXP.exec(str)
|
||||
|
||||
return match
|
||||
? match[1]
|
||||
: undefined
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the charset of a request.
|
||||
*
|
||||
* @param {object} req
|
||||
* @api private
|
||||
*/
|
||||
|
||||
function getCharset (req) {
|
||||
try {
|
||||
return (contentType.parse(req).parameters.charset || '').toLowerCase()
|
||||
} catch (e) {
|
||||
return undefined
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize a SyntaxError for JSON.parse.
|
||||
*
|
||||
* @param {SyntaxError} error
|
||||
* @param {object} obj
|
||||
* @return {SyntaxError}
|
||||
*/
|
||||
|
||||
function normalizeJsonSyntaxError (error, obj) {
|
||||
var keys = Object.getOwnPropertyNames(error)
|
||||
|
||||
for (var i = 0; i < keys.length; i++) {
|
||||
var key = keys[i]
|
||||
if (key !== 'stack' && key !== 'message') {
|
||||
delete error[key]
|
||||
}
|
||||
}
|
||||
|
||||
// replace stack before message for Node.js 0.10 and below
|
||||
error.stack = obj.stack.replace(error.message, obj.message)
|
||||
error.message = obj.message
|
||||
|
||||
return error
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the simple type checker.
|
||||
*
|
||||
* @param {string} type
|
||||
* @return {function}
|
||||
*/
|
||||
|
||||
function typeChecker (type) {
|
||||
return function checkType (req) {
|
||||
return Boolean(typeis(req, type))
|
||||
}
|
||||
}
|
||||
101
node_modules/body-parser/lib/types/raw.js
generated
vendored
Normal file
101
node_modules/body-parser/lib/types/raw.js
generated
vendored
Normal file
|
|
@ -0,0 +1,101 @@
|
|||
/*!
|
||||
* body-parser
|
||||
* Copyright(c) 2014-2015 Douglas Christopher Wilson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Module dependencies.
|
||||
*/
|
||||
|
||||
var bytes = require('bytes')
|
||||
var debug = require('debug')('body-parser:raw')
|
||||
var read = require('../read')
|
||||
var typeis = require('type-is')
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
*/
|
||||
|
||||
module.exports = raw
|
||||
|
||||
/**
|
||||
* Create a middleware to parse raw bodies.
|
||||
*
|
||||
* @param {object} [options]
|
||||
* @return {function}
|
||||
* @api public
|
||||
*/
|
||||
|
||||
function raw (options) {
|
||||
var opts = options || {}
|
||||
|
||||
var inflate = opts.inflate !== false
|
||||
var limit = typeof opts.limit !== 'number'
|
||||
? bytes.parse(opts.limit || '100kb')
|
||||
: opts.limit
|
||||
var type = opts.type || 'application/octet-stream'
|
||||
var verify = opts.verify || false
|
||||
|
||||
if (verify !== false && typeof verify !== 'function') {
|
||||
throw new TypeError('option verify must be function')
|
||||
}
|
||||
|
||||
// create the appropriate type checking function
|
||||
var shouldParse = typeof type !== 'function'
|
||||
? typeChecker(type)
|
||||
: type
|
||||
|
||||
function parse (buf) {
|
||||
return buf
|
||||
}
|
||||
|
||||
return function rawParser (req, res, next) {
|
||||
if (req._body) {
|
||||
debug('body already parsed')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
req.body = req.body || {}
|
||||
|
||||
// skip requests without bodies
|
||||
if (!typeis.hasBody(req)) {
|
||||
debug('skip empty body')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
debug('content-type %j', req.headers['content-type'])
|
||||
|
||||
// determine if request should be parsed
|
||||
if (!shouldParse(req)) {
|
||||
debug('skip parsing')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
// read
|
||||
read(req, res, next, parse, debug, {
|
||||
encoding: null,
|
||||
inflate: inflate,
|
||||
limit: limit,
|
||||
verify: verify
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the simple type checker.
|
||||
*
|
||||
* @param {string} type
|
||||
* @return {function}
|
||||
*/
|
||||
|
||||
function typeChecker (type) {
|
||||
return function checkType (req) {
|
||||
return Boolean(typeis(req, type))
|
||||
}
|
||||
}
|
||||
121
node_modules/body-parser/lib/types/text.js
generated
vendored
Normal file
121
node_modules/body-parser/lib/types/text.js
generated
vendored
Normal file
|
|
@ -0,0 +1,121 @@
|
|||
/*!
|
||||
* body-parser
|
||||
* Copyright(c) 2014-2015 Douglas Christopher Wilson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Module dependencies.
|
||||
*/
|
||||
|
||||
var bytes = require('bytes')
|
||||
var contentType = require('content-type')
|
||||
var debug = require('debug')('body-parser:text')
|
||||
var read = require('../read')
|
||||
var typeis = require('type-is')
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
*/
|
||||
|
||||
module.exports = text
|
||||
|
||||
/**
|
||||
* Create a middleware to parse text bodies.
|
||||
*
|
||||
* @param {object} [options]
|
||||
* @return {function}
|
||||
* @api public
|
||||
*/
|
||||
|
||||
function text (options) {
|
||||
var opts = options || {}
|
||||
|
||||
var defaultCharset = opts.defaultCharset || 'utf-8'
|
||||
var inflate = opts.inflate !== false
|
||||
var limit = typeof opts.limit !== 'number'
|
||||
? bytes.parse(opts.limit || '100kb')
|
||||
: opts.limit
|
||||
var type = opts.type || 'text/plain'
|
||||
var verify = opts.verify || false
|
||||
|
||||
if (verify !== false && typeof verify !== 'function') {
|
||||
throw new TypeError('option verify must be function')
|
||||
}
|
||||
|
||||
// create the appropriate type checking function
|
||||
var shouldParse = typeof type !== 'function'
|
||||
? typeChecker(type)
|
||||
: type
|
||||
|
||||
function parse (buf) {
|
||||
return buf
|
||||
}
|
||||
|
||||
return function textParser (req, res, next) {
|
||||
if (req._body) {
|
||||
debug('body already parsed')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
req.body = req.body || {}
|
||||
|
||||
// skip requests without bodies
|
||||
if (!typeis.hasBody(req)) {
|
||||
debug('skip empty body')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
debug('content-type %j', req.headers['content-type'])
|
||||
|
||||
// determine if request should be parsed
|
||||
if (!shouldParse(req)) {
|
||||
debug('skip parsing')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
// get charset
|
||||
var charset = getCharset(req) || defaultCharset
|
||||
|
||||
// read
|
||||
read(req, res, next, parse, debug, {
|
||||
encoding: charset,
|
||||
inflate: inflate,
|
||||
limit: limit,
|
||||
verify: verify
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the charset of a request.
|
||||
*
|
||||
* @param {object} req
|
||||
* @api private
|
||||
*/
|
||||
|
||||
function getCharset (req) {
|
||||
try {
|
||||
return (contentType.parse(req).parameters.charset || '').toLowerCase()
|
||||
} catch (e) {
|
||||
return undefined
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the simple type checker.
|
||||
*
|
||||
* @param {string} type
|
||||
* @return {function}
|
||||
*/
|
||||
|
||||
function typeChecker (type) {
|
||||
return function checkType (req) {
|
||||
return Boolean(typeis(req, type))
|
||||
}
|
||||
}
|
||||
284
node_modules/body-parser/lib/types/urlencoded.js
generated
vendored
Normal file
284
node_modules/body-parser/lib/types/urlencoded.js
generated
vendored
Normal file
|
|
@ -0,0 +1,284 @@
|
|||
/*!
|
||||
* body-parser
|
||||
* Copyright(c) 2014 Jonathan Ong
|
||||
* Copyright(c) 2014-2015 Douglas Christopher Wilson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Module dependencies.
|
||||
* @private
|
||||
*/
|
||||
|
||||
var bytes = require('bytes')
|
||||
var contentType = require('content-type')
|
||||
var createError = require('http-errors')
|
||||
var debug = require('debug')('body-parser:urlencoded')
|
||||
var deprecate = require('depd')('body-parser')
|
||||
var read = require('../read')
|
||||
var typeis = require('type-is')
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
*/
|
||||
|
||||
module.exports = urlencoded
|
||||
|
||||
/**
|
||||
* Cache of parser modules.
|
||||
*/
|
||||
|
||||
var parsers = Object.create(null)
|
||||
|
||||
/**
|
||||
* Create a middleware to parse urlencoded bodies.
|
||||
*
|
||||
* @param {object} [options]
|
||||
* @return {function}
|
||||
* @public
|
||||
*/
|
||||
|
||||
function urlencoded (options) {
|
||||
var opts = options || {}
|
||||
|
||||
// notice because option default will flip in next major
|
||||
if (opts.extended === undefined) {
|
||||
deprecate('undefined extended: provide extended option')
|
||||
}
|
||||
|
||||
var extended = opts.extended !== false
|
||||
var inflate = opts.inflate !== false
|
||||
var limit = typeof opts.limit !== 'number'
|
||||
? bytes.parse(opts.limit || '100kb')
|
||||
: opts.limit
|
||||
var type = opts.type || 'application/x-www-form-urlencoded'
|
||||
var verify = opts.verify || false
|
||||
|
||||
if (verify !== false && typeof verify !== 'function') {
|
||||
throw new TypeError('option verify must be function')
|
||||
}
|
||||
|
||||
// create the appropriate query parser
|
||||
var queryparse = extended
|
||||
? extendedparser(opts)
|
||||
: simpleparser(opts)
|
||||
|
||||
// create the appropriate type checking function
|
||||
var shouldParse = typeof type !== 'function'
|
||||
? typeChecker(type)
|
||||
: type
|
||||
|
||||
function parse (body) {
|
||||
return body.length
|
||||
? queryparse(body)
|
||||
: {}
|
||||
}
|
||||
|
||||
return function urlencodedParser (req, res, next) {
|
||||
if (req._body) {
|
||||
debug('body already parsed')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
req.body = req.body || {}
|
||||
|
||||
// skip requests without bodies
|
||||
if (!typeis.hasBody(req)) {
|
||||
debug('skip empty body')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
debug('content-type %j', req.headers['content-type'])
|
||||
|
||||
// determine if request should be parsed
|
||||
if (!shouldParse(req)) {
|
||||
debug('skip parsing')
|
||||
next()
|
||||
return
|
||||
}
|
||||
|
||||
// assert charset
|
||||
var charset = getCharset(req) || 'utf-8'
|
||||
if (charset !== 'utf-8') {
|
||||
debug('invalid charset')
|
||||
next(createError(415, 'unsupported charset "' + charset.toUpperCase() + '"', {
|
||||
charset: charset,
|
||||
type: 'charset.unsupported'
|
||||
}))
|
||||
return
|
||||
}
|
||||
|
||||
// read
|
||||
read(req, res, next, parse, debug, {
|
||||
debug: debug,
|
||||
encoding: charset,
|
||||
inflate: inflate,
|
||||
limit: limit,
|
||||
verify: verify
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the extended query parser.
|
||||
*
|
||||
* @param {object} options
|
||||
*/
|
||||
|
||||
function extendedparser (options) {
|
||||
var parameterLimit = options.parameterLimit !== undefined
|
||||
? options.parameterLimit
|
||||
: 1000
|
||||
var parse = parser('qs')
|
||||
|
||||
if (isNaN(parameterLimit) || parameterLimit < 1) {
|
||||
throw new TypeError('option parameterLimit must be a positive number')
|
||||
}
|
||||
|
||||
if (isFinite(parameterLimit)) {
|
||||
parameterLimit = parameterLimit | 0
|
||||
}
|
||||
|
||||
return function queryparse (body) {
|
||||
var paramCount = parameterCount(body, parameterLimit)
|
||||
|
||||
if (paramCount === undefined) {
|
||||
debug('too many parameters')
|
||||
throw createError(413, 'too many parameters', {
|
||||
type: 'parameters.too.many'
|
||||
})
|
||||
}
|
||||
|
||||
var arrayLimit = Math.max(100, paramCount)
|
||||
|
||||
debug('parse extended urlencoding')
|
||||
return parse(body, {
|
||||
allowPrototypes: true,
|
||||
arrayLimit: arrayLimit,
|
||||
depth: Infinity,
|
||||
parameterLimit: parameterLimit
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the charset of a request.
|
||||
*
|
||||
* @param {object} req
|
||||
* @api private
|
||||
*/
|
||||
|
||||
function getCharset (req) {
|
||||
try {
|
||||
return (contentType.parse(req).parameters.charset || '').toLowerCase()
|
||||
} catch (e) {
|
||||
return undefined
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Count the number of parameters, stopping once limit reached
|
||||
*
|
||||
* @param {string} body
|
||||
* @param {number} limit
|
||||
* @api private
|
||||
*/
|
||||
|
||||
function parameterCount (body, limit) {
|
||||
var count = 0
|
||||
var index = 0
|
||||
|
||||
while ((index = body.indexOf('&', index)) !== -1) {
|
||||
count++
|
||||
index++
|
||||
|
||||
if (count === limit) {
|
||||
return undefined
|
||||
}
|
||||
}
|
||||
|
||||
return count
|
||||
}
|
||||
|
||||
/**
|
||||
* Get parser for module name dynamically.
|
||||
*
|
||||
* @param {string} name
|
||||
* @return {function}
|
||||
* @api private
|
||||
*/
|
||||
|
||||
function parser (name) {
|
||||
var mod = parsers[name]
|
||||
|
||||
if (mod !== undefined) {
|
||||
return mod.parse
|
||||
}
|
||||
|
||||
// this uses a switch for static require analysis
|
||||
switch (name) {
|
||||
case 'qs':
|
||||
mod = require('qs')
|
||||
break
|
||||
case 'querystring':
|
||||
mod = require('querystring')
|
||||
break
|
||||
}
|
||||
|
||||
// store to prevent invoking require()
|
||||
parsers[name] = mod
|
||||
|
||||
return mod.parse
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the simple query parser.
|
||||
*
|
||||
* @param {object} options
|
||||
*/
|
||||
|
||||
function simpleparser (options) {
|
||||
var parameterLimit = options.parameterLimit !== undefined
|
||||
? options.parameterLimit
|
||||
: 1000
|
||||
var parse = parser('querystring')
|
||||
|
||||
if (isNaN(parameterLimit) || parameterLimit < 1) {
|
||||
throw new TypeError('option parameterLimit must be a positive number')
|
||||
}
|
||||
|
||||
if (isFinite(parameterLimit)) {
|
||||
parameterLimit = parameterLimit | 0
|
||||
}
|
||||
|
||||
return function queryparse (body) {
|
||||
var paramCount = parameterCount(body, parameterLimit)
|
||||
|
||||
if (paramCount === undefined) {
|
||||
debug('too many parameters')
|
||||
throw createError(413, 'too many parameters', {
|
||||
type: 'parameters.too.many'
|
||||
})
|
||||
}
|
||||
|
||||
debug('parse urlencoding')
|
||||
return parse(body, undefined, undefined, { maxKeys: parameterLimit })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the simple type checker.
|
||||
*
|
||||
* @param {string} type
|
||||
* @return {function}
|
||||
*/
|
||||
|
||||
function typeChecker (type) {
|
||||
return function checkType (req) {
|
||||
return Boolean(typeis(req, type))
|
||||
}
|
||||
}
|
||||
56
node_modules/body-parser/package.json
generated
vendored
Normal file
56
node_modules/body-parser/package.json
generated
vendored
Normal file
|
|
@ -0,0 +1,56 @@
|
|||
{
|
||||
"name": "body-parser",
|
||||
"description": "Node.js body parsing middleware",
|
||||
"version": "1.20.2",
|
||||
"contributors": [
|
||||
"Douglas Christopher Wilson <doug@somethingdoug.com>",
|
||||
"Jonathan Ong <me@jongleberry.com> (http://jongleberry.com)"
|
||||
],
|
||||
"license": "MIT",
|
||||
"repository": "expressjs/body-parser",
|
||||
"dependencies": {
|
||||
"bytes": "3.1.2",
|
||||
"content-type": "~1.0.5",
|
||||
"debug": "2.6.9",
|
||||
"depd": "2.0.0",
|
||||
"destroy": "1.2.0",
|
||||
"http-errors": "2.0.0",
|
||||
"iconv-lite": "0.4.24",
|
||||
"on-finished": "2.4.1",
|
||||
"qs": "6.11.0",
|
||||
"raw-body": "2.5.2",
|
||||
"type-is": "~1.6.18",
|
||||
"unpipe": "1.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"eslint": "8.34.0",
|
||||
"eslint-config-standard": "14.1.1",
|
||||
"eslint-plugin-import": "2.27.5",
|
||||
"eslint-plugin-markdown": "3.0.0",
|
||||
"eslint-plugin-node": "11.1.0",
|
||||
"eslint-plugin-promise": "6.1.1",
|
||||
"eslint-plugin-standard": "4.1.0",
|
||||
"methods": "1.1.2",
|
||||
"mocha": "10.2.0",
|
||||
"nyc": "15.1.0",
|
||||
"safe-buffer": "5.2.1",
|
||||
"supertest": "6.3.3"
|
||||
},
|
||||
"files": [
|
||||
"lib/",
|
||||
"LICENSE",
|
||||
"HISTORY.md",
|
||||
"SECURITY.md",
|
||||
"index.js"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">= 0.8",
|
||||
"npm": "1.2.8000 || >= 1.4.16"
|
||||
},
|
||||
"scripts": {
|
||||
"lint": "eslint .",
|
||||
"test": "mocha --require test/support/env --reporter spec --check-leaks --bail test/",
|
||||
"test-ci": "nyc --reporter=lcov --reporter=text npm test",
|
||||
"test-cov": "nyc --reporter=html --reporter=text npm test"
|
||||
}
|
||||
}
|
||||
21
node_modules/brace-expansion/LICENSE
generated
vendored
Normal file
21
node_modules/brace-expansion/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
MIT License
|
||||
|
||||
Copyright (c) 2013 Julian Gruber <julian@juliangruber.com>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
129
node_modules/brace-expansion/README.md
generated
vendored
Normal file
129
node_modules/brace-expansion/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,129 @@
|
|||
# brace-expansion
|
||||
|
||||
[Brace expansion](https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html),
|
||||
as known from sh/bash, in JavaScript.
|
||||
|
||||
[](http://travis-ci.org/juliangruber/brace-expansion)
|
||||
[](https://www.npmjs.org/package/brace-expansion)
|
||||
[](https://greenkeeper.io/)
|
||||
|
||||
[](https://ci.testling.com/juliangruber/brace-expansion)
|
||||
|
||||
## Example
|
||||
|
||||
```js
|
||||
var expand = require('brace-expansion');
|
||||
|
||||
expand('file-{a,b,c}.jpg')
|
||||
// => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg']
|
||||
|
||||
expand('-v{,,}')
|
||||
// => ['-v', '-v', '-v']
|
||||
|
||||
expand('file{0..2}.jpg')
|
||||
// => ['file0.jpg', 'file1.jpg', 'file2.jpg']
|
||||
|
||||
expand('file-{a..c}.jpg')
|
||||
// => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg']
|
||||
|
||||
expand('file{2..0}.jpg')
|
||||
// => ['file2.jpg', 'file1.jpg', 'file0.jpg']
|
||||
|
||||
expand('file{0..4..2}.jpg')
|
||||
// => ['file0.jpg', 'file2.jpg', 'file4.jpg']
|
||||
|
||||
expand('file-{a..e..2}.jpg')
|
||||
// => ['file-a.jpg', 'file-c.jpg', 'file-e.jpg']
|
||||
|
||||
expand('file{00..10..5}.jpg')
|
||||
// => ['file00.jpg', 'file05.jpg', 'file10.jpg']
|
||||
|
||||
expand('{{A..C},{a..c}}')
|
||||
// => ['A', 'B', 'C', 'a', 'b', 'c']
|
||||
|
||||
expand('ppp{,config,oe{,conf}}')
|
||||
// => ['ppp', 'pppconfig', 'pppoe', 'pppoeconf']
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
```js
|
||||
var expand = require('brace-expansion');
|
||||
```
|
||||
|
||||
### var expanded = expand(str)
|
||||
|
||||
Return an array of all possible and valid expansions of `str`. If none are
|
||||
found, `[str]` is returned.
|
||||
|
||||
Valid expansions are:
|
||||
|
||||
```js
|
||||
/^(.*,)+(.+)?$/
|
||||
// {a,b,...}
|
||||
```
|
||||
|
||||
A comma separated list of options, like `{a,b}` or `{a,{b,c}}` or `{,a,}`.
|
||||
|
||||
```js
|
||||
/^-?\d+\.\.-?\d+(\.\.-?\d+)?$/
|
||||
// {x..y[..incr]}
|
||||
```
|
||||
|
||||
A numeric sequence from `x` to `y` inclusive, with optional increment.
|
||||
If `x` or `y` start with a leading `0`, all the numbers will be padded
|
||||
to have equal length. Negative numbers and backwards iteration work too.
|
||||
|
||||
```js
|
||||
/^-?\d+\.\.-?\d+(\.\.-?\d+)?$/
|
||||
// {x..y[..incr]}
|
||||
```
|
||||
|
||||
An alphabetic sequence from `x` to `y` inclusive, with optional increment.
|
||||
`x` and `y` must be exactly one character, and if given, `incr` must be a
|
||||
number.
|
||||
|
||||
For compatibility reasons, the string `${` is not eligible for brace expansion.
|
||||
|
||||
## Installation
|
||||
|
||||
With [npm](https://npmjs.org) do:
|
||||
|
||||
```bash
|
||||
npm install brace-expansion
|
||||
```
|
||||
|
||||
## Contributors
|
||||
|
||||
- [Julian Gruber](https://github.com/juliangruber)
|
||||
- [Isaac Z. Schlueter](https://github.com/isaacs)
|
||||
|
||||
## Sponsors
|
||||
|
||||
This module is proudly supported by my [Sponsors](https://github.com/juliangruber/sponsors)!
|
||||
|
||||
Do you want to support modules like this to improve their quality, stability and weigh in on new features? Then please consider donating to my [Patreon](https://www.patreon.com/juliangruber). Not sure how much of my modules you're using? Try [feross/thanks](https://github.com/feross/thanks)!
|
||||
|
||||
## License
|
||||
|
||||
(MIT)
|
||||
|
||||
Copyright (c) 2013 Julian Gruber <julian@juliangruber.com>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||
this software and associated documentation files (the "Software"), to deal in
|
||||
the Software without restriction, including without limitation the rights to
|
||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
|
||||
of the Software, and to permit persons to whom the Software is furnished to do
|
||||
so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
201
node_modules/brace-expansion/index.js
generated
vendored
Normal file
201
node_modules/brace-expansion/index.js
generated
vendored
Normal file
|
|
@ -0,0 +1,201 @@
|
|||
var concatMap = require('concat-map');
|
||||
var balanced = require('balanced-match');
|
||||
|
||||
module.exports = expandTop;
|
||||
|
||||
var escSlash = '\0SLASH'+Math.random()+'\0';
|
||||
var escOpen = '\0OPEN'+Math.random()+'\0';
|
||||
var escClose = '\0CLOSE'+Math.random()+'\0';
|
||||
var escComma = '\0COMMA'+Math.random()+'\0';
|
||||
var escPeriod = '\0PERIOD'+Math.random()+'\0';
|
||||
|
||||
function numeric(str) {
|
||||
return parseInt(str, 10) == str
|
||||
? parseInt(str, 10)
|
||||
: str.charCodeAt(0);
|
||||
}
|
||||
|
||||
function escapeBraces(str) {
|
||||
return str.split('\\\\').join(escSlash)
|
||||
.split('\\{').join(escOpen)
|
||||
.split('\\}').join(escClose)
|
||||
.split('\\,').join(escComma)
|
||||
.split('\\.').join(escPeriod);
|
||||
}
|
||||
|
||||
function unescapeBraces(str) {
|
||||
return str.split(escSlash).join('\\')
|
||||
.split(escOpen).join('{')
|
||||
.split(escClose).join('}')
|
||||
.split(escComma).join(',')
|
||||
.split(escPeriod).join('.');
|
||||
}
|
||||
|
||||
|
||||
// Basically just str.split(","), but handling cases
|
||||
// where we have nested braced sections, which should be
|
||||
// treated as individual members, like {a,{b,c},d}
|
||||
function parseCommaParts(str) {
|
||||
if (!str)
|
||||
return [''];
|
||||
|
||||
var parts = [];
|
||||
var m = balanced('{', '}', str);
|
||||
|
||||
if (!m)
|
||||
return str.split(',');
|
||||
|
||||
var pre = m.pre;
|
||||
var body = m.body;
|
||||
var post = m.post;
|
||||
var p = pre.split(',');
|
||||
|
||||
p[p.length-1] += '{' + body + '}';
|
||||
var postParts = parseCommaParts(post);
|
||||
if (post.length) {
|
||||
p[p.length-1] += postParts.shift();
|
||||
p.push.apply(p, postParts);
|
||||
}
|
||||
|
||||
parts.push.apply(parts, p);
|
||||
|
||||
return parts;
|
||||
}
|
||||
|
||||
function expandTop(str) {
|
||||
if (!str)
|
||||
return [];
|
||||
|
||||
// I don't know why Bash 4.3 does this, but it does.
|
||||
// Anything starting with {} will have the first two bytes preserved
|
||||
// but *only* at the top level, so {},a}b will not expand to anything,
|
||||
// but a{},b}c will be expanded to [a}c,abc].
|
||||
// One could argue that this is a bug in Bash, but since the goal of
|
||||
// this module is to match Bash's rules, we escape a leading {}
|
||||
if (str.substr(0, 2) === '{}') {
|
||||
str = '\\{\\}' + str.substr(2);
|
||||
}
|
||||
|
||||
return expand(escapeBraces(str), true).map(unescapeBraces);
|
||||
}
|
||||
|
||||
function identity(e) {
|
||||
return e;
|
||||
}
|
||||
|
||||
function embrace(str) {
|
||||
return '{' + str + '}';
|
||||
}
|
||||
function isPadded(el) {
|
||||
return /^-?0\d/.test(el);
|
||||
}
|
||||
|
||||
function lte(i, y) {
|
||||
return i <= y;
|
||||
}
|
||||
function gte(i, y) {
|
||||
return i >= y;
|
||||
}
|
||||
|
||||
function expand(str, isTop) {
|
||||
var expansions = [];
|
||||
|
||||
var m = balanced('{', '}', str);
|
||||
if (!m || /\$$/.test(m.pre)) return [str];
|
||||
|
||||
var isNumericSequence = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/.test(m.body);
|
||||
var isAlphaSequence = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/.test(m.body);
|
||||
var isSequence = isNumericSequence || isAlphaSequence;
|
||||
var isOptions = m.body.indexOf(',') >= 0;
|
||||
if (!isSequence && !isOptions) {
|
||||
// {a},b}
|
||||
if (m.post.match(/,.*\}/)) {
|
||||
str = m.pre + '{' + m.body + escClose + m.post;
|
||||
return expand(str);
|
||||
}
|
||||
return [str];
|
||||
}
|
||||
|
||||
var n;
|
||||
if (isSequence) {
|
||||
n = m.body.split(/\.\./);
|
||||
} else {
|
||||
n = parseCommaParts(m.body);
|
||||
if (n.length === 1) {
|
||||
// x{{a,b}}y ==> x{a}y x{b}y
|
||||
n = expand(n[0], false).map(embrace);
|
||||
if (n.length === 1) {
|
||||
var post = m.post.length
|
||||
? expand(m.post, false)
|
||||
: [''];
|
||||
return post.map(function(p) {
|
||||
return m.pre + n[0] + p;
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// at this point, n is the parts, and we know it's not a comma set
|
||||
// with a single entry.
|
||||
|
||||
// no need to expand pre, since it is guaranteed to be free of brace-sets
|
||||
var pre = m.pre;
|
||||
var post = m.post.length
|
||||
? expand(m.post, false)
|
||||
: [''];
|
||||
|
||||
var N;
|
||||
|
||||
if (isSequence) {
|
||||
var x = numeric(n[0]);
|
||||
var y = numeric(n[1]);
|
||||
var width = Math.max(n[0].length, n[1].length)
|
||||
var incr = n.length == 3
|
||||
? Math.abs(numeric(n[2]))
|
||||
: 1;
|
||||
var test = lte;
|
||||
var reverse = y < x;
|
||||
if (reverse) {
|
||||
incr *= -1;
|
||||
test = gte;
|
||||
}
|
||||
var pad = n.some(isPadded);
|
||||
|
||||
N = [];
|
||||
|
||||
for (var i = x; test(i, y); i += incr) {
|
||||
var c;
|
||||
if (isAlphaSequence) {
|
||||
c = String.fromCharCode(i);
|
||||
if (c === '\\')
|
||||
c = '';
|
||||
} else {
|
||||
c = String(i);
|
||||
if (pad) {
|
||||
var need = width - c.length;
|
||||
if (need > 0) {
|
||||
var z = new Array(need + 1).join('0');
|
||||
if (i < 0)
|
||||
c = '-' + z + c.slice(1);
|
||||
else
|
||||
c = z + c;
|
||||
}
|
||||
}
|
||||
}
|
||||
N.push(c);
|
||||
}
|
||||
} else {
|
||||
N = concatMap(n, function(el) { return expand(el, false) });
|
||||
}
|
||||
|
||||
for (var j = 0; j < N.length; j++) {
|
||||
for (var k = 0; k < post.length; k++) {
|
||||
var expansion = pre + N[j] + post[k];
|
||||
if (!isTop || isSequence || expansion)
|
||||
expansions.push(expansion);
|
||||
}
|
||||
}
|
||||
|
||||
return expansions;
|
||||
}
|
||||
|
||||
47
node_modules/brace-expansion/package.json
generated
vendored
Normal file
47
node_modules/brace-expansion/package.json
generated
vendored
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
{
|
||||
"name": "brace-expansion",
|
||||
"description": "Brace expansion as known from sh/bash",
|
||||
"version": "1.1.11",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/juliangruber/brace-expansion.git"
|
||||
},
|
||||
"homepage": "https://github.com/juliangruber/brace-expansion",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "tape test/*.js",
|
||||
"gentest": "bash test/generate.sh",
|
||||
"bench": "matcha test/perf/bench.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"balanced-match": "^1.0.0",
|
||||
"concat-map": "0.0.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"matcha": "^0.7.0",
|
||||
"tape": "^4.6.0"
|
||||
},
|
||||
"keywords": [],
|
||||
"author": {
|
||||
"name": "Julian Gruber",
|
||||
"email": "mail@juliangruber.com",
|
||||
"url": "http://juliangruber.com"
|
||||
},
|
||||
"license": "MIT",
|
||||
"testling": {
|
||||
"files": "test/*.js",
|
||||
"browsers": [
|
||||
"ie/8..latest",
|
||||
"firefox/20..latest",
|
||||
"firefox/nightly",
|
||||
"chrome/25..latest",
|
||||
"chrome/canary",
|
||||
"opera/12..latest",
|
||||
"opera/next",
|
||||
"safari/5.1..latest",
|
||||
"ipad/6.0..latest",
|
||||
"iphone/6.0..latest",
|
||||
"android-browser/4.2..latest"
|
||||
]
|
||||
}
|
||||
}
|
||||
97
node_modules/bytes/History.md
generated
vendored
Normal file
97
node_modules/bytes/History.md
generated
vendored
Normal file
|
|
@ -0,0 +1,97 @@
|
|||
3.1.2 / 2022-01-27
|
||||
==================
|
||||
|
||||
* Fix return value for un-parsable strings
|
||||
|
||||
3.1.1 / 2021-11-15
|
||||
==================
|
||||
|
||||
* Fix "thousandsSeparator" incorrecting formatting fractional part
|
||||
|
||||
3.1.0 / 2019-01-22
|
||||
==================
|
||||
|
||||
* Add petabyte (`pb`) support
|
||||
|
||||
3.0.0 / 2017-08-31
|
||||
==================
|
||||
|
||||
* Change "kB" to "KB" in format output
|
||||
* Remove support for Node.js 0.6
|
||||
* Remove support for ComponentJS
|
||||
|
||||
2.5.0 / 2017-03-24
|
||||
==================
|
||||
|
||||
* Add option "unit"
|
||||
|
||||
2.4.0 / 2016-06-01
|
||||
==================
|
||||
|
||||
* Add option "unitSeparator"
|
||||
|
||||
2.3.0 / 2016-02-15
|
||||
==================
|
||||
|
||||
* Drop partial bytes on all parsed units
|
||||
* Fix non-finite numbers to `.format` to return `null`
|
||||
* Fix parsing byte string that looks like hex
|
||||
* perf: hoist regular expressions
|
||||
|
||||
2.2.0 / 2015-11-13
|
||||
==================
|
||||
|
||||
* add option "decimalPlaces"
|
||||
* add option "fixedDecimals"
|
||||
|
||||
2.1.0 / 2015-05-21
|
||||
==================
|
||||
|
||||
* add `.format` export
|
||||
* add `.parse` export
|
||||
|
||||
2.0.2 / 2015-05-20
|
||||
==================
|
||||
|
||||
* remove map recreation
|
||||
* remove unnecessary object construction
|
||||
|
||||
2.0.1 / 2015-05-07
|
||||
==================
|
||||
|
||||
* fix browserify require
|
||||
* remove node.extend dependency
|
||||
|
||||
2.0.0 / 2015-04-12
|
||||
==================
|
||||
|
||||
* add option "case"
|
||||
* add option "thousandsSeparator"
|
||||
* return "null" on invalid parse input
|
||||
* support proper round-trip: bytes(bytes(num)) === num
|
||||
* units no longer case sensitive when parsing
|
||||
|
||||
1.0.0 / 2014-05-05
|
||||
==================
|
||||
|
||||
* add negative support. fixes #6
|
||||
|
||||
0.3.0 / 2014-03-19
|
||||
==================
|
||||
|
||||
* added terabyte support
|
||||
|
||||
0.2.1 / 2013-04-01
|
||||
==================
|
||||
|
||||
* add .component
|
||||
|
||||
0.2.0 / 2012-10-28
|
||||
==================
|
||||
|
||||
* bytes(200).should.eql('200b')
|
||||
|
||||
0.1.0 / 2012-07-04
|
||||
==================
|
||||
|
||||
* add bytes to string conversion [yields]
|
||||
23
node_modules/bytes/LICENSE
generated
vendored
Normal file
23
node_modules/bytes/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
(The MIT License)
|
||||
|
||||
Copyright (c) 2012-2014 TJ Holowaychuk <tj@vision-media.ca>
|
||||
Copyright (c) 2015 Jed Watson <jed.watson@me.com>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining
|
||||
a copy of this software and associated documentation files (the
|
||||
'Software'), to deal in the Software without restriction, including
|
||||
without limitation the rights to use, copy, modify, merge, publish,
|
||||
distribute, sublicense, and/or sell copies of the Software, and to
|
||||
permit persons to whom the Software is furnished to do so, subject to
|
||||
the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
|
||||
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
152
node_modules/bytes/Readme.md
generated
vendored
Normal file
152
node_modules/bytes/Readme.md
generated
vendored
Normal file
|
|
@ -0,0 +1,152 @@
|
|||
# Bytes utility
|
||||
|
||||
[![NPM Version][npm-image]][npm-url]
|
||||
[![NPM Downloads][downloads-image]][downloads-url]
|
||||
[![Build Status][ci-image]][ci-url]
|
||||
[![Test Coverage][coveralls-image]][coveralls-url]
|
||||
|
||||
Utility to parse a string bytes (ex: `1TB`) to bytes (`1099511627776`) and vice-versa.
|
||||
|
||||
## Installation
|
||||
|
||||
This is a [Node.js](https://nodejs.org/en/) module available through the
|
||||
[npm registry](https://www.npmjs.com/). Installation is done using the
|
||||
[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally):
|
||||
|
||||
```bash
|
||||
$ npm install bytes
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```js
|
||||
var bytes = require('bytes');
|
||||
```
|
||||
|
||||
#### bytes(number|string value, [options]): number|string|null
|
||||
|
||||
Default export function. Delegates to either `bytes.format` or `bytes.parse` based on the type of `value`.
|
||||
|
||||
**Arguments**
|
||||
|
||||
| Name | Type | Description |
|
||||
|---------|----------|--------------------|
|
||||
| value | `number`|`string` | Number value to format or string value to parse |
|
||||
| options | `Object` | Conversion options for `format` |
|
||||
|
||||
**Returns**
|
||||
|
||||
| Name | Type | Description |
|
||||
|---------|------------------|-------------------------------------------------|
|
||||
| results | `string`|`number`|`null` | Return null upon error. Numeric value in bytes, or string value otherwise. |
|
||||
|
||||
**Example**
|
||||
|
||||
```js
|
||||
bytes(1024);
|
||||
// output: '1KB'
|
||||
|
||||
bytes('1KB');
|
||||
// output: 1024
|
||||
```
|
||||
|
||||
#### bytes.format(number value, [options]): string|null
|
||||
|
||||
Format the given value in bytes into a string. If the value is negative, it is kept as such. If it is a float, it is
|
||||
rounded.
|
||||
|
||||
**Arguments**
|
||||
|
||||
| Name | Type | Description |
|
||||
|---------|----------|--------------------|
|
||||
| value | `number` | Value in bytes |
|
||||
| options | `Object` | Conversion options |
|
||||
|
||||
**Options**
|
||||
|
||||
| Property | Type | Description |
|
||||
|-------------------|--------|-----------------------------------------------------------------------------------------|
|
||||
| decimalPlaces | `number`|`null` | Maximum number of decimal places to include in output. Default value to `2`. |
|
||||
| fixedDecimals | `boolean`|`null` | Whether to always display the maximum number of decimal places. Default value to `false` |
|
||||
| thousandsSeparator | `string`|`null` | Example of values: `' '`, `','` and `'.'`... Default value to `''`. |
|
||||
| unit | `string`|`null` | The unit in which the result will be returned (B/KB/MB/GB/TB). Default value to `''` (which means auto detect). |
|
||||
| unitSeparator | `string`|`null` | Separator to use between number and unit. Default value to `''`. |
|
||||
|
||||
**Returns**
|
||||
|
||||
| Name | Type | Description |
|
||||
|---------|------------------|-------------------------------------------------|
|
||||
| results | `string`|`null` | Return null upon error. String value otherwise. |
|
||||
|
||||
**Example**
|
||||
|
||||
```js
|
||||
bytes.format(1024);
|
||||
// output: '1KB'
|
||||
|
||||
bytes.format(1000);
|
||||
// output: '1000B'
|
||||
|
||||
bytes.format(1000, {thousandsSeparator: ' '});
|
||||
// output: '1 000B'
|
||||
|
||||
bytes.format(1024 * 1.7, {decimalPlaces: 0});
|
||||
// output: '2KB'
|
||||
|
||||
bytes.format(1024, {unitSeparator: ' '});
|
||||
// output: '1 KB'
|
||||
```
|
||||
|
||||
#### bytes.parse(string|number value): number|null
|
||||
|
||||
Parse the string value into an integer in bytes. If no unit is given, or `value`
|
||||
is a number, it is assumed the value is in bytes.
|
||||
|
||||
Supported units and abbreviations are as follows and are case-insensitive:
|
||||
|
||||
* `b` for bytes
|
||||
* `kb` for kilobytes
|
||||
* `mb` for megabytes
|
||||
* `gb` for gigabytes
|
||||
* `tb` for terabytes
|
||||
* `pb` for petabytes
|
||||
|
||||
The units are in powers of two, not ten. This means 1kb = 1024b according to this parser.
|
||||
|
||||
**Arguments**
|
||||
|
||||
| Name | Type | Description |
|
||||
|---------------|--------|--------------------|
|
||||
| value | `string`|`number` | String to parse, or number in bytes. |
|
||||
|
||||
**Returns**
|
||||
|
||||
| Name | Type | Description |
|
||||
|---------|-------------|-------------------------|
|
||||
| results | `number`|`null` | Return null upon error. Value in bytes otherwise. |
|
||||
|
||||
**Example**
|
||||
|
||||
```js
|
||||
bytes.parse('1KB');
|
||||
// output: 1024
|
||||
|
||||
bytes.parse('1024');
|
||||
// output: 1024
|
||||
|
||||
bytes.parse(1024);
|
||||
// output: 1024
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
[MIT](LICENSE)
|
||||
|
||||
[ci-image]: https://badgen.net/github/checks/visionmedia/bytes.js/master?label=ci
|
||||
[ci-url]: https://github.com/visionmedia/bytes.js/actions?query=workflow%3Aci
|
||||
[coveralls-image]: https://badgen.net/coveralls/c/github/visionmedia/bytes.js/master
|
||||
[coveralls-url]: https://coveralls.io/r/visionmedia/bytes.js?branch=master
|
||||
[downloads-image]: https://badgen.net/npm/dm/bytes
|
||||
[downloads-url]: https://npmjs.org/package/bytes
|
||||
[npm-image]: https://badgen.net/npm/v/bytes
|
||||
[npm-url]: https://npmjs.org/package/bytes
|
||||
170
node_modules/bytes/index.js
generated
vendored
Normal file
170
node_modules/bytes/index.js
generated
vendored
Normal file
|
|
@ -0,0 +1,170 @@
|
|||
/*!
|
||||
* bytes
|
||||
* Copyright(c) 2012-2014 TJ Holowaychuk
|
||||
* Copyright(c) 2015 Jed Watson
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Module exports.
|
||||
* @public
|
||||
*/
|
||||
|
||||
module.exports = bytes;
|
||||
module.exports.format = format;
|
||||
module.exports.parse = parse;
|
||||
|
||||
/**
|
||||
* Module variables.
|
||||
* @private
|
||||
*/
|
||||
|
||||
var formatThousandsRegExp = /\B(?=(\d{3})+(?!\d))/g;
|
||||
|
||||
var formatDecimalsRegExp = /(?:\.0*|(\.[^0]+)0+)$/;
|
||||
|
||||
var map = {
|
||||
b: 1,
|
||||
kb: 1 << 10,
|
||||
mb: 1 << 20,
|
||||
gb: 1 << 30,
|
||||
tb: Math.pow(1024, 4),
|
||||
pb: Math.pow(1024, 5),
|
||||
};
|
||||
|
||||
var parseRegExp = /^((-|\+)?(\d+(?:\.\d+)?)) *(kb|mb|gb|tb|pb)$/i;
|
||||
|
||||
/**
|
||||
* Convert the given value in bytes into a string or parse to string to an integer in bytes.
|
||||
*
|
||||
* @param {string|number} value
|
||||
* @param {{
|
||||
* case: [string],
|
||||
* decimalPlaces: [number]
|
||||
* fixedDecimals: [boolean]
|
||||
* thousandsSeparator: [string]
|
||||
* unitSeparator: [string]
|
||||
* }} [options] bytes options.
|
||||
*
|
||||
* @returns {string|number|null}
|
||||
*/
|
||||
|
||||
function bytes(value, options) {
|
||||
if (typeof value === 'string') {
|
||||
return parse(value);
|
||||
}
|
||||
|
||||
if (typeof value === 'number') {
|
||||
return format(value, options);
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format the given value in bytes into a string.
|
||||
*
|
||||
* If the value is negative, it is kept as such. If it is a float,
|
||||
* it is rounded.
|
||||
*
|
||||
* @param {number} value
|
||||
* @param {object} [options]
|
||||
* @param {number} [options.decimalPlaces=2]
|
||||
* @param {number} [options.fixedDecimals=false]
|
||||
* @param {string} [options.thousandsSeparator=]
|
||||
* @param {string} [options.unit=]
|
||||
* @param {string} [options.unitSeparator=]
|
||||
*
|
||||
* @returns {string|null}
|
||||
* @public
|
||||
*/
|
||||
|
||||
function format(value, options) {
|
||||
if (!Number.isFinite(value)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
var mag = Math.abs(value);
|
||||
var thousandsSeparator = (options && options.thousandsSeparator) || '';
|
||||
var unitSeparator = (options && options.unitSeparator) || '';
|
||||
var decimalPlaces = (options && options.decimalPlaces !== undefined) ? options.decimalPlaces : 2;
|
||||
var fixedDecimals = Boolean(options && options.fixedDecimals);
|
||||
var unit = (options && options.unit) || '';
|
||||
|
||||
if (!unit || !map[unit.toLowerCase()]) {
|
||||
if (mag >= map.pb) {
|
||||
unit = 'PB';
|
||||
} else if (mag >= map.tb) {
|
||||
unit = 'TB';
|
||||
} else if (mag >= map.gb) {
|
||||
unit = 'GB';
|
||||
} else if (mag >= map.mb) {
|
||||
unit = 'MB';
|
||||
} else if (mag >= map.kb) {
|
||||
unit = 'KB';
|
||||
} else {
|
||||
unit = 'B';
|
||||
}
|
||||
}
|
||||
|
||||
var val = value / map[unit.toLowerCase()];
|
||||
var str = val.toFixed(decimalPlaces);
|
||||
|
||||
if (!fixedDecimals) {
|
||||
str = str.replace(formatDecimalsRegExp, '$1');
|
||||
}
|
||||
|
||||
if (thousandsSeparator) {
|
||||
str = str.split('.').map(function (s, i) {
|
||||
return i === 0
|
||||
? s.replace(formatThousandsRegExp, thousandsSeparator)
|
||||
: s
|
||||
}).join('.');
|
||||
}
|
||||
|
||||
return str + unitSeparator + unit;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse the string value into an integer in bytes.
|
||||
*
|
||||
* If no unit is given, it is assumed the value is in bytes.
|
||||
*
|
||||
* @param {number|string} val
|
||||
*
|
||||
* @returns {number|null}
|
||||
* @public
|
||||
*/
|
||||
|
||||
function parse(val) {
|
||||
if (typeof val === 'number' && !isNaN(val)) {
|
||||
return val;
|
||||
}
|
||||
|
||||
if (typeof val !== 'string') {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Test if the string passed is valid
|
||||
var results = parseRegExp.exec(val);
|
||||
var floatValue;
|
||||
var unit = 'b';
|
||||
|
||||
if (!results) {
|
||||
// Nothing could be extracted from the given string
|
||||
floatValue = parseInt(val, 10);
|
||||
unit = 'b'
|
||||
} else {
|
||||
// Retrieve the value and the unit
|
||||
floatValue = parseFloat(results[1]);
|
||||
unit = results[4].toLowerCase();
|
||||
}
|
||||
|
||||
if (isNaN(floatValue)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return Math.floor(map[unit] * floatValue);
|
||||
}
|
||||
42
node_modules/bytes/package.json
generated
vendored
Normal file
42
node_modules/bytes/package.json
generated
vendored
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
{
|
||||
"name": "bytes",
|
||||
"description": "Utility to parse a string bytes to bytes and vice-versa",
|
||||
"version": "3.1.2",
|
||||
"author": "TJ Holowaychuk <tj@vision-media.ca> (http://tjholowaychuk.com)",
|
||||
"contributors": [
|
||||
"Jed Watson <jed.watson@me.com>",
|
||||
"Théo FIDRY <theo.fidry@gmail.com>"
|
||||
],
|
||||
"license": "MIT",
|
||||
"keywords": [
|
||||
"byte",
|
||||
"bytes",
|
||||
"utility",
|
||||
"parse",
|
||||
"parser",
|
||||
"convert",
|
||||
"converter"
|
||||
],
|
||||
"repository": "visionmedia/bytes.js",
|
||||
"devDependencies": {
|
||||
"eslint": "7.32.0",
|
||||
"eslint-plugin-markdown": "2.2.1",
|
||||
"mocha": "9.2.0",
|
||||
"nyc": "15.1.0"
|
||||
},
|
||||
"files": [
|
||||
"History.md",
|
||||
"LICENSE",
|
||||
"Readme.md",
|
||||
"index.js"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">= 0.8"
|
||||
},
|
||||
"scripts": {
|
||||
"lint": "eslint .",
|
||||
"test": "mocha --check-leaks --reporter spec",
|
||||
"test-ci": "nyc --reporter=lcov --reporter=text npm test",
|
||||
"test-cov": "nyc --reporter=html --reporter=text npm test"
|
||||
}
|
||||
}
|
||||
1
node_modules/call-bind/.eslintignore
generated
vendored
Normal file
1
node_modules/call-bind/.eslintignore
generated
vendored
Normal file
|
|
@ -0,0 +1 @@
|
|||
coverage/
|
||||
16
node_modules/call-bind/.eslintrc
generated
vendored
Normal file
16
node_modules/call-bind/.eslintrc
generated
vendored
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"root": true,
|
||||
|
||||
"extends": "@ljharb",
|
||||
|
||||
"rules": {
|
||||
"func-name-matching": 0,
|
||||
"id-length": 0,
|
||||
"new-cap": [2, {
|
||||
"capIsNewExceptions": [
|
||||
"GetIntrinsic",
|
||||
],
|
||||
}],
|
||||
"no-magic-numbers": 0,
|
||||
},
|
||||
}
|
||||
6
.github/FUNDING.yml → node_modules/call-bind/.github/FUNDING.yml
generated
vendored
6
.github/FUNDING.yml → node_modules/call-bind/.github/FUNDING.yml
generated
vendored
|
|
@ -1,10 +1,10 @@
|
|||
# These are supported funding model platforms
|
||||
|
||||
github: # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2]
|
||||
patreon: holyunblockerlts
|
||||
github: [ljharb]
|
||||
patreon: # Replace with a single Patreon username
|
||||
open_collective: # Replace with a single Open Collective username
|
||||
ko_fi: # Replace with a single Ko-fi username
|
||||
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
|
||||
tidelift: npm/call-bind
|
||||
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
|
||||
liberapay: # Replace with a single Liberapay username
|
||||
issuehunt: # Replace with a single IssueHunt username
|
||||
9
node_modules/call-bind/.nycrc
generated
vendored
Normal file
9
node_modules/call-bind/.nycrc
generated
vendored
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
{
|
||||
"all": true,
|
||||
"check-coverage": false,
|
||||
"reporter": ["text-summary", "text", "html", "json"],
|
||||
"exclude": [
|
||||
"coverage",
|
||||
"test"
|
||||
]
|
||||
}
|
||||
93
node_modules/call-bind/CHANGELOG.md
generated
vendored
Normal file
93
node_modules/call-bind/CHANGELOG.md
generated
vendored
Normal file
|
|
@ -0,0 +1,93 @@
|
|||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.0.7](https://github.com/ljharb/call-bind/compare/v1.0.6...v1.0.7) - 2024-02-12
|
||||
|
||||
### Commits
|
||||
|
||||
- [Refactor] use `es-define-property` [`09b76a0`](https://github.com/ljharb/call-bind/commit/09b76a01634440461d44a80c9924ec4b500f3b03)
|
||||
- [Deps] update `get-intrinsic`, `set-function-length` [`ad5136d`](https://github.com/ljharb/call-bind/commit/ad5136ddda2a45c590959829ad3dce0c9f4e3590)
|
||||
|
||||
## [v1.0.6](https://github.com/ljharb/call-bind/compare/v1.0.5...v1.0.6) - 2024-02-05
|
||||
|
||||
### Commits
|
||||
|
||||
- [Dev Deps] update `aud`, `npmignore`, `tape` [`d564d5c`](https://github.com/ljharb/call-bind/commit/d564d5ce3e06a19df4d499c77f8d1a9da44e77aa)
|
||||
- [Deps] update `get-intrinsic`, `set-function-length` [`cfc2bdc`](https://github.com/ljharb/call-bind/commit/cfc2bdca7b633df0e0e689e6b637f668f1c6792e)
|
||||
- [Refactor] use `es-errors`, so things that only need those do not need `get-intrinsic` [`64cd289`](https://github.com/ljharb/call-bind/commit/64cd289ae5862c250a4ca80aa8d461047c166af5)
|
||||
- [meta] add missing `engines.node` [`32a4038`](https://github.com/ljharb/call-bind/commit/32a4038857b62179f7f9b7b3df2c5260036be582)
|
||||
|
||||
## [v1.0.5](https://github.com/ljharb/call-bind/compare/v1.0.4...v1.0.5) - 2023-10-19
|
||||
|
||||
### Commits
|
||||
|
||||
- [Fix] throw an error on non-functions as early as possible [`f262408`](https://github.com/ljharb/call-bind/commit/f262408f822c840fbc268080f3ad7c429611066d)
|
||||
- [Deps] update `set-function-length` [`3fff271`](https://github.com/ljharb/call-bind/commit/3fff27145a1e3a76a5b74f1d7c3c43d0fa3b9871)
|
||||
|
||||
## [v1.0.4](https://github.com/ljharb/call-bind/compare/v1.0.3...v1.0.4) - 2023-10-19
|
||||
|
||||
## [v1.0.3](https://github.com/ljharb/call-bind/compare/v1.0.2...v1.0.3) - 2023-10-19
|
||||
|
||||
### Commits
|
||||
|
||||
- [actions] reuse common workflows [`a994df6`](https://github.com/ljharb/call-bind/commit/a994df69f401f4bf735a4ccd77029b85d1549453)
|
||||
- [meta] use `npmignore` to autogenerate an npmignore file [`eef3ef2`](https://github.com/ljharb/call-bind/commit/eef3ef21e1f002790837fedb8af2679c761fbdf5)
|
||||
- [readme] flesh out content [`1845ccf`](https://github.com/ljharb/call-bind/commit/1845ccfd9976a607884cfc7157c93192cc16cf22)
|
||||
- [actions] use `node/install` instead of `node/run`; use `codecov` action [`5b47d53`](https://github.com/ljharb/call-bind/commit/5b47d53d2fd74af5ea0a44f1d51e503cd42f7a90)
|
||||
- [Refactor] use `set-function-length` [`a0e165c`](https://github.com/ljharb/call-bind/commit/a0e165c5dc61db781cbc919b586b1c2b8da0b150)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `tape` [`9c50103`](https://github.com/ljharb/call-bind/commit/9c50103f44137279a817317cf6cc421a658f85b4)
|
||||
- [meta] simplify "exports" [`019c6d0`](https://github.com/ljharb/call-bind/commit/019c6d06b0e1246ceed8e579f57e44441cbbf6d9)
|
||||
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `aud`, `auto-changelog`, `safe-publish-latest`, `tape` [`23bd718`](https://github.com/ljharb/call-bind/commit/23bd718a288d3b03042062b4ef5153b3cea83f11)
|
||||
- [actions] update codecov uploader [`62552d7`](https://github.com/ljharb/call-bind/commit/62552d79cc79e05825e99aaba134ae5b37f33da5)
|
||||
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `aud`, `auto-changelog`, `tape` [`ec81665`](https://github.com/ljharb/call-bind/commit/ec81665b300f87eabff597afdc8b8092adfa7afd)
|
||||
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `safe-publish-latest`, `tape` [`35d67fc`](https://github.com/ljharb/call-bind/commit/35d67fcea883e686650f736f61da5ddca2592de8)
|
||||
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `aud`, `tape` [`0266d8d`](https://github.com/ljharb/call-bind/commit/0266d8d2a45086a922db366d0c2932fa463662ff)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `tape` [`43a5b28`](https://github.com/ljharb/call-bind/commit/43a5b28a444e710e1bbf92adb8afb5cf7523a223)
|
||||
- [Deps] update `define-data-property`, `function-bind`, `get-intrinsic` [`780eb36`](https://github.com/ljharb/call-bind/commit/780eb36552514f8cc99c70821ce698697c2726a5)
|
||||
- [Dev Deps] update `aud`, `tape` [`90d50ad`](https://github.com/ljharb/call-bind/commit/90d50ad03b061e0268b3380b0065fcaec183dc05)
|
||||
- [meta] use `prepublishOnly` script for npm 7+ [`44c5433`](https://github.com/ljharb/call-bind/commit/44c5433b7980e02b4870007046407cf6fc543329)
|
||||
- [Deps] update `get-intrinsic` [`86bfbfc`](https://github.com/ljharb/call-bind/commit/86bfbfcf34afdc6eabc93ce3d408548d0e27d958)
|
||||
- [Deps] update `get-intrinsic` [`5c53354`](https://github.com/ljharb/call-bind/commit/5c5335489be0294c18cd7a8bb6e08226ee019ff5)
|
||||
- [actions] update checkout action [`4c393a8`](https://github.com/ljharb/call-bind/commit/4c393a8173b3c8e5b30d5b3297b3b94d48bf87f3)
|
||||
- [Deps] update `get-intrinsic` [`4e70bde`](https://github.com/ljharb/call-bind/commit/4e70bdec0626acb11616d66250fc14565e716e91)
|
||||
- [Deps] update `get-intrinsic` [`55ae803`](https://github.com/ljharb/call-bind/commit/55ae803a920bd93c369cd798c20de31f91e9fc60)
|
||||
|
||||
## [v1.0.2](https://github.com/ljharb/call-bind/compare/v1.0.1...v1.0.2) - 2021-01-11
|
||||
|
||||
### Commits
|
||||
|
||||
- [Fix] properly include the receiver in the bound length [`dbae7bc`](https://github.com/ljharb/call-bind/commit/dbae7bc676c079a0d33c0a43e9ef92cb7b01345d)
|
||||
|
||||
## [v1.0.1](https://github.com/ljharb/call-bind/compare/v1.0.0...v1.0.1) - 2021-01-08
|
||||
|
||||
### Commits
|
||||
|
||||
- [Tests] migrate tests to Github Actions [`b6db284`](https://github.com/ljharb/call-bind/commit/b6db284c36f8ccd195b88a6764fe84b7223a0da1)
|
||||
- [meta] do not publish github action workflow files [`ec7fe46`](https://github.com/ljharb/call-bind/commit/ec7fe46e60cfa4764ee943d2755f5e5a366e578e)
|
||||
- [Fix] preserve original function’s length when possible [`adbceaa`](https://github.com/ljharb/call-bind/commit/adbceaa3cac4b41ea78bb19d7ccdbaaf7e0bdadb)
|
||||
- [Tests] gather coverage data on every job [`d69e23c`](https://github.com/ljharb/call-bind/commit/d69e23cc65f101ba1d4c19bb07fa8eb0ec624be8)
|
||||
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `aud`, `tape` [`2fd3586`](https://github.com/ljharb/call-bind/commit/2fd3586c5d47b335364c14293114c6b625ae1f71)
|
||||
- [Deps] update `get-intrinsic` [`f23e931`](https://github.com/ljharb/call-bind/commit/f23e9318cc271c2add8bb38cfded85ee7baf8eee)
|
||||
- [Deps] update `get-intrinsic` [`72d9f44`](https://github.com/ljharb/call-bind/commit/72d9f44e184465ba8dd3fb48260bbcff234985f2)
|
||||
- [meta] fix FUNDING.yml [`e723573`](https://github.com/ljharb/call-bind/commit/e723573438c5a68dcec31fb5d96ea6b7e4a93be8)
|
||||
- [eslint] ignore coverage output [`15e76d2`](https://github.com/ljharb/call-bind/commit/15e76d28a5f43e504696401e5b31ebb78ee1b532)
|
||||
- [meta] add Automatic Rebase and Require Allow Edits workflows [`8fa4dab`](https://github.com/ljharb/call-bind/commit/8fa4dabb23ba3dd7bb92c9571c1241c08b56e4b6)
|
||||
|
||||
## v1.0.0 - 2020-10-30
|
||||
|
||||
### Commits
|
||||
|
||||
- Initial commit [`306cf98`](https://github.com/ljharb/call-bind/commit/306cf98c7ec9e7ef66b653ec152277ac1381eb50)
|
||||
- Tests [`e10d0bb`](https://github.com/ljharb/call-bind/commit/e10d0bbdadc7a10ecedc9a1c035112d3e368b8df)
|
||||
- Implementation [`43852ed`](https://github.com/ljharb/call-bind/commit/43852eda0f187327b7fad2423ca972149a52bd65)
|
||||
- npm init [`408f860`](https://github.com/ljharb/call-bind/commit/408f860b773a2f610805fd3613d0d71bac1b6249)
|
||||
- [meta] add Automatic Rebase and Require Allow Edits workflows [`fb349b2`](https://github.com/ljharb/call-bind/commit/fb349b2e48defbec8b5ec8a8395cc8f69f220b13)
|
||||
- [meta] add `auto-changelog` [`c4001fc`](https://github.com/ljharb/call-bind/commit/c4001fc43031799ef908211c98d3b0fb2b60fde4)
|
||||
- [meta] add "funding"; create `FUNDING.yml` [`d4d6d29`](https://github.com/ljharb/call-bind/commit/d4d6d2974a14bc2e98830468eda7fe6d6a776717)
|
||||
- [Tests] add `npm run lint` [`dedfb98`](https://github.com/ljharb/call-bind/commit/dedfb98bd0ecefb08ddb9a94061bd10cde4332af)
|
||||
- Only apps should have lockfiles [`54ac776`](https://github.com/ljharb/call-bind/commit/54ac77653db45a7361dc153d2f478e743f110650)
|
||||
- [meta] add `safe-publish-latest` [`9ea8e43`](https://github.com/ljharb/call-bind/commit/9ea8e435b950ce9b705559cd651039f9bf40140f)
|
||||
21
node_modules/call-bind/LICENSE
generated
vendored
Normal file
21
node_modules/call-bind/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
MIT License
|
||||
|
||||
Copyright (c) 2020 Jordan Harband
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue