Compare commits

...

86 Commits

Author SHA1 Message Date
Trivernis 0f2bd16d28
Merge pull request #23 from Trivernis/develop
Develop
3 years ago
trivernis 3eb11e03e8
Add chromium requirement to README
closes #21

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 659ca55e2c
Merge branch 'main' into develop 3 years ago
trivernis 7da9f5b338
Bump version
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 83734aa530
Merge pull request #22 from silentbat/main
add style.css to init
3 years ago
trivernis c7694d39e0
Change path of temporary file to be absolute
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 0495855d91
Fix problems with creating temp file for pdf rendering
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis cdbfe8c195
Update dependencies
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 0e608e255a
Add fetch feature to headless_chromium dep
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
silentbat 995a3b0582
add style.css to init
when calling `snekdown init`, you should get a default (empty) style.css file now
3 years ago
trivernis 8181119f4a
Update README badge links
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 0fda78bb96
Merge pull request #20 from Trivernis/develop
Reference anchor and improved README
3 years ago
Trivernis 3c9f41ce8c Update issue templates 3 years ago
trivernis 6ebc7cc766
Update Copyright in Source files
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 1e46274003
Add reference anchors and update README
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis e1e63cc35a
Update README
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis fd23a49e49
Merge pull request #19 from Trivernis/develop
Fix windows lineending
3 years ago
trivernis f709465eba
Fix windows lineending
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 31c4711881
Merge pull request #18 from Trivernis/develop
Fix mathjax include
3 years ago
trivernis 4c0d4c560c
Fix mathjax include
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis ccad1547ae
Merge pull request #17 from Trivernis/develop
Changes and fixes to whitespace behaviour
3 years ago
trivernis e50d73a880
Fix quote text alignment
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis d5df0b6f05
Change whitespace behaviour
A single linebreak will be ignored in plain text while a double line
break will be converted to a single linebreak. All following
linebreaks are taken as-is and rendered as normal linebreaks.
(Fixes #13)

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 661a6e5a85
Add flag to print to stdout
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 673a7527d6
Merge pull request #15 from Trivernis/develop
Fixes to monospace and block elements
3 years ago
trivernis cac1cbe8d6
Bump version
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis e99b80ecf0
Fix block documents being ignored at EOF
Fixes that block elements are being ignored when written as the last
element of a file (Fixes #14)x

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 8cf6dd33b7
Fix escapes in monospace and progressbar
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis d93fe6a57b
Merge pull request #11 from Trivernis/develop
HTML Metadata
3 years ago
trivernis d60e6aabd4
Merge branch 'develop' of github.com:/Trivernis/snekdown into develop 3 years ago
trivernis 8747c8c41a
Add metadata embedding in html
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis e24cc4b0f1
Merge pull request #9 from Trivernis/develop
Develop
4 years ago
Trivernis 796dc6ae34
Merge pull request #10 from Trivernis/actions
Change build to build for all features
4 years ago
trivernis 2e53e9e603
Change build to build for all features
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis f6e4bb86da
Hotfix pdf renderer
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 374c385f06
Increment Version
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
Trivernis d1e71e0204
Merge pull request #8 from Trivernis/develop
New Config and Themes
4 years ago
trivernis 245c908410
Fix nested anchors and placeholders
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis d0186cc90e
Fix toc not using plain text
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
Trivernis 4db1f27315
Merge pull request #7 from Trivernis/feature/themes
Feature/themes
4 years ago
trivernis d82c43e5a3
Add magic dark theme and rename formatting config to style
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis faa8e57ffa
Update Theme
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 097eae5f4e
Add theme config option
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
Trivernis 6b9a30c571
Merge pull request #6 from Trivernis/feature/hierarchical-config
Feature/hierarchical config
4 years ago
trivernis f2bfcd66a9
Merge branch 'main' of github.com:/Trivernis/snekdown into main 4 years ago
Trivernis a98a512b18
Merge pull request #5 from Trivernis/actions
Change build action to also trigger on develop
4 years ago
trivernis 1bfe95d08a
Change build action to also trigger on develop
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis ce311853cc
Update README
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis d84b0d86dd
Change settings format
Settings are now stored in a specific format defined in the settings
module. The Manifest.toml file will be loaded by default. Further
config files can be imported with the default file import syntax.

Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 31a4abff39
Add option to configure watch debounce time
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis b21a6ddb5d
Increment version
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
Trivernis 1b086ebdc1
Merge pull request #4 from Trivernis/feature/image-processing
Feature/image processing
4 years ago
trivernis 19bf960d74
Merge branch 'main' into feature/image-processing 4 years ago
Trivernis 9ebf7c1882
Merge pull request #3 from Trivernis/actions
Add build task for PRs
4 years ago
trivernis d4ae239c8a
Add build task for PRs
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 2b07fc39b2
Add huerotate option to image
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 9fb6664a63
Add inverting images and progressbar for image processing
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis b9cf095cfa
Add image filters
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 63ea60b10a
Add conversion of images when configured
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 3acfe9c6e2
Update Readme
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 53c4818f9d
Change input syntax and add cache storage handler
- input syntax changed to <subcommand> <options...>
- Added clear-cache subcommand
- CacheStorage now handles the caching of files

Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis d42821a6eb
Fix image href pointing to non existent files
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis b379fcbea9
Fix default value for display-header-footer
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 293fb52aaa
Fix compile warning on pdf config constants
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis f2b154b7c3
Expose chromium pdf options to config
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
Trivernis 169e062d29
Merge pull request #2 from Trivernis/actions
Actions
4 years ago
trivernis 64910c031c
Add LICENSE file to task
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 93c1bc9c1d
Add mingw setup to build-and-release
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis b59a5af2d1
Add runs-on attribute
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis cbfe631ce8
Fix syntax error
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 14862990f2
Add build-and-release task and rust-toolchain file
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 9d1f6d0ac9
Update dependencies
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 944a9e03fc
Increment version
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis b599bf5e37
Fix warnings and update Readme
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis e5d021a571
Hide pdf behind features flag
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 2f16c69c39
Add chromium based pdf support
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 72d0e0a215
Resolve conflict between glossary and striked text parsing
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 04e1e30fef
Fix MathJax Integration
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis b14ba1b647
Update README
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 1a4ec92aff
Add SmartArrows
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 0db3c62c57
Update README
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 6922320841
Update README
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis e8cdbc3b06
Add glossary implementation
Glossary entries can be defined with `~KEY` for the
short form and `~~KEY` for the long form.
If a glossary entry is referenced for the first time it will always be
rendered as the long form.
Glossary entries can be defined in a toml file (default is glossary.toml)
similar to bibliography.

Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 9424d04c37
Add option to turn off embedding of images
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis 2dba1025e8
Fix --no-cache to also not write to the cache dir
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago
trivernis dc884890d4
Add download cache and refactor parser creation
Signed-off-by: trivernis <trivernis@protonmail.com>
4 years ago

@ -0,0 +1,31 @@
---
name: Bug report
about: Create a report to help us improve
title: "[BUG]"
labels: bug
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Write...
2. Render...
3. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. Arch Linux]
- Architecture: [e.g. x86_64, ARM]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.

@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: "[FEATURE]"
labels: enhancement
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

@ -0,0 +1,52 @@
name: "Build and Release"
on:
push:
tags:
- "v*"
workflow_dispatch:
jobs:
build-release:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up toolchain
uses: actions-rs/toolchain@v1
with:
toolchain: nightly
override: true
- name: Set up MinGW
uses: egor-tensin/setup-mingw@v1
with:
platform: x64
- name: Cache cargo builds
uses: actions/cache@v2
with:
path: |
target
~/.cargo/
key: ${{ runner.os }}-cargo-${{ hashFiles('Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Build Release
uses: actions-rs/cargo@v1
with:
use-cross: false
command: build
args: --release --all-features -Zmultitarget --target x86_64-unknown-linux-gnu --target x86_64-pc-windows-gnu
- name: Move binaries
run: mv target/x86_64-unknown-linux-gnu/release/snekdown target/snekdown-linux-x86_64 && mv target/x86_64-pc-windows-gnu/release/snekdown.exe target/snekdown-windows-x86_64.exe
- name: Upload artifacts
uses: actions/upload-artifact@v2
with:
name: snekdown
path: target/snekdown*
- name: publish release
uses: "marvinpinto/action-automatic-releases@latest"
with:
repo_token: "${{ secrets.GITHUB_TOKEN }}"
prerelease: false
files: |
LICENSE
target/snekdown*

@ -0,0 +1,42 @@
name: Build and Test
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main, develop ]
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Cache build data
uses: actions/cache@v2
with:
path: |
target
~/.cargo/
key: ${{ runner.os }}-cargo-${{ hashFiles('Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Build
run: cargo build --verbose --all-features
- name: Run tests
run: cargo test --verbose --all-features
- name: Test init
run: cargo run -- init
- name: Test HTML
run: cargo run -- render README.md README.html --format html
- name: Test PDF
run: cargo run --all-features -- render README.md README.pdf --format pdf

2845
Cargo.lock generated

File diff suppressed because it is too large Load Diff

@ -1,9 +1,9 @@
[package] [package]
name = "snekdown" name = "snekdown"
version = "0.26.6" version = "0.33.4"
authors = ["trivernis <trivernis@protonmail.com>"] authors = ["trivernis <trivernis@protonmail.com>"]
edition = "2018" edition = "2018"
license-file = "LICENSE" license = "GPL-3.0"
readme = "README.md" readme = "README.md"
description = "A parser for the custom snekdown markdown syntax" description = "A parser for the custom snekdown markdown syntax"
repository = "https://github.com/Trivernis/snekdown" repository = "https://github.com/Trivernis/snekdown"
@ -16,10 +16,14 @@ crate-type = ["lib"]
name = "snekdown" name = "snekdown"
path = "src/main.rs" path = "src/main.rs"
[features]
pdf = ["headless_chrome", "failure"]
[dependencies] [dependencies]
charred = "0.3.3" charred = "0.3.6"
asciimath-rs = "0.5.7" asciimath-rs = "0.5.7"
bibliographix = "0.5.0" bibliographix = "0.6.0"
crossbeam-utils = "0.7.2" crossbeam-utils = "0.7.2"
structopt = "0.3.14" structopt = "0.3.14"
minify = "1.1.1" minify = "1.1.1"
@ -32,9 +36,8 @@ colored = "1.9.3"
gh-emoji = "1.0.3" gh-emoji = "1.0.3"
notify = "4.0.12" notify = "4.0.12"
toml = "0.5.6" toml = "0.5.6"
serde ="1.0.111" serde = { version = "1.0.111", features = ["serde_derive"] }
serde_derive = "1.0.111" reqwest = { version = "0.10", features = ["blocking"] }
reqwest = {version = "0.10", features=["blocking"]}
mime_guess = "2.0.3" mime_guess = "2.0.3"
mime = "0.3.16" mime = "0.3.16"
base64 = "0.12.3" base64 = "0.12.3"
@ -42,4 +45,13 @@ rayon = "1.3.1"
maplit = "1.0.2" maplit = "1.0.2"
log = "0.4.11" log = "0.4.11"
env_logger = "0.7.1" env_logger = "0.7.1"
indicatif = "0.15.0" indicatif = "0.15.0"
platform-dirs = "0.2.0"
image = "0.23.12"
parking_lot = "0.11.1"
sha2 = "0.9.2"
config = "0.10.1"
rsass = "0.16.0"
headless_chrome = { version = "0.9.0", optional = true, features = ["fetch"] }
failure = { version = "0.1.8", optional = true }

@ -1,263 +1,101 @@
# ![](https://i.imgur.com/FpdXqiT.png) Snekdown - More than just Markdown ![](https://img.shields.io/discord/729250668162056313) <p align="center">
<img src="https://i.imgur.com/FpdXqiT.png">
</p>
<h1 align="center">Snekdown</h1>
<p align="center">
<i>More than just Markdown</i>
</p>
<p align="center">
<a href="https://github.com/Trivernis/snekdown/actions">
<img src="https://img.shields.io/github/workflow/status/trivernis/snekdown/Build%20and%20Test/main?style=for-the-badge">
</a>
<a href="https://crates.io/crates/snekdown">
<img src="https://img.shields.io/crates/v/snekdown?style=for-the-badge">
</a>
<a href="https://aur.archlinux.org/packages/snekdown">
<img src="https://img.shields.io/aur/version/snekdown?style=for-the-badge">
</a>
<a href="https://discord.gg/vGAXW9nxUv">
<img src="https://img.shields.io/discord/729250668162056313?style=for-the-badge">
</a>
<br/>
<br/>
<a href="https://trivernis.net/snekdown/">Documentation</a> |
<a href="https://github.com/Trivernis/snekdown/releases">Releases</a>
</p>
- - -
## Description
This projects goal is to implement a fast markdown parser with an extended syntax fitted This projects goal is to implement a fast markdown parser with an extended syntax fitted
for my needs. for my needs.
## Usage ## Core Features
```
USAGE:
snekdown [OPTIONS] <input> <output> [SUBCOMMAND]
FLAGS:
-h, --help Prints help information
-V, --version Prints version information
OPTIONS:
-f, --format <format> the output format [default: html]
ARGS:
<input> Path to the input file
<output> Path for the output file
SUBCOMMANDS:
help Prints this message or the help of the given subcommand(s)
render Default. Parse and render the document
watch Watch the document and its imports and render on change
```
## Syntax
### Images
```md
Simple Syntax
!(url)
Extended syntax with a description
![description](url)
Extended syntax with metadata to specify the size
![description](url)[metadata]
Extended syntax with metadata and no description
!(url)[metadata]
```
When generating the html file the images are base64 embedded.
### Quotes
```md
Simple (default) Syntax
> This is a quote
Multiline
> This is a
> Multiline Quote
Quote with metadata (e.g. Author)
[author=Trivernis year=2020 display='{{author}} - {{year}}']> This is a quote with metadata
```
### Imports
Imports can be used to import a different document to be attached to the main document.
Imports are parsed via multithreading.
```md
<[path]
<[document.md]
<[style.css][type=stylesheet]
```
The parser differentiates four different types of imported files.
- `document` - The default import which is just another snekdown document
- `stylesheet` - CSS Stylesheets that are inclued when rendering
- `bibliography` - A file including bibliography
- `config`/`manifest` - A config file that contains metadata
If no type is provided the parser guesses the type of file from the extension.
### Tables
Tables MUST start with a pipe character `|`
```md
Standalone header:
| header | header | header
Header with rows
| header | header | header
|--------|--------|-------
| row | row | row
```
### Placeholders
Placeholders can be used to insert special elements in a specific place.
Placeholders are always case insensitive.
```md - Imports
Insert the table of contents - Bibliography & Glossary
[[TOC]] - AsciiMath
- Placeholders
- Advanced Images
Insert the current date ## Prerequisites
[[date]]
Insert the current time - Google Chrome/Chromium (for PDF rendering)
[[time]]
```
### Metadata
Additional metadata can be provided for some elements.
```md
String value
[key = value]
String value
[key = "String value"]
Integer value
[key = 123]
Float value
[key = 1.23]
Boolean
[key]
Boolean
[key = false]
Placeholder
[key = [[placeholder]]]
```
Metadata can also be defined in a separate toml file with simple key-value pairs. ## Installation
Example:
```toml
# bibliography.bib.toml
[Meta]
author = "Snek"
published = "2020"
test-key = ["test value", "test value 2"]
[imports]
ignored-imports = ["style.css"] # those files won't get imported
included-stylesheets = ["style2.css"] # stylesheets that should be included
included-configs = [] # other metadata files that should be included
included-bibliography = ["nextbib.toml"]# bibliography that should be included
```
The `[Section]` keys are not relevant as the structure gets flattened before the values are read. ### Binaries
You can download prebuilt binaries on the [Releases](https://github.com/Trivernis/snekdown/releases) Page.
#### Usage
``` ### Arch Linux
Hide a section (including subsections) in the TOC
#[toc-hidden] Section
Set the size of an image Snekdown is available in [the AUR](https://aur.archlinux.org/packages/snekdown).
!(url)[width = 42% height=auto]
Set the source of a quote
[author=Me date=[[date]] display="{{author}} - {{date}}"]> It's me
Set options for placeholders
[[toc]][ordered]
```
### Centered Text ### Cargo
``` You need a working rust installation, for example by using [rustup](http://rustup.rs).
|| These two lines
|| are centered
```
### Inline ```sh
cargo install snekdown
```md
*Italic*
**Bold**
~Striked~
_Underlined_
^Superscript^
`Monospace`
:Emoji:
§[#0C0]Colored text§[] §[red] red §[]
``` ```
## Bibliography With pdf rendering
Bibliography entries can be defined and referenced anywhere in the document.
Definition: ```sh
```md cargo install snekdown --features pdf
[SD_BOOK]:[type=book, author=Snek, title = "Snekdown Book" date="20.08.2020", publisher=Snek]
[SD_GITHUB]: https://github.com/trivernis/snekdown
``` ```
Usage:
```
There is a book about snekdown[^book] and a github repo[^github].
```
Entries can also be defined in a separate toml file with the following data layout: ## Usage
```toml
# snekdown.toml
[BIB_KEY]
key = "value"
[SD_BOOK] Use `snekdown help` and `snekdown <subcommand> --help` for more information.
type = "book"
author = "Snek"
title = "Snekdown Book"
date = "20.08.2020"
publisher = "Snek"
[SD_GITHUB] ### Rendering
type = "website"
url = "https://github.com/trivernis/snekdown"
```
The valid types for entries and required fields can be found on in the [bibliographix README](https://github.com/Trivernis/bibliographix#bibliography-types-and-fields). `snekdown render <input> <output>`
Bibliography entries are not rendered. To render a list of used bibliography insert the ### Watching
`bib` placeholder at the place you want it to be rendered.
`snekdown watch <input> <output>`
## Math
Snekdown allows the embedding of [AsciiMath](http://asciimath.org/): ## Editors
The AsciiMath parser is provided in the [asciimath-rs](https://github.com/Trivernis/asciimath-rs) crate
```
inline math $$ a^2 + b^2 = c^2 $$
Block Math I've created a [VisualStudio Code extension](https://marketplace.visualstudio.com/items?itemName=trivernis.snekdown) for Snekdown.
$$$ This extension provides a preview of snekdown files, exports and other commands similar to the
A = [[1, 2],[3,4]] cli. The source code can be found [here](https://github.com/Trivernis/snekdown-vscode-extension).
$$$
```
The expression get's converted into MathML which is then converted by MathJax when loaded in
the browser.
## Roadmap ## Roadmap
The end goal is to have a markup language with features similar to LaTeX. The end goal is to have a markup language with features similar to LaTeX.
### Short Term
- [x] Checkboxes - [x] Checkboxes
- [x] Emojis (\:emoji:) - [x] Emojis (\:emoji:)
- [x] Colors - [x] Colors
@ -265,12 +103,23 @@ The end goal is to have a markup language with features similar to LaTeX.
- [x] Metadata files - [x] Metadata files
- [x] Bibliography - [x] Bibliography
- [x] Math - [x] Math
- [ ] Text sizes - [x] Glossary
- [ ] Title pages - [x] Chromium based pdf rendering
- [ ] Glossary - [x] Custom Stylesheets
- [x] Smart arrows
- [ ] Cross References - [ ] Cross References
- [ ] Figures - [ ] Figures
- [ ] EPUB Rendering (PDF is too hard) - [ ] EPUB Rendering
- [ ] Custom Elements via templates (50%) - [ ] Text sizes
- [x] Custom Stylesheets - [ ] Title pages
- [ ] Smart arrows
### Long Term
- Rewrite of the whole parsing process
- Custom Elements via templates
## License
This project is licensed under GPL 3.0. See LICENSE for more information.

@ -0,0 +1,6 @@
[toolchain]
channel = "nightly"
targets = [
"x86_64-unknown-linux-gnu",
"x86_64-pc-windows-gnu",
]

@ -1,18 +1,28 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod tokens; pub mod tokens;
use crate::format::PlaceholderTemplate; use crate::format::PlaceholderTemplate;
use crate::references::configuration::{ConfigRefEntry, Configuration, Value}; use crate::references::glossary::{GlossaryManager, GlossaryReference};
use crate::references::placeholders::ProcessPlaceholders; use crate::references::placeholders::ProcessPlaceholders;
use crate::references::templates::{Template, TemplateVariable}; use crate::references::templates::{Template, TemplateVariable};
use crate::settings::Settings;
use crate::utils::downloads::{DownloadManager, PendingDownload}; use crate::utils::downloads::{DownloadManager, PendingDownload};
use crate::utils::image_converting::{ImageConverter, PendingImage};
use asciimath_rs::elements::special::Expression; use asciimath_rs::elements::special::Expression;
use bibliographix::bib_manager::BibManager; use bibliographix::bib_manager::BibManager;
use bibliographix::bibliography::bibliography_entry::BibliographyEntryReference; use bibliographix::bibliography::bibliography_entry::BibliographyEntryReference;
use bibliographix::references::bib_reference::BibRefAnchor; use bibliographix::references::bib_reference::BibRefAnchor;
use image::ImageFormat;
use mime::Mime;
use parking_lot::Mutex;
use std::collections::HashMap; use std::collections::HashMap;
use std::iter::FromIterator;
use std::sync::atomic::{AtomicBool, Ordering}; use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::{Arc, Mutex, RwLock}; use std::sync::{Arc, RwLock};
pub const SECTION: &str = "section"; pub const SECTION: &str = "section";
pub const PARAGRAPH: &str = "paragraph"; pub const PARAGRAPH: &str = "paragraph";
@ -69,10 +79,12 @@ pub struct Document {
pub(crate) is_root: bool, pub(crate) is_root: bool,
pub(crate) path: Option<String>, pub(crate) path: Option<String>,
pub(crate) placeholders: Vec<Arc<RwLock<Placeholder>>>, pub(crate) placeholders: Vec<Arc<RwLock<Placeholder>>>,
pub config: Configuration, pub config: Arc<Mutex<Settings>>,
pub bibliography: BibManager, pub bibliography: BibManager,
pub downloads: Arc<Mutex<DownloadManager>>, pub downloads: Arc<Mutex<DownloadManager>>,
pub images: Arc<Mutex<ImageConverter>>,
pub stylesheets: Vec<Arc<Mutex<PendingDownload>>>, pub stylesheets: Vec<Arc<Mutex<PendingDownload>>>,
pub glossary: Arc<Mutex<GlossaryManager>>,
} }
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
@ -177,9 +189,12 @@ pub enum Inline {
Colored(Colored), Colored(Colored),
Math(Math), Math(Math),
BibReference(Arc<RwLock<BibReference>>), BibReference(Arc<RwLock<BibReference>>),
GlossaryReference(Arc<Mutex<GlossaryReference>>),
TemplateVar(Arc<RwLock<TemplateVariable>>), TemplateVar(Arc<RwLock<TemplateVariable>>),
CharacterCode(CharacterCode), CharacterCode(CharacterCode),
LineBreak, LineBreak,
Arrow(Arrow),
Anchor(Anchor),
} }
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
@ -232,7 +247,7 @@ pub struct Url {
pub struct Image { pub struct Image {
pub(crate) url: Url, pub(crate) url: Url,
pub(crate) metadata: Option<InlineMetadata>, pub(crate) metadata: Option<InlineMetadata>,
pub(crate) download: Arc<Mutex<PendingDownload>>, pub(crate) image_data: Arc<Mutex<PendingImage>>,
} }
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
@ -244,7 +259,7 @@ pub struct Placeholder {
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
pub struct RefLink { pub struct RefLink {
pub(crate) description: Box<Line>, pub(crate) description: TextLine,
pub(crate) reference: String, pub(crate) reference: String,
} }
@ -286,36 +301,48 @@ pub struct CharacterCode {
pub(crate) code: String, pub(crate) code: String,
} }
#[derive(Clone, Debug)]
pub enum Arrow {
RightArrow,
LeftArrow,
LeftRightArrow,
BigRightArrow,
BigLeftArrow,
BigLeftRightArrow,
}
// implementations // implementations
impl Document { impl Document {
pub fn new(is_root: bool) -> Self { /// Creates a new parent document
pub fn new() -> Self {
Self { Self {
elements: Vec::new(), elements: Vec::new(),
is_root, is_root: true,
path: None, path: None,
placeholders: Vec::new(), placeholders: Vec::new(),
config: Configuration::default(), config: Arc::new(Mutex::new(Settings::default())),
bibliography: BibManager::new(), bibliography: BibManager::new(),
stylesheets: Vec::new(), stylesheets: Vec::new(),
downloads: Arc::new(Mutex::new(DownloadManager::new())), downloads: Arc::new(Mutex::new(DownloadManager::new())),
images: Arc::new(Mutex::new(ImageConverter::new())),
glossary: Arc::new(Mutex::new(GlossaryManager::new())),
} }
} }
pub fn new_with_manager( /// Creates a new child document
is_root: bool, pub fn create_child(&self) -> Self {
bibliography: BibManager,
downloads: Arc<Mutex<DownloadManager>>,
) -> Self {
Self { Self {
elements: Vec::new(), elements: Vec::new(),
is_root, is_root: false,
path: None, path: None,
placeholders: Vec::new(), placeholders: Vec::new(),
config: Configuration::default(), config: self.config.clone(),
bibliography, bibliography: self.bibliography.create_child(),
stylesheets: Vec::new(), stylesheets: Vec::new(),
downloads, downloads: Arc::clone(&self.downloads),
images: Arc::clone(&self.images),
glossary: Arc::clone(&self.glossary),
} }
} }
@ -332,7 +359,7 @@ impl Document {
list.ordered = ordered; list.ordered = ordered;
self.elements.iter().for_each(|e| match e { self.elements.iter().for_each(|e| match e {
Block::Section(sec) => { Block::Section(sec) => {
if !sec.get_hide_in_toc() { if !sec.is_hidden_in_toc() {
let mut item = let mut item =
ListItem::new(Line::RefLink(sec.header.get_anchor()), 1, ordered); ListItem::new(Line::RefLink(sec.header.get_anchor()), 1, ordered);
item.children.append(&mut sec.get_toc_list(ordered).items); item.children.append(&mut sec.get_toc_list(ordered).items);
@ -413,9 +440,42 @@ impl Document {
if self.is_root { if self.is_root {
self.process_definitions(); self.process_definitions();
self.bibliography.assign_entries_to_references(); self.bibliography.assign_entries_to_references();
self.glossary.lock().assign_entries_to_references();
self.process_placeholders(); self.process_placeholders();
self.process_media();
} }
} }
fn process_media(&self) {
let downloads = Arc::clone(&self.downloads);
if self.config.lock().features.embed_external {
downloads.lock().download_all();
}
if let Some(s) = &self.config.lock().images.format {
if let Some(format) = ImageFormat::from_extension(s) {
self.images.lock().set_target_format(format);
}
}
let mut image_width = 0;
let mut image_height = 0;
if let Some(i) = self.config.lock().images.max_width {
image_width = i;
image_height = i;
}
if let Some(i) = self.config.lock().images.max_height {
image_height = i;
if image_width <= 0 {
image_width = i;
}
}
if image_width > 0 && image_height > 0 {
self.images
.lock()
.set_target_size((image_width as u32, image_height as u32));
}
self.images.lock().convert_all();
}
} }
impl Section { impl Section {
@ -433,9 +493,10 @@ impl Section {
pub fn get_toc_list(&self, ordered: bool) -> List { pub fn get_toc_list(&self, ordered: bool) -> List {
let mut list = List::new(); let mut list = List::new();
self.elements.iter().for_each(|e| { self.elements.iter().for_each(|e| {
if let Block::Section(sec) = e { if let Block::Section(sec) = e {
if !sec.get_hide_in_toc() { if !sec.is_hidden_in_toc() {
let mut item = let mut item =
ListItem::new(Line::RefLink(sec.header.get_anchor()), 1, ordered); ListItem::new(Line::RefLink(sec.header.get_anchor()), 1, ordered);
item.children.append(&mut sec.get_toc_list(ordered).items); item.children.append(&mut sec.get_toc_list(ordered).items);
@ -447,7 +508,7 @@ impl Section {
list list
} }
pub(crate) fn get_hide_in_toc(&self) -> bool { pub(crate) fn is_hidden_in_toc(&self) -> bool {
if let Some(meta) = &self.metadata { if let Some(meta) = &self.metadata {
meta.get_bool("toc-hidden") meta.get_bool("toc-hidden")
} else { } else {
@ -503,7 +564,7 @@ impl Header {
pub fn get_anchor(&self) -> RefLink { pub fn get_anchor(&self) -> RefLink {
RefLink { RefLink {
description: Box::new(self.line.clone()), description: self.line.as_raw_text().as_plain_line(),
reference: self.anchor.clone(), reference: self.anchor.clone(),
} }
} }
@ -559,6 +620,16 @@ impl TextLine {
pub fn add_subtext(&mut self, subtext: Inline) { pub fn add_subtext(&mut self, subtext: Inline) {
self.subtext.push(subtext) self.subtext.push(subtext)
} }
pub fn as_plain_line(&self) -> TextLine {
TextLine {
subtext: self
.subtext
.iter()
.map(|s| Inline::Plain(s.as_plain_text()))
.collect(),
}
}
} }
impl Table { impl Table {
@ -601,6 +672,15 @@ impl Quote {
pub fn add_text(&mut self, text: TextLine) { pub fn add_text(&mut self, text: TextLine) {
self.text.push(text) self.text.push(text)
} }
/// Strips a single linebreak from the end of the quote
pub fn strip_linebreak(&mut self) {
if let Some(last) = self.text.last_mut() {
if let Some(Inline::LineBreak) = last.subtext.last() {
last.subtext.pop();
}
}
}
} }
impl ImportAnchor { impl ImportAnchor {
@ -636,6 +716,8 @@ impl Placeholder {
pub trait Metadata { pub trait Metadata {
fn get_bool(&self, key: &str) -> bool; fn get_bool(&self, key: &str) -> bool;
fn get_string(&self, key: &str) -> Option<String>; fn get_string(&self, key: &str) -> Option<String>;
fn get_float(&self, key: &str) -> Option<f64>;
fn get_integer(&self, key: &str) -> Option<i64>;
fn get_string_map(&self) -> HashMap<String, String>; fn get_string_map(&self) -> HashMap<String, String>;
} }
@ -656,6 +738,24 @@ impl Metadata for InlineMetadata {
} }
} }
fn get_float(&self, key: &str) -> Option<f64> {
if let Some(MetadataValue::Float(f)) = self.data.get(key) {
Some(*f)
} else if let Some(MetadataValue::Integer(i)) = self.data.get(key) {
Some(*i as f64)
} else {
None
}
}
fn get_integer(&self, key: &str) -> Option<i64> {
if let Some(MetadataValue::Integer(i)) = self.data.get(key) {
Some(*i)
} else {
None
}
}
fn get_string_map(&self) -> HashMap<String, String> { fn get_string_map(&self) -> HashMap<String, String> {
let mut string_map = HashMap::new(); let mut string_map = HashMap::new();
for (k, v) in &self.data { for (k, v) in &self.data {
@ -672,26 +772,17 @@ impl Metadata for InlineMetadata {
} }
} }
impl Into<HashMap<String, Value>> for InlineMetadata {
fn into(self) -> HashMap<String, Value> {
HashMap::from_iter(self.data.iter().filter_map(|(k, v)| match v {
MetadataValue::String(s) => Some((k.clone(), Value::String(s.clone()))),
MetadataValue::Bool(b) => Some((k.clone(), Value::Bool(*b))),
MetadataValue::Integer(i) => Some((k.clone(), Value::Integer(*i))),
MetadataValue::Float(f) => Some((k.clone(), Value::Float(*f))),
MetadataValue::Template(t) => Some((k.clone(), Value::Template(t.clone()))),
_ => None,
}))
}
}
impl Image { impl Image {
pub fn get_content(&self) -> Option<Vec<u8>> { pub fn get_content(&self) -> Option<Vec<u8>> {
let mut data = None; let mut data = None;
std::mem::swap(&mut data, &mut self.download.lock().unwrap().data); std::mem::swap(&mut data, &mut self.image_data.lock().data);
data data
} }
pub fn get_mime_type(&self) -> Mime {
self.image_data.lock().mime.clone()
}
} }
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
@ -704,15 +795,11 @@ pub struct BibEntry {
pub struct BibReference { pub struct BibReference {
pub(crate) key: String, pub(crate) key: String,
pub(crate) entry_anchor: Arc<Mutex<BibRefAnchor>>, pub(crate) entry_anchor: Arc<Mutex<BibRefAnchor>>,
pub(crate) display: Option<ConfigRefEntry>, pub(crate) display: Option<String>,
} }
impl BibReference { impl BibReference {
pub fn new( pub fn new(key: String, display: Option<String>, anchor: Arc<Mutex<BibRefAnchor>>) -> Self {
key: String,
display: Option<ConfigRefEntry>,
anchor: Arc<Mutex<BibRefAnchor>>,
) -> Self {
Self { Self {
key: key.to_string(), key: key.to_string(),
display, display,
@ -721,12 +808,11 @@ impl BibReference {
} }
pub(crate) fn get_formatted(&self) -> String { pub(crate) fn get_formatted(&self) -> String {
if let Some(entry) = &self.entry_anchor.lock().unwrap().entry { if let Some(entry) = &self.entry_anchor.lock().entry {
let entry = entry.lock().unwrap(); let entry = entry.lock();
if let Some(display) = &self.display { if let Some(display) = &self.display {
let display = display.read().unwrap(); let mut template = PlaceholderTemplate::new(display.clone());
let mut template = PlaceholderTemplate::new(display.get().as_string());
let mut value_map = HashMap::new(); let mut value_map = HashMap::new();
value_map.insert("key".to_string(), entry.key()); value_map.insert("key".to_string(), entry.key());
@ -755,3 +841,71 @@ impl MetadataValue {
} }
} }
} }
impl Line {
pub fn as_raw_text(&self) -> TextLine {
match self {
Line::Text(t) => t.clone(),
Line::Ruler(_) => TextLine::new(),
Line::RefLink(r) => r.description.clone(),
Line::Anchor(a) => a.inner.as_raw_text().as_plain_line(),
Line::Centered(c) => c.line.clone(),
Line::BibEntry(_) => TextLine::new(),
}
}
}
impl Inline {
pub fn as_plain_text(&self) -> PlainText {
match self {
Inline::Plain(p) => p.clone(),
Inline::Bold(b) => b.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Italic(i) => i.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Underlined(u) => u.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Striked(s) => s.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Monospace(m) => PlainText {
value: m.value.clone(),
},
Inline::Superscript(s) => s.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Colored(c) => c.value.as_plain_text(),
_ => PlainText {
value: String::new(),
},
}
}
}

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
#![allow(unused)] #![allow(unused)]
pub(crate) const BACKSLASH: char = '\\'; pub(crate) const BACKSLASH: char = '\\';
@ -33,7 +39,9 @@ pub(crate) const L_BRACE: char = '}';
pub(crate) const PERCENT: char = '%'; pub(crate) const PERCENT: char = '%';
pub(crate) const COMMA: char = ','; pub(crate) const COMMA: char = ',';
pub(crate) const MATH: char = '$'; pub(crate) const MATH: char = '$';
pub(crate) const DOLLAR: char = '$';
pub(crate) const AMPERSAND: char = '&'; pub(crate) const AMPERSAND: char = '&';
pub(crate) const QUESTION_MARK: char = '?';
// aliases // aliases
@ -69,16 +77,39 @@ pub(crate) const TEMPLATE: char = PERCENT;
pub(crate) const ITALIC: char = ASTERISK; pub(crate) const ITALIC: char = ASTERISK;
pub(crate) const MONOSPACE: char = BACKTICK; pub(crate) const MONOSPACE: char = BACKTICK;
pub(crate) const STRIKED: char = TILDE; pub(crate) const STRIKED: &'static [char] = &[TILDE, TILDE];
pub(crate) const UNDERLINED: char = UNDERSCR; pub(crate) const UNDERLINED: char = UNDERSCR;
pub(crate) const SUPER: char = UP; pub(crate) const SUPER: char = UP;
pub(crate) const EMOJI: char = COLON; pub(crate) const EMOJI: char = COLON;
pub(crate) const MATH_INLINE: &'static [char] = &[MATH, MATH]; pub(crate) const MATH_INLINE: &'static [char] = &[MATH, MATH];
pub(crate) const BOLD: [char; 2] = [ASTERISK, ASTERISK]; pub(crate) const BOLD: &'static [char] = &[ASTERISK, ASTERISK];
pub(crate) const CHARACTER_START: char = AMPERSAND; pub(crate) const CHARACTER_START: char = AMPERSAND;
pub(crate) const CHARACTER_STOP: char = SEMICOLON; pub(crate) const CHARACTER_STOP: char = SEMICOLON;
pub(crate) const GLOSSARY_REF_START: char = TILDE;
// Reference Anchors
pub(crate) const ANCHOR_START: &'static [char] = &[R_BRACKET, QUESTION_MARK];
pub(crate) const ANCHOR_STOP: char = L_BRACKET;
// References
pub(crate) const REF_START: &'static [char] = &[R_BRACKET, DOLLAR];
pub(crate) const REF_STOP: char = L_BRACKET;
pub(crate) const REF_DESC_START: char = R_PARENTH;
pub(crate) const REF_DESC_STOP: char = L_PARENTH;
// Arrows
pub(crate) const A_RIGHT_ARROW: &'static [char] = &['-', '-', '>'];
pub(crate) const A_LEFT_ARROW: &'static [char] = &['<', '-', '-'];
pub(crate) const A_LEFT_RIGHT_ARROW: &'static [char] = &['<', '-', '-', '>'];
pub(crate) const A_BIG_RIGHT_ARROW: &'static [char] = &['=', '=', '>'];
pub(crate) const A_BIG_LEFT_ARROW: &'static [char] = &['<', '=', '='];
pub(crate) const A_BIG_LEFT_RIGHT_ARROW: &'static [char] = &['<', '=', '=', '>'];
// groups // groups
pub(crate) const QUOTES: [char; 2] = [SINGLE_QUOTE, DOUBLE_QUOTE]; pub(crate) const QUOTES: [char; 2] = [SINGLE_QUOTE, DOUBLE_QUOTE];
@ -111,11 +142,22 @@ pub(crate) const INLINE_SPECIAL_CHARS: &'static [char] = &[
MATH, MATH,
]; ];
pub(crate) const INLINE_SPECIAL_SEQUENCES: &'static [&'static [char]] = &[
A_BIG_LEFT_RIGHT_ARROW,
A_BIG_LEFT_ARROW,
A_BIG_RIGHT_ARROW,
A_RIGHT_ARROW,
A_LEFT_ARROW,
A_LEFT_RIGHT_ARROW,
ANCHOR_START,
REF_START,
];
pub(crate) const LIST_SPECIAL_CHARS: [char; 14] = [ pub(crate) const LIST_SPECIAL_CHARS: [char; 14] = [
MINUS, PLUS, ASTERISK, O, '1', '2', '3', '4', '5', '6', '7', '8', '9', '0', MINUS, PLUS, ASTERISK, O, '1', '2', '3', '4', '5', '6', '7', '8', '9', '0',
]; ];
pub(crate) const WHITESPACE: [char; 4] = [' ', '\t', '\r', '\n']; pub(crate) const WHITESPACE: &[char] = &[' ', '\t', '\r', '\n'];
pub(crate) const INLINE_WHITESPACE: [char; 3] = [' ', '\t', '\r']; pub(crate) const INLINE_WHITESPACE: [char; 3] = [' ', '\t', '\r'];
// sequences // sequences

@ -0,0 +1,167 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
body {
background-color: $body-background;
overflow-x: hidden;
color: $primary-color;
word-break: break-word;
}
.content {
font-family: "Fira Sans", "Noto Sans", SansSerif, sans-serif;
width: 100vh;
max-width: calc(100% - 4rem);
padding: 2rem;
margin: auto;
background-color: $background-color;
}
h1 {
font-size: 2.2rem;
}
h2 {
font-size: 1.8rem;
}
h3 {
font-size: 1.4rem;
}
h4 {
font-size: 1rem;
}
h5 {
font-size: 0.8rem;
}
h6 {
font-size: 0.4rem;
}
img {
max-width: 100%;
max-height: 100vh;
height: auto;
}
code {
color: $primary-color;
pre {
font-family: "Fira Code", "Mono", monospace;
padding: 0.8em 0.2em;
background-color: $code-background !important;
border-radius: 0.25em;
overflow: auto;
}
&.inlineCode {
font-family: "Fira Code", monospace;
border-radius: 0.1em;
background-color: $code-background;
padding: 0 0.1em;
}
}
.tableWrapper {
overflow-x: auto;
width: 100%;
& > table {
margin: auto;
}
}
table {
border-collapse: collapse;
tr {
&:nth-child(odd) {
background-color: $table-background-alt;
}
&:nth-child(1) {
background-color: $table-background-alt;
font-weight: bold;
border-bottom: 1px solid invert($background-color)
}
}
}
table td, table th {
border-left: 1px solid invert($background-color);
padding: 0.2em 0.5em;
}
table tr td:first-child, table tr th:first-child {
border-left: none;
}
blockquote {
margin-left: 0;
padding-top: 0.2em;
padding-bottom: 0.2em;
background-color: rgba(0, 0, 0, 0);
}
a {
color: $secondary-color;
}
.quote {
border-left: 0.3em solid $quote-background-alt;
border-radius: 0.2em;
padding-left: 1em;
margin-left: 0;
background-color: $quote-background;
.metadata {
font-style: italic;
padding-left: 0.5em;
color: $primary-variant-1;
}
}
.figure {
width: 100%;
display: block;
text-align: center;
.imageDescription {
display: block;
color: $primary-variant-1;
font-style: italic;
}
}
.centered {
text-align: center;
}
.glossaryReference {
text-decoration: none;
color: inherit;
border-bottom: 1px dotted $primary-color;
}
.arrow {
font-family: "Fira Code", "Mono", monospace;
}
@media print {
.content > section > section, .content > section > section {
page-break-inside: avoid;
}
body {
background-color: $background-color !important;
}
}

@ -0,0 +1,20 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #1e1d2c;
$background-color-variant-1: lighten($background-color, 7%);
$background-color-variant-2: lighten($background-color, 14%);
$background-color-variant-3: lighten($background-color, 21%);
$primary-color: #EEE;
$primary-variant-1: darken($primary-color, 14%);
$secondary-color: #3aa7df;
$body-background: darken($background-color, 5%);
$code-background: lighten($background-color, 5%);
$table-background-alt: $background-color-variant-2;
$quote-background: lighten($background-color-variant-1, 3%);
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,20 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: darken(#2b303b, 8%);
$background-color-variant-1: lighten($background-color, 7%);
$background-color-variant-2: lighten($background-color, 14%);
$background-color-variant-3: lighten($background-color, 21%);
$primary-color: #EEE;
$primary-variant-1: darken($primary-color, 14%);
$secondary-color: #3aa7df;
$body-background: darken($background-color, 5%);
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: lighten($background-color-variant-1, 3%);
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: darken(#002b36, 5%);
$background-color-variant-1: lighten($background-color, 7%);
$background-color-variant-2: lighten($background-color, 14%);
$background-color-variant-3: lighten($background-color, 21%);
$primary-color: #EEE;
$primary-variant-1: darken($primary-color, 14%);
$secondary-color: #0096c9;
$body-background: darken($background-color, 3%);
$code-background: $background-color-variant-1;
$table-background-alt: lighten($background-color, 10%);
$quote-background: lighten($background-color-variant-1, 3%);
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #FFF;
$background-color-variant-1: darken($background-color, 7%);
$background-color-variant-2: darken($background-color, 14%);
$background-color-variant-3: darken($background-color, 21%);
$primary-color: #000;
$primary-variant-1: lighten($primary-color, 14%);
$secondary-color: #00286a;
$body-background: $background-color-variant-1;
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: $background-color-variant-2;
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #FFF;
$background-color-variant-1: darken($background-color, 7%);
$background-color-variant-2: darken($background-color, 14%);
$background-color-variant-3: darken($background-color, 21%);
$primary-color: #112;
$primary-variant-1: lighten($primary-color, 14%);
$secondary-color: #00348e;
$body-background: $background-color-variant-1;
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: $background-color-variant-2;
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #fff8f0;
$background-color-variant-1: darken($background-color, 4%);
$background-color-variant-2: darken($background-color, 8%);
$background-color-variant-3: darken($background-color, 12%);
$primary-color: #112;
$primary-variant-1: lighten($primary-color, 14%);
$secondary-color: #2b61be;
$body-background: $background-color-variant-1;
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: $background-color-variant-2;
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,9 @@
<!--
~ Snekdown - Custom Markdown flavour and parser
~ Copyright (C) 2021 Trivernis
~ See LICENSE for more information.
-->
<div style="font-size: 10px; text-align: center; width: 100%;">
<span class="pageNumber"></span>/<span class="totalPages"></span>
</div>

@ -0,0 +1,134 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::Document;
use crate::format::chromium_pdf::result::{PdfRenderingError, PdfRenderingResult};
use crate::format::html::html_writer::HTMLWriter;
use crate::format::html::to_html::ToHtml;
use crate::settings::Settings;
use crate::utils::caching::CacheStorage;
use bibliographix::Mutex;
use headless_chrome::protocol::page::PrintToPdfOptions;
use headless_chrome::{Browser, Tab};
use std::env;
use std::fs;
use std::fs::OpenOptions;
use std::io::BufWriter;
use std::path::PathBuf;
use std::sync::Arc;
use std::thread;
use std::time::{Duration, Instant};
pub mod result;
/// Renders the document to pdf and returns the resulting bytes
pub fn render_to_pdf(document: Document) -> PdfRenderingResult<Vec<u8>> {
let cache = CacheStorage::new();
let mut file_path = PathBuf::from("tmp-document.html");
file_path = cache.get_file_path(&file_path);
if !file_path.parent().map(|p| p.exists()).unwrap_or(false) {
file_path = env::current_dir()?;
file_path.push(PathBuf::from(".tmp-document.html"))
}
let config = document.config.clone();
let mathjax = config.lock().features.include_mathjax;
let handle = thread::spawn({
let file_path = file_path.clone();
move || {
log::info!("Rendering html...");
let writer = BufWriter::new(
OpenOptions::new()
.create(true)
.write(true)
.truncate(true)
.open(file_path)?,
);
let mut html_writer =
HTMLWriter::new(Box::new(writer), document.config.lock().style.theme.clone());
document.to_html(&mut html_writer)?;
log::info!("Successfully rendered temporary html file!");
html_writer.flush()
}
});
let browser = Browser::default()?;
let tab = browser.wait_for_initial_tab()?;
handle.join().unwrap()?;
tab.navigate_to(format!("file:///{}", file_path.to_string_lossy()).as_str())?;
tab.wait_until_navigated()?;
if mathjax {
wait_for_mathjax(&tab, Duration::from_secs(60))?;
}
log::info!("Rendering pdf...");
let result = tab.print_to_pdf(Some(get_pdf_options(config)))?;
log::info!("Removing temporary html...");
fs::remove_file(file_path)?;
Ok(result)
}
/// Waits for mathjax to be finished
fn wait_for_mathjax(tab: &Tab, timeout: Duration) -> PdfRenderingResult<()> {
let start = Instant::now();
log::debug!("Waiting for mathjax...");
loop {
let result = tab
.evaluate(
"\
if (window.MathJax)\
!!window.MathJax;\
else \
false;\
",
true,
)?
.value;
match result {
Some(value) => {
if value.is_boolean() && value.as_bool().unwrap() == true {
break;
} else {
if start.elapsed() >= timeout {
return Err(PdfRenderingError::Timeout);
}
thread::sleep(Duration::from_millis(10))
}
}
None => {
if start.elapsed() >= timeout {
return Err(PdfRenderingError::Timeout);
}
thread::sleep(Duration::from_millis(10))
}
}
}
Ok(())
}
fn get_pdf_options(config: Arc<Mutex<Settings>>) -> PrintToPdfOptions {
let config = config.lock().pdf.clone();
PrintToPdfOptions {
landscape: None,
display_header_footer: Some(config.display_header_footer),
print_background: Some(true),
scale: Some(config.page_scale),
paper_width: config.page_width,
paper_height: config.page_height,
margin_top: config.margin.top,
margin_bottom: config.margin.bottom,
margin_left: config.margin.left,
margin_right: config.margin.right,
page_ranges: None,
ignore_invalid_page_ranges: None,
header_template: config.header_template,
footer_template: config.footer_template,
prefer_css_page_size: None,
}
}

@ -0,0 +1,44 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use std::error::Error;
use std::fmt::{self, Display};
use std::io;
pub type PdfRenderingResult<T> = Result<T, PdfRenderingError>;
#[derive(Debug)]
pub enum PdfRenderingError {
IoError(io::Error),
ChromiumError(failure::Error),
Timeout,
HtmlRenderingError,
}
impl Display for PdfRenderingError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
PdfRenderingError::IoError(e) => write!(f, "IO Error: {}", e),
PdfRenderingError::Timeout => write!(f, "Rendering timed out"),
PdfRenderingError::ChromiumError(e) => write!(f, "Chromium Error: {}", e),
PdfRenderingError::HtmlRenderingError => write!(f, "Failed to render html"),
}
}
}
impl Error for PdfRenderingError {}
impl From<failure::Error> for PdfRenderingError {
fn from(other: failure::Error) -> Self {
Self::ChromiumError(other)
}
}
impl From<io::Error> for PdfRenderingError {
fn from(other: io::Error) -> Self {
Self::IoError(other)
}
}

@ -1,130 +0,0 @@
body {
background-color: #DDD;
overflow-x: hidden;
color: #000;
word-break: break-word;
}
.content {
font-family: "Fira Sans", "Noto Sans", SansSerif, sans-serif;
width: 100vh;
max-width: calc(100% - 4rem);
padding: 2rem;
margin: auto;
background-color: #FFF;
}
h1 {
font-size: 2.2rem;
}
h2 {
font-size: 1.8rem;
}
h3 {
font-size: 1.4rem;
}
h4 {
font-size: 1rem;
}
h5 {
font-size: 0.8rem;
}
h6 {
font-size: 0.4rem;
}
img {
max-width: 100%;
max-height: 100vh;
height: auto;
}
code {
color: #000;
}
code pre {
font-family: "Fira Code", monospace;
padding: 0.8em 0.2em;
background-color: #EEE !important;
border-radius: 0.25em;
}
code.inlineCode {
font-family: "Fira Code", monospace;
border-radius: 0.1em;
background-color: #EEE;
padding: 0 0.1em
}
.tableWrapper {
overflow-x: auto;
width: 100%;
}
.tableWrapper > table {
margin: auto;
}
table {
border-collapse: collapse;
}
table tr:nth-child(odd) {
background-color: #DDD;
}
table tr:nth-child(1) {
background-color: white;
font-weight: bold;
border-bottom: 1px solid black;
}
table td, table th {
border-left: 1px solid black;
padding: 0.2em 0.5em
}
table tr td:first-child, table tr th:first-child {
border-left: none;
}
blockquote {
margin-left: 0;
background-color: rgba(0, 0, 0, 0);
}
.quote {
border-left: 0.3em solid gray;
border-radius: 0.2em;
padding-left: 1em;
margin-left: 0;
background-color: #EEE;
}
.quote .metadata {
font-style: italic;
padding-left: 0.5em;
color: #444
}
.figure {
width: 100%;
display: block;
text-align: center;
}
.figure .imageDescription {
display: block;
color: #444;
font-style: italic;
}
.centered {
text-align: center;
}

@ -1,14 +1,22 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::settings::style_settings::Theme;
use std::io; use std::io;
use std::io::Write; use std::io::Write;
pub struct HTMLWriter { pub struct HTMLWriter {
inner: Box<dyn Write>, inner: Box<dyn Write>,
theme: Theme,
} }
impl HTMLWriter { impl HTMLWriter {
/// Creates a new writer /// Creates a new writer
pub fn new(inner: Box<dyn Write>) -> Self { pub fn new(inner: Box<dyn Write>, theme: Theme) -> Self {
Self { inner } Self { inner, theme }
} }
/// Writes a raw string /// Writes a raw string
@ -30,4 +38,9 @@ impl HTMLWriter {
pub fn flush(&mut self) -> io::Result<()> { pub fn flush(&mut self) -> io::Result<()> {
self.inner.flush() self.inner.flush()
} }
/// Return the theme of the html writer
pub fn get_theme(&mut self) -> Theme {
self.theme.clone()
}
} }

@ -1,2 +1,8 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod html_writer; pub mod html_writer;
pub mod to_html; pub mod to_html;

@ -1,16 +1,20 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::*; use crate::elements::*;
use crate::format::html::html_writer::HTMLWriter; use crate::format::html::html_writer::HTMLWriter;
use crate::format::style::{get_code_theme_for_theme, get_css_for_theme};
use crate::format::PlaceholderTemplate; use crate::format::PlaceholderTemplate;
use crate::references::configuration::keys::META_LANG; use crate::references::glossary::{GlossaryDisplay, GlossaryReference};
use crate::references::templates::{Template, TemplateVariable}; use crate::references::templates::{Template, TemplateVariable};
use asciimath_rs::format::mathml::ToMathML; use asciimath_rs::format::mathml::ToMathML;
use htmlescape::encode_attribute; use htmlescape::encode_attribute;
use minify::html::minify; use minify::html::minify;
use std::io; use std::io;
use std::sync::Arc;
use syntect::highlighting::ThemeSet;
use syntect::html::highlighted_html_for_string; use syntect::html::highlighted_html_for_string;
use syntect::parsing::SyntaxSet;
const MATHJAX_URL: &str = "https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js"; const MATHJAX_URL: &str = "https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js";
@ -62,6 +66,9 @@ impl ToHtml for Inline {
Inline::Math(m) => m.to_html(writer), Inline::Math(m) => m.to_html(writer),
Inline::LineBreak => writer.write("<br>".to_string()), Inline::LineBreak => writer.write("<br>".to_string()),
Inline::CharacterCode(code) => code.to_html(writer), Inline::CharacterCode(code) => code.to_html(writer),
Inline::GlossaryReference(gloss) => gloss.lock().to_html(writer),
Inline::Arrow(a) => a.to_html(writer),
Inline::Anchor(a) => a.to_html(writer),
} }
} }
} }
@ -98,47 +105,78 @@ impl ToHtml for MetadataValue {
impl ToHtml for Document { impl ToHtml for Document {
fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> { fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> {
let downloads = Arc::clone(&self.downloads);
let mathjax = downloads
.lock()
.unwrap()
.add_download(MATHJAX_URL.to_string());
downloads.lock().unwrap().download_all();
let path = if let Some(path) = &self.path { let path = if let Some(path) = &self.path {
format!("path=\"{}\"", encode_attribute(path.as_str())) format!("path=\"{}\"", encode_attribute(path.as_str()))
} else { } else {
"".to_string() "".to_string()
}; };
if self.is_root { if self.is_root {
let language = self let metadata = self.config.lock().metadata.clone();
.config
.get_entry(META_LANG) let style = minify(get_css_for_theme(writer.get_theme()).as_str());
.map(|e| e.get().as_string())
.unwrap_or("en".to_string());
let style = minify(std::include_str!("assets/style.css"));
writer.write("<!DOCTYPE html>".to_string())?; writer.write("<!DOCTYPE html>".to_string())?;
writer.write("<html lang=\"".to_string())?; writer.write("<html lang=\"".to_string())?;
writer.write_attribute(language)?; writer.write_attribute(metadata.language)?;
writer.write("\"><head ".to_string())?; writer.write("\"><head>".to_string())?;
writer.write(path)?;
writer.write("/>".to_string())?;
writer.write("<meta charset=\"UTF-8\">".to_string())?; writer.write("<meta charset=\"UTF-8\">".to_string())?;
if let Some(data) = std::mem::replace(&mut mathjax.lock().unwrap().data, None) {
writer.write("<script id=\"MathJax-script\">".to_string())?; if let Some(author) = metadata.author {
writer.write_escaped(minify(String::from_utf8(data).unwrap().as_str()))?; writer.write("<meta name=\"author\" content=\"".to_string())?;
writer.write("</script>".to_string())?; writer.write_attribute(author)?;
writer.write("\">".to_string())?;
}
if let Some(title) = metadata.title {
writer.write("<title>".to_string())?;
writer.write_escaped(title.clone())?;
writer.write("</title>".to_string())?;
writer.write("<meta name=\"title\" content=\"".to_string())?;
writer.write_attribute(title)?;
writer.write("\">".to_string())?;
}
if let Some(description) = metadata.description {
writer.write("<meta name=\"description\" content=\"".to_string())?;
writer.write_attribute(description)?;
writer.write("\">".to_string())?;
} }
if !metadata.keywords.is_empty() {
writer.write("<meta name=\"keywords\" content=\"".to_string())?;
writer.write_attribute(
metadata
.keywords
.iter()
.fold("".to_string(), |a, b| format!("{}, {}", a, b))
.trim_start_matches(", ")
.to_string(),
)?;
writer.write("\">".to_string())?;
}
writer.write("<style>".to_string())?; writer.write("<style>".to_string())?;
writer.write(style)?; writer.write(style)?;
writer.write("</style>".to_string())?; writer.write("</style>".to_string())?;
if self.config.lock().features.include_mathjax {
writer.write(format!(
"<script id=\"MathJax-script\" type=\"text/javascript\" async src={}></script>",
MATHJAX_URL
))?;
}
for stylesheet in &self.stylesheets { for stylesheet in &self.stylesheets {
let mut stylesheet = stylesheet.lock().unwrap(); let mut stylesheet = stylesheet.lock();
let data = std::mem::replace(&mut stylesheet.data, None); let data = std::mem::replace(&mut stylesheet.data, None);
if let Some(data) = data { if let Some(data) = data {
writer.write("<style>".to_string())?; writer.write("<style>".to_string())?;
writer.write(minify(String::from_utf8(data).unwrap().as_str()))?; writer.write(minify(String::from_utf8(data).unwrap().as_str()))?;
writer.write("</style>".to_string())?; writer.write("</style>".to_string())?;
} else {
writer.write("<link rel=\"stylsheet\" href=\"".to_string())?;
writer.write_attribute(stylesheet.path.clone())?;
writer.write("\">".to_string())?;
} }
} }
writer.write("</head><body><div class=\"content\">".to_string())?; writer.write("</head><body><div class=\"content\">".to_string())?;
@ -219,7 +257,7 @@ impl ToHtml for Paragraph {
} }
if self.elements.len() > 1 { if self.elements.len() > 1 {
for element in &self.elements[1..] { for element in &self.elements[1..] {
writer.write("<br/>".to_string())?; writer.write(" ".to_string())?;
element.to_html(writer)?; element.to_html(writer)?;
} }
} }
@ -314,19 +352,19 @@ impl ToHtml for Cell {
impl ToHtml for CodeBlock { impl ToHtml for CodeBlock {
fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> { fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> {
writer.write("<div><code".to_string())?; writer.write("<div><code".to_string())?;
if self.language.len() > 0 { if self.language.len() > 0 {
writer.write(" lang=\"".to_string())?; writer.write(" lang=\"".to_string())?;
writer.write_attribute(self.language.clone())?; writer.write_attribute(self.language.clone())?;
writer.write("\">".to_string())?; writer.write("\">".to_string())?;
lazy_static::lazy_static! { static ref PS: SyntaxSet = SyntaxSet::load_defaults_nonewlines(); } let (theme, syntax_set) = get_code_theme_for_theme(writer.get_theme());
lazy_static::lazy_static! { static ref TS: ThemeSet = ThemeSet::load_defaults(); }
if let Some(syntax) = PS.find_syntax_by_token(self.language.as_str()) { if let Some(syntax) = syntax_set.find_syntax_by_token(self.language.as_str()) {
writer.write(highlighted_html_for_string( writer.write(highlighted_html_for_string(
self.code.as_str(), self.code.as_str(),
&PS, &syntax_set,
syntax, syntax,
&TS.themes["InspiredGitHub"], &theme,
))?; ))?;
} else { } else {
writer.write("<pre>".to_string())?; writer.write("<pre>".to_string())?;
@ -380,7 +418,7 @@ impl ToHtml for Image {
let mut style = String::new(); let mut style = String::new();
let url = if let Some(content) = self.get_content() { let url = if let Some(content) = self.get_content() {
let mime_type = mime_guess::from_path(&self.url.url).first_or(mime::IMAGE_PNG); let mime_type = self.get_mime_type();
format!( format!(
"data:{};base64,{}", "data:{};base64,{}",
mime_type.to_string(), mime_type.to_string(),
@ -399,7 +437,7 @@ impl ToHtml for Image {
} }
if let Some(description) = self.url.description.clone() { if let Some(description) = self.url.description.clone() {
writer.write("<div class=\"figure\"><a href=\"".to_string())?; writer.write("<div class=\"figure\"><a href=\"".to_string())?;
writer.write_attribute(self.url.url.clone())?; writer.write_attribute(url.clone())?;
writer.write("\"><img src=\"".to_string())?; writer.write("\"><img src=\"".to_string())?;
writer.write(url)?; writer.write(url)?;
writer.write("\" style=\"".to_string())?; writer.write("\" style=\"".to_string())?;
@ -643,3 +681,39 @@ impl ToHtml for Anchor {
writer.write("</div>".to_string()) writer.write("</div>".to_string())
} }
} }
impl ToHtml for GlossaryReference {
fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> {
if let Some(entry) = &self.entry {
let entry = entry.lock();
writer.write("<a class=\"glossaryReference\" href=\"#".to_string())?;
writer.write_attribute(self.short.clone())?;
writer.write("\">".to_string())?;
match self.display {
GlossaryDisplay::Short => writer.write_escaped(entry.short.clone())?,
GlossaryDisplay::Long => writer.write_escaped(entry.long.clone())?,
}
writer.write("</a>".to_string())?;
} else {
writer.write_escaped(format!("~{}", self.short.clone()))?;
}
Ok(())
}
}
impl ToHtml for Arrow {
fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> {
writer.write("<span class=\"arrow\">".to_string())?;
match self {
Arrow::RightArrow => writer.write("&xrarr;".to_string()),
Arrow::LeftArrow => writer.write("&xlarr;".to_string()),
Arrow::LeftRightArrow => writer.write("&xharr;".to_string()),
Arrow::BigRightArrow => writer.write("&xrArr;".to_string()),
Arrow::BigLeftArrow => writer.write("&xlArr;".to_string()),
Arrow::BigLeftRightArrow => writer.write("&xhArr;".to_string()),
}?;
writer.write("</span>".to_string())
}
}

@ -1,7 +1,16 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use regex::Regex; use regex::Regex;
use std::collections::HashMap; use std::collections::HashMap;
#[cfg(feature = "pdf")]
pub mod chromium_pdf;
pub mod html; pub mod html;
pub mod style;
pub struct PlaceholderTemplate { pub struct PlaceholderTemplate {
value: String, value: String,

@ -0,0 +1,61 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::settings::style_settings::Theme;
use std::time::Instant;
use syntect::highlighting::ThemeSet;
use syntect::parsing::SyntaxSet;
/// Returns the css of a theme compiled from sass
pub fn get_css_for_theme(theme: Theme) -> String {
let start = Instant::now();
let vars = match theme {
Theme::GitHub => include_str!("assets/light-github.scss"),
Theme::SolarizedDark => include_str!("assets/dark-solarized.scss"),
Theme::SolarizedLight => include_str!("assets/light-solarized.scss"),
Theme::OceanDark => include_str!("assets/dark-ocean.scss"),
Theme::OceanLight => include_str!("assets/light-ocean.scss"),
Theme::MagicDark => include_str!("assets/dark-magic.scss"),
};
let style = format!("{}\n{}", vars, include_str!("assets/base.scss"));
let css = compile_sass(&*style);
log::debug!("Compiled style in {} ms", start.elapsed().as_millis());
css
}
/// Returns the syntax theme for a given theme
pub fn get_code_theme_for_theme(theme: Theme) -> (syntect::highlighting::Theme, SyntaxSet) {
lazy_static::lazy_static! { static ref PS: SyntaxSet = SyntaxSet::load_defaults_nonewlines(); }
lazy_static::lazy_static! { static ref TS: ThemeSet = ThemeSet::load_defaults(); }
let theme = match theme {
Theme::GitHub => "InspiredGitHub",
Theme::SolarizedDark => "Solarized (dark)",
Theme::SolarizedLight => "Solarized (light)",
Theme::OceanDark => "base16-ocean.dark",
Theme::OceanLight => "base16-ocean.light",
Theme::MagicDark => "base16-ocean.dark",
};
return (TS.themes[theme].clone(), PS.clone());
}
fn compile_sass(sass: &str) -> String {
String::from_utf8(
rsass::compile_scss(
sass.as_bytes(),
rsass::output::Format {
style: rsass::output::Style::Compressed,
precision: 5,
},
)
.unwrap(),
)
.unwrap()
}

@ -1,7 +1,14 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod elements; pub mod elements;
pub mod format; pub mod format;
pub mod parser; pub mod parser;
pub mod references; pub mod references;
pub mod settings;
pub mod utils; pub mod utils;
pub use parser::Parser; pub use parser::Parser;

@ -1,43 +1,79 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use colored::Colorize; use colored::Colorize;
use env_logger::Env; use env_logger::Env;
use log::{Level, LevelFilter}; use log::{Level, LevelFilter};
use notify::{watcher, RecursiveMode, Watcher}; use notify::{watcher, RecursiveMode, Watcher};
use snekdown::elements::Document;
use snekdown::format::html::html_writer::HTMLWriter; use snekdown::format::html::html_writer::HTMLWriter;
use snekdown::format::html::to_html::ToHtml; use snekdown::format::html::to_html::ToHtml;
use snekdown::parser::ParserOptions;
use snekdown::settings::Settings;
use snekdown::utils::caching::CacheStorage;
use snekdown::Parser; use snekdown::Parser;
use std::fs::OpenOptions; use std::fs::{File, OpenOptions};
use std::io::BufWriter; use std::io::{stdout, BufWriter, Write};
use std::path::PathBuf; use std::path::PathBuf;
use std::process::exit;
use std::sync::mpsc::channel; use std::sync::mpsc::channel;
use std::time::{Duration, Instant}; use std::time::{Duration, Instant};
use structopt::StructOpt; use structopt::StructOpt;
#[derive(StructOpt, Debug)] #[derive(StructOpt, Debug, Clone)]
struct Opt { struct Opt {
#[structopt(subcommand)]
sub_command: SubCommand,
}
#[derive(StructOpt, Debug, Clone)]
#[structopt()]
enum SubCommand {
/// Watch the document and its imports and render on change.
Watch(WatchOptions),
/// Parse and render the document.
Render(RenderOptions),
/// Initializes the project with default settings
Init,
/// Clears the cache directory
ClearCache,
}
#[derive(StructOpt, Debug, Clone)]
#[structopt()]
struct RenderOptions {
/// Path to the input file /// Path to the input file
#[structopt(parse(from_os_str))] #[structopt(parse(from_os_str))]
input: PathBuf, input: PathBuf,
/// Path for the output file /// Path for the output file
#[structopt(parse(from_os_str))] #[structopt(parse(from_os_str))]
output: PathBuf, output: Option<PathBuf>,
/// If the output should be written to stdout instead of the output file
#[structopt(long = "stdout")]
stdout: bool,
/// the output format /// the output format
#[structopt(short, long, default_value = "html")] #[structopt(short, long, default_value = "html")]
format: String, format: String,
#[structopt(subcommand)]
sub_command: Option<SubCommand>,
} }
#[derive(StructOpt, Debug)] #[derive(StructOpt, Debug, Clone)]
#[structopt()] #[structopt()]
enum SubCommand { struct WatchOptions {
/// Watch the document and its imports and render on change. /// The amount of time in milliseconds to wait after changes before rendering
Watch, #[structopt(long, default_value = "500")]
debounce: u64,
/// Default. Parse and render the document. #[structopt(flatten)]
Render, render_options: RenderOptions,
} }
fn main() { fn main() {
@ -48,7 +84,6 @@ fn main() {
.filter_module("mio", LevelFilter::Warn) .filter_module("mio", LevelFilter::Warn)
.filter_module("want", LevelFilter::Warn) .filter_module("want", LevelFilter::Warn)
.format(|buf, record| { .format(|buf, record| {
use std::io::Write;
let color = get_level_style(record.level()); let color = get_level_style(record.level());
writeln!( writeln!(
buf, buf,
@ -63,19 +98,17 @@ fn main() {
) )
}) })
.init(); .init();
if !opt.input.exists() {
log::error!(
"The input file {} could not be found",
opt.input.to_str().unwrap()
);
return;
}
match &opt.sub_command { match &opt.sub_command {
Some(SubCommand::Render) | None => { SubCommand::Render(opt) => {
let _ = render(&opt); let _ = render(&opt);
} }
Some(SubCommand::Watch) => watch(&opt), SubCommand::Watch(opt) => watch(&opt),
SubCommand::ClearCache => {
let cache = CacheStorage::new();
cache.clear().expect("Failed to clear cache");
}
SubCommand::Init => init(),
}; };
} }
@ -89,17 +122,47 @@ fn get_level_style(level: Level) -> colored::Color {
} }
} }
fn init() {
let settings = Settings::default();
let settings_string = toml::to_string_pretty(&settings).unwrap();
let manifest_path = PathBuf::from("Manifest.toml");
let bibliography_path = PathBuf::from("Bibliography.toml");
let glossary_path = PathBuf::from("Glossary.toml");
let css_path = PathBuf::from("style.css");
if !manifest_path.exists() {
let mut file = OpenOptions::new()
.create(true)
.write(true)
.truncate(true)
.open("Manifest.toml")
.unwrap();
file.write_all(settings_string.as_bytes()).unwrap();
file.flush().unwrap();
}
if !bibliography_path.exists() {
File::create("Bibliography.toml".to_string()).unwrap();
}
if !glossary_path.exists() {
File::create("Glossary.toml".to_string()).unwrap();
}
if !css_path.exists() {
File::create("style.css".to_string()).unwrap();
}
}
/// Watches a file with all of its imports and renders on change /// Watches a file with all of its imports and renders on change
fn watch(opt: &Opt) { fn watch(opt: &WatchOptions) {
let parser = render(opt); let parser = render(&opt.render_options);
let (tx, rx) = channel(); let (tx, rx) = channel();
let mut watcher = watcher(tx, Duration::from_millis(250)).unwrap(); let mut watcher = watcher(tx, Duration::from_millis(opt.debounce)).unwrap();
for path in parser.get_paths() { for path in parser.get_paths() {
watcher.watch(path, RecursiveMode::NonRecursive).unwrap(); watcher.watch(path, RecursiveMode::NonRecursive).unwrap();
} }
while let Ok(_) = rx.recv() { while let Ok(_) = rx.recv() {
println!("---"); println!("---");
let parser = render(opt); let parser = render(&opt.render_options);
for path in parser.get_paths() { for path in parser.get_paths() {
watcher.watch(path, RecursiveMode::NonRecursive).unwrap(); watcher.watch(path, RecursiveMode::NonRecursive).unwrap();
} }
@ -107,31 +170,76 @@ fn watch(opt: &Opt) {
} }
/// Renders the document to the output path /// Renders the document to the output path
fn render(opt: &Opt) -> Parser { fn render(opt: &RenderOptions) -> Parser {
if !opt.input.exists() {
log::error!(
"The input file {} could not be found",
opt.input.to_str().unwrap()
);
exit(1)
}
let start = Instant::now(); let start = Instant::now();
let mut parser = Parser::new_from_file(opt.input.clone()).unwrap();
let mut parser = Parser::with_defaults(ParserOptions::default().add_path(opt.input.clone()));
let document = parser.parse(); let document = parser.parse();
log::info!("Parsing took: {:?}", start.elapsed()); log::info!("Parsing + Processing took: {:?}", start.elapsed());
let start_render = Instant::now(); let start_render = Instant::now();
let file = OpenOptions::new()
.read(true) if let Some(output) = &opt.output {
.write(true) let file = OpenOptions::new()
.truncate(true) .read(true)
.create(true) .write(true)
.open(&opt.output) .truncate(true)
.unwrap(); .create(true)
let writer = BufWriter::new(file); .open(output)
match opt.format.as_str() { .unwrap();
"html" => {
let mut writer = HTMLWriter::new(Box::new(writer)); render_format(opt, document, BufWriter::new(file));
document.to_html(&mut writer).unwrap(); } else {
writer.flush().unwrap(); if !opt.stdout {
log::error!("No output file specified");
exit(1)
} }
_ => log::error!("Unknown format {}", opt.format), render_format(opt, document, BufWriter::new(stdout()));
} }
log::info!("Rendering took: {:?}", start_render.elapsed());
log::info!("Total: {:?}", start.elapsed()); log::info!("Rendering took: {:?}", start_render.elapsed());
log::info!("Total: {:?}", start.elapsed());
parser parser
} }
#[cfg(not(feature = "pdf"))]
fn render_format<W: Write + 'static>(opt: &RenderOptions, document: Document, writer: W) {
match opt.format.as_str() {
"html" => render_html(document, writer),
_ => log::error!("Unknown format {}", opt.format),
}
}
#[cfg(feature = "pdf")]
fn render_format<W: Write + 'static>(opt: &RenderOptions, document: Document, writer: W) {
match opt.format.as_str() {
"html" => render_html(document, writer),
"pdf" => render_pdf(document, writer),
_ => log::error!("Unknown format {}", opt.format),
}
}
fn render_html<W: Write + 'static>(document: Document, writer: W) {
let mut writer = HTMLWriter::new(Box::new(writer), document.config.lock().style.theme.clone());
document.to_html(&mut writer).unwrap();
writer.flush().unwrap();
}
#[cfg(feature = "pdf")]
fn render_pdf<W: Write + 'static>(document: Document, mut writer: W) {
use snekdown::format::chromium_pdf::render_to_pdf;
let result = render_to_pdf(document).expect("Failed to render pdf!");
writer.write_all(&result).unwrap();
writer.flush().unwrap();
}

@ -1,7 +1,13 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use super::ParseResult; use super::ParseResult;
use crate::elements::tokens::*; use crate::elements::tokens::*;
use crate::elements::{ use crate::elements::{
Block, CodeBlock, Import, List, ListItem, MathBlock, Paragraph, Quote, Section, Table, Block, CodeBlock, Import, List, ListItem, MathBlock, Metadata, Paragraph, Quote, Section, Table,
}; };
use crate::parser::inline::ParseInline; use crate::parser::inline::ParseInline;
use crate::parser::line::ParseLine; use crate::parser::line::ParseLine;
@ -26,7 +32,7 @@ impl ParseBlock for Parser {
fn parse_block(&mut self) -> ParseResult<Block> { fn parse_block(&mut self) -> ParseResult<Block> {
if let Some(section) = self.section_return { if let Some(section) = self.section_return {
if section <= self.section_nesting && (self.section_nesting > 0) { if section <= self.section_nesting && (self.section_nesting > 0) {
return Err(self.ctm.assert_error(None)); return Err(self.ctm.assert_error(None).into());
} else { } else {
self.section_return = None; self.section_return = None;
} }
@ -35,7 +41,7 @@ impl ParseBlock for Parser {
log::trace!("Block::Section"); log::trace!("Block::Section");
Block::Section(section) Block::Section(section)
} else if let Some(_) = self.section_return { } else if let Some(_) = self.section_return {
return Err(self.ctm.err()); return Err(self.ctm.err().into());
} else if let Ok(list) = self.parse_list() { } else if let Ok(list) = self.parse_list() {
log::trace!("Block::List"); log::trace!("Block::List");
Block::List(list) Block::List(list)
@ -60,7 +66,7 @@ impl ParseBlock for Parser {
Block::Null Block::Null
} }
} else if let Some(_) = self.section_return { } else if let Some(_) = self.section_return {
return Err(self.ctm.err()); return Err(self.ctm.err().into());
} else if let Ok(pholder) = self.parse_placeholder() { } else if let Ok(pholder) = self.parse_placeholder() {
log::trace!("Block::Placeholder"); log::trace!("Block::Placeholder");
Block::Placeholder(pholder) Block::Placeholder(pholder)
@ -68,7 +74,7 @@ impl ParseBlock for Parser {
log::trace!("Block::Paragraph"); log::trace!("Block::Paragraph");
Block::Paragraph(paragraph) Block::Paragraph(paragraph)
} else { } else {
return Err(self.ctm.err()); return Err(self.ctm.err().into());
}; };
Ok(token) Ok(token)
@ -78,6 +84,7 @@ impl ParseBlock for Parser {
fn parse_section(&mut self) -> ParseResult<Section> { fn parse_section(&mut self) -> ParseResult<Section> {
let start_index = self.ctm.get_index(); let start_index = self.ctm.get_index();
self.ctm.seek_whitespace(); self.ctm.seek_whitespace();
if self.ctm.check_char(&HASH) { if self.ctm.check_char(&HASH) {
let mut size = 1; let mut size = 1;
while let Some(_) = self.ctm.next_char() { while let Some(_) = self.ctm.next_char() {
@ -94,13 +101,15 @@ impl ParseBlock for Parser {
if size <= self.section_nesting { if size <= self.section_nesting {
self.section_return = Some(size); self.section_return = Some(size);
} }
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
self.ctm.seek_any(&INLINE_WHITESPACE)?; self.ctm.seek_any(&INLINE_WHITESPACE)?;
let mut header = self.parse_header()?; let mut header = self.parse_header()?;
header.size = size; header.size = size;
self.section_nesting = size; self.section_nesting = size;
self.sections.push(size); self.sections.push(size);
self.section_anchors.push(header.anchor.clone());
let mut section = Section::new(header); let mut section = Section::new(header);
section.metadata = metadata; section.metadata = metadata;
self.ctm.seek_whitespace(); self.ctm.seek_whitespace();
@ -110,6 +119,7 @@ impl ParseBlock for Parser {
} }
self.sections.pop(); self.sections.pop();
self.section_anchors.pop();
if let Some(sec) = self.sections.last() { if let Some(sec) = self.sections.last() {
self.section_nesting = *sec self.section_nesting = *sec
} else { } else {
@ -117,7 +127,7 @@ impl ParseBlock for Parser {
} }
Ok(section) Ok(section)
} else { } else {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
@ -131,8 +141,9 @@ impl ParseBlock for Parser {
let language = self.ctm.get_string_until_any(&[LB], &[])?; let language = self.ctm.get_string_until_any(&[LB], &[])?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
let text = self.ctm.get_string_until_sequence(&[&SQ_CODE_BLOCK], &[])?; let text = self.ctm.get_string_until_sequence(&[&SQ_CODE_BLOCK], &[])?;
for _ in 0..2 { for _ in 0..2 {
self.ctm.seek_one()?; self.ctm.try_seek();
} }
Ok(CodeBlock { Ok(CodeBlock {
@ -149,7 +160,7 @@ impl ParseBlock for Parser {
self.ctm.seek_one()?; self.ctm.seek_one()?;
let text = self.ctm.get_string_until_sequence(&[SQ_MATH], &[])?; let text = self.ctm.get_string_until_sequence(&[SQ_MATH], &[])?;
for _ in 0..1 { for _ in 0..1 {
self.ctm.seek_one()?; self.ctm.try_seek();
} }
Ok(MathBlock { Ok(MathBlock {
expression: asciimath_rs::parse(text), expression: asciimath_rs::parse(text),
@ -160,6 +171,7 @@ impl ParseBlock for Parser {
fn parse_quote(&mut self) -> ParseResult<Quote> { fn parse_quote(&mut self) -> ParseResult<Quote> {
let start_index = self.ctm.get_index(); let start_index = self.ctm.get_index();
self.ctm.seek_whitespace(); self.ctm.seek_whitespace();
let metadata = if let Ok(meta) = self.parse_inline_metadata() { let metadata = if let Ok(meta) = self.parse_inline_metadata() {
Some(meta) Some(meta)
} else { } else {
@ -167,7 +179,7 @@ impl ParseBlock for Parser {
}; };
if self.ctm.check_char(&META_CLOSE) { if self.ctm.check_char(&META_CLOSE) {
if self.ctm.next_char() == None { if self.ctm.next_char() == None {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
let mut quote = Quote::new(metadata); let mut quote = Quote::new(metadata);
@ -185,8 +197,10 @@ impl ParseBlock for Parser {
break; break;
} }
} }
quote.strip_linebreak();
if quote.text.len() == 0 { if quote.text.len() == 0 {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
Ok(quote) Ok(quote)
@ -194,11 +208,12 @@ impl ParseBlock for Parser {
/// Parses a paragraph /// Parses a paragraph
fn parse_paragraph(&mut self) -> ParseResult<Paragraph> { fn parse_paragraph(&mut self) -> ParseResult<Paragraph> {
self.ctm.seek_whitespace();
let mut paragraph = Paragraph::new(); let mut paragraph = Paragraph::new();
while let Ok(token) = self.parse_line() {
paragraph.add_element(token); while let Ok(element) = self.parse_line() {
paragraph.add_element(element);
let start_index = self.ctm.get_index(); let start_index = self.ctm.get_index();
if self.ctm.check_any_sequence(&BLOCK_SPECIAL_CHARS) if self.ctm.check_any_sequence(&BLOCK_SPECIAL_CHARS)
|| self.ctm.check_any(&self.block_break_at) || self.ctm.check_any(&self.block_break_at)
{ {
@ -213,7 +228,7 @@ impl ParseBlock for Parser {
if paragraph.elements.len() > 0 { if paragraph.elements.len() > 0 {
Ok(paragraph) Ok(paragraph)
} else { } else {
Err(self.ctm.err()) Err(self.ctm.err().into())
} }
} }
@ -227,6 +242,7 @@ impl ParseBlock for Parser {
let ordered = self.ctm.get_current().is_numeric(); let ordered = self.ctm.get_current().is_numeric();
list.ordered = ordered; list.ordered = ordered;
let mut list_hierarchy: Vec<ListItem> = Vec::new(); let mut list_hierarchy: Vec<ListItem> = Vec::new();
while let Ok(mut item) = self.parse_list_item() { while let Ok(mut item) = self.parse_list_item() {
while let Some(parent_item) = list_hierarchy.pop() { while let Some(parent_item) = list_hierarchy.pop() {
if parent_item.level < item.level { if parent_item.level < item.level {
@ -273,7 +289,7 @@ impl ParseBlock for Parser {
if list.items.len() > 0 { if list.items.len() > 0 {
Ok(list) Ok(list)
} else { } else {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
@ -285,6 +301,7 @@ impl ParseBlock for Parser {
} }
let seek_index = self.ctm.get_index(); let seek_index = self.ctm.get_index();
let mut table = Table::new(header); let mut table = Table::new(header);
while let Ok(_) = self.ctm.seek_one() { while let Ok(_) = self.ctm.seek_one() {
self.ctm.seek_any(&INLINE_WHITESPACE)?; self.ctm.seek_any(&INLINE_WHITESPACE)?;
if !self.ctm.check_any(&[MINUS, PIPE]) || self.ctm.check_char(&LB) { if !self.ctm.check_any(&[MINUS, PIPE]) || self.ctm.check_char(&LB) {
@ -319,7 +336,7 @@ impl ParseBlock for Parser {
path.push(character); path.push(character);
} }
if self.ctm.check_char(&LB) || path.is_empty() { if self.ctm.check_char(&LB) || path.is_empty() {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
if self.ctm.check_char(&IMPORT_CLOSE) { if self.ctm.check_char(&IMPORT_CLOSE) {
self.ctm.seek_one()?; self.ctm.seek_one()?;
@ -328,22 +345,20 @@ impl ParseBlock for Parser {
if self.section_nesting > 0 { if self.section_nesting > 0 {
self.section_return = Some(0); self.section_return = Some(0);
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
let metadata = self let metadata = self
.parse_inline_metadata() .parse_inline_metadata()
.ok() .ok()
.map(|m| m.into()) .map(|m| m.get_string_map())
.unwrap_or(HashMap::new()); .unwrap_or(HashMap::new());
self.ctm.seek_whitespace();
match self.import(path.clone(), &metadata) { match self.import(path.clone(), &metadata) {
ImportType::Document(Ok(anchor)) => Ok(Some(Import { path, anchor })), ImportType::Document(Ok(anchor)) => Ok(Some(Import { path, anchor })),
ImportType::Stylesheet(_) => Ok(None), ImportType::Stylesheet(_) => Ok(None),
ImportType::Bibliography(_) => Ok(None), ImportType::Bibliography(_) => Ok(None),
ImportType::Manifest(_) => Ok(None), ImportType::Manifest(_) => Ok(None),
_ => Err(self.ctm.err()), _ => Err(self.ctm.err().into()),
} }
} }
} }

@ -1,13 +1,23 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use super::{ParseError, ParseResult}; use super::{ParseError, ParseResult};
use crate::elements::tokens::*; use crate::elements::tokens::*;
use crate::elements::BibReference; use crate::elements::BibReference;
use crate::elements::*; use crate::elements::*;
use crate::parser::block::ParseBlock; use crate::parser::block::ParseBlock;
use crate::references::configuration::keys::BIB_REF_DISPLAY; use crate::references::glossary::GlossaryDisplay;
use crate::references::glossary::GlossaryReference;
use crate::references::templates::{GetTemplateVariables, Template, TemplateVariable}; use crate::references::templates::{GetTemplateVariables, Template, TemplateVariable};
use crate::utils::parsing::remove_single_backlslash;
use crate::Parser; use crate::Parser;
use bibliographix::references::bib_reference::BibRef; use bibliographix::references::bib_reference::BibRef;
use parking_lot::Mutex;
use std::collections::HashMap; use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::{Arc, RwLock}; use std::sync::{Arc, RwLock};
pub(crate) trait ParseInline { pub(crate) trait ParseInline {
@ -27,12 +37,15 @@ pub(crate) trait ParseInline {
fn parse_colored(&mut self) -> ParseResult<Colored>; fn parse_colored(&mut self) -> ParseResult<Colored>;
fn parse_bibref(&mut self) -> ParseResult<Arc<RwLock<BibReference>>>; fn parse_bibref(&mut self) -> ParseResult<Arc<RwLock<BibReference>>>;
fn parse_template_variable(&mut self) -> ParseResult<Arc<RwLock<TemplateVariable>>>; fn parse_template_variable(&mut self) -> ParseResult<Arc<RwLock<TemplateVariable>>>;
fn parse_glossary_reference(&mut self) -> ParseResult<Arc<Mutex<GlossaryReference>>>;
fn parse_plain(&mut self) -> ParseResult<PlainText>; fn parse_plain(&mut self) -> ParseResult<PlainText>;
fn parse_inline_metadata(&mut self) -> ParseResult<InlineMetadata>; fn parse_inline_metadata(&mut self) -> ParseResult<InlineMetadata>;
fn parse_metadata_pair(&mut self) -> ParseResult<(String, MetadataValue)>; fn parse_metadata_pair(&mut self) -> ParseResult<(String, MetadataValue)>;
fn parse_placeholder(&mut self) -> ParseResult<Arc<RwLock<Placeholder>>>; fn parse_placeholder(&mut self) -> ParseResult<Arc<RwLock<Placeholder>>>;
fn parse_template(&mut self) -> ParseResult<Template>; fn parse_template(&mut self) -> ParseResult<Template>;
fn parse_character_code(&mut self) -> ParseResult<CharacterCode>; fn parse_character_code(&mut self) -> ParseResult<CharacterCode>;
fn parse_arrow(&mut self) -> ParseResult<Arrow>;
fn parse_anchor(&mut self) -> ParseResult<Anchor>;
} }
impl ParseInline for Parser { impl ParseInline for Parser {
@ -42,11 +55,12 @@ impl ParseInline for Parser {
self.ctm.assert_char(surrounding, Some(start_index))?; self.ctm.assert_char(surrounding, Some(start_index))?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
let mut inline = vec![self.parse_inline()?]; let mut inline = vec![self.parse_inline()?];
while !self.ctm.check_char(surrounding) { while !self.ctm.check_char(surrounding) {
if let Ok(result) = self.parse_inline() { if let Ok(result) = self.parse_inline() {
inline.push(result) inline.push(result)
} else { } else {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
if !self.ctm.check_eof() { if !self.ctm.check_eof() {
@ -65,18 +79,18 @@ impl ParseInline for Parser {
} }
} }
if self.ctm.check_char(&PIPE) || self.ctm.check_char(&LB) { if self.ctm.check_char(&PIPE) || self.ctm.check_char(&LB) {
Err(self.ctm.err()) Err(self.ctm.err().into())
} else if self.ctm.check_eof() { } else if self.ctm.check_eof() {
log::trace!("EOF"); log::trace!("EOF");
Err(self.ctm.err()) Err(self.ctm.err().into())
} else if let Ok(image) = self.parse_image() { } else if let Ok(image) = self.parse_image() {
log::trace!("Inline::Image"); log::trace!("Inline::Image {:?}", image);
Ok(Inline::Image(image)) Ok(Inline::Image(image))
} else if let Ok(url) = self.parse_url(false) { } else if let Ok(url) = self.parse_url(false) {
log::trace!("Inline::Url"); log::trace!("Inline::Url {:?}", url);
Ok(Inline::Url(url)) Ok(Inline::Url(url))
} else if let Ok(pholder) = self.parse_placeholder() { } else if let Ok(pholder) = self.parse_placeholder() {
log::trace!("Inline::Placeholder"); log::trace!("Inline::Placeholder {:?}", pholder);
Ok(Inline::Placeholder(pholder)) Ok(Inline::Placeholder(pholder))
} else if let Ok(bold) = self.parse_bold() { } else if let Ok(bold) = self.parse_bold() {
log::trace!("Inline::Bold"); log::trace!("Inline::Bold");
@ -88,35 +102,45 @@ impl ParseInline for Parser {
log::trace!("Inline::Underlined"); log::trace!("Inline::Underlined");
Ok(Inline::Underlined(under)) Ok(Inline::Underlined(under))
} else if let Ok(mono) = self.parse_monospace() { } else if let Ok(mono) = self.parse_monospace() {
log::trace!("Inline::Monospace"); log::trace!("Inline::Monospace {}", mono.value);
Ok(Inline::Monospace(mono)) Ok(Inline::Monospace(mono))
} else if let Ok(striked) = self.parse_striked() { } else if let Ok(striked) = self.parse_striked() {
log::trace!("Inline::Striked"); log::trace!("Inline::Striked");
Ok(Inline::Striked(striked)) Ok(Inline::Striked(striked))
} else if let Ok(gloss) = self.parse_glossary_reference() {
log::trace!("Inline::GlossaryReference {}", gloss.lock().short);
Ok(Inline::GlossaryReference(gloss))
} else if let Ok(superscript) = self.parse_superscript() { } else if let Ok(superscript) = self.parse_superscript() {
log::trace!("Inline::Superscript"); log::trace!("Inline::Superscript");
Ok(Inline::Superscript(superscript)) Ok(Inline::Superscript(superscript))
} else if let Ok(checkbox) = self.parse_checkbox() { } else if let Ok(checkbox) = self.parse_checkbox() {
log::trace!("Inline::Checkbox"); log::trace!("Inline::Checkbox {}", checkbox.value);
Ok(Inline::Checkbox(checkbox)) Ok(Inline::Checkbox(checkbox))
} else if let Ok(emoji) = self.parse_emoji() { } else if let Ok(emoji) = self.parse_emoji() {
log::trace!("Inline::Emoji"); log::trace!("Inline::Emoji {} -> {}", emoji.name, emoji.value);
Ok(Inline::Emoji(emoji)) Ok(Inline::Emoji(emoji))
} else if let Ok(colored) = self.parse_colored() { } else if let Ok(colored) = self.parse_colored() {
log::trace!("Inline::Colored"); log::trace!("Inline::Colored");
Ok(Inline::Colored(colored)) Ok(Inline::Colored(colored))
} else if let Ok(bibref) = self.parse_bibref() { } else if let Ok(bibref) = self.parse_bibref() {
log::trace!("Inline::BibReference"); log::trace!("Inline::BibReference {:?}", bibref);
Ok(Inline::BibReference(bibref)) Ok(Inline::BibReference(bibref))
} else if let Ok(math) = self.parse_math() { } else if let Ok(math) = self.parse_math() {
log::trace!("Inline::Math"); log::trace!("Inline::Math");
Ok(Inline::Math(math)) Ok(Inline::Math(math))
} else if let Ok(char_code) = self.parse_character_code() { } else if let Ok(char_code) = self.parse_character_code() {
log::trace!("Inline::Character Code"); log::trace!("Inline::CharacterCode {}", char_code.code);
Ok(Inline::CharacterCode(char_code)) Ok(Inline::CharacterCode(char_code))
} else if let Ok(arrow) = self.parse_arrow() {
log::trace!("Inline::Arrow {:?}", arrow);
Ok(Inline::Arrow(arrow))
} else if let Ok(anchor) = self.parse_anchor() {
log::trace!("Inline::Anchor {:?}", anchor);
Ok(Inline::Anchor(anchor))
} else { } else {
log::trace!("Inline::Plain"); let plain = self.parse_plain()?;
Ok(Inline::Plain(self.parse_plain()?)) log::trace!("Inline::Plain {}", plain.value);
Ok(Inline::Plain(plain))
} }
} }
@ -127,19 +151,27 @@ impl ParseInline for Parser {
self.ctm.seek_one()?; self.ctm.seek_one()?;
if let Ok(url) = self.parse_url(true) { if let Ok(url) = self.parse_url(true) {
let metadata = if let Ok(meta) = self.parse_inline_metadata() { let metadata = self.parse_inline_metadata().ok();
Some(meta)
} else {
None
};
let path = url.url.clone(); let path = url.url.clone();
let pending_image = self
.options
.document
.images
.lock()
.add_image(PathBuf::from(path));
if let Some(meta) = &metadata {
pending_image.lock().assign_from_meta(meta)
}
Ok(Image { Ok(Image {
url, url,
metadata, metadata,
download: self.document.downloads.lock().unwrap().add_download(path), image_data: pending_image,
}) })
} else { } else {
Err(self.ctm.rewind_with_error(start_index)) Err(self.ctm.rewind_with_error(start_index).into())
} }
} }
@ -164,7 +196,7 @@ impl ParseInline for Parser {
self.inline_break_at.pop(); self.inline_break_at.pop();
self.ctm.seek_one()?; self.ctm.seek_one()?;
} else if !short_syntax { } else if !short_syntax {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
self.ctm.assert_char(&URL_OPEN, Some(start_index))?; self.ctm.assert_char(&URL_OPEN, Some(start_index))?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
@ -197,7 +229,7 @@ impl ParseInline for Parser {
} else if self.ctm.check_char(&SPACE) { } else if self.ctm.check_char(&SPACE) {
false false
} else { } else {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
}; };
self.ctm.seek_one()?; self.ctm.seek_one()?;
self.ctm.assert_char(&CHECK_CLOSE, Some(start_index))?; self.ctm.assert_char(&CHECK_CLOSE, Some(start_index))?;
@ -212,11 +244,12 @@ impl ParseInline for Parser {
self.ctm.assert_sequence(&BOLD, Some(start_index))?; self.ctm.assert_sequence(&BOLD, Some(start_index))?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
let mut inline = vec![self.parse_inline()?]; let mut inline = vec![self.parse_inline()?];
while !self.ctm.check_sequence(&BOLD) { while !self.ctm.check_sequence(&BOLD) {
if let Ok(result) = self.parse_inline() { if let Ok(result) = self.parse_inline() {
inline.push(result); inline.push(result);
} else { } else {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
self.ctm.seek_one()?; self.ctm.seek_one()?;
@ -231,9 +264,27 @@ impl ParseInline for Parser {
} }
fn parse_striked(&mut self) -> ParseResult<StrikedText> { fn parse_striked(&mut self) -> ParseResult<StrikedText> {
Ok(StrikedText { let start_index = self.ctm.get_index();
value: self.parse_surrounded(&STRIKED)?, self.ctm.assert_sequence(&STRIKED, Some(start_index))?;
}) self.ctm.seek_one()?;
let mut inline = vec![self.parse_inline()?];
while !self.ctm.check_sequence(&STRIKED) {
if let Ok(result) = self.parse_inline() {
inline.push(result);
} else {
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
self.ctm.rewind(self.ctm.get_index() - STRIKED.len());
if self.ctm.check_any(WHITESPACE) {
return Err(self.ctm.rewind_with_error(start_index).into());
}
for _ in 0..(STRIKED.len() + 1) {
self.ctm.seek_one()?;
}
Ok(StrikedText { value: inline })
} }
fn parse_math(&mut self) -> ParseResult<Math> { fn parse_math(&mut self) -> ParseResult<Math> {
@ -255,9 +306,10 @@ impl ParseInline for Parser {
let start_index = self.ctm.get_index(); let start_index = self.ctm.get_index();
self.ctm.assert_char(&BACKTICK, Some(start_index))?; self.ctm.assert_char(&BACKTICK, Some(start_index))?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
let content = self let mut content =
.ctm self.ctm
.get_string_until_any_or_rewind(&[BACKTICK, LB], &[], start_index)?; .get_string_until_any_or_rewind(&[BACKTICK, LB], &[], start_index)?;
content = remove_single_backlslash(content);
self.ctm.assert_char(&BACKTICK, Some(start_index))?; self.ctm.assert_char(&BACKTICK, Some(start_index))?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
@ -291,7 +343,7 @@ impl ParseInline for Parser {
name, name,
}) })
} else { } else {
Err(self.ctm.rewind_with_error(start_index)) Err(self.ctm.rewind_with_error(start_index).into())
} }
} }
@ -308,7 +360,7 @@ impl ParseInline for Parser {
)?; )?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
if color.is_empty() { if color.is_empty() {
return Err(self.ctm.err()); return Err(self.ctm.err().into());
} }
Ok(Colored { Ok(Colored {
value: Box::new(self.parse_inline()?), value: Box::new(self.parse_inline()?),
@ -328,14 +380,22 @@ impl ParseInline for Parser {
let bib_ref = BibRef::new(key.clone()); let bib_ref = BibRef::new(key.clone());
let ref_entry = Arc::new(RwLock::new(BibReference::new( let ref_entry = Arc::new(RwLock::new(BibReference::new(
key, key,
self.document.config.get_ref_entry(BIB_REF_DISPLAY), Some(
self.options
.document
.config
.lock()
.style
.bib_ref_display
.clone(),
),
bib_ref.anchor(), bib_ref.anchor(),
))); )));
self.document self.options
.document
.bibliography .bibliography
.root_ref_anchor() .root_ref_anchor()
.lock() .lock()
.unwrap()
.insert(bib_ref); .insert(bib_ref);
Ok(ref_entry) Ok(ref_entry)
@ -366,10 +426,43 @@ impl ParseInline for Parser {
}))) })))
} }
/// Parses a reference to a glossary entry
fn parse_glossary_reference(&mut self) -> ParseResult<Arc<Mutex<GlossaryReference>>> {
let start_index = self.ctm.get_index();
self.ctm
.assert_char(&GLOSSARY_REF_START, Some(start_index))?;
self.ctm.seek_one()?;
let display = if self.ctm.check_char(&GLOSSARY_REF_START) {
self.ctm.seek_one()?;
GlossaryDisplay::Long
} else {
GlossaryDisplay::Short
};
let mut key =
self.ctm
.get_string_until_any_or_rewind(&WHITESPACE, &[TILDE], start_index)?;
if key.is_empty() {
return Err(self.ctm.rewind_with_error(start_index).into());
}
while !key.is_empty() && !key.chars().last().unwrap().is_alphabetic() {
self.ctm.rewind(self.ctm.get_index() - 1);
key = key[..key.len() - 1].to_string();
}
let reference = GlossaryReference::with_display(key, display);
Ok(self
.options
.document
.glossary
.lock()
.add_reference(reference))
}
/// parses plain text as a string until it encounters an unescaped special inline char /// parses plain text as a string until it encounters an unescaped special inline char
fn parse_plain(&mut self) -> ParseResult<PlainText> { fn parse_plain(&mut self) -> ParseResult<PlainText> {
if self.ctm.check_char(&LB) { if self.ctm.check_char(&LB) {
return Err(self.ctm.err()); return Err(self.ctm.err().into());
} }
let mut characters = String::new(); let mut characters = String::new();
if !self.ctm.check_char(&SPECIAL_ESCAPE) { if !self.ctm.check_char(&SPECIAL_ESCAPE) {
@ -377,10 +470,13 @@ impl ParseInline for Parser {
} }
while let Some(ch) = self.ctm.next_char() { while let Some(ch) = self.ctm.next_char() {
let index = self.ctm.get_index();
if self.ctm.check_any(&INLINE_SPECIAL_CHARS) if self.ctm.check_any(&INLINE_SPECIAL_CHARS)
|| self.ctm.check_any(&self.inline_break_at) || self.ctm.check_any(&self.inline_break_at)
|| self.ctm.check_any_sequence(&INLINE_SPECIAL_SEQUENCES)
|| (self.parse_variables && self.ctm.check_char(&TEMP_VAR_OPEN)) || (self.parse_variables && self.ctm.check_char(&TEMP_VAR_OPEN))
{ {
self.ctm.rewind(index);
break; break;
} }
if !self.ctm.check_char(&SPECIAL_ESCAPE) { if !self.ctm.check_char(&SPECIAL_ESCAPE) {
@ -391,7 +487,7 @@ impl ParseInline for Parser {
if characters.len() > 0 { if characters.len() > 0 {
Ok(PlainText { value: characters }) Ok(PlainText { value: characters })
} else { } else {
Err(self.ctm.err()) Err(self.ctm.err().into())
} }
} }
@ -415,7 +511,7 @@ impl ParseInline for Parser {
if values.len() == 0 { if values.len() == 0 {
// if there was a linebreak (the metadata wasn't closed) or there is no inner data // if there was a linebreak (the metadata wasn't closed) or there is no inner data
// return an error // return an error
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
Ok(InlineMetadata { data: values }) Ok(InlineMetadata { data: values })
@ -430,6 +526,7 @@ impl ParseInline for Parser {
self.ctm.seek_any(&INLINE_WHITESPACE)?; self.ctm.seek_any(&INLINE_WHITESPACE)?;
let mut value = MetadataValue::Bool(true); let mut value = MetadataValue::Bool(true);
if self.ctm.check_char(&EQ) { if self.ctm.check_char(&EQ) {
self.ctm.seek_one()?; self.ctm.seek_one()?;
self.ctm.seek_any(&INLINE_WHITESPACE)?; self.ctm.seek_any(&INLINE_WHITESPACE)?;
@ -490,18 +587,18 @@ impl ParseInline for Parser {
{ {
name_str name_str
} else { } else {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
}; };
self.ctm.seek_one()?; if !self.ctm.check_eof() {
self.ctm.seek_one()?;
}
let metadata = if let Ok(meta) = self.parse_inline_metadata() { let metadata = self.parse_inline_metadata().ok();
Some(meta)
} else {
None
};
let placeholder = Arc::new(RwLock::new(Placeholder::new(name, metadata))); let placeholder = Arc::new(RwLock::new(Placeholder::new(name, metadata)));
self.document.add_placeholder(Arc::clone(&placeholder)); self.options
.document
.add_placeholder(Arc::clone(&placeholder));
Ok(placeholder) Ok(placeholder)
} }
@ -514,7 +611,7 @@ impl ParseInline for Parser {
self.ctm.seek_one()?; self.ctm.seek_one()?;
if self.ctm.check_char(&TEMPLATE) { if self.ctm.check_char(&TEMPLATE) {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
let mut elements = Vec::new(); let mut elements = Vec::new();
@ -566,4 +663,49 @@ impl ParseInline for Parser {
Ok(CharacterCode { code }) Ok(CharacterCode { code })
} }
/// Parses an arrow
fn parse_arrow(&mut self) -> ParseResult<Arrow> {
if !self.options.document.config.lock().features.smart_arrows {
Err(self.ctm.err().into())
} else if self.ctm.check_sequence(A_LEFT_RIGHT_ARROW) {
self.ctm.seek_one()?;
Ok(Arrow::LeftRightArrow)
} else if self.ctm.check_sequence(A_RIGHT_ARROW) {
self.ctm.seek_one()?;
Ok(Arrow::RightArrow)
} else if self.ctm.check_sequence(A_LEFT_ARROW) {
self.ctm.seek_one()?;
Ok(Arrow::LeftArrow)
} else if self.ctm.check_sequence(A_BIG_LEFT_RIGHT_ARROW) {
self.ctm.seek_one()?;
Ok(Arrow::BigLeftRightArrow)
} else if self.ctm.check_sequence(A_BIG_RIGHT_ARROW) {
self.ctm.seek_one()?;
Ok(Arrow::BigRightArrow)
} else if self.ctm.check_sequence(A_BIG_LEFT_ARROW) {
self.ctm.seek_one()?;
Ok(Arrow::BigLeftArrow)
} else {
Err(self.ctm.err().into())
}
}
/// Parses an anchor elements
fn parse_anchor(&mut self) -> ParseResult<Anchor> {
let start_index = self.ctm.get_index();
self.ctm.assert_sequence(&ANCHOR_START, Some(start_index))?;
self.ctm.seek_one()?;
let key = self.ctm.get_string_until_any_or_rewind(
&[ANCHOR_STOP],
&INLINE_WHITESPACE,
start_index,
)?;
self.ctm.try_seek();
Ok(Anchor {
inner: Box::new(Line::Text(TextLine::new())),
key,
})
}
} }

@ -1,7 +1,14 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use super::ParseResult; use super::ParseResult;
use crate::elements::tokens::*; use crate::elements::tokens::*;
use crate::elements::Inline::LineBreak;
use crate::elements::{BibEntry, Metadata}; use crate::elements::{BibEntry, Metadata};
use crate::elements::{Cell, Centered, Header, Inline, Line, ListItem, Row, Ruler, TextLine}; use crate::elements::{Cell, Centered, Header, Line, ListItem, Row, Ruler, TextLine};
use crate::parser::inline::ParseInline; use crate::parser::inline::ParseInline;
use crate::Parser; use crate::Parser;
use bibliographix::bibliography::bibliography_entry::BibliographyEntry; use bibliographix::bibliography::bibliography_entry::BibliographyEntry;
@ -16,6 +23,7 @@ pub(crate) trait ParseLine {
fn parse_row(&mut self) -> ParseResult<Row>; fn parse_row(&mut self) -> ParseResult<Row>;
fn parse_centered(&mut self) -> ParseResult<Centered>; fn parse_centered(&mut self) -> ParseResult<Centered>;
fn parse_ruler(&mut self) -> ParseResult<Ruler>; fn parse_ruler(&mut self) -> ParseResult<Ruler>;
fn parse_paragraph_break(&mut self) -> ParseResult<TextLine>;
fn parse_text_line(&mut self) -> ParseResult<TextLine>; fn parse_text_line(&mut self) -> ParseResult<TextLine>;
fn parse_bib_entry(&mut self) -> ParseResult<BibEntry>; fn parse_bib_entry(&mut self) -> ParseResult<BibEntry>;
} }
@ -25,7 +33,7 @@ impl ParseLine for Parser {
fn parse_line(&mut self) -> ParseResult<Line> { fn parse_line(&mut self) -> ParseResult<Line> {
if self.ctm.check_eof() { if self.ctm.check_eof() {
log::trace!("EOF"); log::trace!("EOF");
Err(self.ctm.err()) Err(self.ctm.err().into())
} else { } else {
if let Ok(ruler) = self.parse_ruler() { if let Ok(ruler) = self.parse_ruler() {
log::trace!("Line::Ruler"); log::trace!("Line::Ruler");
@ -36,11 +44,14 @@ impl ParseLine for Parser {
} else if let Ok(bib) = self.parse_bib_entry() { } else if let Ok(bib) = self.parse_bib_entry() {
log::trace!("Line::BibEntry"); log::trace!("Line::BibEntry");
Ok(Line::BibEntry(bib)) Ok(Line::BibEntry(bib))
} else if let Ok(text) = self.parse_paragraph_break() {
log::trace!("Line::LineBreak");
Ok(Line::Text(text))
} else if let Ok(text) = self.parse_text_line() { } else if let Ok(text) = self.parse_text_line() {
log::trace!("Line::Text"); log::trace!("Line::Text");
Ok(Line::Text(text)) Ok(Line::Text(text))
} else { } else {
Err(self.ctm.err()) Err(self.ctm.err().into())
} }
} }
} }
@ -53,6 +64,9 @@ impl ParseLine for Parser {
self.ctm.get_text()[start_index..self.ctm.get_index()] self.ctm.get_text()[start_index..self.ctm.get_index()]
.iter() .iter()
.for_each(|e| anchor.push(*e)); .for_each(|e| anchor.push(*e));
if let Some(last) = self.section_anchors.last() {
anchor = format!("{}-{}", last, anchor);
}
anchor.retain(|c| !c.is_whitespace()); anchor.retain(|c| !c.is_whitespace());
log::trace!("Line::Header"); log::trace!("Line::Header");
Ok(Header::new(line, anchor)) Ok(Header::new(line, anchor))
@ -76,11 +90,11 @@ impl ParseLine for Parser {
} }
if !self.ctm.check_any(&INLINE_WHITESPACE) { if !self.ctm.check_any(&INLINE_WHITESPACE) {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
self.ctm.seek_any(&INLINE_WHITESPACE)?; self.ctm.seek_any(&INLINE_WHITESPACE)?;
if self.ctm.check_char(&MINUS) { if self.ctm.check_char(&MINUS) {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
let item = ListItem::new(self.parse_line()?, level as u16, ordered); let item = ListItem::new(self.parse_line()?, level as u16, ordered);
@ -96,7 +110,7 @@ impl ParseLine for Parser {
self.ctm.assert_char(&PIPE, Some(start_index))?; self.ctm.assert_char(&PIPE, Some(start_index))?;
self.ctm.seek_one()?; self.ctm.seek_one()?;
if self.ctm.check_char(&PIPE) { if self.ctm.check_char(&PIPE) {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
self.inline_break_at.push(PIPE); self.inline_break_at.push(PIPE);
@ -134,7 +148,7 @@ impl ParseLine for Parser {
log::trace!("Line::TableRow"); log::trace!("Line::TableRow");
Ok(row) Ok(row)
} else { } else {
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
@ -163,6 +177,8 @@ impl ParseLine for Parser {
/// Parses a line of text /// Parses a line of text
fn parse_text_line(&mut self) -> ParseResult<TextLine> { fn parse_text_line(&mut self) -> ParseResult<TextLine> {
let mut text = TextLine::new(); let mut text = TextLine::new();
let start_index = self.ctm.get_index();
while let Ok(subtext) = self.parse_inline() { while let Ok(subtext) = self.parse_inline() {
text.add_subtext(subtext); text.add_subtext(subtext);
if self.ctm.check_eof() || self.ctm.check_any(&self.inline_break_at) { if self.ctm.check_eof() || self.ctm.check_any(&self.inline_break_at) {
@ -170,21 +186,36 @@ impl ParseLine for Parser {
} }
} }
// add a linebreak when encountering \n\n
if self.ctm.check_char(&LB) { if self.ctm.check_char(&LB) {
if let Ok(_) = self.ctm.seek_one() { self.ctm.try_seek();
if self.ctm.check_char(&LB) {
text.add_subtext(Inline::LineBreak) if self.ctm.check_char(&LB) {
} text.add_subtext(LineBreak);
self.ctm.try_seek();
} }
} }
if text.subtext.len() > 0 { if text.subtext.len() > 0 {
Ok(text) Ok(text)
} else { } else {
Err(self.ctm.err()) Err(self.ctm.rewind_with_error(start_index).into())
} }
} }
/// Parses a paragraph break
fn parse_paragraph_break(&mut self) -> ParseResult<TextLine> {
let start_index = self.ctm.get_index();
self.ctm.assert_char(&LB, Some(start_index))?;
self.ctm.seek_one()?;
let mut line = TextLine::new();
line.subtext.push(LineBreak);
Ok(line)
}
fn parse_bib_entry(&mut self) -> ParseResult<BibEntry> { fn parse_bib_entry(&mut self) -> ParseResult<BibEntry> {
let start_index = self.ctm.get_index(); let start_index = self.ctm.get_index();
self.ctm.seek_any(&INLINE_WHITESPACE)?; self.ctm.seek_any(&INLINE_WHITESPACE)?;
@ -211,7 +242,7 @@ impl ParseLine for Parser {
msg, msg,
self.get_position_string() self.get_position_string()
); );
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
} else { } else {
@ -232,25 +263,26 @@ impl ParseLine for Parser {
msg, msg,
self.get_position_string() self.get_position_string()
); );
return Err(self.ctm.rewind_with_error(start_index)); return Err(self.ctm.rewind_with_error(start_index).into());
} }
} }
}; };
self.ctm.seek_whitespace();
self.document self.options
.document
.bibliography .bibliography
.entry_dictionary() .entry_dictionary()
.lock() .lock()
.unwrap()
.insert(entry); .insert(entry);
Ok(BibEntry { Ok(BibEntry {
entry: self entry: self
.options
.document .document
.bibliography .bibliography
.entry_dictionary() .entry_dictionary()
.lock() .lock()
.unwrap()
.get(&key) .get(&key)
.unwrap(), .unwrap(),
key, key,

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub(crate) mod block; pub(crate) mod block;
pub(crate) mod inline; pub(crate) mod inline;
pub(crate) mod line; pub(crate) mod line;
@ -5,176 +11,153 @@ pub(crate) mod line;
use self::block::ParseBlock; use self::block::ParseBlock;
use crate::elements::tokens::LB; use crate::elements::tokens::LB;
use crate::elements::{Document, ImportAnchor}; use crate::elements::{Document, ImportAnchor};
use crate::references::configuration::keys::{ use crate::settings::SettingsError;
IMP_BIBLIOGRAPHY, IMP_CONFIGS, IMP_IGNORE, IMP_STYLESHEETS, use charred::tapemachine::{CharTapeMachine, TapeError};
};
use crate::references::configuration::{Configuration, Value};
use crate::utils::downloads::DownloadManager;
use bibliographix::bib_manager::BibManager;
use charred::tapemachine::{CharTapeMachine, TapeError, TapeResult};
use crossbeam_utils::sync::WaitGroup; use crossbeam_utils::sync::WaitGroup;
use regex::Regex; use regex::Regex;
use std::collections::HashMap; use std::collections::HashMap;
use std::fmt;
use std::fs::{read_to_string, File}; use std::fs::{read_to_string, File};
use std::io; use std::io::{self, BufReader};
use std::io::{BufRead, BufReader, Cursor};
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::{Arc, Mutex, RwLock}; use std::sync::{Arc, Mutex, RwLock};
use std::thread; use std::thread;
pub type ParseResult<T> = TapeResult<T>; pub type ParseResult<T> = Result<T, ParseError>;
pub type ParseError = TapeError;
#[derive(Debug)]
pub enum ParseError {
TapeError(TapeError),
SettingsError(SettingsError),
IoError(io::Error),
}
impl fmt::Display for ParseError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
ParseError::TapeError(e) => write!(f, "{}", e),
ParseError::SettingsError(e) => write!(f, "{}", e),
ParseError::IoError(e) => write!(f, "IO Error: {}", e),
}
}
}
impl From<TapeError> for ParseError {
fn from(e: TapeError) -> Self {
Self::TapeError(e)
}
}
impl From<SettingsError> for ParseError {
fn from(e: SettingsError) -> Self {
Self::SettingsError(e)
}
}
impl From<io::Error> for ParseError {
fn from(e: io::Error) -> Self {
Self::IoError(e)
}
}
#[derive(Clone, Debug)]
pub struct ParserOptions {
pub path: Option<PathBuf>,
pub paths: Arc<Mutex<Vec<PathBuf>>>,
pub document: Document,
pub is_child: bool,
}
impl Default for ParserOptions {
fn default() -> Self {
Self {
path: None,
paths: Arc::new(Mutex::new(Vec::new())),
document: Document::new(),
is_child: false,
}
}
}
impl ParserOptions {
/// Adds a path to the parser options
pub fn add_path(mut self, path: PathBuf) -> Self {
self.path = Some(path.clone());
self.paths.lock().unwrap().push(path);
const DEFAULT_IMPORTS: &'static [(&str, &str)] = &[ self
("snekdown.toml", "manifest"), }
("manifest.toml", "manifest"), }
("bibliography.toml", "bibliography"),
("bibliography2.bib.toml", "bibliography"),
("style.css", "stylesheet"),
];
pub struct Parser { pub struct Parser {
pub(crate) options: ParserOptions,
pub(crate) ctm: CharTapeMachine, pub(crate) ctm: CharTapeMachine,
section_nesting: u8, section_nesting: u8,
sections: Vec<u8>, sections: Vec<u8>,
section_anchors: Vec<String>,
section_return: Option<u8>, section_return: Option<u8>,
path: Option<PathBuf>,
paths: Arc<Mutex<Vec<PathBuf>>>,
wg: WaitGroup, wg: WaitGroup,
is_child: bool,
pub(crate) block_break_at: Vec<char>, pub(crate) block_break_at: Vec<char>,
pub(crate) inline_break_at: Vec<char>, pub(crate) inline_break_at: Vec<char>,
pub(crate) document: Document,
pub(crate) parse_variables: bool, pub(crate) parse_variables: bool,
} }
impl Parser { impl Parser {
/// Creates a new parser from a path /// Creates a new parser with the default values given
pub fn new_from_file(path: PathBuf) -> Result<Self, io::Error> { pub fn with_defaults(options: ParserOptions) -> Self {
let f = File::open(&path)?; let text = if let Some(path) = &options.path {
Ok(Self::create( let mut text = read_to_string(&path).unwrap();
Some(PathBuf::from(path)), text = text.replace("\r\n", "\n");
Arc::new(Mutex::new(Vec::new())), if text.chars().last() != Some('\n') {
false, text.push('\n');
Box::new(BufReader::new(f)), }
BibManager::new(),
Arc::new(Mutex::new(DownloadManager::new())),
))
}
/// Creates a new parser with text being the markdown text text
pub fn new(text: String, path: Option<PathBuf>) -> Self {
let text_bytes = text.as_bytes();
let path = if let Some(inner_path) = path {
Some(PathBuf::from(inner_path))
} else { } else {
None "".to_string()
}; };
Parser::create(
path,
Arc::new(Mutex::new(Vec::new())),
false,
Box::new(Cursor::new(text_bytes.to_vec())),
BibManager::new(),
Arc::new(Mutex::new(DownloadManager::new())),
)
}
/// Creates a child parser from string text
pub fn child(
text: String,
path: PathBuf,
paths: Arc<Mutex<Vec<PathBuf>>>,
bib_manager: BibManager,
download_manager: Arc<Mutex<DownloadManager>>,
) -> Self {
let text_bytes = text.as_bytes();
Self::create(
Some(PathBuf::from(path)),
paths,
true,
Box::new(Cursor::new(text_bytes.to_vec())),
bib_manager,
download_manager,
)
}
/// Creates a child parser from a file
pub fn child_from_file(
path: PathBuf,
paths: Arc<Mutex<Vec<PathBuf>>>,
bib_manager: BibManager,
download_manager: Arc<Mutex<DownloadManager>>,
) -> Result<Self, io::Error> {
let f = File::open(&path)?;
Ok(Self::create(
Some(PathBuf::from(path)),
paths,
true,
Box::new(BufReader::new(f)),
bib_manager,
download_manager,
))
}
fn create(
path: Option<PathBuf>,
paths: Arc<Mutex<Vec<PathBuf>>>,
is_child: bool,
mut reader: Box<dyn BufRead>,
bib_manager: BibManager,
download_manager: Arc<Mutex<DownloadManager>>,
) -> Self {
if let Some(path) = path.clone() {
paths.lock().unwrap().push(path.clone())
}
let mut text = String::new();
reader
.read_to_string(&mut text)
.expect("Failed to read file");
if text.chars().last() != Some('\n') {
text.push('\n');
}
let document = Document::new_with_manager(!is_child, bib_manager, download_manager);
Self { Self {
options,
sections: Vec::new(), sections: Vec::new(),
section_anchors: Vec::new(),
section_nesting: 0, section_nesting: 0,
section_return: None, section_return: None,
path,
paths,
wg: WaitGroup::new(), wg: WaitGroup::new(),
is_child,
ctm: CharTapeMachine::new(text.chars().collect()), ctm: CharTapeMachine::new(text.chars().collect()),
inline_break_at: Vec::new(), inline_break_at: Vec::new(),
block_break_at: Vec::new(), block_break_at: Vec::new(),
document,
parse_variables: false, parse_variables: false,
} }
} }
pub fn set_config(&mut self, config: Configuration) { /// Creates a new child parser
self.document.config = config; fn create_child(&self, path: PathBuf) -> Self {
} let mut options = self.options.clone().add_path(path.clone());
options.document = self.options.document.create_child();
options.document.path = Some(path.to_str().unwrap().to_string());
options.is_child = true;
/// Returns the import paths of the parser Self::with_defaults(options)
pub fn get_paths(&self) -> Vec<PathBuf> {
self.paths.lock().unwrap().clone()
} }
/// Returns a string of the current position in the file /// Returns a string of the current position in the file
pub(crate) fn get_position_string(&self) -> String { pub(crate) fn get_position_string(&self) -> String {
let char_index = self.ctm.get_index(); let char_index = self.ctm.get_index();
self.get_position_string_for_index(char_index)
}
/// Returns a string of the given index position in the file
fn get_position_string_for_index(&self, char_index: usize) -> String {
let text = self.ctm.get_text(); let text = self.ctm.get_text();
let mut text_unil = text[..char_index].to_vec(); let mut text_unil = text[..char_index].to_vec();
let line_number = text_unil.iter().filter(|c| c == &&LB).count(); let line_number = text_unil.iter().filter(|c| c == &&LB).count();
text_unil.reverse(); text_unil.reverse();
let mut inline_pos = 0; let mut inline_pos = 0;
while text_unil[inline_pos] != LB { while inline_pos < text_unil.len() && text_unil[inline_pos] != LB {
inline_pos += 1; inline_pos += 1;
} }
if let Some(path) = &self.path { if let Some(path) = &self.options.path {
format!("{}:{}:{}", path.to_str().unwrap(), line_number, inline_pos) format!("{}:{}:{}", path.to_str().unwrap(), line_number, inline_pos)
} else { } else {
format!("{}:{}", line_number, inline_pos) format!("{}:{}", line_number, inline_pos)
@ -186,7 +169,7 @@ impl Parser {
let mut path = PathBuf::from(path); let mut path = PathBuf::from(path);
if !path.is_absolute() { if !path.is_absolute() {
if let Some(selfpath) = &self.path { if let Some(selfpath) = &self.options.path {
if let Some(dir) = selfpath.parent() { if let Some(dir) = selfpath.parent() {
path = PathBuf::new().join(dir).join(path); path = PathBuf::new().join(dir).join(path);
} }
@ -204,33 +187,15 @@ impl Parser {
path.to_str().unwrap(), path.to_str().unwrap(),
self.get_position_string(), self.get_position_string(),
); );
return Err(self.ctm.assert_error(None)); return Err(self.ctm.assert_error(None).into());
}
{
let mut paths = self.paths.lock().unwrap();
if paths.iter().find(|item| **item == path) != None {
log::warn!(
"Import of \"{}\" failed: Already imported.\n\t--> {}\n",
path.to_str().unwrap(),
self.get_position_string(),
);
return Err(self.ctm.assert_error(None));
}
paths.push(path.clone());
} }
let anchor = Arc::new(RwLock::new(ImportAnchor::new())); let anchor = Arc::new(RwLock::new(ImportAnchor::new()));
let anchor_clone = Arc::clone(&anchor); let anchor_clone = Arc::clone(&anchor);
let wg = self.wg.clone(); let wg = self.wg.clone();
let paths = Arc::clone(&self.paths); let mut child_parser = self.create_child(path.clone());
let config = self.document.config.clone();
let bibliography = self.document.bibliography.create_child();
let download_manager = Arc::clone(&self.document.downloads);
let _ = thread::spawn(move || { let _ = thread::spawn(move || {
let mut parser = let document = child_parser.parse();
Parser::child_from_file(path, paths, bibliography, download_manager).unwrap();
parser.set_config(config);
let document = parser.parse();
anchor_clone.write().unwrap().set_document(document); anchor_clone.write().unwrap().set_document(document);
drop(wg); drop(wg);
@ -242,7 +207,8 @@ impl Parser {
/// Imports a bibliography toml file /// Imports a bibliography toml file
fn import_bib(&mut self, path: PathBuf) -> ParseResult<()> { fn import_bib(&mut self, path: PathBuf) -> ParseResult<()> {
let f = File::open(path).map_err(|_| self.ctm.err())?; let f = File::open(path).map_err(|_| self.ctm.err())?;
self.document self.options
.document
.bibliography .bibliography
.read_bib_file(&mut BufReader::new(f)) .read_bib_file(&mut BufReader::new(f))
.map_err(|_| self.ctm.err())?; .map_err(|_| self.ctm.err())?;
@ -252,15 +218,15 @@ impl Parser {
/// Returns the text of an imported text file /// Returns the text of an imported text file
fn import_text_file(&self, path: PathBuf) -> ParseResult<String> { fn import_text_file(&self, path: PathBuf) -> ParseResult<String> {
read_to_string(path).map_err(|_| self.ctm.err()) read_to_string(path).map_err(ParseError::from)
} }
fn import_stylesheet(&mut self, path: PathBuf) -> ParseResult<()> { fn import_stylesheet(&mut self, path: PathBuf) -> ParseResult<()> {
self.document.stylesheets.push( self.options.document.stylesheets.push(
self.document self.options
.document
.downloads .downloads
.lock() .lock()
.unwrap()
.add_download(path.to_str().unwrap().to_string()), .add_download(path.to_str().unwrap().to_string()),
); );
@ -268,17 +234,32 @@ impl Parser {
} }
fn import_manifest(&mut self, path: PathBuf) -> ParseResult<()> { fn import_manifest(&mut self, path: PathBuf) -> ParseResult<()> {
self.options
.document
.config
.lock()
.merge(path)
.map_err(ParseError::from)
}
/// Imports a glossary
fn import_glossary(&self, path: PathBuf) -> ParseResult<()> {
let contents = self.import_text_file(path)?; let contents = self.import_text_file(path)?;
let value = contents let value = contents
.parse::<toml::Value>() .parse::<toml::Value>()
.map_err(|_| self.ctm.err())?; .map_err(|_| self.ctm.err())?;
self.document.config.set_from_toml(&value); self.options
.document
.glossary
.lock()
.assign_from_toml(value)
.unwrap_or_else(|e| log::error!("{}", e));
Ok(()) Ok(())
} }
/// Imports a path /// Imports a path
fn import(&mut self, path: String, args: &HashMap<String, Value>) -> ImportType { fn import(&mut self, path: String, args: &HashMap<String, String>) -> ImportType {
log::debug!( log::debug!(
"Importing file {}\n\t--> {}\n", "Importing file {}\n\t--> {}\n",
path, path,
@ -297,22 +278,24 @@ impl Parser {
.file_name() .file_name()
.and_then(|f| Some(f.to_str().unwrap().to_string())) .and_then(|f| Some(f.to_str().unwrap().to_string()))
{ {
if let Some(Value::Array(ignore)) = self let ignore = &self.options.document.config.lock().imports.ignored_imports;
.document if ignore.contains(&fname) {
.config return ImportType::None;
.get_entry(IMP_IGNORE) }
.and_then(|e| Some(e.get().clone())) }
{ {
let ignore = ignore let mut paths = self.options.paths.lock().unwrap();
.iter() if paths.iter().find(|item| **item == path).is_some() {
.map(|v| v.as_string()) log::warn!(
.collect::<Vec<String>>(); "Import of \"{}\" failed: Already imported.\n\t--> {}\n",
if ignore.contains(&fname) { path.to_str().unwrap(),
return ImportType::None; self.get_position_string(),
} );
return ImportType::None;
} }
paths.push(path.clone());
} }
match args.get("type").map(|e| e.as_string().to_lowercase()) { match args.get("type").cloned() {
Some(s) if s == "stylesheet".to_string() => { Some(s) if s == "stylesheet".to_string() => {
ImportType::Stylesheet(self.import_stylesheet(path)) ImportType::Stylesheet(self.import_stylesheet(path))
} }
@ -325,6 +308,9 @@ impl Parser {
Some(s) if s == "manifest".to_string() || s == "config" => { Some(s) if s == "manifest".to_string() || s == "config" => {
ImportType::Manifest(self.import_manifest(path)) ImportType::Manifest(self.import_manifest(path))
} }
Some(s) if s == "glossary".to_string() => {
ImportType::Glossary(self.import_glossary(path))
}
_ => { _ => {
lazy_static::lazy_static! { lazy_static::lazy_static! {
static ref BIB_NAME: Regex = Regex::new(r".*\.bib\.toml$").unwrap(); static ref BIB_NAME: Regex = Regex::new(r".*\.bib\.toml$").unwrap();
@ -345,7 +331,7 @@ impl Parser {
/// parses the given text into a document /// parses the given text into a document
pub fn parse(&mut self) -> Document { pub fn parse(&mut self) -> Document {
self.document.path = if let Some(path) = &self.path { self.options.document.path = if let Some(path) = &self.options.path {
Some(path.canonicalize().unwrap().to_str().unwrap().to_string()) Some(path.canonicalize().unwrap().to_str().unwrap().to_string())
} else { } else {
None None
@ -353,12 +339,23 @@ impl Parser {
while !self.ctm.check_eof() { while !self.ctm.check_eof() {
match self.parse_block() { match self.parse_block() {
Ok(block) => self.document.add_element(block), Ok(block) => self.options.document.add_element(block),
Err(err) => { Err(err) => {
if self.ctm.check_eof() { if self.ctm.check_eof() {
break; break;
} }
eprintln!("{}", err); match err {
ParseError::TapeError(t) => {
log::error!(
"Parse Error: {}\n\t--> {}\n",
t,
self.get_position_string_for_index(t.get_index())
)
}
_ => {
log::error!("{}", err)
}
}
break; break;
} }
} }
@ -366,63 +363,47 @@ impl Parser {
let wg = self.wg.clone(); let wg = self.wg.clone();
self.wg = WaitGroup::new(); self.wg = WaitGroup::new();
if !self.is_child { if !self.options.is_child {
for (path, file_type) in DEFAULT_IMPORTS { self.import(
if self.transform_path(path.to_string()).exists() { "Manifest.toml".to_string(),
self.import( &maplit::hashmap! {"type".to_string() => "manifest".to_string()},
path.to_string(), );
&maplit::hashmap! {"type".to_string() => Value::String(file_type.to_string())},
);
}
}
} }
wg.wait(); wg.wait();
if !self.is_child { if !self.options.is_child {
self.import_from_config(); self.import_from_config();
} }
self.document.post_process(); self.options.document.post_process();
let document = self.document.clone(); let document = std::mem::replace(&mut self.options.document, Document::new());
self.document = Document::new(!self.is_child);
document document
} }
pub fn get_paths(&self) -> Vec<PathBuf> {
self.options.paths.lock().unwrap().clone()
}
/// Imports files from the configs import values /// Imports files from the configs import values
fn import_from_config(&mut self) { fn import_from_config(&mut self) {
if let Some(Value::Array(mut imp)) = self let config = Arc::clone(&self.options.document.config);
.document
.config let mut stylesheets = config.lock().imports.included_stylesheets.clone();
.get_entry(IMP_STYLESHEETS) let args = maplit::hashmap! {"type".to_string() => "stylesheet".to_string()};
.and_then(|e| Some(e.get().clone())) while let Some(s) = stylesheets.pop() {
{ self.import(s, &args);
let args =
maplit::hashmap! {"type".to_string() => Value::String("stylesheet".to_string())};
while let Some(Value::String(s)) = imp.pop() {
self.import(s, &args);
}
} }
if let Some(Value::Array(mut imp)) = self
.document let mut bibliography = config.lock().imports.included_bibliography.clone();
.config let args = maplit::hashmap! {"type".to_string() => "bibliography".to_string()};
.get_entry(IMP_CONFIGS) while let Some(s) = bibliography.pop() {
.and_then(|e| Some(e.get().clone())) self.import(s, &args);
{
let args = maplit::hashmap! {"type".to_string() => Value::String("config".to_string())};
while let Some(Value::String(s)) = imp.pop() {
self.import(s, &args);
}
} }
if let Some(Value::Array(mut imp)) = self
.document let mut glossaries = config.lock().imports.included_glossaries.clone();
.config
.get_entry(IMP_BIBLIOGRAPHY) let args = maplit::hashmap! {"type".to_string() =>"glossary".to_string()};
.and_then(|e| Some(e.get().clone())) while let Some(s) = glossaries.pop() {
{ self.import(s, &args);
let args =
maplit::hashmap! {"type".to_string() => Value::String("bibliography".to_string())};
while let Some(Value::String(s)) = imp.pop() {
self.import(s, &args);
}
} }
} }
} }
@ -432,5 +413,6 @@ pub(crate) enum ImportType {
Stylesheet(ParseResult<()>), Stylesheet(ParseResult<()>),
Bibliography(ParseResult<()>), Bibliography(ParseResult<()>),
Manifest(ParseResult<()>), Manifest(ParseResult<()>),
Glossary(ParseResult<()>),
None, None,
} }

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{Anchor, BoldText, ItalicText, Line, List, ListItem, PlainText, TextLine}; use crate::elements::{Anchor, BoldText, ItalicText, Line, List, ListItem, PlainText, TextLine};
use crate::elements::{Inline, Url}; use crate::elements::{Inline, Url};
use bibliographix::bibliography::bib_types::article::Article; use bibliographix::bibliography::bib_types::article::Article;
@ -17,51 +23,12 @@ use bibliographix::bibliography::bibliography_entry::{
BibliographyEntry, BibliographyEntryReference, BibliographyEntry, BibliographyEntryReference,
}; };
macro_rules! plain_text {
($e:expr) => {
Inline::Plain(PlainText { value: $e })
};
}
macro_rules! bold_text {
($e:expr) => {
Inline::Bold(BoldText {
value: vec![Inline::Plain(PlainText { value: $e })],
})
};
}
macro_rules! italic_text {
($e:expr) => {
Inline::Italic(ItalicText {
value: vec![Inline::Plain(PlainText { value: $e })],
})
};
}
macro_rules! url_text {
($e:expr) => {
Inline::Url(Url {
url: $e,
description: None,
})
};
}
macro_rules! list_item {
($e:expr, $k:expr) => {
ListItem::new(
Line::Anchor(Anchor {
inner: Box::new(Line::Text($e)),
key: $k,
}),
0,
true,
)
};
}
const DATE_FORMAT: &str = "%d.%m.%Y"; const DATE_FORMAT: &str = "%d.%m.%Y";
use crate::bold_text;
use crate::italic_text;
use crate::list_item;
use crate::plain_text;
use crate::url_text;
/// Creates a list from a list of bib items /// Creates a list from a list of bib items
pub fn create_bib_list(entries: Vec<BibliographyEntryReference>) -> List { pub fn create_bib_list(entries: Vec<BibliographyEntryReference>) -> List {
@ -72,7 +39,6 @@ pub fn create_bib_list(entries: Vec<BibliographyEntryReference>) -> List {
for entry in entries { for entry in entries {
entry entry
.lock() .lock()
.unwrap()
.raw_fields .raw_fields
.insert("ord".to_string(), count.to_string()); .insert("ord".to_string(), count.to_string());
list.add_item(get_item_for_entry(entry)); list.add_item(get_item_for_entry(entry));
@ -84,7 +50,7 @@ pub fn create_bib_list(entries: Vec<BibliographyEntryReference>) -> List {
/// Returns the list item for a bib entry /// Returns the list item for a bib entry
fn get_item_for_entry(entry: BibliographyEntryReference) -> ListItem { fn get_item_for_entry(entry: BibliographyEntryReference) -> ListItem {
let entry = entry.lock().unwrap(); let entry = entry.lock();
match &entry.bib_type { match &entry.bib_type {
BibliographyType::Article(a) => get_item_for_article(&*entry, a), BibliographyType::Article(a) => get_item_for_article(&*entry, a),

@ -1,7 +0,0 @@
pub const BIB_REF_DISPLAY: &str = "bib-ref-display";
pub const META_LANG: &str = "language";
pub const IMP_IGNORE: &str = "ignored-imports";
pub const IMP_STYLESHEETS: &str = "included-stylesheets";
pub const IMP_CONFIGS: &str = "included-configs";
pub const IMP_BIBLIOGRAPHY: &str = "included-bibliography";

@ -1,156 +0,0 @@
use crate::elements::MetadataValue;
use crate::references::configuration::keys::{BIB_REF_DISPLAY, META_LANG};
use crate::references::templates::Template;
use serde::export::TryFrom;
use std::collections::HashMap;
use std::sync::{Arc, RwLock};
pub(crate) mod keys;
#[derive(Clone, Debug)]
pub enum Value {
String(String),
Bool(bool),
Float(f64),
Integer(i64),
Template(Template),
Array(Vec<Value>),
}
#[derive(Clone, Debug)]
pub struct ConfigEntry {
inner: Value,
}
pub type ConfigRefEntry = Arc<RwLock<ConfigEntry>>;
#[derive(Clone, Debug)]
pub struct Configuration {
config: Arc<RwLock<HashMap<String, ConfigRefEntry>>>,
}
impl Value {
pub fn as_string(&self) -> String {
match self {
Value::String(string) => string.clone(),
Value::Integer(int) => format!("{}", int),
Value::Float(f) => format!("{:02}", f),
Value::Bool(b) => format!("{}", b),
Value::Array(a) => a.iter().fold("".to_string(), |a, b| {
format!("{} \"{}\"", a, b.as_string())
}),
_ => "".to_string(),
}
}
}
impl ConfigEntry {
pub fn new(value: Value) -> Self {
Self { inner: value }
}
pub fn set(&mut self, value: Value) {
self.inner = value;
}
pub fn get(&self) -> &Value {
&self.inner
}
}
impl Default for Configuration {
fn default() -> Self {
let mut self_config = Self::new();
self_config.set(BIB_REF_DISPLAY, Value::String("{{number}}".to_string()));
self_config.set(META_LANG, Value::String("en".to_string()));
self_config
}
}
impl Configuration {
pub fn new() -> Self {
Self {
config: Arc::new(RwLock::new(HashMap::new())),
}
}
/// returns the value of a config entry
pub fn get_entry(&self, key: &str) -> Option<ConfigEntry> {
let config = self.config.read().unwrap();
if let Some(entry) = config.get(key) {
let value = entry.read().unwrap();
Some(value.clone())
} else {
None
}
}
/// returns a config entry that is a reference to a value
pub fn get_ref_entry(&self, key: &str) -> Option<ConfigRefEntry> {
let config = self.config.read().unwrap();
if let Some(entry) = config.get(&key.to_string()) {
Some(Arc::clone(entry))
} else {
None
}
}
/// Sets a config parameter
pub fn set(&mut self, key: &str, value: Value) {
let mut config = self.config.write().unwrap();
if let Some(entry) = config.get(&key.to_string()) {
entry.write().unwrap().set(value)
} else {
config.insert(
key.to_string(),
Arc::new(RwLock::new(ConfigEntry::new(value))),
);
}
}
/// Sets a config value based on a metadata value
pub fn set_from_meta(&mut self, key: &str, value: MetadataValue) {
match value {
MetadataValue::String(string) => self.set(key, Value::String(string)),
MetadataValue::Bool(bool) => self.set(key, Value::Bool(bool)),
MetadataValue::Float(f) => self.set(key, Value::Float(f)),
MetadataValue::Integer(i) => self.set(key, Value::Integer(i)),
MetadataValue::Template(t) => self.set(key, Value::Template(t)),
_ => {}
}
}
pub fn set_from_toml(&mut self, value: &toml::Value) -> Option<()> {
let table = value.as_table().cloned()?;
table.iter().for_each(|(k, v)| {
match v {
toml::Value::Table(_) => self.set_from_toml(v).unwrap_or(()),
_ => self.set(k, Value::try_from(v.clone()).unwrap()),
};
});
Some(())
}
}
impl TryFrom<toml::Value> for Value {
type Error = ();
fn try_from(value: toml::Value) -> Result<Self, Self::Error> {
match value {
toml::Value::Table(_) => Err(()),
toml::Value::Float(f) => Ok(Value::Float(f)),
toml::Value::Integer(i) => Ok(Value::Integer(i)),
toml::Value::String(s) => Ok(Value::String(s)),
toml::Value::Boolean(b) => Ok(Value::Bool(b)),
toml::Value::Datetime(dt) => Ok(Value::String(dt.to_string())),
toml::Value::Array(a) => Ok(Value::Array(
a.iter()
.cloned()
.filter_map(|e| Value::try_from(e).ok())
.collect::<Vec<Value>>(),
)),
}
}
}

@ -0,0 +1,195 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{
Anchor, BoldText, Inline, ItalicText, Line, List, ListItem, PlainText, TextLine,
};
use parking_lot::Mutex;
use std::cmp::Ordering;
use std::collections::HashMap;
use std::sync::Arc;
use crate::bold_text;
use crate::italic_text;
use crate::plain_text;
const K_LONG: &str = "long";
const K_DESCRIPTION: &str = "description";
/// A glossary manager responsible for handling glossary entries and references to those entries
#[derive(Clone, Debug)]
pub struct GlossaryManager {
entries: HashMap<String, Arc<Mutex<GlossaryEntry>>>,
references: Vec<Arc<Mutex<GlossaryReference>>>,
}
/// A single glossary entry
#[derive(Clone, Debug)]
pub struct GlossaryEntry {
pub short: String,
pub long: String,
pub description: String,
pub is_assigned: bool,
}
/// A single glossary reference
#[derive(Clone, Debug)]
pub struct GlossaryReference {
pub short: String,
pub display: GlossaryDisplay,
pub entry: Option<Arc<Mutex<GlossaryEntry>>>,
}
/// A glossary display value that determines which value
/// of a glossary entry will be rendered
#[derive(Clone, Debug)]
pub enum GlossaryDisplay {
Short,
Long,
}
impl GlossaryManager {
/// Creates a new glossary manager
pub fn new() -> Self {
Self {
entries: HashMap::new(),
references: Vec::new(),
}
}
/// Adds a new glossary entry to the manager
pub fn add_entry(&mut self, entry: GlossaryEntry) -> Arc<Mutex<GlossaryEntry>> {
let key = entry.short.clone();
let entry = Arc::new(Mutex::new(entry));
self.entries.insert(key.clone(), Arc::clone(&entry));
log::debug!("Added glossary entry {}", key);
entry
}
/// Adds a new glossary reference to the manager
pub fn add_reference(&mut self, reference: GlossaryReference) -> Arc<Mutex<GlossaryReference>> {
let reference = Arc::new(Mutex::new(reference));
self.references.push(Arc::clone(&reference));
reference
}
/// Assignes bibliography entries from toml
pub fn assign_from_toml(&mut self, value: toml::Value) -> Result<(), String> {
let table = value.as_table().ok_or("Failed to parse toml".to_string())?;
log::debug!("Assigning glossary entries from toml...");
for (key, value) in table {
let long = value.get(K_LONG).and_then(|l| l.as_str());
let description = value.get(K_DESCRIPTION).and_then(|d| d.as_str());
if let Some(long) = long {
if let Some(description) = description {
let entry = GlossaryEntry {
description: description.to_string(),
long: long.to_string(),
short: key.clone(),
is_assigned: false,
};
self.add_entry(entry);
} else {
log::warn!(
"Failed to parse glossary entry {}: Missing field '{}'",
key,
K_DESCRIPTION
);
}
} else {
log::warn!(
"Failed to parse glossary entry {}: Missing field '{}'",
key,
K_LONG
);
}
}
Ok(())
}
/// Assignes entries to references
pub fn assign_entries_to_references(&self) {
for reference in &self.references {
let mut reference = reference.lock();
if let Some(entry) = self.entries.get(&reference.short) {
reference.entry = Some(Arc::clone(entry));
let mut entry = entry.lock();
if !entry.is_assigned {
entry.is_assigned = true;
reference.display = GlossaryDisplay::Long;
}
}
}
}
/// Creates a sorted glossary list from the glossary entries
pub fn create_glossary_list(&self) -> List {
let mut list = List::new();
let mut entries = self
.entries
.values()
.filter(|e| e.lock().is_assigned)
.cloned()
.collect::<Vec<Arc<Mutex<GlossaryEntry>>>>();
entries.sort_by(|a, b| {
let a = a.lock();
let b = b.lock();
if a.short > b.short {
Ordering::Greater
} else if a.short < b.short {
Ordering::Less
} else {
Ordering::Equal
}
});
for entry in &entries {
let entry = entry.lock();
let mut line = TextLine::new();
line.subtext.push(bold_text!(entry.short.clone()));
line.subtext.push(plain_text!(" - ".to_string()));
line.subtext.push(italic_text!(entry.long.clone()));
line.subtext.push(plain_text!(" - ".to_string()));
line.subtext.push(plain_text!(entry.description.clone()));
list.add_item(ListItem::new(
Line::Anchor(Anchor {
inner: Box::new(Line::Text(line)),
key: entry.short.clone(),
}),
0,
false,
));
}
list
}
}
impl GlossaryReference {
/// Creates a new glossary reference
pub fn new(key: String) -> Self {
Self {
short: key,
display: GlossaryDisplay::Short,
entry: None,
}
}
/// Creates a new glossary reference with a given display parameter
pub fn with_display(key: String, display: GlossaryDisplay) -> Self {
Self {
short: key,
display,
entry: None,
}
}
}

@ -1,4 +1,10 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod bibliography; pub mod bibliography;
pub mod configuration; pub mod glossary;
pub mod placeholders; pub mod placeholders;
pub mod templates; pub mod templates;

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::*; use crate::elements::*;
use crate::references::bibliography::create_bib_list; use crate::references::bibliography::create_bib_list;
use chrono::prelude::*; use chrono::prelude::*;
@ -31,9 +37,12 @@ const S_VALUE: &str = "value";
const P_TOC: &str = "toc"; const P_TOC: &str = "toc";
const P_BIB: &str = "bib"; const P_BIB: &str = "bib";
const P_GLS: &str = "gls";
const P_DATE: &str = "date"; const P_DATE: &str = "date";
const P_TIME: &str = "time"; const P_TIME: &str = "time";
const P_DATETIME: &str = "datetime"; const P_DATETIME: &str = "datetime";
const P_AUTHOR: &str = "author";
const P_TITLE: &str = "title";
impl ProcessPlaceholders for Document { impl ProcessPlaceholders for Document {
/// parses all placeholders and assigns values to them /// parses all placeholders and assigns values to them
@ -52,6 +61,9 @@ impl ProcessPlaceholders for Document {
P_BIB => pholder.set_value(block!(Block::List(create_bib_list( P_BIB => pholder.set_value(block!(Block::List(create_bib_list(
self.bibliography.get_entry_list_by_occurrence() self.bibliography.get_entry_list_by_occurrence()
)))), )))),
P_GLS => pholder.set_value(block!(Block::List(
self.glossary.lock().create_glossary_list()
))),
P_DATE => pholder.set_value(inline!(Inline::Plain(PlainText { P_DATE => pholder.set_value(inline!(Inline::Plain(PlainText {
value: get_date_string() value: get_date_string()
}))), }))),
@ -61,10 +73,24 @@ impl ProcessPlaceholders for Document {
P_DATETIME => pholder.set_value(inline!(Inline::Plain(PlainText { P_DATETIME => pholder.set_value(inline!(Inline::Plain(PlainText {
value: format!("{} {}", get_date_string(), get_time_string()) value: format!("{} {}", get_date_string(), get_time_string())
}))), }))),
P_AUTHOR => {
if let Some(value) = self.config.lock().metadata.author.clone() {
pholder.set_value(inline!(Inline::Plain(PlainText { value })))
}
}
P_TITLE => {
if let Some(value) = self.config.lock().metadata.title.clone() {
pholder.set_value(inline!(Inline::Plain(PlainText { value })))
}
}
_ => { _ => {
if let Some(entry) = self.config.get_entry(pholder.name.to_lowercase().as_str()) if let Some(value) = self
.config
.lock()
.custom_attributes
.get(pholder.name.to_lowercase().as_str())
.cloned()
{ {
let value = entry.get().as_string();
pholder.set_value(inline!(Inline::Plain(PlainText { value }))) pholder.set_value(inline!(Inline::Plain(PlainText { value })))
} }
} }
@ -90,7 +116,7 @@ impl ProcessPlaceholders for Document {
}))); })));
if let Some(meta) = &pholder.metadata { if let Some(meta) = &pholder.metadata {
if let Some(value) = meta.data.get(S_VALUE) { if let Some(value) = meta.data.get(S_VALUE) {
self.config.set_from_meta(key, value.clone()) self.config.lock().set_from_meta(key, value.clone())
} }
} }
} }

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{Block, Element, Inline, Line, ListItem}; use crate::elements::{Block, Element, Inline, Line, ListItem};
use std::collections::HashMap; use std::collections::HashMap;
use std::sync::{Arc, RwLock}; use std::sync::{Arc, RwLock};

@ -0,0 +1,24 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct FeatureSettings {
pub embed_external: bool,
pub smart_arrows: bool,
pub include_mathjax: bool,
}
impl Default for FeatureSettings {
fn default() -> Self {
Self {
embed_external: true,
smart_arrows: true,
include_mathjax: true,
}
}
}

@ -0,0 +1,24 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct ImageSettings {
pub format: Option<String>,
pub max_width: Option<u32>,
pub max_height: Option<u32>,
}
impl Default for ImageSettings {
fn default() -> Self {
Self {
format: None,
max_height: None,
max_width: None,
}
}
}

@ -0,0 +1,26 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct ImportSettings {
pub ignored_imports: Vec<String>,
pub included_stylesheets: Vec<String>,
pub included_bibliography: Vec<String>,
pub included_glossaries: Vec<String>,
}
impl Default for ImportSettings {
fn default() -> Self {
Self {
ignored_imports: Vec::with_capacity(0),
included_stylesheets: vec!["style.css".to_string()],
included_bibliography: vec!["Bibliography.toml".to_string()],
included_glossaries: vec!["Glossary.toml".to_string()],
}
}
}

@ -0,0 +1,28 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct MetadataSettings {
pub title: Option<String>,
pub author: Option<String>,
pub description: Option<String>,
pub keywords: Vec<String>,
pub language: String,
}
impl Default for MetadataSettings {
fn default() -> Self {
Self {
title: None,
author: None,
description: None,
keywords: Vec::new(),
language: "en".to_string(),
}
}
}

@ -0,0 +1,131 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{Metadata, MetadataValue};
use crate::settings::feature_settings::FeatureSettings;
use crate::settings::image_settings::ImageSettings;
use crate::settings::import_settings::ImportSettings;
use crate::settings::metadata_settings::MetadataSettings;
use crate::settings::pdf_settings::PDFSettings;
use crate::settings::style_settings::StyleSettings;
use config::{ConfigError, Source};
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::error::Error;
use std::fmt::{self, Display};
use std::io;
use std::mem;
use std::path::PathBuf;
pub mod feature_settings;
pub mod image_settings;
pub mod import_settings;
pub mod metadata_settings;
pub mod pdf_settings;
pub mod style_settings;
pub type SettingsResult<T> = Result<T, SettingsError>;
#[derive(Debug)]
pub enum SettingsError {
IoError(io::Error),
ConfigError(ConfigError),
TomlError(toml::ser::Error),
}
impl Display for SettingsError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
Self::IoError(e) => write!(f, "IO Error: {}", e),
Self::ConfigError(e) => write!(f, "Config Error: {}", e),
Self::TomlError(e) => write!(f, "Toml Error: {}", e),
}
}
}
impl Error for SettingsError {}
impl From<io::Error> for SettingsError {
fn from(e: io::Error) -> Self {
Self::IoError(e)
}
}
impl From<ConfigError> for SettingsError {
fn from(e: ConfigError) -> Self {
Self::ConfigError(e)
}
}
impl From<toml::ser::Error> for SettingsError {
fn from(e: toml::ser::Error) -> Self {
Self::TomlError(e)
}
}
#[derive(Serialize, Deserialize, Clone, Debug, Default)]
pub struct Settings {
pub metadata: MetadataSettings,
pub features: FeatureSettings,
pub imports: ImportSettings,
pub pdf: PDFSettings,
pub images: ImageSettings,
pub style: StyleSettings,
pub custom_attributes: HashMap<String, String>,
}
impl Source for Settings {
fn clone_into_box(&self) -> Box<dyn Source + Send + Sync> {
Box::new(self.clone())
}
fn collect(&self) -> Result<HashMap<String, config::Value>, config::ConfigError> {
let source_str =
toml::to_string(&self).map_err(|e| config::ConfigError::Foreign(Box::new(e)))?;
let result = toml::de::from_str(&source_str)
.map_err(|e| config::ConfigError::Foreign(Box::new(e)))?;
Ok(result)
}
}
impl Settings {
/// Loads the settings from the specified path
pub fn load(path: PathBuf) -> SettingsResult<Self> {
let mut settings = config::Config::default();
settings
.merge(Self::default())?
.merge(config::File::from(path))?;
let settings: Self = settings.try_into()?;
Ok(settings)
}
/// Merges the current settings with the settings from the given path
/// returning updated settings
pub fn merge(&mut self, path: PathBuf) -> SettingsResult<()> {
let mut settings = config::Config::default();
settings
.merge(self.clone())?
.merge(config::File::from(path))?;
let mut settings: Self = settings.try_into()?;
mem::swap(self, &mut settings); // replace the old settings with the new ones
Ok(())
}
pub fn append_metadata<M: Metadata>(&mut self, metadata: M) {
let entries = metadata.get_string_map();
for (key, value) in entries {
self.custom_attributes.insert(key, value);
}
}
pub fn set_from_meta(&mut self, key: &str, value: MetadataValue) {
self.custom_attributes
.insert(key.to_string(), value.to_string());
}
}

@ -0,0 +1,54 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct PDFSettings {
pub display_header_footer: bool,
pub header_template: Option<String>,
pub footer_template: Option<String>,
pub page_height: Option<f32>,
pub page_width: Option<f32>,
pub page_scale: f32,
pub margin: PDFMarginSettings,
}
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct PDFMarginSettings {
pub top: Option<f32>,
pub bottom: Option<f32>,
pub left: Option<f32>,
pub right: Option<f32>,
}
impl Default for PDFMarginSettings {
fn default() -> Self {
Self {
top: Some(0.5),
bottom: Some(0.5),
left: None,
right: None,
}
}
}
impl Default for PDFSettings {
fn default() -> Self {
Self {
display_header_footer: true,
header_template: Some("<div></div>".to_string()),
footer_template: Some(
include_str!("../format/chromium_pdf/assets/default-footer-template.html")
.to_string(),
),
page_height: None,
page_width: None,
page_scale: 1.0,
margin: Default::default(),
}
}
}

@ -0,0 +1,32 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct StyleSettings {
pub bib_ref_display: String,
pub theme: Theme,
}
impl Default for StyleSettings {
fn default() -> Self {
Self {
bib_ref_display: "{{number}}".to_string(),
theme: Theme::GitHub,
}
}
}
#[derive(Serialize, Deserialize, Clone, Debug)]
pub enum Theme {
GitHub,
SolarizedDark,
SolarizedLight,
OceanDark,
OceanLight,
MagicDark,
}

@ -0,0 +1,69 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use platform_dirs::{AppDirs, AppUI};
use sha2::Digest;
use std::fs;
use std::io;
use std::path::PathBuf;
#[derive(Clone, Debug)]
pub struct CacheStorage {
location: PathBuf,
}
impl CacheStorage {
pub fn new() -> Self {
lazy_static::lazy_static! {
static ref APP_DIRS: AppDirs = AppDirs::new(Some("snekdown"), AppUI::CommandLine).unwrap();
}
Self {
location: APP_DIRS.cache_dir.clone(),
}
}
/// Returns the cache path for a given file
pub fn get_file_path(&self, path: &PathBuf) -> PathBuf {
let mut hasher = sha2::Sha256::default();
hasher.update(path.to_string_lossy().as_bytes());
let mut file_name = PathBuf::from(format!("{:x}", hasher.finalize()));
if let Some(extension) = path.extension() {
file_name.set_extension(extension);
}
log::trace!("Cache path is {:?}", path);
return self.location.join(PathBuf::from(file_name));
}
/// Returns if the given file exists in the cache
pub fn has_file(&self, path: &PathBuf) -> bool {
let cache_path = self.get_file_path(path);
cache_path.exists()
}
/// Writes into the corresponding cache file
pub fn read(&self, path: &PathBuf) -> io::Result<Vec<u8>> {
let cache_path = self.get_file_path(path);
fs::read(cache_path)
}
/// Reads the corresponding cache file
pub fn write<R: AsRef<[u8]>>(&self, path: &PathBuf, contents: R) -> io::Result<()> {
let cache_path = self.get_file_path(path);
fs::write(cache_path, contents)
}
/// Clears the cache directory by deleting and recreating it
pub fn clear(&self) -> io::Result<()> {
fs::remove_dir_all(&self.location)?;
fs::create_dir(&self.location)
}
}

@ -1,8 +1,16 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::utils::caching::CacheStorage;
use indicatif::{ProgressBar, ProgressStyle}; use indicatif::{ProgressBar, ProgressStyle};
use parking_lot::Mutex;
use rayon::prelude::*; use rayon::prelude::*;
use std::fs::read; use std::fs::read;
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::{Arc, Mutex}; use std::sync::Arc;
/// A manager for downloading urls in parallel /// A manager for downloading urls in parallel
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
@ -20,7 +28,8 @@ impl DownloadManager {
/// Adds a new pending download /// Adds a new pending download
pub fn add_download(&mut self, path: String) -> Arc<Mutex<PendingDownload>> { pub fn add_download(&mut self, path: String) -> Arc<Mutex<PendingDownload>> {
let pending = Arc::new(Mutex::new(PendingDownload::new(path.clone()))); let download = PendingDownload::new(path.clone());
let pending = Arc::new(Mutex::new(download));
self.downloads.push(Arc::clone(&pending)); self.downloads.push(Arc::clone(&pending));
log::debug!("Added download {}", path); log::debug!("Added download {}", path);
@ -30,7 +39,7 @@ impl DownloadManager {
/// Downloads all download entries /// Downloads all download entries
pub fn download_all(&self) { pub fn download_all(&self) {
let pb = Arc::new(Mutex::new(ProgressBar::new(self.downloads.len() as u64))); let pb = Arc::new(Mutex::new(ProgressBar::new(self.downloads.len() as u64)));
pb.lock().unwrap().set_style( pb.lock().set_style(
ProgressStyle::default_bar() ProgressStyle::default_bar()
.template("Fetching Embeds: [{bar:40.cyan/blue}]") .template("Fetching Embeds: [{bar:40.cyan/blue}]")
.progress_chars("=> "), .progress_chars("=> "),
@ -38,10 +47,10 @@ impl DownloadManager {
let pb_cloned = Arc::clone(&pb); let pb_cloned = Arc::clone(&pb);
self.downloads.par_iter().for_each_with(pb_cloned, |pb, d| { self.downloads.par_iter().for_each_with(pb_cloned, |pb, d| {
d.lock().unwrap().download(); d.lock().download();
pb.lock().unwrap().inc(1); pb.lock().inc(1);
}); });
pb.lock().unwrap().finish_and_clear(); pb.lock().finish_and_clear();
} }
} }
@ -51,11 +60,18 @@ impl DownloadManager {
pub struct PendingDownload { pub struct PendingDownload {
pub(crate) path: String, pub(crate) path: String,
pub(crate) data: Option<Vec<u8>>, pub(crate) data: Option<Vec<u8>>,
pub(crate) use_cache: bool,
cache: CacheStorage,
} }
impl PendingDownload { impl PendingDownload {
pub fn new(path: String) -> Self { pub fn new(path: String) -> Self {
Self { path, data: None } Self {
path,
data: None,
use_cache: true,
cache: CacheStorage::new(),
}
} }
/// Downloads the file and writes the content to the content field /// Downloads the file and writes the content to the content field
@ -66,19 +82,52 @@ impl PendingDownload {
/// Reads the fiels content or downloads it if it doesn't exist in the filesystem /// Reads the fiels content or downloads it if it doesn't exist in the filesystem
fn read_content(&self) -> Option<Vec<u8>> { fn read_content(&self) -> Option<Vec<u8>> {
let path = PathBuf::from(&self.path); let path = PathBuf::from(&self.path);
if path.exists() { if path.exists() {
read(path).ok() read(path).ok()
} else if let Some(contents) = self.read_from_cache() {
log::debug!("Read {} from cache.", self.path.clone());
Some(contents)
} else { } else {
self.download_content() if let Some(data) = self.download_content() {
self.store_to_cache(&data);
Some(data)
} else {
None
}
}
}
/// Stores the data to a cache file to retrieve it later
fn store_to_cache(&self, data: &Vec<u8>) {
if self.use_cache {
let path = PathBuf::from(&self.path);
self.cache
.write(&path, data.clone())
.unwrap_or_else(|_| log::warn!("Failed to write file to cache: {}", self.path));
}
}
fn read_from_cache(&self) -> Option<Vec<u8>> {
let path = PathBuf::from(&self.path);
if self.cache.has_file(&path) && self.use_cache {
self.cache.read(&path).ok()
} else {
None
} }
} }
/// Downloads the content from the given url /// Downloads the content from the given url
fn download_content(&self) -> Option<Vec<u8>> { fn download_content(&self) -> Option<Vec<u8>> {
reqwest::blocking::get(&self.path) download_path(self.path.clone())
.ok()
.map(|c| c.bytes())
.and_then(|b| b.ok())
.map(|b| b.to_vec())
} }
} }
pub fn download_path(path: String) -> Option<Vec<u8>> {
reqwest::blocking::get(&path)
.ok()
.map(|c| c.bytes())
.and_then(|b| b.ok())
.map(|b| b.to_vec())
}

@ -0,0 +1,249 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::Metadata;
use crate::utils::caching::CacheStorage;
use crate::utils::downloads::download_path;
use image::imageops::FilterType;
use image::io::Reader as ImageReader;
use image::{GenericImageView, ImageFormat, ImageResult};
use indicatif::{ProgressBar, ProgressStyle};
use mime::Mime;
use parking_lot::Mutex;
use rayon::prelude::*;
use std::io;
use std::io::Cursor;
use std::path::PathBuf;
use std::sync::Arc;
#[derive(Clone, Debug)]
pub struct ImageConverter {
images: Vec<Arc<Mutex<PendingImage>>>,
target_format: Option<ImageFormat>,
target_size: Option<(u32, u32)>,
}
impl ImageConverter {
pub fn new() -> Self {
Self {
images: Vec::new(),
target_format: None,
target_size: None,
}
}
pub fn set_target_size(&mut self, target_size: (u32, u32)) {
self.target_size = Some(target_size)
}
pub fn set_target_format(&mut self, target_format: ImageFormat) {
self.target_format = Some(target_format);
}
/// Adds an image to convert
pub fn add_image(&mut self, path: PathBuf) -> Arc<Mutex<PendingImage>> {
let image = Arc::new(Mutex::new(PendingImage::new(path)));
self.images.push(image.clone());
image
}
/// Converts all images
pub fn convert_all(&mut self) {
let pb = Arc::new(Mutex::new(ProgressBar::new(self.images.len() as u64)));
pb.lock().set_style(
ProgressStyle::default_bar()
.template("Processing images: [{bar:40.cyan/blue}]")
.progress_chars("=> "),
);
self.images.par_iter().for_each(|image| {
let mut image = image.lock();
if let Err(e) = image.convert(self.target_format.clone(), self.target_size.clone()) {
log::error!("Failed to embed image {:?}: {}", image.path, e)
}
pb.lock().tick();
});
pb.lock().finish_and_clear();
}
}
#[derive(Clone, Debug)]
pub struct PendingImage {
pub path: PathBuf,
pub data: Option<Vec<u8>>,
cache: CacheStorage,
pub mime: Mime,
brightness: Option<i32>,
contrast: Option<f32>,
huerotate: Option<i32>,
grayscale: bool,
invert: bool,
}
impl PendingImage {
pub fn new(path: PathBuf) -> Self {
let mime = get_mime(&path);
Self {
path,
data: None,
cache: CacheStorage::new(),
mime,
brightness: None,
contrast: None,
grayscale: false,
invert: false,
huerotate: None,
}
}
pub fn assign_from_meta<M: Metadata>(&mut self, meta: &M) {
if let Some(brightness) = meta.get_integer("brightness") {
self.brightness = Some(brightness as i32);
}
if let Some(contrast) = meta.get_float("contrast") {
self.contrast = Some(contrast as f32);
}
if let Some(huerotate) = meta.get_float("huerotate") {
self.huerotate = Some(huerotate as i32);
}
self.grayscale = meta.get_bool("grayscale");
self.invert = meta.get_bool("invert");
}
/// Converts the image to the specified target format (specified by target_extension)
pub fn convert(
&mut self,
target_format: Option<ImageFormat>,
target_size: Option<(u32, u32)>,
) -> ImageResult<()> {
let format = target_format
.or_else(|| {
self.path
.extension()
.and_then(|extension| ImageFormat::from_extension(extension))
})
.unwrap_or(ImageFormat::Png);
let output_path = self.get_output_path(format, target_size);
self.mime = get_mime(&output_path);
if self.cache.has_file(&output_path) {
self.data = Some(self.cache.read(&output_path)?)
} else {
self.convert_image(format, target_size)?;
if let Some(data) = &self.data {
self.cache.write(&output_path, data)?;
}
}
Ok(())
}
/// Converts the image
fn convert_image(
&mut self,
format: ImageFormat,
target_size: Option<(u32, u32)>,
) -> ImageResult<()> {
let mut image = ImageReader::open(self.get_path()?)?.decode()?;
if let Some((width, height)) = target_size {
let dimensions = image.dimensions();
if dimensions.0 > width || dimensions.1 > height {
image = image.resize(width, height, FilterType::Lanczos3);
}
}
if let Some(brightness) = self.brightness {
image = image.brighten(brightness);
}
if let Some(contrast) = self.contrast {
image = image.adjust_contrast(contrast);
}
if let Some(rotate) = self.huerotate {
image = image.huerotate(rotate);
}
if self.grayscale {
image = image.grayscale();
}
if self.invert {
image.invert();
}
let data = Vec::new();
let mut writer = Cursor::new(data);
image.write_to(&mut writer, format)?;
self.data = Some(writer.into_inner());
Ok(())
}
/// Returns the path of the file
fn get_path(&self) -> io::Result<PathBuf> {
if !self.path.exists() {
if self.cache.has_file(&self.path) {
return Ok(self.cache.get_file_path(&self.path));
}
if let Some(data) = download_path(self.path.to_string_lossy().to_string()) {
self.cache.write(&self.path, data)?;
return Ok(self.cache.get_file_path(&self.path));
}
}
Ok(self.path.clone())
}
/// Returns the output file name after converting the image
fn get_output_path(
&self,
target_format: ImageFormat,
target_size: Option<(u32, u32)>,
) -> PathBuf {
let mut path = self.path.clone();
let mut file_name = path.file_stem().unwrap().to_string_lossy().to_string();
let extension = target_format.extensions_str()[0];
let type_name = format!("{:?}", target_format);
if let Some(target_size) = target_size {
file_name += &*format!("-w{}-h{}", target_size.0, target_size.1);
}
if let Some(b) = self.brightness {
file_name += &*format!("-b{}", b);
}
if let Some(c) = self.contrast {
file_name += &*format!("-c{}", c);
}
if let Some(h) = self.huerotate {
file_name += &*format!("-h{}", h);
}
file_name += &*format!("{}-{}", self.invert, self.grayscale);
file_name += format!("-{}", type_name).as_str();
path.set_file_name(file_name);
path.set_extension(extension);
path
}
}
fn get_mime(path: &PathBuf) -> Mime {
let mime = mime_guess::from_ext(
path.clone()
.extension()
.and_then(|e| e.to_str())
.unwrap_or("png"),
)
.first()
.unwrap_or(mime::IMAGE_PNG);
mime
}

@ -0,0 +1,54 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
#[macro_export]
macro_rules! plain_text {
($e:expr) => {
Inline::Plain(PlainText { value: $e })
};
}
#[macro_export]
macro_rules! bold_text {
($e:expr) => {
Inline::Bold(BoldText {
value: vec![Inline::Plain(PlainText { value: $e })],
})
};
}
#[macro_export]
macro_rules! italic_text {
($e:expr) => {
Inline::Italic(ItalicText {
value: vec![Inline::Plain(PlainText { value: $e })],
})
};
}
#[macro_export]
macro_rules! url_text {
($e:expr) => {
Inline::Url(Url {
url: $e,
description: None,
})
};
}
#[macro_export]
macro_rules! list_item {
($e:expr, $k:expr) => {
ListItem::new(
Line::Anchor(Anchor {
inner: Box::new(Line::Text($e)),
key: $k,
}),
0,
true,
)
};
}

@ -1,2 +1,11 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod caching;
pub mod downloads; pub mod downloads;
pub mod image_converting;
pub mod macros;
pub mod parsing; pub mod parsing;

@ -1,6 +1,21 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use regex::Regex;
#[macro_export] #[macro_export]
macro_rules! parse { macro_rules! parse {
($str:expr) => { ($str:expr) => {
Parser::new($str.to_string(), None).parse() Parser::new($str.to_string(), None).parse()
}; };
} }
/// Removes a single backslash from the given content
pub(crate) fn remove_single_backlslash<S: ToString>(content: S) -> String {
let content = content.to_string();
lazy_static::lazy_static! {static ref R: Regex = Regex::new(r"\\(?P<c>[^\\])").unwrap();}
R.replace_all(&*content, "$c").to_string()
}

@ -1,97 +0,0 @@
use snekdown::parse;
use snekdown::parser::elements::Block;
use snekdown::Parser;
macro_rules! count_block_elements {
($document:expr, $filter:expr) => {
$document
.elements
.iter()
.filter($filter)
.collect::<Vec<&Block>>()
.len()
};
}
#[test]
fn it_inits() {
let _ = Parser::new("".to_string(), None);
}
#[test]
fn it_parses_sections() {
let document = parse!("# Section\n## Subsection\n# Section");
assert_eq!(
count_block_elements!(document, |e| if let Block::Section(_) = e {
true
} else {
false
}),
2
)
}
#[test]
fn it_parses_tables() {
let document = parse!("|header|header|\n|---|---|\n|col|col|");
assert_eq!(
count_block_elements!(document, |e| if let Block::Table(_) = e {
true
} else {
false
}),
1
)
}
#[test]
fn it_parses_paragraphs() {
let document = parse!("**Bold***Italic*_Underline_`Monospace`^super^~strike~");
assert_eq!(
count_block_elements!(document, |e| if let Block::Paragraph(_) = e {
true
} else {
false
}),
1
)
}
#[test]
fn it_parses_lists() {
let document = parse!("- item1\n- item2\n\n* item\n+ item\n\no item\n1. item");
assert_eq!(
count_block_elements!(document, |e| if let Block::List(l) = e {
l.items.len() == 2
} else {
false
}),
3
)
}
#[test]
fn it_parses_code_blocks() {
let document = parse!("```\ncode\n```\n```rust\ncode\n``````");
assert_eq!(
count_block_elements!(document, |e| if let Block::CodeBlock(_) = e {
true
} else {
false
}),
2
)
}
#[test]
fn it_parses_quotes() {
let document = parse!("> quote\n\n[meta]> quote\n>hm");
assert_eq!(
count_block_elements!(document, |e| if let Block::Quote(_) = e {
true
} else {
false
}),
2
)
}
Loading…
Cancel
Save