Compare commits

...

63 Commits

Author SHA1 Message Date
Trivernis 0f2bd16d28
Merge pull request #23 from Trivernis/develop
Develop
3 years ago
trivernis 3eb11e03e8
Add chromium requirement to README
closes #21

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 659ca55e2c
Merge branch 'main' into develop 3 years ago
trivernis 7da9f5b338
Bump version
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 83734aa530
Merge pull request #22 from silentbat/main
add style.css to init
3 years ago
trivernis c7694d39e0
Change path of temporary file to be absolute
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 0495855d91
Fix problems with creating temp file for pdf rendering
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis cdbfe8c195
Update dependencies
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 0e608e255a
Add fetch feature to headless_chromium dep
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
silentbat 995a3b0582
add style.css to init
when calling `snekdown init`, you should get a default (empty) style.css file now
3 years ago
trivernis 8181119f4a
Update README badge links
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 0fda78bb96
Merge pull request #20 from Trivernis/develop
Reference anchor and improved README
3 years ago
Trivernis 3c9f41ce8c Update issue templates 3 years ago
trivernis 6ebc7cc766
Update Copyright in Source files
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 1e46274003
Add reference anchors and update README
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis e1e63cc35a
Update README
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis fd23a49e49
Merge pull request #19 from Trivernis/develop
Fix windows lineending
3 years ago
trivernis f709465eba
Fix windows lineending
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 31c4711881
Merge pull request #18 from Trivernis/develop
Fix mathjax include
3 years ago
trivernis 4c0d4c560c
Fix mathjax include
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis ccad1547ae
Merge pull request #17 from Trivernis/develop
Changes and fixes to whitespace behaviour
3 years ago
trivernis e50d73a880
Fix quote text alignment
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis d5df0b6f05
Change whitespace behaviour
A single linebreak will be ignored in plain text while a double line
break will be converted to a single linebreak. All following
linebreaks are taken as-is and rendered as normal linebreaks.
(Fixes #13)

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 661a6e5a85
Add flag to print to stdout
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 673a7527d6
Merge pull request #15 from Trivernis/develop
Fixes to monospace and block elements
3 years ago
trivernis cac1cbe8d6
Bump version
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis e99b80ecf0
Fix block documents being ignored at EOF
Fixes that block elements are being ignored when written as the last
element of a file (Fixes #14)x

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 8cf6dd33b7
Fix escapes in monospace and progressbar
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis d93fe6a57b
Merge pull request #11 from Trivernis/develop
HTML Metadata
3 years ago
trivernis d60e6aabd4
Merge branch 'develop' of github.com:/Trivernis/snekdown into develop 3 years ago
trivernis 8747c8c41a
Add metadata embedding in html
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis e24cc4b0f1
Merge pull request #9 from Trivernis/develop
Develop
3 years ago
Trivernis 796dc6ae34
Merge pull request #10 from Trivernis/actions
Change build to build for all features
3 years ago
trivernis 2e53e9e603
Change build to build for all features
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis f6e4bb86da
Hotfix pdf renderer
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 374c385f06
Increment Version
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis d1e71e0204
Merge pull request #8 from Trivernis/develop
New Config and Themes
3 years ago
trivernis 245c908410
Fix nested anchors and placeholders
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis d0186cc90e
Fix toc not using plain text
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 4db1f27315
Merge pull request #7 from Trivernis/feature/themes
Feature/themes
3 years ago
trivernis d82c43e5a3
Add magic dark theme and rename formatting config to style
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis faa8e57ffa
Update Theme
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 097eae5f4e
Add theme config option
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 6b9a30c571
Merge pull request #6 from Trivernis/feature/hierarchical-config
Feature/hierarchical config
3 years ago
trivernis f2bfcd66a9
Merge branch 'main' of github.com:/Trivernis/snekdown into main 3 years ago
Trivernis a98a512b18
Merge pull request #5 from Trivernis/actions
Change build action to also trigger on develop
3 years ago
trivernis 1bfe95d08a
Change build action to also trigger on develop
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis ce311853cc
Update README
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis d84b0d86dd
Change settings format
Settings are now stored in a specific format defined in the settings
module. The Manifest.toml file will be loaded by default. Further
config files can be imported with the default file import syntax.

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 31a4abff39
Add option to configure watch debounce time
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis b21a6ddb5d
Increment version
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
Trivernis 1b086ebdc1
Merge pull request #4 from Trivernis/feature/image-processing
Feature/image processing
3 years ago
trivernis 19bf960d74
Merge branch 'main' into feature/image-processing 3 years ago
Trivernis 9ebf7c1882
Merge pull request #3 from Trivernis/actions
Add build task for PRs
3 years ago
trivernis d4ae239c8a
Add build task for PRs
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 2b07fc39b2
Add huerotate option to image
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 9fb6664a63
Add inverting images and progressbar for image processing
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis b9cf095cfa
Add image filters
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 63ea60b10a
Add conversion of images when configured
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 3acfe9c6e2
Update Readme
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis 53c4818f9d
Change input syntax and add cache storage handler
- input syntax changed to <subcommand> <options...>
- Added clear-cache subcommand
- CacheStorage now handles the caching of files

Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis d42821a6eb
Fix image href pointing to non existent files
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago
trivernis b379fcbea9
Fix default value for display-header-footer
Signed-off-by: trivernis <trivernis@protonmail.com>
3 years ago

@ -0,0 +1,31 @@
---
name: Bug report
about: Create a report to help us improve
title: "[BUG]"
labels: bug
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Write...
2. Render...
3. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. Arch Linux]
- Architecture: [e.g. x86_64, ARM]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.

@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: "[FEATURE]"
labels: enhancement
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

@ -0,0 +1,42 @@
name: Build and Test
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main, develop ]
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Cache build data
uses: actions/cache@v2
with:
path: |
target
~/.cargo/
key: ${{ runner.os }}-cargo-${{ hashFiles('Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Build
run: cargo build --verbose --all-features
- name: Run tests
run: cargo test --verbose --all-features
- name: Test init
run: cargo run -- init
- name: Test HTML
run: cargo run -- render README.md README.html --format html
- name: Test PDF
run: cargo run --all-features -- render README.md README.pdf --format pdf

1229
Cargo.lock generated

File diff suppressed because it is too large Load Diff

@ -1,9 +1,9 @@
[package]
name = "snekdown"
version = "0.30.3"
version = "0.33.4"
authors = ["trivernis <trivernis@protonmail.com>"]
edition = "2018"
license-file = "LICENSE"
license = "GPL-3.0"
readme = "README.md"
description = "A parser for the custom snekdown markdown syntax"
repository = "https://github.com/Trivernis/snekdown"
@ -21,9 +21,9 @@ path = "src/main.rs"
pdf = ["headless_chrome", "failure"]
[dependencies]
charred = "0.3.3"
charred = "0.3.6"
asciimath-rs = "0.5.7"
bibliographix = "0.5.0"
bibliographix = "0.6.0"
crossbeam-utils = "0.7.2"
structopt = "0.3.14"
minify = "1.1.1"
@ -36,9 +36,8 @@ colored = "1.9.3"
gh-emoji = "1.0.3"
notify = "4.0.12"
toml = "0.5.6"
serde ="1.0.111"
serde_derive = "1.0.111"
reqwest = {version = "0.10", features=["blocking"]}
serde = { version = "1.0.111", features = ["serde_derive"] }
reqwest = { version = "0.10", features = ["blocking"] }
mime_guess = "2.0.3"
mime = "0.3.16"
base64 = "0.12.3"
@ -48,6 +47,11 @@ log = "0.4.11"
env_logger = "0.7.1"
indicatif = "0.15.0"
platform-dirs = "0.2.0"
image = "0.23.12"
parking_lot = "0.11.1"
sha2 = "0.9.2"
config = "0.10.1"
rsass = "0.16.0"
headless_chrome = {version = "0.9.0", optional = true}
failure = {version = "0.1.8", optional = true}
headless_chrome = { version = "0.9.0", optional = true, features = ["fetch"] }
failure = { version = "0.1.8", optional = true }

@ -1,368 +1,101 @@
# ![](https://i.imgur.com/FpdXqiT.png) Snekdown - More than just Markdown ![](https://img.shields.io/discord/729250668162056313)
<p align="center">
<img src="https://i.imgur.com/FpdXqiT.png">
</p>
<h1 align="center">Snekdown</h1>
<p align="center">
<i>More than just Markdown</i>
</p>
<p align="center">
<a href="https://github.com/Trivernis/snekdown/actions">
<img src="https://img.shields.io/github/workflow/status/trivernis/snekdown/Build%20and%20Test/main?style=for-the-badge">
</a>
<a href="https://crates.io/crates/snekdown">
<img src="https://img.shields.io/crates/v/snekdown?style=for-the-badge">
</a>
<a href="https://aur.archlinux.org/packages/snekdown">
<img src="https://img.shields.io/aur/version/snekdown?style=for-the-badge">
</a>
<a href="https://discord.gg/vGAXW9nxUv">
<img src="https://img.shields.io/discord/729250668162056313?style=for-the-badge">
</a>
<br/>
<br/>
<a href="https://trivernis.net/snekdown/">Documentation</a> |
<a href="https://github.com/Trivernis/snekdown/releases">Releases</a>
</p>
- - -
## Description
This projects goal is to implement a fast markdown parser with an extended syntax fitted
for my needs.
## Installation
You need a working rust installation, for example by using [rustup](http://rustup.rs).
```sh
cargo install snekdown
```
With pdf rendering
```sh
cargo install snekdown --features pdf
```
## Usage
```
USAGE:
snekdown [FLAGS] [OPTIONS] <input> <output> [SUBCOMMAND]
FLAGS:
-h, --help Prints help information
--no-cache Don't use the cache
-V, --version Prints version information
OPTIONS:
-f, --format <format> the output format [default: html]
ARGS:
<input> Path to the input file
<output> Path for the output file
SUBCOMMANDS:
help Prints this message or the help of the given subcommand(s)
render Default. Parse and render the document
watch Watch the document and its imports and render on change
```
## Syntax
### Images
```md
Simple Syntax
!(url)
Extended syntax with a description
![description](url)
Extended syntax with metadata to specify the size
![description](url)[metadata]
Extended syntax with metadata and no description
!(url)[metadata]
```
When generating the html file the images are base64 embedded. To turn off this behaviour
set the config parameter `embed-external` to `false`.
### Quotes
```md
Simple (default) Syntax
> This is a quote
Multiline
> This is a
> Multiline Quote
Quote with metadata (e.g. Author)
[author=Trivernis year=2020 display='{{author}} - {{year}}']> This is a quote with metadata
```
### Imports
Imports can be used to import a different document to be attached to the main document.
Imports are parsed via multithreading.
```md
<[path]
<[document.md]
<[style.css][type=stylesheet]
```
The parser differentiates four different types of imported files.
- `document` - The default import which is just another snekdown document
- `stylesheet` - CSS Stylesheets that are inclued when rendering
- `bibliography` - A file including bibliography
- `config`/`manifest` - A config file that contains metadata
If no type is provided the parser guesses the type of file from the extension.
### Tables
Tables MUST start with a pipe character `|`
```md
Standalone header:
| header | header | header
Header with rows
| header | header | header
|--------|--------|-------
| row | row | row
```
### Placeholders
Placeholders can be used to insert special elements in a specific place.
Placeholders are always case insensitive.
```md
Insert the table of contents
[[TOC]]
Insert the current date
[[date]]
Insert the current time
[[time]]
```
### Metadata
Additional metadata can be provided for some elements.
```md
String value
[key = value]
String value
[key = "String value"]
Integer value
[key = 123]
Float value
[key = 1.23]
## Core Features
Boolean
[key]
- Imports
- Bibliography & Glossary
- AsciiMath
- Placeholders
- Advanced Images
Boolean
[key = false]
## Prerequisites
Placeholder
[key = [[placeholder]]]
```
Metadata can also be defined in a separate toml file with simple key-value pairs.
Example:
```toml
# bibliography.bib.toml
author = "Snek"
published = "2020"
test-key = ["test value", "test value 2"]
# those files won't get imported
ignored-imports = ["style.css"]
# stylesheets that should be included
included-stylesheets = ["style2.css"]
# other metadata files that should be included
included-configs = []
# bibliography that should be included
included-bibliography = ["mybib.toml"]
# glossary that sould be included
included-glossary = ["myglossary.toml"]
# if external sources (images, stylesheets, MathJax)
# should be embedded into the document (default: true)
embed-external = true
# If SmartArrows should be used (default: true)
smart-arrows = true
# Includes a MathJax script tag in the document to render MathML in chromium.
# (default: true)
include-math-jax = true
### PDF Options - needs the pdf feature enabled ###
# If the header and footer of the pdf should be displayed (default: true)
pdf-display-header-footer = true
# PDF header template of each page (default: empty)
pdf-header-template = "<div><span class='title'></span></div>"
# PDF footer template of each page (default: see chromium_pdf assets)
pdf-footer-template = "<div><span class='pageNumber'></span></div>"
# Top margin of the pdf. Should be between 0 and 1. (default: 1.0)
pdf-margin-top = 1
# Bottom margin of the pdf. Should be between 0 and 1. (default: 1.0)
pdf-margin-bottom = 1
# Left margin of the pdf. Should be between 0 and 1.
pdf-margin-left = 0
# Right margin of the pdf. Should be between 0 and 1.
pdf-margin-right = 0
# Page height of the pdf
pdf-page-height = 100
# Page width of the pdf
pdf-page-width = 80
# The scale at which the website is rendered into pdf.
pdf-page-scale = 1.0
```
The `[Section]` keys are not relevant as the structure gets flattened before the values are read.
#### Usage
```
Hide a section (including subsections) in the TOC
#[toc-hidden] Section
Set the size of an image
!(url)[width = 42% height=auto]
Set the source of a quote
[author=Me date=[[date]] display="{{author}} - {{date}}"]> It's me
Set options for placeholders
[[toc]][ordered]
```
- Google Chrome/Chromium (for PDF rendering)
### Centered Text
```
|| These two lines
|| are centered
```
## Installation
### Inline
```md
*Italic*
**Bold**
~~Striked~~
_Underlined_
^Superscript^
`Monospace`
:Emoji:
§[#0C0]Colored text§[] §[red] red §[]
```
### Binaries
## Bibliography
You can download prebuilt binaries on the [Releases](https://github.com/Trivernis/snekdown/releases) Page.
Bibliography entries can be defined and referenced anywhere in the document.
Definition:
```md
[SD_BOOK]:[type=book, author=Snek, title = "Snekdown Book" date="20.08.2020", publisher=Snek]
[SD_GITHUB]: https://github.com/trivernis/snekdown
```
### Arch Linux
Usage:
```
There is a book about snekdown[^book] and a github repo[^github].
```
Snekdown is available in [the AUR](https://aur.archlinux.org/packages/snekdown).
Entries can also be defined in a separate toml file with the following data layout:
```toml
# snekdown.toml
[BIB_KEY]
key = "value"
### Cargo
[SD_BOOK]
type = "book"
author = "Snek"
title = "Snekdown Book"
date = "20.08.2020"
publisher = "Snek"
You need a working rust installation, for example by using [rustup](http://rustup.rs).
[SD_GITHUB]
type = "website"
url = "https://github.com/trivernis/snekdown"
```sh
cargo install snekdown
```
The valid types for entries and required fields can be found on in the [bibliographix README](https://github.com/Trivernis/bibliographix#bibliography-types-and-fields).
Bibliography entries are not rendered. To render a list of used bibliography insert the
`bib` placeholder at the place you want it to be rendered.
## Glossary
Glossary entries are to be defined in a `glossary.toml` file or any other toml file
that is imported as type `glossary`.
The definition of glossary entries has to follow the following structure
```toml
[SHORT]
long = "Long Form"
description = "The description of the entry"
With pdf rendering
# Example
[HTML]
long = "Hypertext Markup Language"
description = "The markup language of the web"
```sh
cargo install snekdown --features pdf
```
Those glossary entries can be referenced in the snekdown file as follows:
```md
~HTML is widely used for websites.
The format ~HTML is not considered a programming language by some definitions.
~~HTML
```
## Usage
The first occurence of the glossary entry (`~HTML`) always uses the long form.
The second will always be the short form. The long form can be enforced by using two
(`~~HTML`) tildes.
Use `snekdown help` and `snekdown <subcommand> --help` for more information.
## Math
### Rendering
Snekdown allows the embedding of [AsciiMath](http://asciimath.org/):
The AsciiMath parser is provided in the [asciimath-rs](https://github.com/Trivernis/asciimath-rs) crate
`snekdown render <input> <output>`
```
inline math $$ a^2 + b^2 = c^2 $$
### Watching
Block Math
$$$
A = [[1, 2],[3,4]]
$$$
```
`snekdown watch <input> <output>`
The expression get's converted into MathML which is then converted by MathJax when loaded in
the browser.
## Smart Arrows
## Editors
Snekdown automatically renders the sequences `-->`, `==>`, `<--`, `<==`, `<-->`, `<==>` as
their respective unicode arrows (similar to [markdown-it-smartarrows](https://github.com/adam-p/markdown-it-smartarrows)).
This behavior can be turned off by setting the config parameter `smart-arrows` to `false`
(the config needs to be imported before the arrows are used for that to work).
I've created a [VisualStudio Code extension](https://marketplace.visualstudio.com/items?itemName=trivernis.snekdown) for Snekdown.
This extension provides a preview of snekdown files, exports and other commands similar to the
cli. The source code can be found [here](https://github.com/Trivernis/snekdown-vscode-extension).
## Roadmap
The end goal is to have a markup language with features similar to LaTeX.
### Short Term
- [x] Checkboxes
- [x] Emojis (\:emoji:)
- [x] Colors
@ -374,9 +107,19 @@ The end goal is to have a markup language with features similar to LaTeX.
- [x] Chromium based pdf rendering
- [x] Custom Stylesheets
- [x] Smart arrows
- [ ] Custom Elements via templates (50%)
- [ ] Cross References
- [ ] Figures
- [ ] EPUB Rendering
- [ ] Text sizes
- [ ] Title pages
- [ ] Title pages
### Long Term
- Rewrite of the whole parsing process
- Custom Elements via templates
## License
This project is licensed under GPL 3.0. See LICENSE for more information.

@ -1,19 +1,28 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod tokens;
use crate::format::PlaceholderTemplate;
use crate::references::configuration::{ConfigRefEntry, Configuration, Value};
use crate::references::glossary::{GlossaryManager, GlossaryReference};
use crate::references::placeholders::ProcessPlaceholders;
use crate::references::templates::{Template, TemplateVariable};
use crate::settings::Settings;
use crate::utils::downloads::{DownloadManager, PendingDownload};
use crate::utils::image_converting::{ImageConverter, PendingImage};
use asciimath_rs::elements::special::Expression;
use bibliographix::bib_manager::BibManager;
use bibliographix::bibliography::bibliography_entry::BibliographyEntryReference;
use bibliographix::references::bib_reference::BibRefAnchor;
use image::ImageFormat;
use mime::Mime;
use parking_lot::Mutex;
use std::collections::HashMap;
use std::iter::FromIterator;
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::{Arc, Mutex, RwLock};
use std::sync::{Arc, RwLock};
pub const SECTION: &str = "section";
pub const PARAGRAPH: &str = "paragraph";
@ -70,9 +79,10 @@ pub struct Document {
pub(crate) is_root: bool,
pub(crate) path: Option<String>,
pub(crate) placeholders: Vec<Arc<RwLock<Placeholder>>>,
pub config: Configuration,
pub config: Arc<Mutex<Settings>>,
pub bibliography: BibManager,
pub downloads: Arc<Mutex<DownloadManager>>,
pub images: Arc<Mutex<ImageConverter>>,
pub stylesheets: Vec<Arc<Mutex<PendingDownload>>>,
pub glossary: Arc<Mutex<GlossaryManager>>,
}
@ -184,6 +194,7 @@ pub enum Inline {
CharacterCode(CharacterCode),
LineBreak,
Arrow(Arrow),
Anchor(Anchor),
}
#[derive(Clone, Debug)]
@ -236,7 +247,7 @@ pub struct Url {
pub struct Image {
pub(crate) url: Url,
pub(crate) metadata: Option<InlineMetadata>,
pub(crate) download: Arc<Mutex<PendingDownload>>,
pub(crate) image_data: Arc<Mutex<PendingImage>>,
}
#[derive(Clone, Debug)]
@ -248,7 +259,7 @@ pub struct Placeholder {
#[derive(Clone, Debug)]
pub struct RefLink {
pub(crate) description: Box<Line>,
pub(crate) description: TextLine,
pub(crate) reference: String,
}
@ -310,10 +321,11 @@ impl Document {
is_root: true,
path: None,
placeholders: Vec::new(),
config: Configuration::default(),
config: Arc::new(Mutex::new(Settings::default())),
bibliography: BibManager::new(),
stylesheets: Vec::new(),
downloads: Arc::new(Mutex::new(DownloadManager::new())),
images: Arc::new(Mutex::new(ImageConverter::new())),
glossary: Arc::new(Mutex::new(GlossaryManager::new())),
}
}
@ -329,6 +341,7 @@ impl Document {
bibliography: self.bibliography.create_child(),
stylesheets: Vec::new(),
downloads: Arc::clone(&self.downloads),
images: Arc::clone(&self.images),
glossary: Arc::clone(&self.glossary),
}
}
@ -346,7 +359,7 @@ impl Document {
list.ordered = ordered;
self.elements.iter().for_each(|e| match e {
Block::Section(sec) => {
if !sec.get_hide_in_toc() {
if !sec.is_hidden_in_toc() {
let mut item =
ListItem::new(Line::RefLink(sec.header.get_anchor()), 1, ordered);
item.children.append(&mut sec.get_toc_list(ordered).items);
@ -427,9 +440,41 @@ impl Document {
if self.is_root {
self.process_definitions();
self.bibliography.assign_entries_to_references();
self.glossary.lock().unwrap().assign_entries_to_references();
self.glossary.lock().assign_entries_to_references();
self.process_placeholders();
self.process_media();
}
}
fn process_media(&self) {
let downloads = Arc::clone(&self.downloads);
if self.config.lock().features.embed_external {
downloads.lock().download_all();
}
if let Some(s) = &self.config.lock().images.format {
if let Some(format) = ImageFormat::from_extension(s) {
self.images.lock().set_target_format(format);
}
}
let mut image_width = 0;
let mut image_height = 0;
if let Some(i) = self.config.lock().images.max_width {
image_width = i;
image_height = i;
}
if let Some(i) = self.config.lock().images.max_height {
image_height = i;
if image_width <= 0 {
image_width = i;
}
}
if image_width > 0 && image_height > 0 {
self.images
.lock()
.set_target_size((image_width as u32, image_height as u32));
}
self.images.lock().convert_all();
}
}
@ -448,9 +493,10 @@ impl Section {
pub fn get_toc_list(&self, ordered: bool) -> List {
let mut list = List::new();
self.elements.iter().for_each(|e| {
if let Block::Section(sec) = e {
if !sec.get_hide_in_toc() {
if !sec.is_hidden_in_toc() {
let mut item =
ListItem::new(Line::RefLink(sec.header.get_anchor()), 1, ordered);
item.children.append(&mut sec.get_toc_list(ordered).items);
@ -462,7 +508,7 @@ impl Section {
list
}
pub(crate) fn get_hide_in_toc(&self) -> bool {
pub(crate) fn is_hidden_in_toc(&self) -> bool {
if let Some(meta) = &self.metadata {
meta.get_bool("toc-hidden")
} else {
@ -518,7 +564,7 @@ impl Header {
pub fn get_anchor(&self) -> RefLink {
RefLink {
description: Box::new(self.line.clone()),
description: self.line.as_raw_text().as_plain_line(),
reference: self.anchor.clone(),
}
}
@ -574,6 +620,16 @@ impl TextLine {
pub fn add_subtext(&mut self, subtext: Inline) {
self.subtext.push(subtext)
}
pub fn as_plain_line(&self) -> TextLine {
TextLine {
subtext: self
.subtext
.iter()
.map(|s| Inline::Plain(s.as_plain_text()))
.collect(),
}
}
}
impl Table {
@ -616,6 +672,15 @@ impl Quote {
pub fn add_text(&mut self, text: TextLine) {
self.text.push(text)
}
/// Strips a single linebreak from the end of the quote
pub fn strip_linebreak(&mut self) {
if let Some(last) = self.text.last_mut() {
if let Some(Inline::LineBreak) = last.subtext.last() {
last.subtext.pop();
}
}
}
}
impl ImportAnchor {
@ -651,6 +716,8 @@ impl Placeholder {
pub trait Metadata {
fn get_bool(&self, key: &str) -> bool;
fn get_string(&self, key: &str) -> Option<String>;
fn get_float(&self, key: &str) -> Option<f64>;
fn get_integer(&self, key: &str) -> Option<i64>;
fn get_string_map(&self) -> HashMap<String, String>;
}
@ -671,6 +738,24 @@ impl Metadata for InlineMetadata {
}
}
fn get_float(&self, key: &str) -> Option<f64> {
if let Some(MetadataValue::Float(f)) = self.data.get(key) {
Some(*f)
} else if let Some(MetadataValue::Integer(i)) = self.data.get(key) {
Some(*i as f64)
} else {
None
}
}
fn get_integer(&self, key: &str) -> Option<i64> {
if let Some(MetadataValue::Integer(i)) = self.data.get(key) {
Some(*i)
} else {
None
}
}
fn get_string_map(&self) -> HashMap<String, String> {
let mut string_map = HashMap::new();
for (k, v) in &self.data {
@ -687,26 +772,17 @@ impl Metadata for InlineMetadata {
}
}
impl Into<HashMap<String, Value>> for InlineMetadata {
fn into(self) -> HashMap<String, Value> {
HashMap::from_iter(self.data.iter().filter_map(|(k, v)| match v {
MetadataValue::String(s) => Some((k.clone(), Value::String(s.clone()))),
MetadataValue::Bool(b) => Some((k.clone(), Value::Bool(*b))),
MetadataValue::Integer(i) => Some((k.clone(), Value::Integer(*i))),
MetadataValue::Float(f) => Some((k.clone(), Value::Float(*f))),
MetadataValue::Template(t) => Some((k.clone(), Value::Template(t.clone()))),
_ => None,
}))
}
}
impl Image {
pub fn get_content(&self) -> Option<Vec<u8>> {
let mut data = None;
std::mem::swap(&mut data, &mut self.download.lock().unwrap().data);
std::mem::swap(&mut data, &mut self.image_data.lock().data);
data
}
pub fn get_mime_type(&self) -> Mime {
self.image_data.lock().mime.clone()
}
}
#[derive(Clone, Debug)]
@ -719,15 +795,11 @@ pub struct BibEntry {
pub struct BibReference {
pub(crate) key: String,
pub(crate) entry_anchor: Arc<Mutex<BibRefAnchor>>,
pub(crate) display: Option<ConfigRefEntry>,
pub(crate) display: Option<String>,
}
impl BibReference {
pub fn new(
key: String,
display: Option<ConfigRefEntry>,
anchor: Arc<Mutex<BibRefAnchor>>,
) -> Self {
pub fn new(key: String, display: Option<String>, anchor: Arc<Mutex<BibRefAnchor>>) -> Self {
Self {
key: key.to_string(),
display,
@ -736,12 +808,11 @@ impl BibReference {
}
pub(crate) fn get_formatted(&self) -> String {
if let Some(entry) = &self.entry_anchor.lock().unwrap().entry {
let entry = entry.lock().unwrap();
if let Some(entry) = &self.entry_anchor.lock().entry {
let entry = entry.lock();
if let Some(display) = &self.display {
let display = display.read().unwrap();
let mut template = PlaceholderTemplate::new(display.get().as_string());
let mut template = PlaceholderTemplate::new(display.clone());
let mut value_map = HashMap::new();
value_map.insert("key".to_string(), entry.key());
@ -770,3 +841,71 @@ impl MetadataValue {
}
}
}
impl Line {
pub fn as_raw_text(&self) -> TextLine {
match self {
Line::Text(t) => t.clone(),
Line::Ruler(_) => TextLine::new(),
Line::RefLink(r) => r.description.clone(),
Line::Anchor(a) => a.inner.as_raw_text().as_plain_line(),
Line::Centered(c) => c.line.clone(),
Line::BibEntry(_) => TextLine::new(),
}
}
}
impl Inline {
pub fn as_plain_text(&self) -> PlainText {
match self {
Inline::Plain(p) => p.clone(),
Inline::Bold(b) => b.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Italic(i) => i.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Underlined(u) => u.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Striked(s) => s.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Monospace(m) => PlainText {
value: m.value.clone(),
},
Inline::Superscript(s) => s.value.iter().fold(
PlainText {
value: String::new(),
},
|a, b| PlainText {
value: format!("{} {}", a.value, b.as_plain_text().value),
},
),
Inline::Colored(c) => c.value.as_plain_text(),
_ => PlainText {
value: String::new(),
},
}
}
}

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
#![allow(unused)]
pub(crate) const BACKSLASH: char = '\\';
@ -33,6 +39,7 @@ pub(crate) const L_BRACE: char = '}';
pub(crate) const PERCENT: char = '%';
pub(crate) const COMMA: char = ',';
pub(crate) const MATH: char = '$';
pub(crate) const DOLLAR: char = '$';
pub(crate) const AMPERSAND: char = '&';
pub(crate) const QUESTION_MARK: char = '?';
@ -82,6 +89,18 @@ pub(crate) const CHARACTER_STOP: char = SEMICOLON;
pub(crate) const GLOSSARY_REF_START: char = TILDE;
// Reference Anchors
pub(crate) const ANCHOR_START: &'static [char] = &[R_BRACKET, QUESTION_MARK];
pub(crate) const ANCHOR_STOP: char = L_BRACKET;
// References
pub(crate) const REF_START: &'static [char] = &[R_BRACKET, DOLLAR];
pub(crate) const REF_STOP: char = L_BRACKET;
pub(crate) const REF_DESC_START: char = R_PARENTH;
pub(crate) const REF_DESC_STOP: char = L_PARENTH;
// Arrows
pub(crate) const A_RIGHT_ARROW: &'static [char] = &['-', '-', '>'];
@ -130,6 +149,8 @@ pub(crate) const INLINE_SPECIAL_SEQUENCES: &'static [&'static [char]] = &[
A_RIGHT_ARROW,
A_LEFT_ARROW,
A_LEFT_RIGHT_ARROW,
ANCHOR_START,
REF_START,
];
pub(crate) const LIST_SPECIAL_CHARS: [char; 14] = [

@ -0,0 +1,167 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
body {
background-color: $body-background;
overflow-x: hidden;
color: $primary-color;
word-break: break-word;
}
.content {
font-family: "Fira Sans", "Noto Sans", SansSerif, sans-serif;
width: 100vh;
max-width: calc(100% - 4rem);
padding: 2rem;
margin: auto;
background-color: $background-color;
}
h1 {
font-size: 2.2rem;
}
h2 {
font-size: 1.8rem;
}
h3 {
font-size: 1.4rem;
}
h4 {
font-size: 1rem;
}
h5 {
font-size: 0.8rem;
}
h6 {
font-size: 0.4rem;
}
img {
max-width: 100%;
max-height: 100vh;
height: auto;
}
code {
color: $primary-color;
pre {
font-family: "Fira Code", "Mono", monospace;
padding: 0.8em 0.2em;
background-color: $code-background !important;
border-radius: 0.25em;
overflow: auto;
}
&.inlineCode {
font-family: "Fira Code", monospace;
border-radius: 0.1em;
background-color: $code-background;
padding: 0 0.1em;
}
}
.tableWrapper {
overflow-x: auto;
width: 100%;
& > table {
margin: auto;
}
}
table {
border-collapse: collapse;
tr {
&:nth-child(odd) {
background-color: $table-background-alt;
}
&:nth-child(1) {
background-color: $table-background-alt;
font-weight: bold;
border-bottom: 1px solid invert($background-color)
}
}
}
table td, table th {
border-left: 1px solid invert($background-color);
padding: 0.2em 0.5em;
}
table tr td:first-child, table tr th:first-child {
border-left: none;
}
blockquote {
margin-left: 0;
padding-top: 0.2em;
padding-bottom: 0.2em;
background-color: rgba(0, 0, 0, 0);
}
a {
color: $secondary-color;
}
.quote {
border-left: 0.3em solid $quote-background-alt;
border-radius: 0.2em;
padding-left: 1em;
margin-left: 0;
background-color: $quote-background;
.metadata {
font-style: italic;
padding-left: 0.5em;
color: $primary-variant-1;
}
}
.figure {
width: 100%;
display: block;
text-align: center;
.imageDescription {
display: block;
color: $primary-variant-1;
font-style: italic;
}
}
.centered {
text-align: center;
}
.glossaryReference {
text-decoration: none;
color: inherit;
border-bottom: 1px dotted $primary-color;
}
.arrow {
font-family: "Fira Code", "Mono", monospace;
}
@media print {
.content > section > section, .content > section > section {
page-break-inside: avoid;
}
body {
background-color: $background-color !important;
}
}

@ -0,0 +1,20 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #1e1d2c;
$background-color-variant-1: lighten($background-color, 7%);
$background-color-variant-2: lighten($background-color, 14%);
$background-color-variant-3: lighten($background-color, 21%);
$primary-color: #EEE;
$primary-variant-1: darken($primary-color, 14%);
$secondary-color: #3aa7df;
$body-background: darken($background-color, 5%);
$code-background: lighten($background-color, 5%);
$table-background-alt: $background-color-variant-2;
$quote-background: lighten($background-color-variant-1, 3%);
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,20 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: darken(#2b303b, 8%);
$background-color-variant-1: lighten($background-color, 7%);
$background-color-variant-2: lighten($background-color, 14%);
$background-color-variant-3: lighten($background-color, 21%);
$primary-color: #EEE;
$primary-variant-1: darken($primary-color, 14%);
$secondary-color: #3aa7df;
$body-background: darken($background-color, 5%);
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: lighten($background-color-variant-1, 3%);
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: darken(#002b36, 5%);
$background-color-variant-1: lighten($background-color, 7%);
$background-color-variant-2: lighten($background-color, 14%);
$background-color-variant-3: lighten($background-color, 21%);
$primary-color: #EEE;
$primary-variant-1: darken($primary-color, 14%);
$secondary-color: #0096c9;
$body-background: darken($background-color, 3%);
$code-background: $background-color-variant-1;
$table-background-alt: lighten($background-color, 10%);
$quote-background: lighten($background-color-variant-1, 3%);
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #FFF;
$background-color-variant-1: darken($background-color, 7%);
$background-color-variant-2: darken($background-color, 14%);
$background-color-variant-3: darken($background-color, 21%);
$primary-color: #000;
$primary-variant-1: lighten($primary-color, 14%);
$secondary-color: #00286a;
$body-background: $background-color-variant-1;
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: $background-color-variant-2;
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #FFF;
$background-color-variant-1: darken($background-color, 7%);
$background-color-variant-2: darken($background-color, 14%);
$background-color-variant-3: darken($background-color, 21%);
$primary-color: #112;
$primary-variant-1: lighten($primary-color, 14%);
$secondary-color: #00348e;
$body-background: $background-color-variant-1;
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: $background-color-variant-2;
$quote-background-alt: $background-color-variant-3;

@ -0,0 +1,19 @@
/*!
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
$background-color: #fff8f0;
$background-color-variant-1: darken($background-color, 4%);
$background-color-variant-2: darken($background-color, 8%);
$background-color-variant-3: darken($background-color, 12%);
$primary-color: #112;
$primary-variant-1: lighten($primary-color, 14%);
$secondary-color: #2b61be;
$body-background: $background-color-variant-1;
$code-background: $background-color-variant-1;
$table-background-alt: $background-color-variant-2;
$quote-background: $background-color-variant-2;
$quote-background-alt: $background-color-variant-3;

@ -1,3 +1,9 @@
<!--
~ Snekdown - Custom Markdown flavour and parser
~ Copyright (C) 2021 Trivernis
~ See LICENSE for more information.
-->
<div style="font-size: 10px; text-align: center; width: 100%;">
<span class="pageNumber"></span>/<span class="totalPages"></span>
</div>

@ -1,20 +1,24 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::Document;
use crate::format::chromium_pdf::result::{PdfRenderingError, PdfRenderingResult};
use crate::format::html::html_writer::HTMLWriter;
use crate::format::html::to_html::ToHtml;
use crate::references::configuration::keys::{
INCLUDE_MATHJAX, PDF_DISPLAY_HEADER_FOOTER, PDF_FOOTER_TEMPLATE, PDF_HEADER_TEMPLATE,
PDF_MARGIN_BOTTOM, PDF_MARGIN_LEFT, PDF_MARGIN_RIGHT, PDF_MARGIN_TOP, PDF_PAGE_HEIGHT,
PDF_PAGE_SCALE, PDF_PAGE_WIDTH,
};
use crate::references::configuration::Configuration;
use crate::utils::downloads::get_cached_path;
use crate::settings::Settings;
use crate::utils::caching::CacheStorage;
use bibliographix::Mutex;
use headless_chrome::protocol::page::PrintToPdfOptions;
use headless_chrome::{Browser, LaunchOptionsBuilder, Tab};
use headless_chrome::{Browser, Tab};
use std::env;
use std::fs;
use std::fs::OpenOptions;
use std::io::BufWriter;
use std::path::PathBuf;
use std::sync::Arc;
use std::thread;
use std::time::{Duration, Instant};
@ -22,16 +26,17 @@ pub mod result;
/// Renders the document to pdf and returns the resulting bytes
pub fn render_to_pdf(document: Document) -> PdfRenderingResult<Vec<u8>> {
let mut file_path = PathBuf::from(format!("tmp-document.html"));
file_path = get_cached_path(file_path).with_extension("html");
let mut mathjax = false;
let cache = CacheStorage::new();
let mut file_path = PathBuf::from("tmp-document.html");
file_path = cache.get_file_path(&file_path);
if let Some(entry) = document.config.get_entry(INCLUDE_MATHJAX) {
if entry.get().as_bool() == Some(true) {
mathjax = true;
}
if !file_path.parent().map(|p| p.exists()).unwrap_or(false) {
file_path = env::current_dir()?;
file_path.push(PathBuf::from(".tmp-document.html"))
}
let config = document.config.clone();
let mathjax = config.lock().features.include_mathjax;
let handle = thread::spawn({
let file_path = file_path.clone();
@ -44,14 +49,15 @@ pub fn render_to_pdf(document: Document) -> PdfRenderingResult<Vec<u8>> {
.truncate(true)
.open(file_path)?,
);
let mut html_writer = HTMLWriter::new(Box::new(writer));
let mut html_writer =
HTMLWriter::new(Box::new(writer), document.config.lock().style.theme.clone());
document.to_html(&mut html_writer)?;
log::info!("Successfully rendered temporary html file!");
html_writer.flush()
}
});
let browser = Browser::new(LaunchOptionsBuilder::default().build().unwrap())?;
let browser = Browser::default()?;
let tab = browser.wait_for_initial_tab()?;
handle.join().unwrap()?;
tab.navigate_to(format!("file:///{}", file_path.to_string_lossy()).as_str())?;
@ -106,49 +112,23 @@ fn wait_for_mathjax(tab: &Tab, timeout: Duration) -> PdfRenderingResult<()> {
Ok(())
}
fn get_pdf_options(config: Configuration) -> PrintToPdfOptions {
fn get_pdf_options(config: Arc<Mutex<Settings>>) -> PrintToPdfOptions {
let config = config.lock().pdf.clone();
PrintToPdfOptions {
landscape: None,
display_header_footer: config
.get_entry(PDF_DISPLAY_HEADER_FOOTER)
.and_then(|value| value.get().as_bool()),
display_header_footer: Some(config.display_header_footer),
print_background: Some(true),
scale: config
.get_entry(PDF_PAGE_SCALE)
.and_then(|value| value.get().as_float())
.map(|value| value as f32),
paper_width: config
.get_entry(PDF_PAGE_WIDTH)
.and_then(|value| value.get().as_float())
.map(|value| value as f32),
paper_height: config
.get_entry(PDF_PAGE_HEIGHT)
.and_then(|value| value.get().as_float())
.map(|value| value as f32),
margin_top: config
.get_entry(PDF_MARGIN_TOP)
.and_then(|value| value.get().as_float())
.map(|f| f as f32),
margin_bottom: config
.get_entry(PDF_MARGIN_BOTTOM)
.and_then(|value| value.get().as_float())
.map(|f| f as f32),
margin_left: config
.get_entry(PDF_MARGIN_LEFT)
.and_then(|value| value.get().as_float())
.map(|f| f as f32),
margin_right: config
.get_entry(PDF_MARGIN_RIGHT)
.and_then(|value| value.get().as_float())
.map(|f| f as f32),
scale: Some(config.page_scale),
paper_width: config.page_width,
paper_height: config.page_height,
margin_top: config.margin.top,
margin_bottom: config.margin.bottom,
margin_left: config.margin.left,
margin_right: config.margin.right,
page_ranges: None,
ignore_invalid_page_ranges: None,
header_template: config
.get_entry(PDF_HEADER_TEMPLATE)
.map(|value| value.get().as_string()),
footer_template: config
.get_entry(PDF_FOOTER_TEMPLATE)
.map(|value| value.get().as_string()),
header_template: config.header_template,
footer_template: config.footer_template,
prefer_css_page_size: None,
}
}

@ -1,5 +1,11 @@
use serde::export::fmt::{self, Display};
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use std::error::Error;
use std::fmt::{self, Display};
use std::io;
pub type PdfRenderingResult<T> = Result<T, PdfRenderingError>;

@ -1,149 +0,0 @@
body {
background-color: #DDD;
overflow-x: hidden;
color: #000;
word-break: break-word;
}
.content {
font-family: "Fira Sans", "Noto Sans", SansSerif, sans-serif;
width: 100vh;
max-width: calc(100% - 4rem);
padding: 2rem;
margin: auto;
background-color: #FFF;
}
h1 {
font-size: 2.2rem;
}
h2 {
font-size: 1.8rem;
}
h3 {
font-size: 1.4rem;
}
h4 {
font-size: 1rem;
}
h5 {
font-size: 0.8rem;
}
h6 {
font-size: 0.4rem;
}
img {
max-width: 100%;
max-height: 100vh;
height: auto;
}
code {
color: #000;
}
code pre {
font-family: "Fira Code", "Mono", monospace;
padding: 0.8em 0.2em;
background-color: #EEE !important;
border-radius: 0.25em;
}
code.inlineCode {
font-family: "Fira Code", monospace;
border-radius: 0.1em;
background-color: #EEE;
padding: 0 0.1em
}
.tableWrapper {
overflow-x: auto;
width: 100%;
}
.tableWrapper > table {
margin: auto;
}
table {
border-collapse: collapse;
}
table tr:nth-child(odd) {
background-color: #DDD;
}
table tr:nth-child(1) {
background-color: white;
font-weight: bold;
border-bottom: 1px solid black;
}
table td, table th {
border-left: 1px solid black;
padding: 0.2em 0.5em
}
table tr td:first-child, table tr th:first-child {
border-left: none;
}
blockquote {
margin-left: 0;
background-color: rgba(0, 0, 0, 0);
}
.quote {
border-left: 0.3em solid gray;
border-radius: 0.2em;
padding-left: 1em;
margin-left: 0;
background-color: #EEE;
}
.quote .metadata {
font-style: italic;
padding-left: 0.5em;
color: #444
}
.figure {
width: 100%;
display: block;
text-align: center;
}
.figure .imageDescription {
display: block;
color: #444;
font-style: italic;
}
.centered {
text-align: center;
}
.glossaryReference {
text-decoration: none;
color: inherit;
border-bottom: 1px dotted #000;
}
.arrow {
font-family: "Fira Code", "Mono", monospace;
}
@media print {
.content > section > section, .content > section > section {
page-break-inside: avoid;
}
body {
background-color: white !important;
}
}

@ -1,14 +1,22 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::settings::style_settings::Theme;
use std::io;
use std::io::Write;
pub struct HTMLWriter {
inner: Box<dyn Write>,
theme: Theme,
}
impl HTMLWriter {
/// Creates a new writer
pub fn new(inner: Box<dyn Write>) -> Self {
Self { inner }
pub fn new(inner: Box<dyn Write>, theme: Theme) -> Self {
Self { inner, theme }
}
/// Writes a raw string
@ -30,4 +38,9 @@ impl HTMLWriter {
pub fn flush(&mut self) -> io::Result<()> {
self.inner.flush()
}
/// Return the theme of the html writer
pub fn get_theme(&mut self) -> Theme {
self.theme.clone()
}
}

@ -1,2 +1,8 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod html_writer;
pub mod to_html;

@ -1,18 +1,20 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::*;
use crate::format::html::html_writer::HTMLWriter;
use crate::format::style::{get_code_theme_for_theme, get_css_for_theme};
use crate::format::PlaceholderTemplate;
use crate::references::configuration::keys::{EMBED_EXTERNAL, INCLUDE_MATHJAX, META_LANG};
use crate::references::configuration::Value;
use crate::references::glossary::{GlossaryDisplay, GlossaryReference};
use crate::references::templates::{Template, TemplateVariable};
use asciimath_rs::format::mathml::ToMathML;
use htmlescape::encode_attribute;
use minify::html::minify;
use std::io;
use std::sync::Arc;
use syntect::highlighting::ThemeSet;
use syntect::html::highlighted_html_for_string;
use syntect::parsing::SyntaxSet;
const MATHJAX_URL: &str = "https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js";
@ -64,8 +66,9 @@ impl ToHtml for Inline {
Inline::Math(m) => m.to_html(writer),
Inline::LineBreak => writer.write("<br>".to_string()),
Inline::CharacterCode(code) => code.to_html(writer),
Inline::GlossaryReference(gloss) => gloss.lock().unwrap().to_html(writer),
Inline::GlossaryReference(gloss) => gloss.lock().to_html(writer),
Inline::Arrow(a) => a.to_html(writer),
Inline::Anchor(a) => a.to_html(writer),
}
}
}
@ -102,18 +105,6 @@ impl ToHtml for MetadataValue {
impl ToHtml for Document {
fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> {
let downloads = Arc::clone(&self.downloads);
if let Some(Value::Bool(embed)) = self
.config
.get_entry(EMBED_EXTERNAL)
.map(|e| e.get().clone())
{
if embed {
downloads.lock().unwrap().download_all();
}
} else {
downloads.lock().unwrap().download_all();
}
let path = if let Some(path) = &self.path {
format!("path=\"{}\"", encode_attribute(path.as_str()))
} else {
@ -121,38 +112,64 @@ impl ToHtml for Document {
};
if self.is_root {
let language = self
.config
.get_entry(META_LANG)
.map(|e| e.get().as_string())
.unwrap_or("en".to_string());
let style = minify(std::include_str!("assets/style.css"));
let metadata = self.config.lock().metadata.clone();
let style = minify(get_css_for_theme(writer.get_theme()).as_str());
writer.write("<!DOCTYPE html>".to_string())?;
writer.write("<html lang=\"".to_string())?;
writer.write_attribute(language)?;
writer.write("\"><head ".to_string())?;
writer.write(path)?;
writer.write("/>".to_string())?;
writer.write_attribute(metadata.language)?;
writer.write("\"><head>".to_string())?;
writer.write("<meta charset=\"UTF-8\">".to_string())?;
if let Some(author) = metadata.author {
writer.write("<meta name=\"author\" content=\"".to_string())?;
writer.write_attribute(author)?;
writer.write("\">".to_string())?;
}
if let Some(title) = metadata.title {
writer.write("<title>".to_string())?;
writer.write_escaped(title.clone())?;
writer.write("</title>".to_string())?;
writer.write("<meta name=\"title\" content=\"".to_string())?;
writer.write_attribute(title)?;
writer.write("\">".to_string())?;
}
if let Some(description) = metadata.description {
writer.write("<meta name=\"description\" content=\"".to_string())?;
writer.write_attribute(description)?;
writer.write("\">".to_string())?;
}
if !metadata.keywords.is_empty() {
writer.write("<meta name=\"keywords\" content=\"".to_string())?;
writer.write_attribute(
metadata
.keywords
.iter()
.fold("".to_string(), |a, b| format!("{}, {}", a, b))
.trim_start_matches(", ")
.to_string(),
)?;
writer.write("\">".to_string())?;
}
writer.write("<style>".to_string())?;
writer.write(style)?;
writer.write("</style>".to_string())?;
if self.config.lock().features.include_mathjax {
writer.write(format!(
"<script id=\"MathJax-script\" type=\"text/javascript\" async src={}></script>",
MATHJAX_URL
))?;
}
for stylesheet in &self.stylesheets {
let mut stylesheet = stylesheet.lock().unwrap();
let mut stylesheet = stylesheet.lock();
let data = std::mem::replace(&mut stylesheet.data, None);
if let Some(data) = data {
if self
.config
.get_entry(INCLUDE_MATHJAX)
.and_then(|e| e.get().as_bool())
.unwrap_or(true)
{
writer.write(format!(
"<script id=\"MathJax-script\" type=\"text/javascript\" async src={}></script>",
MATHJAX_URL
))?;
}
writer.write("<style>".to_string())?;
writer.write(minify(String::from_utf8(data).unwrap().as_str()))?;
writer.write("</style>".to_string())?;
@ -240,7 +257,7 @@ impl ToHtml for Paragraph {
}
if self.elements.len() > 1 {
for element in &self.elements[1..] {
writer.write("<br/>".to_string())?;
writer.write(" ".to_string())?;
element.to_html(writer)?;
}
}
@ -335,19 +352,19 @@ impl ToHtml for Cell {
impl ToHtml for CodeBlock {
fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> {
writer.write("<div><code".to_string())?;
if self.language.len() > 0 {
writer.write(" lang=\"".to_string())?;
writer.write_attribute(self.language.clone())?;
writer.write("\">".to_string())?;
lazy_static::lazy_static! { static ref PS: SyntaxSet = SyntaxSet::load_defaults_nonewlines(); }
lazy_static::lazy_static! { static ref TS: ThemeSet = ThemeSet::load_defaults(); }
let (theme, syntax_set) = get_code_theme_for_theme(writer.get_theme());
if let Some(syntax) = PS.find_syntax_by_token(self.language.as_str()) {
if let Some(syntax) = syntax_set.find_syntax_by_token(self.language.as_str()) {
writer.write(highlighted_html_for_string(
self.code.as_str(),
&PS,
&syntax_set,
syntax,
&TS.themes["InspiredGitHub"],
&theme,
))?;
} else {
writer.write("<pre>".to_string())?;
@ -401,7 +418,7 @@ impl ToHtml for Image {
let mut style = String::new();
let url = if let Some(content) = self.get_content() {
let mime_type = mime_guess::from_path(&self.url.url).first_or(mime::IMAGE_PNG);
let mime_type = self.get_mime_type();
format!(
"data:{};base64,{}",
mime_type.to_string(),
@ -420,7 +437,7 @@ impl ToHtml for Image {
}
if let Some(description) = self.url.description.clone() {
writer.write("<div class=\"figure\"><a href=\"".to_string())?;
writer.write_attribute(self.url.url.clone())?;
writer.write_attribute(url.clone())?;
writer.write("\"><img src=\"".to_string())?;
writer.write(url)?;
writer.write("\" style=\"".to_string())?;
@ -668,7 +685,7 @@ impl ToHtml for Anchor {
impl ToHtml for GlossaryReference {
fn to_html(&self, writer: &mut HTMLWriter) -> io::Result<()> {
if let Some(entry) = &self.entry {
let entry = entry.lock().unwrap();
let entry = entry.lock();
writer.write("<a class=\"glossaryReference\" href=\"#".to_string())?;
writer.write_attribute(self.short.clone())?;
writer.write("\">".to_string())?;

@ -1,9 +1,16 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use regex::Regex;
use std::collections::HashMap;
#[cfg(feature = "pdf")]
pub mod chromium_pdf;
pub mod html;
pub mod style;
pub struct PlaceholderTemplate {
value: String,

@ -0,0 +1,61 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::settings::style_settings::Theme;
use std::time::Instant;
use syntect::highlighting::ThemeSet;
use syntect::parsing::SyntaxSet;
/// Returns the css of a theme compiled from sass
pub fn get_css_for_theme(theme: Theme) -> String {
let start = Instant::now();
let vars = match theme {
Theme::GitHub => include_str!("assets/light-github.scss"),
Theme::SolarizedDark => include_str!("assets/dark-solarized.scss"),
Theme::SolarizedLight => include_str!("assets/light-solarized.scss"),
Theme::OceanDark => include_str!("assets/dark-ocean.scss"),
Theme::OceanLight => include_str!("assets/light-ocean.scss"),
Theme::MagicDark => include_str!("assets/dark-magic.scss"),
};
let style = format!("{}\n{}", vars, include_str!("assets/base.scss"));
let css = compile_sass(&*style);
log::debug!("Compiled style in {} ms", start.elapsed().as_millis());
css
}
/// Returns the syntax theme for a given theme
pub fn get_code_theme_for_theme(theme: Theme) -> (syntect::highlighting::Theme, SyntaxSet) {
lazy_static::lazy_static! { static ref PS: SyntaxSet = SyntaxSet::load_defaults_nonewlines(); }
lazy_static::lazy_static! { static ref TS: ThemeSet = ThemeSet::load_defaults(); }
let theme = match theme {
Theme::GitHub => "InspiredGitHub",
Theme::SolarizedDark => "Solarized (dark)",
Theme::SolarizedLight => "Solarized (light)",
Theme::OceanDark => "base16-ocean.dark",
Theme::OceanLight => "base16-ocean.light",
Theme::MagicDark => "base16-ocean.dark",
};
return (TS.themes[theme].clone(), PS.clone());
}
fn compile_sass(sass: &str) -> String {
String::from_utf8(
rsass::compile_scss(
sass.as_bytes(),
rsass::output::Format {
style: rsass::output::Style::Compressed,
precision: 5,
},
)
.unwrap(),
)
.unwrap()
}

@ -1,7 +1,14 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod elements;
pub mod format;
pub mod parser;
pub mod references;
pub mod settings;
pub mod utils;
pub use parser::Parser;

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use colored::Colorize;
use env_logger::Env;
use log::{Level, LevelFilter};
@ -6,44 +12,68 @@ use snekdown::elements::Document;
use snekdown::format::html::html_writer::HTMLWriter;
use snekdown::format::html::to_html::ToHtml;
use snekdown::parser::ParserOptions;
use snekdown::settings::Settings;
use snekdown::utils::caching::CacheStorage;
use snekdown::Parser;
use std::fs::{File, OpenOptions};
use std::io::{BufWriter, Write};
use std::io::{stdout, BufWriter, Write};
use std::path::PathBuf;
use std::process::exit;
use std::sync::mpsc::channel;
use std::time::{Duration, Instant};
use structopt::StructOpt;
#[derive(StructOpt, Debug)]
#[derive(StructOpt, Debug, Clone)]
struct Opt {
#[structopt(subcommand)]
sub_command: SubCommand,
}
#[derive(StructOpt, Debug, Clone)]
#[structopt()]
enum SubCommand {
/// Watch the document and its imports and render on change.
Watch(WatchOptions),
/// Parse and render the document.
Render(RenderOptions),
/// Initializes the project with default settings
Init,
/// Clears the cache directory
ClearCache,
}
#[derive(StructOpt, Debug, Clone)]
#[structopt()]
struct RenderOptions {
/// Path to the input file
#[structopt(parse(from_os_str))]
input: PathBuf,
/// Path for the output file
#[structopt(parse(from_os_str))]
output: PathBuf,
output: Option<PathBuf>,
/// If the output should be written to stdout instead of the output file
#[structopt(long = "stdout")]
stdout: bool,
/// the output format
#[structopt(short, long, default_value = "html")]
format: String,
/// Don't use the cache
#[structopt(long)]
no_cache: bool,
#[structopt(subcommand)]
sub_command: Option<SubCommand>,
}
#[derive(StructOpt, Debug)]
#[derive(StructOpt, Debug, Clone)]
#[structopt()]
enum SubCommand {
/// Watch the document and its imports and render on change.
Watch,
struct WatchOptions {
/// The amount of time in milliseconds to wait after changes before rendering
#[structopt(long, default_value = "500")]
debounce: u64,
/// Default. Parse and render the document.
Render,
#[structopt(flatten)]
render_options: RenderOptions,
}
fn main() {
@ -68,19 +98,17 @@ fn main() {
)
})
.init();
if !opt.input.exists() {
log::error!(
"The input file {} could not be found",
opt.input.to_str().unwrap()
);
return;
}
match &opt.sub_command {
Some(SubCommand::Render) | None => {
SubCommand::Render(opt) => {
let _ = render(&opt);
}
Some(SubCommand::Watch) => watch(&opt),
SubCommand::Watch(opt) => watch(&opt),
SubCommand::ClearCache => {
let cache = CacheStorage::new();
cache.clear().expect("Failed to clear cache");
}
SubCommand::Init => init(),
};
}
@ -94,17 +122,47 @@ fn get_level_style(level: Level) -> colored::Color {
}
}
fn init() {
let settings = Settings::default();
let settings_string = toml::to_string_pretty(&settings).unwrap();
let manifest_path = PathBuf::from("Manifest.toml");
let bibliography_path = PathBuf::from("Bibliography.toml");
let glossary_path = PathBuf::from("Glossary.toml");
let css_path = PathBuf::from("style.css");
if !manifest_path.exists() {
let mut file = OpenOptions::new()
.create(true)
.write(true)
.truncate(true)
.open("Manifest.toml")
.unwrap();
file.write_all(settings_string.as_bytes()).unwrap();
file.flush().unwrap();
}
if !bibliography_path.exists() {
File::create("Bibliography.toml".to_string()).unwrap();
}
if !glossary_path.exists() {
File::create("Glossary.toml".to_string()).unwrap();
}
if !css_path.exists() {
File::create("style.css".to_string()).unwrap();
}
}
/// Watches a file with all of its imports and renders on change
fn watch(opt: &Opt) {
let parser = render(opt);
fn watch(opt: &WatchOptions) {
let parser = render(&opt.render_options);
let (tx, rx) = channel();
let mut watcher = watcher(tx, Duration::from_millis(250)).unwrap();
let mut watcher = watcher(tx, Duration::from_millis(opt.debounce)).unwrap();
for path in parser.get_paths() {
watcher.watch(path, RecursiveMode::NonRecursive).unwrap();
}
while let Ok(_) = rx.recv() {
println!("---");
let parser = render(opt);
let parser = render(&opt.render_options);
for path in parser.get_paths() {
watcher.watch(path, RecursiveMode::NonRecursive).unwrap();
}
@ -112,37 +170,50 @@ fn watch(opt: &Opt) {
}
/// Renders the document to the output path
fn render(opt: &Opt) -> Parser {
fn render(opt: &RenderOptions) -> Parser {
if !opt.input.exists() {
log::error!(
"The input file {} could not be found",
opt.input.to_str().unwrap()
);
exit(1)
}
let start = Instant::now();
let mut parser = Parser::with_defaults(
ParserOptions::default()
.add_path(opt.input.clone())
.use_cache(!opt.no_cache),
);
let mut parser = Parser::with_defaults(ParserOptions::default().add_path(opt.input.clone()));
let document = parser.parse();
log::info!("Parsing took: {:?}", start.elapsed());
log::info!("Parsing + Processing took: {:?}", start.elapsed());
let start_render = Instant::now();
let file = OpenOptions::new()
.read(true)
.write(true)
.truncate(true)
.create(true)
.open(&opt.output)
.unwrap();
let writer = BufWriter::new(file);
if let Some(output) = &opt.output {
let file = OpenOptions::new()
.read(true)
.write(true)
.truncate(true)
.create(true)
.open(output)
.unwrap();
render_format(opt, document, BufWriter::new(file));
} else {
if !opt.stdout {
log::error!("No output file specified");
exit(1)
}
render_format(opt, document, BufWriter::new(stdout()));
}
render_format(opt, document, writer);
log::info!("Rendering took: {:?}", start_render.elapsed());
log::info!("Total: {:?}", start.elapsed());
log::info!("Rendering took: {:?}", start_render.elapsed());
log::info!("Total: {:?}", start.elapsed());
parser
}
#[cfg(not(feature = "pdf"))]
fn render_format(opt: &Opt, document: Document, writer: BufWriter<File>) {
fn render_format<W: Write + 'static>(opt: &RenderOptions, document: Document, writer: W) {
match opt.format.as_str() {
"html" => render_html(document, writer),
_ => log::error!("Unknown format {}", opt.format),
@ -150,7 +221,7 @@ fn render_format(opt: &Opt, document: Document, writer: BufWriter<File>) {
}
#[cfg(feature = "pdf")]
fn render_format(opt: &Opt, document: Document, writer: BufWriter<File>) {
fn render_format<W: Write + 'static>(opt: &RenderOptions, document: Document, writer: W) {
match opt.format.as_str() {
"html" => render_html(document, writer),
"pdf" => render_pdf(document, writer),
@ -158,14 +229,14 @@ fn render_format(opt: &Opt, document: Document, writer: BufWriter<File>) {
}
}
fn render_html(document: Document, writer: BufWriter<File>) {
let mut writer = HTMLWriter::new(Box::new(writer));
fn render_html<W: Write + 'static>(document: Document, writer: W) {
let mut writer = HTMLWriter::new(Box::new(writer), document.config.lock().style.theme.clone());
document.to_html(&mut writer).unwrap();
writer.flush().unwrap();
}
#[cfg(feature = "pdf")]
fn render_pdf(document: Document, mut writer: BufWriter<File>) {
fn render_pdf<W: Write + 'static>(document: Document, mut writer: W) {
use snekdown::format::chromium_pdf::render_to_pdf;
let result = render_to_pdf(document).expect("Failed to render pdf!");

@ -1,7 +1,13 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use super::ParseResult;
use crate::elements::tokens::*;
use crate::elements::{
Block, CodeBlock, Import, List, ListItem, MathBlock, Paragraph, Quote, Section, Table,
Block, CodeBlock, Import, List, ListItem, MathBlock, Metadata, Paragraph, Quote, Section, Table,
};
use crate::parser::inline::ParseInline;
use crate::parser::line::ParseLine;
@ -26,7 +32,7 @@ impl ParseBlock for Parser {
fn parse_block(&mut self) -> ParseResult<Block> {
if let Some(section) = self.section_return {
if section <= self.section_nesting && (self.section_nesting > 0) {
return Err(self.ctm.assert_error(None));
return Err(self.ctm.assert_error(None).into());
} else {
self.section_return = None;
}
@ -35,7 +41,7 @@ impl ParseBlock for Parser {
log::trace!("Block::Section");
Block::Section(section)
} else if let Some(_) = self.section_return {
return Err(self.ctm.err());
return Err(self.ctm.err().into());
} else if let Ok(list) = self.parse_list() {
log::trace!("Block::List");
Block::List(list)
@ -60,7 +66,7 @@ impl ParseBlock for Parser {
Block::Null
}
} else if let Some(_) = self.section_return {
return Err(self.ctm.err());
return Err(self.ctm.err().into());
} else if let Ok(pholder) = self.parse_placeholder() {
log::trace!("Block::Placeholder");
Block::Placeholder(pholder)
@ -68,7 +74,7 @@ impl ParseBlock for Parser {
log::trace!("Block::Paragraph");
Block::Paragraph(paragraph)
} else {
return Err(self.ctm.err());
return Err(self.ctm.err().into());
};
Ok(token)
@ -78,6 +84,7 @@ impl ParseBlock for Parser {
fn parse_section(&mut self) -> ParseResult<Section> {
let start_index = self.ctm.get_index();
self.ctm.seek_whitespace();
if self.ctm.check_char(&HASH) {
let mut size = 1;
while let Some(_) = self.ctm.next_char() {
@ -94,13 +101,15 @@ impl ParseBlock for Parser {
if size <= self.section_nesting {
self.section_return = Some(size);
}
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
self.ctm.seek_any(&INLINE_WHITESPACE)?;
let mut header = self.parse_header()?;
header.size = size;
self.section_nesting = size;
self.sections.push(size);
self.section_anchors.push(header.anchor.clone());
let mut section = Section::new(header);
section.metadata = metadata;
self.ctm.seek_whitespace();
@ -110,6 +119,7 @@ impl ParseBlock for Parser {
}
self.sections.pop();
self.section_anchors.pop();
if let Some(sec) = self.sections.last() {
self.section_nesting = *sec
} else {
@ -117,7 +127,7 @@ impl ParseBlock for Parser {
}
Ok(section)
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
@ -131,8 +141,9 @@ impl ParseBlock for Parser {
let language = self.ctm.get_string_until_any(&[LB], &[])?;
self.ctm.seek_one()?;
let text = self.ctm.get_string_until_sequence(&[&SQ_CODE_BLOCK], &[])?;
for _ in 0..2 {
self.ctm.seek_one()?;
self.ctm.try_seek();
}
Ok(CodeBlock {
@ -149,7 +160,7 @@ impl ParseBlock for Parser {
self.ctm.seek_one()?;
let text = self.ctm.get_string_until_sequence(&[SQ_MATH], &[])?;
for _ in 0..1 {
self.ctm.seek_one()?;
self.ctm.try_seek();
}
Ok(MathBlock {
expression: asciimath_rs::parse(text),
@ -160,6 +171,7 @@ impl ParseBlock for Parser {
fn parse_quote(&mut self) -> ParseResult<Quote> {
let start_index = self.ctm.get_index();
self.ctm.seek_whitespace();
let metadata = if let Ok(meta) = self.parse_inline_metadata() {
Some(meta)
} else {
@ -167,7 +179,7 @@ impl ParseBlock for Parser {
};
if self.ctm.check_char(&META_CLOSE) {
if self.ctm.next_char() == None {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
let mut quote = Quote::new(metadata);
@ -185,8 +197,10 @@ impl ParseBlock for Parser {
break;
}
}
quote.strip_linebreak();
if quote.text.len() == 0 {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
Ok(quote)
@ -194,11 +208,12 @@ impl ParseBlock for Parser {
/// Parses a paragraph
fn parse_paragraph(&mut self) -> ParseResult<Paragraph> {
self.ctm.seek_whitespace();
let mut paragraph = Paragraph::new();
while let Ok(token) = self.parse_line() {
paragraph.add_element(token);
while let Ok(element) = self.parse_line() {
paragraph.add_element(element);
let start_index = self.ctm.get_index();
if self.ctm.check_any_sequence(&BLOCK_SPECIAL_CHARS)
|| self.ctm.check_any(&self.block_break_at)
{
@ -213,7 +228,7 @@ impl ParseBlock for Parser {
if paragraph.elements.len() > 0 {
Ok(paragraph)
} else {
Err(self.ctm.err())
Err(self.ctm.err().into())
}
}
@ -227,6 +242,7 @@ impl ParseBlock for Parser {
let ordered = self.ctm.get_current().is_numeric();
list.ordered = ordered;
let mut list_hierarchy: Vec<ListItem> = Vec::new();
while let Ok(mut item) = self.parse_list_item() {
while let Some(parent_item) = list_hierarchy.pop() {
if parent_item.level < item.level {
@ -273,7 +289,7 @@ impl ParseBlock for Parser {
if list.items.len() > 0 {
Ok(list)
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
@ -285,6 +301,7 @@ impl ParseBlock for Parser {
}
let seek_index = self.ctm.get_index();
let mut table = Table::new(header);
while let Ok(_) = self.ctm.seek_one() {
self.ctm.seek_any(&INLINE_WHITESPACE)?;
if !self.ctm.check_any(&[MINUS, PIPE]) || self.ctm.check_char(&LB) {
@ -319,7 +336,7 @@ impl ParseBlock for Parser {
path.push(character);
}
if self.ctm.check_char(&LB) || path.is_empty() {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
if self.ctm.check_char(&IMPORT_CLOSE) {
self.ctm.seek_one()?;
@ -328,22 +345,20 @@ impl ParseBlock for Parser {
if self.section_nesting > 0 {
self.section_return = Some(0);
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
let metadata = self
.parse_inline_metadata()
.ok()
.map(|m| m.into())
.map(|m| m.get_string_map())
.unwrap_or(HashMap::new());
self.ctm.seek_whitespace();
match self.import(path.clone(), &metadata) {
ImportType::Document(Ok(anchor)) => Ok(Some(Import { path, anchor })),
ImportType::Stylesheet(_) => Ok(None),
ImportType::Bibliography(_) => Ok(None),
ImportType::Manifest(_) => Ok(None),
_ => Err(self.ctm.err()),
_ => Err(self.ctm.err().into()),
}
}
}

@ -1,16 +1,24 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use super::{ParseError, ParseResult};
use crate::elements::tokens::*;
use crate::elements::BibReference;
use crate::elements::*;
use crate::parser::block::ParseBlock;
use crate::references::configuration::keys::{BIB_REF_DISPLAY, SMART_ARROWS};
use crate::references::glossary::GlossaryDisplay;
use crate::references::glossary::GlossaryReference;
use crate::references::templates::{GetTemplateVariables, Template, TemplateVariable};
use crate::utils::parsing::remove_single_backlslash;
use crate::Parser;
use bibliographix::references::bib_reference::BibRef;
use parking_lot::Mutex;
use std::collections::HashMap;
use std::sync::{Arc, Mutex, RwLock};
use std::path::PathBuf;
use std::sync::{Arc, RwLock};
pub(crate) trait ParseInline {
fn parse_surrounded(&mut self, surrounding: &char) -> ParseResult<Vec<Inline>>;
@ -37,6 +45,7 @@ pub(crate) trait ParseInline {
fn parse_template(&mut self) -> ParseResult<Template>;
fn parse_character_code(&mut self) -> ParseResult<CharacterCode>;
fn parse_arrow(&mut self) -> ParseResult<Arrow>;
fn parse_anchor(&mut self) -> ParseResult<Anchor>;
}
impl ParseInline for Parser {
@ -46,11 +55,12 @@ impl ParseInline for Parser {
self.ctm.assert_char(surrounding, Some(start_index))?;
self.ctm.seek_one()?;
let mut inline = vec![self.parse_inline()?];
while !self.ctm.check_char(surrounding) {
if let Ok(result) = self.parse_inline() {
inline.push(result)
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
if !self.ctm.check_eof() {
@ -69,10 +79,10 @@ impl ParseInline for Parser {
}
}
if self.ctm.check_char(&PIPE) || self.ctm.check_char(&LB) {
Err(self.ctm.err())
Err(self.ctm.err().into())
} else if self.ctm.check_eof() {
log::trace!("EOF");
Err(self.ctm.err())
Err(self.ctm.err().into())
} else if let Ok(image) = self.parse_image() {
log::trace!("Inline::Image {:?}", image);
Ok(Inline::Image(image))
@ -98,7 +108,7 @@ impl ParseInline for Parser {
log::trace!("Inline::Striked");
Ok(Inline::Striked(striked))
} else if let Ok(gloss) = self.parse_glossary_reference() {
log::trace!("Inline::GlossaryReference {}", gloss.lock().unwrap().short);
log::trace!("Inline::GlossaryReference {}", gloss.lock().short);
Ok(Inline::GlossaryReference(gloss))
} else if let Ok(superscript) = self.parse_superscript() {
log::trace!("Inline::Superscript");
@ -124,6 +134,9 @@ impl ParseInline for Parser {
} else if let Ok(arrow) = self.parse_arrow() {
log::trace!("Inline::Arrow {:?}", arrow);
Ok(Inline::Arrow(arrow))
} else if let Ok(anchor) = self.parse_anchor() {
log::trace!("Inline::Anchor {:?}", anchor);
Ok(Inline::Anchor(anchor))
} else {
let plain = self.parse_plain()?;
log::trace!("Inline::Plain {}", plain.value);
@ -138,25 +151,27 @@ impl ParseInline for Parser {
self.ctm.seek_one()?;
if let Ok(url) = self.parse_url(true) {
let metadata = if let Ok(meta) = self.parse_inline_metadata() {
Some(meta)
} else {
None
};
let metadata = self.parse_inline_metadata().ok();
let path = url.url.clone();
let pending_image = self
.options
.document
.images
.lock()
.add_image(PathBuf::from(path));
if let Some(meta) = &metadata {
pending_image.lock().assign_from_meta(meta)
}
Ok(Image {
url,
metadata,
download: self
.options
.document
.downloads
.lock()
.unwrap()
.add_download(path),
image_data: pending_image,
})
} else {
Err(self.ctm.rewind_with_error(start_index))
Err(self.ctm.rewind_with_error(start_index).into())
}
}
@ -181,7 +196,7 @@ impl ParseInline for Parser {
self.inline_break_at.pop();
self.ctm.seek_one()?;
} else if !short_syntax {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
self.ctm.assert_char(&URL_OPEN, Some(start_index))?;
self.ctm.seek_one()?;
@ -214,7 +229,7 @@ impl ParseInline for Parser {
} else if self.ctm.check_char(&SPACE) {
false
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
};
self.ctm.seek_one()?;
self.ctm.assert_char(&CHECK_CLOSE, Some(start_index))?;
@ -229,11 +244,12 @@ impl ParseInline for Parser {
self.ctm.assert_sequence(&BOLD, Some(start_index))?;
self.ctm.seek_one()?;
let mut inline = vec![self.parse_inline()?];
while !self.ctm.check_sequence(&BOLD) {
if let Ok(result) = self.parse_inline() {
inline.push(result);
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
self.ctm.seek_one()?;
@ -257,12 +273,12 @@ impl ParseInline for Parser {
if let Ok(result) = self.parse_inline() {
inline.push(result);
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
self.ctm.rewind(self.ctm.get_index() - STRIKED.len());
if self.ctm.check_any(WHITESPACE) {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
for _ in 0..(STRIKED.len() + 1) {
self.ctm.seek_one()?;
@ -290,9 +306,10 @@ impl ParseInline for Parser {
let start_index = self.ctm.get_index();
self.ctm.assert_char(&BACKTICK, Some(start_index))?;
self.ctm.seek_one()?;
let content = self
.ctm
.get_string_until_any_or_rewind(&[BACKTICK, LB], &[], start_index)?;
let mut content =
self.ctm
.get_string_until_any_or_rewind(&[BACKTICK, LB], &[], start_index)?;
content = remove_single_backlslash(content);
self.ctm.assert_char(&BACKTICK, Some(start_index))?;
self.ctm.seek_one()?;
@ -326,7 +343,7 @@ impl ParseInline for Parser {
name,
})
} else {
Err(self.ctm.rewind_with_error(start_index))
Err(self.ctm.rewind_with_error(start_index).into())
}
}
@ -343,7 +360,7 @@ impl ParseInline for Parser {
)?;
self.ctm.seek_one()?;
if color.is_empty() {
return Err(self.ctm.err());
return Err(self.ctm.err().into());
}
Ok(Colored {
value: Box::new(self.parse_inline()?),
@ -363,7 +380,15 @@ impl ParseInline for Parser {
let bib_ref = BibRef::new(key.clone());
let ref_entry = Arc::new(RwLock::new(BibReference::new(
key,
self.options.document.config.get_ref_entry(BIB_REF_DISPLAY),
Some(
self.options
.document
.config
.lock()
.style
.bib_ref_display
.clone(),
),
bib_ref.anchor(),
)));
self.options
@ -371,7 +396,6 @@ impl ParseInline for Parser {
.bibliography
.root_ref_anchor()
.lock()
.unwrap()
.insert(bib_ref);
Ok(ref_entry)
@ -419,7 +443,7 @@ impl ParseInline for Parser {
self.ctm
.get_string_until_any_or_rewind(&WHITESPACE, &[TILDE], start_index)?;
if key.is_empty() {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
while !key.is_empty() && !key.chars().last().unwrap().is_alphabetic() {
self.ctm.rewind(self.ctm.get_index() - 1);
@ -432,14 +456,13 @@ impl ParseInline for Parser {
.document
.glossary
.lock()
.unwrap()
.add_reference(reference))
}
/// parses plain text as a string until it encounters an unescaped special inline char
fn parse_plain(&mut self) -> ParseResult<PlainText> {
if self.ctm.check_char(&LB) {
return Err(self.ctm.err());
return Err(self.ctm.err().into());
}
let mut characters = String::new();
if !self.ctm.check_char(&SPECIAL_ESCAPE) {
@ -464,7 +487,7 @@ impl ParseInline for Parser {
if characters.len() > 0 {
Ok(PlainText { value: characters })
} else {
Err(self.ctm.err())
Err(self.ctm.err().into())
}
}
@ -488,7 +511,7 @@ impl ParseInline for Parser {
if values.len() == 0 {
// if there was a linebreak (the metadata wasn't closed) or there is no inner data
// return an error
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
Ok(InlineMetadata { data: values })
@ -503,6 +526,7 @@ impl ParseInline for Parser {
self.ctm.seek_any(&INLINE_WHITESPACE)?;
let mut value = MetadataValue::Bool(true);
if self.ctm.check_char(&EQ) {
self.ctm.seek_one()?;
self.ctm.seek_any(&INLINE_WHITESPACE)?;
@ -563,7 +587,7 @@ impl ParseInline for Parser {
{
name_str
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
};
if !self.ctm.check_eof() {
self.ctm.seek_one()?;
@ -587,7 +611,7 @@ impl ParseInline for Parser {
self.ctm.seek_one()?;
if self.ctm.check_char(&TEMPLATE) {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
let mut elements = Vec::new();
@ -642,15 +666,8 @@ impl ParseInline for Parser {
/// Parses an arrow
fn parse_arrow(&mut self) -> ParseResult<Arrow> {
if !self
.options
.document
.config
.get_entry(SMART_ARROWS)
.and_then(|e| e.get().as_bool())
.unwrap_or(true)
{
Err(self.ctm.err())
if !self.options.document.config.lock().features.smart_arrows {
Err(self.ctm.err().into())
} else if self.ctm.check_sequence(A_LEFT_RIGHT_ARROW) {
self.ctm.seek_one()?;
Ok(Arrow::LeftRightArrow)
@ -670,7 +687,25 @@ impl ParseInline for Parser {
self.ctm.seek_one()?;
Ok(Arrow::BigLeftArrow)
} else {
Err(self.ctm.err())
Err(self.ctm.err().into())
}
}
/// Parses an anchor elements
fn parse_anchor(&mut self) -> ParseResult<Anchor> {
let start_index = self.ctm.get_index();
self.ctm.assert_sequence(&ANCHOR_START, Some(start_index))?;
self.ctm.seek_one()?;
let key = self.ctm.get_string_until_any_or_rewind(
&[ANCHOR_STOP],
&INLINE_WHITESPACE,
start_index,
)?;
self.ctm.try_seek();
Ok(Anchor {
inner: Box::new(Line::Text(TextLine::new())),
key,
})
}
}

@ -1,7 +1,14 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use super::ParseResult;
use crate::elements::tokens::*;
use crate::elements::Inline::LineBreak;
use crate::elements::{BibEntry, Metadata};
use crate::elements::{Cell, Centered, Header, Inline, Line, ListItem, Row, Ruler, TextLine};
use crate::elements::{Cell, Centered, Header, Line, ListItem, Row, Ruler, TextLine};
use crate::parser::inline::ParseInline;
use crate::Parser;
use bibliographix::bibliography::bibliography_entry::BibliographyEntry;
@ -16,6 +23,7 @@ pub(crate) trait ParseLine {
fn parse_row(&mut self) -> ParseResult<Row>;
fn parse_centered(&mut self) -> ParseResult<Centered>;
fn parse_ruler(&mut self) -> ParseResult<Ruler>;
fn parse_paragraph_break(&mut self) -> ParseResult<TextLine>;
fn parse_text_line(&mut self) -> ParseResult<TextLine>;
fn parse_bib_entry(&mut self) -> ParseResult<BibEntry>;
}
@ -25,7 +33,7 @@ impl ParseLine for Parser {
fn parse_line(&mut self) -> ParseResult<Line> {
if self.ctm.check_eof() {
log::trace!("EOF");
Err(self.ctm.err())
Err(self.ctm.err().into())
} else {
if let Ok(ruler) = self.parse_ruler() {
log::trace!("Line::Ruler");
@ -36,11 +44,14 @@ impl ParseLine for Parser {
} else if let Ok(bib) = self.parse_bib_entry() {
log::trace!("Line::BibEntry");
Ok(Line::BibEntry(bib))
} else if let Ok(text) = self.parse_paragraph_break() {
log::trace!("Line::LineBreak");
Ok(Line::Text(text))
} else if let Ok(text) = self.parse_text_line() {
log::trace!("Line::Text");
Ok(Line::Text(text))
} else {
Err(self.ctm.err())
Err(self.ctm.err().into())
}
}
}
@ -53,6 +64,9 @@ impl ParseLine for Parser {
self.ctm.get_text()[start_index..self.ctm.get_index()]
.iter()
.for_each(|e| anchor.push(*e));
if let Some(last) = self.section_anchors.last() {
anchor = format!("{}-{}", last, anchor);
}
anchor.retain(|c| !c.is_whitespace());
log::trace!("Line::Header");
Ok(Header::new(line, anchor))
@ -76,11 +90,11 @@ impl ParseLine for Parser {
}
if !self.ctm.check_any(&INLINE_WHITESPACE) {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
self.ctm.seek_any(&INLINE_WHITESPACE)?;
if self.ctm.check_char(&MINUS) {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
let item = ListItem::new(self.parse_line()?, level as u16, ordered);
@ -96,7 +110,7 @@ impl ParseLine for Parser {
self.ctm.assert_char(&PIPE, Some(start_index))?;
self.ctm.seek_one()?;
if self.ctm.check_char(&PIPE) {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
self.inline_break_at.push(PIPE);
@ -134,7 +148,7 @@ impl ParseLine for Parser {
log::trace!("Line::TableRow");
Ok(row)
} else {
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
@ -163,6 +177,8 @@ impl ParseLine for Parser {
/// Parses a line of text
fn parse_text_line(&mut self) -> ParseResult<TextLine> {
let mut text = TextLine::new();
let start_index = self.ctm.get_index();
while let Ok(subtext) = self.parse_inline() {
text.add_subtext(subtext);
if self.ctm.check_eof() || self.ctm.check_any(&self.inline_break_at) {
@ -170,21 +186,36 @@ impl ParseLine for Parser {
}
}
// add a linebreak when encountering \n\n
if self.ctm.check_char(&LB) {
if let Ok(_) = self.ctm.seek_one() {
if self.ctm.check_char(&LB) {
text.add_subtext(Inline::LineBreak)
}
self.ctm.try_seek();
if self.ctm.check_char(&LB) {
text.add_subtext(LineBreak);
self.ctm.try_seek();
}
}
if text.subtext.len() > 0 {
Ok(text)
} else {
Err(self.ctm.err())
Err(self.ctm.rewind_with_error(start_index).into())
}
}
/// Parses a paragraph break
fn parse_paragraph_break(&mut self) -> ParseResult<TextLine> {
let start_index = self.ctm.get_index();
self.ctm.assert_char(&LB, Some(start_index))?;
self.ctm.seek_one()?;
let mut line = TextLine::new();
line.subtext.push(LineBreak);
Ok(line)
}
fn parse_bib_entry(&mut self) -> ParseResult<BibEntry> {
let start_index = self.ctm.get_index();
self.ctm.seek_any(&INLINE_WHITESPACE)?;
@ -211,7 +242,7 @@ impl ParseLine for Parser {
msg,
self.get_position_string()
);
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
} else {
@ -232,17 +263,17 @@ impl ParseLine for Parser {
msg,
self.get_position_string()
);
return Err(self.ctm.rewind_with_error(start_index));
return Err(self.ctm.rewind_with_error(start_index).into());
}
}
};
self.ctm.seek_whitespace();
self.options
.document
.bibliography
.entry_dictionary()
.lock()
.unwrap()
.insert(entry);
Ok(BibEntry {
@ -252,7 +283,6 @@ impl ParseLine for Parser {
.bibliography
.entry_dictionary()
.lock()
.unwrap()
.get(&key)
.unwrap(),
key,

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub(crate) mod block;
pub(crate) mod inline;
pub(crate) mod line;
@ -5,31 +11,54 @@ pub(crate) mod line;
use self::block::ParseBlock;
use crate::elements::tokens::LB;
use crate::elements::{Document, ImportAnchor};
use crate::references::configuration::keys::{
IMP_BIBLIOGRAPHY, IMP_CONFIGS, IMP_GLOSSARY, IMP_IGNORE, IMP_STYLESHEETS,
};
use crate::references::configuration::Value;
use charred::tapemachine::{CharTapeMachine, TapeError, TapeResult};
use crate::settings::SettingsError;
use charred::tapemachine::{CharTapeMachine, TapeError};
use crossbeam_utils::sync::WaitGroup;
use regex::Regex;
use std::collections::HashMap;
use std::fmt;
use std::fs::{read_to_string, File};
use std::io::BufReader;
use std::io::{self, BufReader};
use std::path::PathBuf;
use std::sync::{Arc, Mutex, RwLock};
use std::thread;
pub type ParseResult<T> = TapeResult<T>;
pub type ParseError = TapeError;
pub type ParseResult<T> = Result<T, ParseError>;
#[derive(Debug)]
pub enum ParseError {
TapeError(TapeError),
SettingsError(SettingsError),
IoError(io::Error),
}
impl fmt::Display for ParseError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
ParseError::TapeError(e) => write!(f, "{}", e),
ParseError::SettingsError(e) => write!(f, "{}", e),
ParseError::IoError(e) => write!(f, "IO Error: {}", e),
}
}
}
impl From<TapeError> for ParseError {
fn from(e: TapeError) -> Self {
Self::TapeError(e)
}
}
impl From<SettingsError> for ParseError {
fn from(e: SettingsError) -> Self {
Self::SettingsError(e)
}
}
const DEFAULT_IMPORTS: &'static [(&str, &str)] = &[
("snekdown.toml", "manifest"),
("manifest.toml", "manifest"),
("bibliography.toml", "bibliography"),
("bibliography2.bib.toml", "bibliography"),
("glossary.toml", "glossary"),
("style.css", "stylesheet"),
];
impl From<io::Error> for ParseError {
fn from(e: io::Error) -> Self {
Self::IoError(e)
}
}
#[derive(Clone, Debug)]
pub struct ParserOptions {
@ -58,13 +87,6 @@ impl ParserOptions {
self
}
/// If external sources should be cached when after downloaded
pub fn use_cache(self, value: bool) -> Self {
self.document.downloads.lock().unwrap().use_cache = value;
self
}
}
pub struct Parser {
@ -72,6 +94,7 @@ pub struct Parser {
pub(crate) ctm: CharTapeMachine,
section_nesting: u8,
sections: Vec<u8>,
section_anchors: Vec<String>,
section_return: Option<u8>,
wg: WaitGroup,
pub(crate) block_break_at: Vec<char>,
@ -84,6 +107,7 @@ impl Parser {
pub fn with_defaults(options: ParserOptions) -> Self {
let text = if let Some(path) = &options.path {
let mut text = read_to_string(&path).unwrap();
text = text.replace("\r\n", "\n");
if text.chars().last() != Some('\n') {
text.push('\n');
}
@ -95,6 +119,7 @@ impl Parser {
Self {
options,
sections: Vec::new(),
section_anchors: Vec::new(),
section_nesting: 0,
section_return: None,
wg: WaitGroup::new(),
@ -118,13 +143,18 @@ impl Parser {
/// Returns a string of the current position in the file
pub(crate) fn get_position_string(&self) -> String {
let char_index = self.ctm.get_index();
self.get_position_string_for_index(char_index)
}
/// Returns a string of the given index position in the file
fn get_position_string_for_index(&self, char_index: usize) -> String {
let text = self.ctm.get_text();
let mut text_unil = text[..char_index].to_vec();
let line_number = text_unil.iter().filter(|c| c == &&LB).count();
text_unil.reverse();
let mut inline_pos = 0;
while text_unil[inline_pos] != LB {
while inline_pos < text_unil.len() && text_unil[inline_pos] != LB {
inline_pos += 1;
}
if let Some(path) = &self.options.path {
@ -157,27 +187,15 @@ impl Parser {
path.to_str().unwrap(),
self.get_position_string(),
);
return Err(self.ctm.assert_error(None));
}
{
let mut paths = self.options.paths.lock().unwrap();
if paths.iter().find(|item| **item == path) != None {
log::warn!(
"Import of \"{}\" failed: Already imported.\n\t--> {}\n",
path.to_str().unwrap(),
self.get_position_string(),
);
return Err(self.ctm.assert_error(None));
}
paths.push(path.clone());
return Err(self.ctm.assert_error(None).into());
}
let anchor = Arc::new(RwLock::new(ImportAnchor::new()));
let anchor_clone = Arc::clone(&anchor);
let wg = self.wg.clone();
let mut chid_parser = self.create_child(path.clone());
let mut child_parser = self.create_child(path.clone());
let _ = thread::spawn(move || {
let document = chid_parser.parse();
let document = child_parser.parse();
anchor_clone.write().unwrap().set_document(document);
drop(wg);
@ -200,7 +218,7 @@ impl Parser {
/// Returns the text of an imported text file
fn import_text_file(&self, path: PathBuf) -> ParseResult<String> {
read_to_string(path).map_err(|_| self.ctm.err())
read_to_string(path).map_err(ParseError::from)
}
fn import_stylesheet(&mut self, path: PathBuf) -> ParseResult<()> {
@ -209,7 +227,6 @@ impl Parser {
.document
.downloads
.lock()
.unwrap()
.add_download(path.to_str().unwrap().to_string()),
);
@ -217,13 +234,12 @@ impl Parser {
}
fn import_manifest(&mut self, path: PathBuf) -> ParseResult<()> {
let contents = self.import_text_file(path)?;
let value = contents
.parse::<toml::Value>()
.map_err(|_| self.ctm.err())?;
self.options.document.config.set_from_toml(&value);
Ok(())
self.options
.document
.config
.lock()
.merge(path)
.map_err(ParseError::from)
}
/// Imports a glossary
@ -236,7 +252,6 @@ impl Parser {
.document
.glossary
.lock()
.unwrap()
.assign_from_toml(value)
.unwrap_or_else(|e| log::error!("{}", e));
@ -244,7 +259,7 @@ impl Parser {
}
/// Imports a path
fn import(&mut self, path: String, args: &HashMap<String, Value>) -> ImportType {
fn import(&mut self, path: String, args: &HashMap<String, String>) -> ImportType {
log::debug!(
"Importing file {}\n\t--> {}\n",
path,
@ -263,23 +278,24 @@ impl Parser {
.file_name()
.and_then(|f| Some(f.to_str().unwrap().to_string()))
{
if let Some(Value::Array(ignore)) = self
.options
.document
.config
.get_entry(IMP_IGNORE)
.and_then(|e| Some(e.get().clone()))
{
let ignore = ignore
.iter()
.map(|v| v.as_string())
.collect::<Vec<String>>();
if ignore.contains(&fname) {
return ImportType::None;
}
let ignore = &self.options.document.config.lock().imports.ignored_imports;
if ignore.contains(&fname) {
return ImportType::None;
}
}
match args.get("type").map(|e| e.as_string().to_lowercase()) {
{
let mut paths = self.options.paths.lock().unwrap();
if paths.iter().find(|item| **item == path).is_some() {
log::warn!(
"Import of \"{}\" failed: Already imported.\n\t--> {}\n",
path.to_str().unwrap(),
self.get_position_string(),
);
return ImportType::None;
}
paths.push(path.clone());
}
match args.get("type").cloned() {
Some(s) if s == "stylesheet".to_string() => {
ImportType::Stylesheet(self.import_stylesheet(path))
}
@ -328,7 +344,18 @@ impl Parser {
if self.ctm.check_eof() {
break;
}
eprintln!("{}", err);
match err {
ParseError::TapeError(t) => {
log::error!(
"Parse Error: {}\n\t--> {}\n",
t,
self.get_position_string_for_index(t.get_index())
)
}
_ => {
log::error!("{}", err)
}
}
break;
}
}
@ -337,14 +364,10 @@ impl Parser {
let wg = self.wg.clone();
self.wg = WaitGroup::new();
if !self.options.is_child {
for (path, file_type) in DEFAULT_IMPORTS {
if self.transform_path(path.to_string()).exists() {
self.import(
path.to_string(),
&maplit::hashmap! {"type".to_string() => Value::String(file_type.to_string())},
);
}
}
self.import(
"Manifest.toml".to_string(),
&maplit::hashmap! {"type".to_string() => "manifest".to_string()},
);
}
wg.wait();
if !self.options.is_child {
@ -362,57 +385,25 @@ impl Parser {
/// Imports files from the configs import values
fn import_from_config(&mut self) {
if let Some(Value::Array(mut imp)) = self
.options
.document
.config
.get_entry(IMP_STYLESHEETS)
.and_then(|e| Some(e.get().clone()))
{
let args =
maplit::hashmap! {"type".to_string() => Value::String("stylesheet".to_string())};
while let Some(Value::String(s)) = imp.pop() {
self.import(s, &args);
}
}
if let Some(Value::Array(mut imp)) = self
.options
.document
.config
.get_entry(IMP_CONFIGS)
.and_then(|e| Some(e.get().clone()))
{
let args = maplit::hashmap! {"type".to_string() => Value::String("config".to_string())};
while let Some(Value::String(s)) = imp.pop() {
self.import(s, &args);
}
let config = Arc::clone(&self.options.document.config);
let mut stylesheets = config.lock().imports.included_stylesheets.clone();
let args = maplit::hashmap! {"type".to_string() => "stylesheet".to_string()};
while let Some(s) = stylesheets.pop() {
self.import(s, &args);
}
if let Some(Value::Array(mut imp)) = self
.options
.document
.config
.get_entry(IMP_BIBLIOGRAPHY)
.and_then(|e| Some(e.get().clone()))
{
let args =
maplit::hashmap! {"type".to_string() => Value::String("bibliography".to_string())};
while let Some(Value::String(s)) = imp.pop() {
self.import(s, &args);
}
let mut bibliography = config.lock().imports.included_bibliography.clone();
let args = maplit::hashmap! {"type".to_string() => "bibliography".to_string()};
while let Some(s) = bibliography.pop() {
self.import(s, &args);
}
if let Some(Value::Array(mut imp)) = self
.options
.document
.config
.get_entry(IMP_GLOSSARY)
.and_then(|e| Some(e.get().clone()))
{
let args =
maplit::hashmap! {"type".to_string() => Value::String("glossary".to_string())};
while let Some(Value::String(s)) = imp.pop() {
self.import(s, &args);
}
let mut glossaries = config.lock().imports.included_glossaries.clone();
let args = maplit::hashmap! {"type".to_string() =>"glossary".to_string()};
while let Some(s) = glossaries.pop() {
self.import(s, &args);
}
}
}

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{Anchor, BoldText, ItalicText, Line, List, ListItem, PlainText, TextLine};
use crate::elements::{Inline, Url};
use bibliographix::bibliography::bib_types::article::Article;
@ -33,7 +39,6 @@ pub fn create_bib_list(entries: Vec<BibliographyEntryReference>) -> List {
for entry in entries {
entry
.lock()
.unwrap()
.raw_fields
.insert("ord".to_string(), count.to_string());
list.add_item(get_item_for_entry(entry));
@ -45,7 +50,7 @@ pub fn create_bib_list(entries: Vec<BibliographyEntryReference>) -> List {
/// Returns the list item for a bib entry
fn get_item_for_entry(entry: BibliographyEntryReference) -> ListItem {
let entry = entry.lock().unwrap();
let entry = entry.lock();
match &entry.bib_type {
BibliographyType::Article(a) => get_item_for_article(&*entry, a),

@ -1,26 +0,0 @@
#![allow(unused)]
pub const BIB_REF_DISPLAY: &str = "bib-ref-display";
pub const META_LANG: &str = "language";
// import and include options
pub const IMP_IGNORE: &str = "ignored-imports";
pub const IMP_STYLESHEETS: &str = "included-stylesheets";
pub const IMP_CONFIGS: &str = "included-configs";
pub const IMP_BIBLIOGRAPHY: &str = "included-bibliography";
pub const IMP_GLOSSARY: &str = "included-glossary";
pub const EMBED_EXTERNAL: &str = "embed-external";
pub const SMART_ARROWS: &str = "smart-arrows";
pub const INCLUDE_MATHJAX: &str = "include-math-jax";
// PDF options
pub const PDF_DISPLAY_HEADER_FOOTER: &str = "pfd-display-header-footer";
pub const PDF_HEADER_TEMPLATE: &str = "pdf-header-template";
pub const PDF_FOOTER_TEMPLATE: &str = "pdf-footer-template";
pub const PDF_MARGIN_TOP: &str = "pdf-margin-top";
pub const PDF_MARGIN_BOTTOM: &str = "pdf-margin-bottom";
pub const PDF_MARGIN_LEFT: &str = "pdf-margin-left";
pub const PDF_MARGIN_RIGHT: &str = "pdf-margin-right";
pub const PDF_PAGE_HEIGHT: &str = "pdf-page-height";
pub const PDF_PAGE_WIDTH: &str = "pdf-page-width";
pub const PDF_PAGE_SCALE: &str = "pdf-page-scale";

@ -1,184 +0,0 @@
use crate::elements::MetadataValue;
use crate::references::configuration::keys::{
BIB_REF_DISPLAY, META_LANG, PDF_FOOTER_TEMPLATE, PDF_HEADER_TEMPLATE, PDF_MARGIN_BOTTOM,
PDF_MARGIN_TOP,
};
use crate::references::templates::Template;
use serde::export::TryFrom;
use std::collections::HashMap;
use std::sync::{Arc, RwLock};
pub(crate) mod keys;
#[derive(Clone, Debug)]
pub enum Value {
String(String),
Bool(bool),
Float(f64),
Integer(i64),
Template(Template),
Array(Vec<Value>),
}
#[derive(Clone, Debug)]
pub struct ConfigEntry {
inner: Value,
}
pub type ConfigRefEntry = Arc<RwLock<ConfigEntry>>;
#[derive(Clone, Debug)]
pub struct Configuration {
config: Arc<RwLock<HashMap<String, ConfigRefEntry>>>,
}
impl Value {
pub fn as_string(&self) -> String {
match self {
Value::String(string) => string.clone(),
Value::Integer(int) => format!("{}", int),
Value::Float(f) => format!("{:02}", f),
Value::Bool(b) => format!("{}", b),
Value::Array(a) => a.iter().fold("".to_string(), |a, b| {
format!("{} \"{}\"", a, b.as_string())
}),
_ => "".to_string(),
}
}
/// Returns the bool value if the value is a boolean
pub fn as_bool(&self) -> Option<bool> {
match self {
Value::Bool(b) => Some(*b),
_ => None,
}
}
pub fn as_float(&self) -> Option<f64> {
match self {
Value::Float(v) => Some(*v),
_ => None,
}
}
}
impl ConfigEntry {
pub fn new(value: Value) -> Self {
Self { inner: value }
}
pub fn set(&mut self, value: Value) {
self.inner = value;
}
pub fn get(&self) -> &Value {
&self.inner
}
}
impl Default for Configuration {
fn default() -> Self {
let mut self_config = Self::new();
self_config.set(BIB_REF_DISPLAY, Value::String("{{number}}".to_string()));
self_config.set(META_LANG, Value::String("en".to_string()));
self_config.set(PDF_MARGIN_BOTTOM, Value::Float(0.5));
self_config.set(PDF_MARGIN_TOP, Value::Float(0.5));
self_config.set(PDF_HEADER_TEMPLATE, Value::String("<div/>".to_string()));
self_config.set(
PDF_FOOTER_TEMPLATE,
Value::String(
include_str!("../../format/chromium_pdf/assets/default-footer-template.html")
.to_string(),
),
);
self_config
}
}
impl Configuration {
pub fn new() -> Self {
Self {
config: Arc::new(RwLock::new(HashMap::new())),
}
}
/// returns the value of a config entry
pub fn get_entry(&self, key: &str) -> Option<ConfigEntry> {
let config = self.config.read().unwrap();
if let Some(entry) = config.get(key) {
let value = entry.read().unwrap();
Some(value.clone())
} else {
None
}
}
/// returns a config entry that is a reference to a value
pub fn get_ref_entry(&self, key: &str) -> Option<ConfigRefEntry> {
let config = self.config.read().unwrap();
if let Some(entry) = config.get(&key.to_string()) {
Some(Arc::clone(entry))
} else {
None
}
}
/// Sets a config parameter
pub fn set(&mut self, key: &str, value: Value) {
let mut config = self.config.write().unwrap();
if let Some(entry) = config.get(&key.to_string()) {
entry.write().unwrap().set(value)
} else {
config.insert(
key.to_string(),
Arc::new(RwLock::new(ConfigEntry::new(value))),
);
}
}
/// Sets a config value based on a metadata value
pub fn set_from_meta(&mut self, key: &str, value: MetadataValue) {
match value {
MetadataValue::String(string) => self.set(key, Value::String(string)),
MetadataValue::Bool(bool) => self.set(key, Value::Bool(bool)),
MetadataValue::Float(f) => self.set(key, Value::Float(f)),
MetadataValue::Integer(i) => self.set(key, Value::Integer(i)),
MetadataValue::Template(t) => self.set(key, Value::Template(t)),
_ => {}
}
}
pub fn set_from_toml(&mut self, value: &toml::Value) -> Option<()> {
let table = value.as_table().cloned()?;
table.iter().for_each(|(k, v)| {
match v {
toml::Value::Table(_) => self.set_from_toml(v).unwrap_or(()),
_ => self.set(k, Value::try_from(v.clone()).unwrap()),
};
});
Some(())
}
}
impl TryFrom<toml::Value> for Value {
type Error = ();
fn try_from(value: toml::Value) -> Result<Self, Self::Error> {
match value {
toml::Value::Table(_) => Err(()),
toml::Value::Float(f) => Ok(Value::Float(f)),
toml::Value::Integer(i) => Ok(Value::Integer(i)),
toml::Value::String(s) => Ok(Value::String(s)),
toml::Value::Boolean(b) => Ok(Value::Bool(b)),
toml::Value::Datetime(dt) => Ok(Value::String(dt.to_string())),
toml::Value::Array(a) => Ok(Value::Array(
a.iter()
.cloned()
.filter_map(|e| Value::try_from(e).ok())
.collect::<Vec<Value>>(),
)),
}
}
}

@ -1,9 +1,16 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{
Anchor, BoldText, Inline, ItalicText, Line, List, ListItem, PlainText, TextLine,
};
use parking_lot::Mutex;
use std::cmp::Ordering;
use std::collections::HashMap;
use std::sync::{Arc, Mutex};
use std::sync::Arc;
use crate::bold_text;
use crate::italic_text;
@ -110,11 +117,11 @@ impl GlossaryManager {
/// Assignes entries to references
pub fn assign_entries_to_references(&self) {
for reference in &self.references {
let mut reference = reference.lock().unwrap();
let mut reference = reference.lock();
if let Some(entry) = self.entries.get(&reference.short) {
reference.entry = Some(Arc::clone(entry));
let mut entry = entry.lock().unwrap();
let mut entry = entry.lock();
if !entry.is_assigned {
entry.is_assigned = true;
@ -130,13 +137,13 @@ impl GlossaryManager {
let mut entries = self
.entries
.values()
.filter(|e| e.lock().unwrap().is_assigned)
.filter(|e| e.lock().is_assigned)
.cloned()
.collect::<Vec<Arc<Mutex<GlossaryEntry>>>>();
entries.sort_by(|a, b| {
let a = a.lock().unwrap();
let b = b.lock().unwrap();
let a = a.lock();
let b = b.lock();
if a.short > b.short {
Ordering::Greater
} else if a.short < b.short {
@ -146,7 +153,7 @@ impl GlossaryManager {
}
});
for entry in &entries {
let entry = entry.lock().unwrap();
let entry = entry.lock();
let mut line = TextLine::new();
line.subtext.push(bold_text!(entry.short.clone()));
line.subtext.push(plain_text!(" - ".to_string()));

@ -1,5 +1,10 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod bibliography;
pub mod configuration;
pub mod glossary;
pub mod placeholders;
pub mod templates;

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::*;
use crate::references::bibliography::create_bib_list;
use chrono::prelude::*;
@ -35,6 +41,8 @@ const P_GLS: &str = "gls";
const P_DATE: &str = "date";
const P_TIME: &str = "time";
const P_DATETIME: &str = "datetime";
const P_AUTHOR: &str = "author";
const P_TITLE: &str = "title";
impl ProcessPlaceholders for Document {
/// parses all placeholders and assigns values to them
@ -54,7 +62,7 @@ impl ProcessPlaceholders for Document {
self.bibliography.get_entry_list_by_occurrence()
)))),
P_GLS => pholder.set_value(block!(Block::List(
self.glossary.lock().unwrap().create_glossary_list()
self.glossary.lock().create_glossary_list()
))),
P_DATE => pholder.set_value(inline!(Inline::Plain(PlainText {
value: get_date_string()
@ -65,10 +73,24 @@ impl ProcessPlaceholders for Document {
P_DATETIME => pholder.set_value(inline!(Inline::Plain(PlainText {
value: format!("{} {}", get_date_string(), get_time_string())
}))),
P_AUTHOR => {
if let Some(value) = self.config.lock().metadata.author.clone() {
pholder.set_value(inline!(Inline::Plain(PlainText { value })))
}
}
P_TITLE => {
if let Some(value) = self.config.lock().metadata.title.clone() {
pholder.set_value(inline!(Inline::Plain(PlainText { value })))
}
}
_ => {
if let Some(entry) = self.config.get_entry(pholder.name.to_lowercase().as_str())
if let Some(value) = self
.config
.lock()
.custom_attributes
.get(pholder.name.to_lowercase().as_str())
.cloned()
{
let value = entry.get().as_string();
pholder.set_value(inline!(Inline::Plain(PlainText { value })))
}
}
@ -94,7 +116,7 @@ impl ProcessPlaceholders for Document {
})));
if let Some(meta) = &pholder.metadata {
if let Some(value) = meta.data.get(S_VALUE) {
self.config.set_from_meta(key, value.clone())
self.config.lock().set_from_meta(key, value.clone())
}
}
}

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{Block, Element, Inline, Line, ListItem};
use std::collections::HashMap;
use std::sync::{Arc, RwLock};

@ -0,0 +1,24 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct FeatureSettings {
pub embed_external: bool,
pub smart_arrows: bool,
pub include_mathjax: bool,
}
impl Default for FeatureSettings {
fn default() -> Self {
Self {
embed_external: true,
smart_arrows: true,
include_mathjax: true,
}
}
}

@ -0,0 +1,24 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct ImageSettings {
pub format: Option<String>,
pub max_width: Option<u32>,
pub max_height: Option<u32>,
}
impl Default for ImageSettings {
fn default() -> Self {
Self {
format: None,
max_height: None,
max_width: None,
}
}
}

@ -0,0 +1,26 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct ImportSettings {
pub ignored_imports: Vec<String>,
pub included_stylesheets: Vec<String>,
pub included_bibliography: Vec<String>,
pub included_glossaries: Vec<String>,
}
impl Default for ImportSettings {
fn default() -> Self {
Self {
ignored_imports: Vec::with_capacity(0),
included_stylesheets: vec!["style.css".to_string()],
included_bibliography: vec!["Bibliography.toml".to_string()],
included_glossaries: vec!["Glossary.toml".to_string()],
}
}
}

@ -0,0 +1,28 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct MetadataSettings {
pub title: Option<String>,
pub author: Option<String>,
pub description: Option<String>,
pub keywords: Vec<String>,
pub language: String,
}
impl Default for MetadataSettings {
fn default() -> Self {
Self {
title: None,
author: None,
description: None,
keywords: Vec::new(),
language: "en".to_string(),
}
}
}

@ -0,0 +1,131 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::{Metadata, MetadataValue};
use crate::settings::feature_settings::FeatureSettings;
use crate::settings::image_settings::ImageSettings;
use crate::settings::import_settings::ImportSettings;
use crate::settings::metadata_settings::MetadataSettings;
use crate::settings::pdf_settings::PDFSettings;
use crate::settings::style_settings::StyleSettings;
use config::{ConfigError, Source};
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::error::Error;
use std::fmt::{self, Display};
use std::io;
use std::mem;
use std::path::PathBuf;
pub mod feature_settings;
pub mod image_settings;
pub mod import_settings;
pub mod metadata_settings;
pub mod pdf_settings;
pub mod style_settings;
pub type SettingsResult<T> = Result<T, SettingsError>;
#[derive(Debug)]
pub enum SettingsError {
IoError(io::Error),
ConfigError(ConfigError),
TomlError(toml::ser::Error),
}
impl Display for SettingsError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
Self::IoError(e) => write!(f, "IO Error: {}", e),
Self::ConfigError(e) => write!(f, "Config Error: {}", e),
Self::TomlError(e) => write!(f, "Toml Error: {}", e),
}
}
}
impl Error for SettingsError {}
impl From<io::Error> for SettingsError {
fn from(e: io::Error) -> Self {
Self::IoError(e)
}
}
impl From<ConfigError> for SettingsError {
fn from(e: ConfigError) -> Self {
Self::ConfigError(e)
}
}
impl From<toml::ser::Error> for SettingsError {
fn from(e: toml::ser::Error) -> Self {
Self::TomlError(e)
}
}
#[derive(Serialize, Deserialize, Clone, Debug, Default)]
pub struct Settings {
pub metadata: MetadataSettings,
pub features: FeatureSettings,
pub imports: ImportSettings,
pub pdf: PDFSettings,
pub images: ImageSettings,
pub style: StyleSettings,
pub custom_attributes: HashMap<String, String>,
}
impl Source for Settings {
fn clone_into_box(&self) -> Box<dyn Source + Send + Sync> {
Box::new(self.clone())
}
fn collect(&self) -> Result<HashMap<String, config::Value>, config::ConfigError> {
let source_str =
toml::to_string(&self).map_err(|e| config::ConfigError::Foreign(Box::new(e)))?;
let result = toml::de::from_str(&source_str)
.map_err(|e| config::ConfigError::Foreign(Box::new(e)))?;
Ok(result)
}
}
impl Settings {
/// Loads the settings from the specified path
pub fn load(path: PathBuf) -> SettingsResult<Self> {
let mut settings = config::Config::default();
settings
.merge(Self::default())?
.merge(config::File::from(path))?;
let settings: Self = settings.try_into()?;
Ok(settings)
}
/// Merges the current settings with the settings from the given path
/// returning updated settings
pub fn merge(&mut self, path: PathBuf) -> SettingsResult<()> {
let mut settings = config::Config::default();
settings
.merge(self.clone())?
.merge(config::File::from(path))?;
let mut settings: Self = settings.try_into()?;
mem::swap(self, &mut settings); // replace the old settings with the new ones
Ok(())
}
pub fn append_metadata<M: Metadata>(&mut self, metadata: M) {
let entries = metadata.get_string_map();
for (key, value) in entries {
self.custom_attributes.insert(key, value);
}
}
pub fn set_from_meta(&mut self, key: &str, value: MetadataValue) {
self.custom_attributes
.insert(key.to_string(), value.to_string());
}
}

@ -0,0 +1,54 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct PDFSettings {
pub display_header_footer: bool,
pub header_template: Option<String>,
pub footer_template: Option<String>,
pub page_height: Option<f32>,
pub page_width: Option<f32>,
pub page_scale: f32,
pub margin: PDFMarginSettings,
}
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct PDFMarginSettings {
pub top: Option<f32>,
pub bottom: Option<f32>,
pub left: Option<f32>,
pub right: Option<f32>,
}
impl Default for PDFMarginSettings {
fn default() -> Self {
Self {
top: Some(0.5),
bottom: Some(0.5),
left: None,
right: None,
}
}
}
impl Default for PDFSettings {
fn default() -> Self {
Self {
display_header_footer: true,
header_template: Some("<div></div>".to_string()),
footer_template: Some(
include_str!("../format/chromium_pdf/assets/default-footer-template.html")
.to_string(),
),
page_height: None,
page_width: None,
page_scale: 1.0,
margin: Default::default(),
}
}
}

@ -0,0 +1,32 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct StyleSettings {
pub bib_ref_display: String,
pub theme: Theme,
}
impl Default for StyleSettings {
fn default() -> Self {
Self {
bib_ref_display: "{{number}}".to_string(),
theme: Theme::GitHub,
}
}
}
#[derive(Serialize, Deserialize, Clone, Debug)]
pub enum Theme {
GitHub,
SolarizedDark,
SolarizedLight,
OceanDark,
OceanLight,
MagicDark,
}

@ -0,0 +1,69 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use platform_dirs::{AppDirs, AppUI};
use sha2::Digest;
use std::fs;
use std::io;
use std::path::PathBuf;
#[derive(Clone, Debug)]
pub struct CacheStorage {
location: PathBuf,
}
impl CacheStorage {
pub fn new() -> Self {
lazy_static::lazy_static! {
static ref APP_DIRS: AppDirs = AppDirs::new(Some("snekdown"), AppUI::CommandLine).unwrap();
}
Self {
location: APP_DIRS.cache_dir.clone(),
}
}
/// Returns the cache path for a given file
pub fn get_file_path(&self, path: &PathBuf) -> PathBuf {
let mut hasher = sha2::Sha256::default();
hasher.update(path.to_string_lossy().as_bytes());
let mut file_name = PathBuf::from(format!("{:x}", hasher.finalize()));
if let Some(extension) = path.extension() {
file_name.set_extension(extension);
}
log::trace!("Cache path is {:?}", path);
return self.location.join(PathBuf::from(file_name));
}
/// Returns if the given file exists in the cache
pub fn has_file(&self, path: &PathBuf) -> bool {
let cache_path = self.get_file_path(path);
cache_path.exists()
}
/// Writes into the corresponding cache file
pub fn read(&self, path: &PathBuf) -> io::Result<Vec<u8>> {
let cache_path = self.get_file_path(path);
fs::read(cache_path)
}
/// Reads the corresponding cache file
pub fn write<R: AsRef<[u8]>>(&self, path: &PathBuf, contents: R) -> io::Result<()> {
let cache_path = self.get_file_path(path);
fs::write(cache_path, contents)
}
/// Clears the cache directory by deleting and recreating it
pub fn clear(&self) -> io::Result<()> {
fs::remove_dir_all(&self.location)?;
fs::create_dir(&self.location)
}
}

@ -1,18 +1,21 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::utils::caching::CacheStorage;
use indicatif::{ProgressBar, ProgressStyle};
use platform_dirs::{AppDirs, AppUI};
use parking_lot::Mutex;
use rayon::prelude::*;
use std::collections::hash_map::DefaultHasher;
use std::fs;
use std::fs::read;
use std::hash::{Hash, Hasher};
use std::path::PathBuf;
use std::sync::{Arc, Mutex};
use std::sync::Arc;
/// A manager for downloading urls in parallel
#[derive(Clone, Debug)]
pub struct DownloadManager {
downloads: Vec<Arc<Mutex<PendingDownload>>>,
pub use_cache: bool,
}
impl DownloadManager {
@ -20,14 +23,12 @@ impl DownloadManager {
pub fn new() -> Self {
Self {
downloads: Vec::new(),
use_cache: true,
}
}
/// Adds a new pending download
pub fn add_download(&mut self, path: String) -> Arc<Mutex<PendingDownload>> {
let mut download = PendingDownload::new(path.clone());
download.use_cache = self.use_cache;
let download = PendingDownload::new(path.clone());
let pending = Arc::new(Mutex::new(download));
self.downloads.push(Arc::clone(&pending));
log::debug!("Added download {}", path);
@ -38,7 +39,7 @@ impl DownloadManager {
/// Downloads all download entries
pub fn download_all(&self) {
let pb = Arc::new(Mutex::new(ProgressBar::new(self.downloads.len() as u64)));
pb.lock().unwrap().set_style(
pb.lock().set_style(
ProgressStyle::default_bar()
.template("Fetching Embeds: [{bar:40.cyan/blue}]")
.progress_chars("=> "),
@ -46,10 +47,10 @@ impl DownloadManager {
let pb_cloned = Arc::clone(&pb);
self.downloads.par_iter().for_each_with(pb_cloned, |pb, d| {
d.lock().unwrap().download();
pb.lock().unwrap().inc(1);
d.lock().download();
pb.lock().inc(1);
});
pb.lock().unwrap().finish_and_clear();
pb.lock().finish_and_clear();
}
}
@ -60,6 +61,7 @@ pub struct PendingDownload {
pub(crate) path: String,
pub(crate) data: Option<Vec<u8>>,
pub(crate) use_cache: bool,
cache: CacheStorage,
}
impl PendingDownload {
@ -68,6 +70,7 @@ impl PendingDownload {
path,
data: None,
use_cache: true,
cache: CacheStorage::new(),
}
}
@ -98,22 +101,18 @@ impl PendingDownload {
/// Stores the data to a cache file to retrieve it later
fn store_to_cache(&self, data: &Vec<u8>) {
if self.use_cache {
let cache_file = get_cached_path(PathBuf::from(&self.path));
log::debug!("Writing to cache {} -> {:?}", self.path, cache_file);
fs::write(&cache_file, data.clone()).unwrap_or_else(|_| {
log::warn!(
"Failed to write file to cache: {} -> {:?}",
self.path,
cache_file
)
});
let path = PathBuf::from(&self.path);
self.cache
.write(&path, data.clone())
.unwrap_or_else(|_| log::warn!("Failed to write file to cache: {}", self.path));
}
}
fn read_from_cache(&self) -> Option<Vec<u8>> {
let cache_path = get_cached_path(PathBuf::from(&self.path));
if cache_path.exists() && self.use_cache {
read(cache_path).ok()
let path = PathBuf::from(&self.path);
if self.cache.has_file(&path) && self.use_cache {
self.cache.read(&path).ok()
} else {
None
}
@ -121,26 +120,14 @@ impl PendingDownload {
/// Downloads the content from the given url
fn download_content(&self) -> Option<Vec<u8>> {
reqwest::blocking::get(&self.path)
.ok()
.map(|c| c.bytes())
.and_then(|b| b.ok())
.map(|b| b.to_vec())
download_path(self.path.clone())
}
}
pub fn get_cached_path(path: PathBuf) -> PathBuf {
lazy_static::lazy_static! {
static ref APP_DIRS: AppDirs = AppDirs::new(Some("snekdown"), AppUI::CommandLine).unwrap();
}
let mut hasher = DefaultHasher::new();
path.hash(&mut hasher);
let file_name = PathBuf::from(format!("{:x}", hasher.finish()));
if !APP_DIRS.cache_dir.is_dir() {
fs::create_dir(&APP_DIRS.cache_dir)
.unwrap_or_else(|_| log::warn!("Failed to create cache dir {:?}", APP_DIRS.cache_dir))
}
APP_DIRS.cache_dir.join(file_name)
pub fn download_path(path: String) -> Option<Vec<u8>> {
reqwest::blocking::get(&path)
.ok()
.map(|c| c.bytes())
.and_then(|b| b.ok())
.map(|b| b.to_vec())
}

@ -0,0 +1,249 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use crate::elements::Metadata;
use crate::utils::caching::CacheStorage;
use crate::utils::downloads::download_path;
use image::imageops::FilterType;
use image::io::Reader as ImageReader;
use image::{GenericImageView, ImageFormat, ImageResult};
use indicatif::{ProgressBar, ProgressStyle};
use mime::Mime;
use parking_lot::Mutex;
use rayon::prelude::*;
use std::io;
use std::io::Cursor;
use std::path::PathBuf;
use std::sync::Arc;
#[derive(Clone, Debug)]
pub struct ImageConverter {
images: Vec<Arc<Mutex<PendingImage>>>,
target_format: Option<ImageFormat>,
target_size: Option<(u32, u32)>,
}
impl ImageConverter {
pub fn new() -> Self {
Self {
images: Vec::new(),
target_format: None,
target_size: None,
}
}
pub fn set_target_size(&mut self, target_size: (u32, u32)) {
self.target_size = Some(target_size)
}
pub fn set_target_format(&mut self, target_format: ImageFormat) {
self.target_format = Some(target_format);
}
/// Adds an image to convert
pub fn add_image(&mut self, path: PathBuf) -> Arc<Mutex<PendingImage>> {
let image = Arc::new(Mutex::new(PendingImage::new(path)));
self.images.push(image.clone());
image
}
/// Converts all images
pub fn convert_all(&mut self) {
let pb = Arc::new(Mutex::new(ProgressBar::new(self.images.len() as u64)));
pb.lock().set_style(
ProgressStyle::default_bar()
.template("Processing images: [{bar:40.cyan/blue}]")
.progress_chars("=> "),
);
self.images.par_iter().for_each(|image| {
let mut image = image.lock();
if let Err(e) = image.convert(self.target_format.clone(), self.target_size.clone()) {
log::error!("Failed to embed image {:?}: {}", image.path, e)
}
pb.lock().tick();
});
pb.lock().finish_and_clear();
}
}
#[derive(Clone, Debug)]
pub struct PendingImage {
pub path: PathBuf,
pub data: Option<Vec<u8>>,
cache: CacheStorage,
pub mime: Mime,
brightness: Option<i32>,
contrast: Option<f32>,
huerotate: Option<i32>,
grayscale: bool,
invert: bool,
}
impl PendingImage {
pub fn new(path: PathBuf) -> Self {
let mime = get_mime(&path);
Self {
path,
data: None,
cache: CacheStorage::new(),
mime,
brightness: None,
contrast: None,
grayscale: false,
invert: false,
huerotate: None,
}
}
pub fn assign_from_meta<M: Metadata>(&mut self, meta: &M) {
if let Some(brightness) = meta.get_integer("brightness") {
self.brightness = Some(brightness as i32);
}
if let Some(contrast) = meta.get_float("contrast") {
self.contrast = Some(contrast as f32);
}
if let Some(huerotate) = meta.get_float("huerotate") {
self.huerotate = Some(huerotate as i32);
}
self.grayscale = meta.get_bool("grayscale");
self.invert = meta.get_bool("invert");
}
/// Converts the image to the specified target format (specified by target_extension)
pub fn convert(
&mut self,
target_format: Option<ImageFormat>,
target_size: Option<(u32, u32)>,
) -> ImageResult<()> {
let format = target_format
.or_else(|| {
self.path
.extension()
.and_then(|extension| ImageFormat::from_extension(extension))
})
.unwrap_or(ImageFormat::Png);
let output_path = self.get_output_path(format, target_size);
self.mime = get_mime(&output_path);
if self.cache.has_file(&output_path) {
self.data = Some(self.cache.read(&output_path)?)
} else {
self.convert_image(format, target_size)?;
if let Some(data) = &self.data {
self.cache.write(&output_path, data)?;
}
}
Ok(())
}
/// Converts the image
fn convert_image(
&mut self,
format: ImageFormat,
target_size: Option<(u32, u32)>,
) -> ImageResult<()> {
let mut image = ImageReader::open(self.get_path()?)?.decode()?;
if let Some((width, height)) = target_size {
let dimensions = image.dimensions();
if dimensions.0 > width || dimensions.1 > height {
image = image.resize(width, height, FilterType::Lanczos3);
}
}
if let Some(brightness) = self.brightness {
image = image.brighten(brightness);
}
if let Some(contrast) = self.contrast {
image = image.adjust_contrast(contrast);
}
if let Some(rotate) = self.huerotate {
image = image.huerotate(rotate);
}
if self.grayscale {
image = image.grayscale();
}
if self.invert {
image.invert();
}
let data = Vec::new();
let mut writer = Cursor::new(data);
image.write_to(&mut writer, format)?;
self.data = Some(writer.into_inner());
Ok(())
}
/// Returns the path of the file
fn get_path(&self) -> io::Result<PathBuf> {
if !self.path.exists() {
if self.cache.has_file(&self.path) {
return Ok(self.cache.get_file_path(&self.path));
}
if let Some(data) = download_path(self.path.to_string_lossy().to_string()) {
self.cache.write(&self.path, data)?;
return Ok(self.cache.get_file_path(&self.path));
}
}
Ok(self.path.clone())
}
/// Returns the output file name after converting the image
fn get_output_path(
&self,
target_format: ImageFormat,
target_size: Option<(u32, u32)>,
) -> PathBuf {
let mut path = self.path.clone();
let mut file_name = path.file_stem().unwrap().to_string_lossy().to_string();
let extension = target_format.extensions_str()[0];
let type_name = format!("{:?}", target_format);
if let Some(target_size) = target_size {
file_name += &*format!("-w{}-h{}", target_size.0, target_size.1);
}
if let Some(b) = self.brightness {
file_name += &*format!("-b{}", b);
}
if let Some(c) = self.contrast {
file_name += &*format!("-c{}", c);
}
if let Some(h) = self.huerotate {
file_name += &*format!("-h{}", h);
}
file_name += &*format!("{}-{}", self.invert, self.grayscale);
file_name += format!("-{}", type_name).as_str();
path.set_file_name(file_name);
path.set_extension(extension);
path
}
}
fn get_mime(path: &PathBuf) -> Mime {
let mime = mime_guess::from_ext(
path.clone()
.extension()
.and_then(|e| e.to_str())
.unwrap_or("png"),
)
.first()
.unwrap_or(mime::IMAGE_PNG);
mime
}

@ -1,3 +1,9 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
#[macro_export]
macro_rules! plain_text {
($e:expr) => {

@ -1,3 +1,11 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
pub mod caching;
pub mod downloads;
pub mod image_converting;
pub mod macros;
pub mod parsing;

@ -1,6 +1,21 @@
/*
* Snekdown - Custom Markdown flavour and parser
* Copyright (C) 2021 Trivernis
* See LICENSE for more information.
*/
use regex::Regex;
#[macro_export]
macro_rules! parse {
($str:expr) => {
Parser::new($str.to_string(), None).parse()
};
}
/// Removes a single backslash from the given content
pub(crate) fn remove_single_backlslash<S: ToString>(content: S) -> String {
let content = content.to_string();
lazy_static::lazy_static! {static ref R: Regex = Regex::new(r"\\(?P<c>[^\\])").unwrap();}
R.replace_all(&*content, "$c").to_string()
}

Loading…
Cancel
Save