Practical Usage

Install, compile, benchmark, publish.

The point of this repository is to make the EML construction usable as software: a Python package, a compiler, a benchmark harness, a plotting pipeline, and a documentation site that can be pushed directly to GitHub Pages.

Installation

uv venv .venv --clear
uv sync --dev

The project is designed around a uv-managed virtual environment. Runtime dependencies and developer dependencies are installed into that environment.

Python API

import numpy as np
import eml

x = np.linspace(-1.0, 1.0, 5)
print(eml.sin(x))
print(eml.log1p(0.5))
print(eml.degrees(eml.pi / 2))
print(eml.compile_source("sin(x) + sqrt(1 + x**2)"))

Compile CLI

uv run eml-compile "sin(x) + sqrt(1 + x**2)"
uv run eml-compile "Sin[x] + Log[2, x]" --show all --assign x=2
uv run eml-compile "sin(x) + sqrt(1 + x**2)" --show summary
uv run eml-compile "sin(x) + sqrt(1 + x**2)" --show mermaid --output docs/assets/tree.mmd
uv run eml-compile "sin(x) + sqrt(1 + x**2)" --show dot --output docs/assets/tree.dot

Output modes now include source, tree, stats, summary, mermaid, dot, json, and all.

Benchmark CLI

uv run eml-benchmark
uv run eml-benchmark --cases exp log sin --sample-count 8192 --repeats 7 --output benchmark_results/report.json
uv run eml-benchmark --output-dir benchmark_results/full_suite --plot-dir benchmark_results/full_suite/plots
uv run eml-benchmark-plot --bundle-dir benchmark_results/full_suite --output-dir benchmark_results/full_suite/plots

The benchmark tooling writes structured JSON and CSV reports, grouped summary tables, and PNG plots that can be published directly under docs/.

Parser conveniences

The compiler accepts Python-style syntax, Wolfram-style syntax, and strings prefixed with np., numpy., or math..

Engineering scope

This project adds tests, plotting, Pages-ready docs, CLI entry points, and a library interface intended for repeated use rather than one-off experimentation.

Why not just wrap NumPy?

Because the point here is to preserve the constructive EML chain. The project uses NumPy as a numeric backend, but not as a shortcut for the higher-level math itself.

Implementation notes page

For the branch choices, complex-domain rationale, and stability tradeoffs, see the dedicated Implementation page.

Why publish under docs/ ?

It keeps the site deployable through GitHub Pages without an extra documentation build step and makes benchmark artifacts part of the published repository state.

GitHub Pages deployment

The repository includes a GitHub Actions workflow that tests the package and deploys the docs/ directory to GitHub Pages on pushes to the default branch.

.github/workflows/deploy-pages.yml

The workflow is intentionally simple because the site is already static. There is no extra documentation build phase beyond committing the generated assets under docs/assets/.

What the workflow does

  • Installs uv and syncs the project environment.
  • Runs the full test suite.
  • Uploads the docs/ directory as the Pages artifact.
  • Deploys the artifact to GitHub Pages on pushes to the main branch.

The workflow asks actions/configure-pages to enable GitHub Pages automatically. If the repository has never used Pages before and the workflow still lacks permission to enable it, a maintainer can do the one-time setup in repository settings by choosing GitHub Actions as the Pages source.