Compare commits
14 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| ca986633b1 | |||
| 19e9bba3cc | |||
| 0d75c08f6a | |||
| 6a2f207d8d | |||
| b79bb06130 | |||
| 61c5b322dd | |||
| 3e95ad76f9 | |||
| 695e206157 | |||
| 15faada234 | |||
| 9484857d9a | |||
| 7075b3f52b | |||
| f6933978e9 | |||
| 684195a04f | |||
| a8c59b7cf0 |
@@ -5,6 +5,7 @@
|
|||||||
/jupyterhub_volumes/caddy
|
/jupyterhub_volumes/caddy
|
||||||
/jupyterhub_volumes/course/data/Genbank
|
/jupyterhub_volumes/course/data/Genbank
|
||||||
/jupyterhub_volumes/web/
|
/jupyterhub_volumes/web/
|
||||||
|
/jupyterhub_volumes/builder
|
||||||
/**/.DS_Store
|
/**/.DS_Store
|
||||||
/web_src/**/*.RData
|
/web_src/**/*.RData
|
||||||
/web_src/**/*.pdf
|
/web_src/**/*.pdf
|
||||||
@@ -17,3 +18,4 @@ ncbitaxo_*
|
|||||||
Readme_files
|
Readme_files
|
||||||
Readme.html
|
Readme.html
|
||||||
tmp.*
|
tmp.*
|
||||||
|
reserve
|
||||||
|
|||||||
@@ -2,253 +2,328 @@
|
|||||||
|
|
||||||
## Intended use
|
## Intended use
|
||||||
|
|
||||||
This project packages the MetabarcodingSchool training lab into one reproducible bundle. You get Python, R, and Bash kernels, a Quarto-built course website, and preconfigured admin/student accounts, so onboarding a class is a single command instead of a day of setup. Everything runs locally on a single machine, student work persists between sessions, and `./start-jupyterhub.sh` takes care of building images, rendering the site, preparing volumes, and bringing JupyterHub up at `http://localhost:8888`. Defaults (accounts, passwords, volumes) live in the repo so instructors can tweak them quickly.
|
This project packages the MetabarcodingSchool training lab into one reproducible bundle. You get Python, R, and Bash kernels, a Quarto-built course website, and preconfigured admin/student accounts, so onboarding a class is a single command instead of a day of setup. Everything runs locally on a single machine, student work persists between sessions, and `./start-jupyterhub.sh` takes care of pulling images, rendering the site, preparing volumes, and bringing JupyterHub up at `http://localhost:8888`.
|
||||||
|
|
||||||
## Prerequisites (with quick checks)
|
## Prerequisites (with quick checks)
|
||||||
|
|
||||||
You need Docker, Docker Compose, Quarto, and Python 3 available on the machine that will host the lab.
|
You only need **Docker and Docker Compose** on the machine that will host the lab. All other tools (Quarto, Hugo, Python, R) are provided via a builder Docker image and do not need to be installed on your system.
|
||||||
|
|
||||||
- macOS: install [OrbStack](https://orbstack.dev/) (recommended) or Docker Desktop; both ship Docker Engine and Compose.
|
- macOS: install [OrbStack](https://orbstack.dev/) (recommended) or Docker Desktop; both ship Docker Engine and Compose.
|
||||||
- Linux: install Docker Engine and the Compose plugin from your distribution (e.g., `sudo apt install docker.io docker-compose-plugin`) or from Docker’s official packages.
|
- Linux: install Docker Engine and the Compose plugin (`sudo apt install docker.io docker-compose-plugin`) or from Docker's official packages.
|
||||||
- Windows: install Docker Desktop with the WSL2 backend enabled.
|
- Windows: install Docker Desktop with the WSL2 backend enabled.
|
||||||
- Quarto CLI: get installers from <https://quarto.org/docs/get-started/>.
|
|
||||||
- Python 3: any recent version is fine (only the standard library is used).
|
|
||||||
|
|
||||||
Verify from a terminal; if a command is missing, install it before moving on:
|
Verify from a terminal:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker --version
|
docker --version
|
||||||
docker compose version # or: docker-compose --version
|
docker compose version # or: docker-compose --version
|
||||||
quarto --version
|
|
||||||
python3 --version
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## How the startup script works
|
## Three operating modes
|
||||||
|
|
||||||
`./start-jupyterhub.sh` is the single entry point. It builds the Docker images, renders the course website, prepares the volume folders, and starts the stack. Internally it:
|
`./start-jupyterhub.sh` has three modes that control how Docker images are obtained:
|
||||||
|
|
||||||
- creates the `jupyterhub_volumes/` tree (caddy, course, shared, users, web…)
|
| Mode | Flag | Description |
|
||||||
- builds `jupyterhub-student` and `jupyterhub-hub` images
|
|------|------|-------------|
|
||||||
- renders the Quarto site from `web_src/`, generates PDF galleries and `pages.json`, and copies everything into `jupyterhub_volumes/web/`
|
| **Pull** (default) | *(none)* | Pull pre-built images from the registry and start |
|
||||||
- runs `docker-compose up -d --remove-orphans`
|
| **Local build** | `--local-build` | Build images locally on your machine and start (no push) |
|
||||||
|
| **Publish** | `--publish` | Build multi-arch images (amd64 + arm64), push to registry, then start |
|
||||||
|
|
||||||
You can tailor what it does with a few flags:
|
### Pull mode — default, fastest
|
||||||
|
|
||||||
- `--no-build` (or `--offline`): skip Docker image builds and reuse existing images (useful when offline).
|
```bash
|
||||||
- `--force-rebuild`: rebuild images without cache.
|
./start-jupyterhub.sh
|
||||||
- `--stop-server`: stop the stack and remove student containers, then exit.
|
```
|
||||||
- `--update-lectures`: rebuild the course website only (no Docker stop/start).
|
|
||||||
- `--build-obidoc`: force rebuilding the obidoc documentation (auto-built if empty; skipped in offline mode).
|
Downloads the three pre-built images from `registry.metabarcoding.org/metabarschool/`:
|
||||||
|
- `obijupyterhub-builder:latest`
|
||||||
|
- `obijupyterhub-hub:latest`
|
||||||
|
- `obijupyterhub-student:latest`
|
||||||
|
|
||||||
|
This is what instructors should use in class. No compilation, no wait.
|
||||||
|
|
||||||
|
### Local build mode — for development
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./start-jupyterhub.sh --local-build
|
||||||
|
```
|
||||||
|
|
||||||
|
Builds all three images locally using the Dockerfiles in `obijupyterhub/`. Rebuilt images stay on your machine and are not pushed to the registry. Additional flags apply only in this mode:
|
||||||
|
|
||||||
|
| Flag | Effect |
|
||||||
|
|------|--------|
|
||||||
|
| `--no-build` / `--offline` | Skip all image operations, use whatever is already local |
|
||||||
|
| `--force-rebuild` | Rebuild all images without Docker cache |
|
||||||
|
| `--rebuild-builder` | Force rebuild the builder image only |
|
||||||
|
| `--rebuild-student` | Force rebuild the student image only |
|
||||||
|
| `--rebuild-hub` | Force rebuild the JupyterHub image only |
|
||||||
|
|
||||||
|
`--rebuild-*` and `--force-rebuild` imply `--local-build` automatically.
|
||||||
|
|
||||||
|
### Publish mode — for maintainers
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./start-jupyterhub.sh --publish
|
||||||
|
```
|
||||||
|
|
||||||
|
Builds all three images for both `linux/amd64` and `linux/arm64` using `docker buildx`, then pushes them to the registry tagged with both `:latest` and the version from `version.txt`. Requires write access to the registry and `docker buildx` with a `docker-container` driver.
|
||||||
|
|
||||||
|
**Before publishing a new version**, bump `version.txt` at the project root:
|
||||||
|
|
||||||
|
```
|
||||||
|
0.2.0
|
||||||
|
```
|
||||||
|
|
||||||
|
## Actions (all modes)
|
||||||
|
|
||||||
|
These flags work alongside any mode:
|
||||||
|
|
||||||
|
| Flag | Effect |
|
||||||
|
|------|--------|
|
||||||
|
| `--stop-server` | Stop the stack and remove student containers, then exit |
|
||||||
|
| `--update-lectures` | Rebuild the course website only (no Docker stop/start) |
|
||||||
|
| `--update-obidoc` | Rebuild the obidoc documentation only (no Docker stop/start) |
|
||||||
|
| `--build-obidoc` | Force rebuild of obidoc documentation on next full start |
|
||||||
|
|
||||||
## Installation and first run
|
## Installation and first run
|
||||||
|
|
||||||
1) Clone the project:
|
1. Clone the project:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git clone https://forge.metabarcoding.org/MetabarcodingSchool/OBIJupyterHub.git
|
git clone https://forge.metabarcoding.org/MetabarcodingSchool/OBIJupyterHub.git
|
||||||
cd OBIJupyterHub
|
cd OBIJupyterHub
|
||||||
```
|
```
|
||||||
|
|
||||||
2) (Optional) glance at the structure you’ll populate:
|
2. Repository structure:
|
||||||
|
|
||||||
```
|
```
|
||||||
OBIJupyterHub
|
OBIJupyterHub/
|
||||||
├── start-jupyterhub.sh - single entry point (build + render + start)
|
├── start-jupyterhub.sh single entry point
|
||||||
├── obijupyterhub - Docker images and stack definitions
|
├── version.txt current image version number
|
||||||
│ ├── docker-compose.yml
|
├── obijupyterhub/
|
||||||
│ ├── Dockerfile
|
│ ├── docker-compose.yml
|
||||||
│ ├── Dockerfile.hub
|
│ ├── Dockerfile student image
|
||||||
│ └── jupyterhub_config.py
|
│ ├── Dockerfile.hub JupyterHub image
|
||||||
├── jupyterhub_volumes - data persisted on the host
|
│ ├── Dockerfile.builder builder image (Quarto, Hugo, R, Python)
|
||||||
│ ├── course - read-only for students (notebooks, data, bin, R packages)
|
│ └── jupyterhub_config.py
|
||||||
│ ├── shared - shared read/write space for everyone
|
├── jupyterhub_volumes/ data persisted on the host
|
||||||
│ ├── users - per-user persistent data
|
│ ├── builder/R_packages/ R package cache for building lectures
|
||||||
│ └── web - rendered course website
|
│ ├── course/ read-only for students (notebooks, data, bin)
|
||||||
└── web_src - Quarto sources for the course website
|
│ ├── shared/ shared read/write space for everyone
|
||||||
|
│ ├── users/ per-user persistent data
|
||||||
|
│ └── web/ rendered course website
|
||||||
|
├── tools/
|
||||||
|
│ ├── install_quarto_deps.R automatic R dependency detection and install
|
||||||
|
│ └── install_packages.sh install shared R packages into course/
|
||||||
|
└── web_src/ Quarto sources for the course website
|
||||||
```
|
```
|
||||||
|
|
||||||
3) Prepare course materials (optional before first run):
|
3. (Optional) place course materials in `jupyterhub_volumes/course/` before first run.
|
||||||
- Put notebooks, datasets, scripts, binaries, or PDFs for students under `jupyterhub_volumes/course/`. They will appear read-only at `/home/jovyan/work/course/`.
|
|
||||||
- For collaborative work, drop files in `jupyterhub_volumes/shared/` (read/write for all at `/home/jovyan/work/shared/`).
|
|
||||||
- Edit or add Quarto sources in `web_src/` to update the course website; the script will render them.
|
|
||||||
|
|
||||||
4) Start everything (build + render + launch):
|
4. Start everything:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
./start-jupyterhub.sh
|
./start-jupyterhub.sh # pulls images from registry (recommended)
|
||||||
|
# or
|
||||||
|
./start-jupyterhub.sh --local-build # builds locally
|
||||||
```
|
```
|
||||||
|
|
||||||
5) Access JupyterHub in a browser at `http://localhost:8888`.
|
5. Access JupyterHub at `http://localhost:8888`.
|
||||||
|
|
||||||
6) Stop the stack when you’re done (run from `obijupyterhub/`):
|
6. Stop when done:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
./start-jupyterhub.sh --stop-server
|
||||||
|
# or from obijupyterhub/
|
||||||
docker-compose down
|
docker-compose down
|
||||||
```
|
```
|
||||||
|
|
||||||
### Operating the stack (one command, a few options)
|
## How the builder image works
|
||||||
|
|
||||||
- Start or rebuild: `./start-jupyterhub.sh` (rebuilds images, regenerates the website, starts the stack).
|
The `obijupyterhub-builder` image contains Quarto, Hugo, R, and Python — you do not need any of these on your host. The script runs this image as a temporary container to:
|
||||||
- Start without rebuilding images (offline): `./start-jupyterhub.sh --no-build`
|
|
||||||
- Force rebuild without cache: `./start-jupyterhub.sh --force-rebuild`
|
|
||||||
- Stop only: `./start-jupyterhub.sh --stop-server`
|
|
||||||
- Rebuild website only (no Docker stop/start): `./start-jupyterhub.sh --update-lectures`
|
|
||||||
- Rebuild obidoc docs: `./start-jupyterhub.sh --build-obidoc` (also builds automatically if `jupyterhub_volumes/web/obidoc` is empty; skipped in offline mode)
|
|
||||||
- Access at `http://localhost:8888` (students: any username / password `metabar2025`; admin: `admin` / `admin2025`).
|
|
||||||
- Check logs from `obijupyterhub/` with `docker-compose logs -f jupyterhub`.
|
|
||||||
- Stop with `docker-compose down` (from `obijupyterhub/`). Rerun `./start-jupyterhub.sh` to start again or after config changes.
|
|
||||||
|
|
||||||
## Managing shared data
|
- detect R package dependencies from your `.qmd` files (scans `library()`, `require()`, and `remotes::install_git/github()` calls using base R — no external package required)
|
||||||
|
- install missing R packages into `jupyterhub_volumes/builder/R_packages/` (cached between runs)
|
||||||
|
- render the Quarto website from `web_src/`
|
||||||
|
- generate PDF galleries and `pages.json`
|
||||||
|
- (optionally) build the obidoc documentation with Hugo
|
||||||
|
|
||||||
Each student lands in `/home/jovyan/work/` with three key areas: their own files, a shared space, and a read-only course space. Everything under `work/` is persisted on the host in `jupyterhub_volumes`.
|
### R package caching
|
||||||
|
|
||||||
```
|
Packages are cached in `jupyterhub_volumes/builder/R_packages/`:
|
||||||
work/ # Personal workspace root (persistent)
|
|
||||||
├── [student files] # Their own files and notebooks
|
- **First build**: all packages used in your `.qmd` files are detected and installed (may take a while).
|
||||||
├── R_packages/ # Personal R packages (writable by student)
|
- **Subsequent builds**: only new packages are installed, making builds much faster.
|
||||||
├── shared/ # Shared workspace (read/write, shared with all)
|
- **Non-CRAN packages**: packages installed via `remotes::install_git()` or `remotes::install_github()` in your `.qmd` files are detected and pre-installed automatically before rendering.
|
||||||
└── course/ # Course files (read-only, managed by admin)
|
- **Clear the cache**: delete `jupyterhub_volumes/builder/R_packages/` to force a full reinstall.
|
||||||
├── R_packages/ # Shared R packages (read-only, installed by prof)
|
|
||||||
├── bin/ # Shared executables (in PATH)
|
## OBITools documentation (obidoc)
|
||||||
└── [course materials] # Your course files
|
|
||||||
|
The OBITools4 documentation is built from the [`obitools4-doc`](https://github.com/metabarcoding/obitools4-doc) repository using Hugo and served as a static site at `http://localhost:8888/obidoc/`.
|
||||||
|
|
||||||
|
### How it works
|
||||||
|
|
||||||
|
The builder container clones the repository (with all submodules), runs `hugo build`, and writes the generated HTML into `jupyterhub_volumes/web/obidoc/`. Caddy then serves these files directly — no special routing is needed.
|
||||||
|
|
||||||
|
### First installation
|
||||||
|
|
||||||
|
The documentation is built automatically on the first full start if `jupyterhub_volumes/web/obidoc/` is empty:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./start-jupyterhub.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
R looks for packages in this order: personal `work/R_packages/`, then shared `work/course/R_packages/`, then system libraries. Because everything lives under `work/`, student files survive restarts.
|
To force a build even if the directory is already populated, use `--build-obidoc` during a full start:
|
||||||
|
|
||||||
### User Accounts
|
```bash
|
||||||
|
./start-jupyterhub.sh --build-obidoc
|
||||||
|
```
|
||||||
|
|
||||||
Defaults are defined in `obijupyterhub/docker-compose.yml`: admin (`admin` / `admin2025`) with write access to `course/`, and students (any username, password `metabar2025`) with read-only access to `course/`. Adjust `JUPYTERHUB_ADMIN_PASSWORD` and `JUPYTERHUB_PASSWORD` there, then rerun `./start-jupyterhub.sh`.
|
### Updating the documentation
|
||||||
|
|
||||||
### Installing R Packages (Admin Only)
|
To rebuild the documentation without stopping the running stack:
|
||||||
|
|
||||||
From the host, install shared R packages into `course/R_packages/`:
|
```bash
|
||||||
|
./start-jupyterhub.sh --update-obidoc
|
||||||
|
```
|
||||||
|
|
||||||
``` bash
|
This pulls the latest version of the builder image (or uses the local one with `--local-build`), reclones the `obitools4-doc` repository, rebuilds the site, and replaces the contents of `jupyterhub_volumes/web/obidoc/`. The JupyterHub stack keeps running throughout.
|
||||||
# Install packages
|
|
||||||
|
### Removing the documentation
|
||||||
|
|
||||||
|
To remove the built documentation (e.g. to free disk space or force a clean rebuild):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
rm -rf jupyterhub_volumes/web/obidoc/*
|
||||||
|
```
|
||||||
|
|
||||||
|
The next `./start-jupyterhub.sh` will rebuild it automatically.
|
||||||
|
|
||||||
|
## Managing course and student data
|
||||||
|
|
||||||
|
Each student lands in `/home/jovyan/work/` with three areas:
|
||||||
|
|
||||||
|
```
|
||||||
|
work/
|
||||||
|
├── [student files] personal workspace (persistent)
|
||||||
|
├── R_packages/ personal R packages (writable by student)
|
||||||
|
├── shared/ shared space (read/write, all students)
|
||||||
|
└── course/ course files (read-only)
|
||||||
|
├── R_packages/ shared R packages installed by the instructor
|
||||||
|
├── bin/ shared executables (added to PATH)
|
||||||
|
└── [course materials]
|
||||||
|
```
|
||||||
|
|
||||||
|
On the host, place course files in `jupyterhub_volumes/course/`, collaborative files in `jupyterhub_volumes/shared/`, and collect student work from `jupyterhub_volumes/users/`.
|
||||||
|
|
||||||
|
### Installing shared R packages (instructor)
|
||||||
|
|
||||||
|
```bash
|
||||||
tools/install_packages.sh reshape2 plotly knitr
|
tools/install_packages.sh reshape2 plotly knitr
|
||||||
```
|
```
|
||||||
|
|
||||||
Students can install their own packages into their personal `work/R_packages/`:
|
### Installing personal R packages (students)
|
||||||
|
|
||||||
```r
|
```r
|
||||||
# Install in personal library (each student has their own)
|
install.packages('mypackage') # installs into work/R_packages/
|
||||||
install.packages('mypackage') # Will install in work/R_packages/
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Using R Packages (Students)
|
### Loading packages (students)
|
||||||
|
|
||||||
Students simply load packages normally:
|
```r
|
||||||
|
library(reshape2) # searches: work/R_packages/ → work/course/R_packages/ → system
|
||||||
``` r
|
|
||||||
library(reshape2) # R checks: 1) work/R_packages/ 2) work/course/R_packages/ 3) system
|
|
||||||
library(plotly)
|
|
||||||
```
|
```
|
||||||
|
|
||||||
R automatically searches in this order:
|
## User accounts
|
||||||
|
|
||||||
1. Personal packages: `/home/jovyan/work/R_packages/` (R_LIBS_USER)
|
Defaults are set in `obijupyterhub/docker-compose.yml`:
|
||||||
1. Prof packages: `/home/jovyan/work/course/R_packages/` (R_LIBS_SITE)
|
|
||||||
1. System packages
|
|
||||||
|
|
||||||
### List Available Packages
|
| Account | Username | Password |
|
||||||
|
|---------|----------|----------|
|
||||||
|
| Admin | `admin` | `admin2025` |
|
||||||
|
| Students | any | `metabar2025` |
|
||||||
|
|
||||||
``` r
|
Change `JUPYTERHUB_ADMIN_PASSWORD` and `JUPYTERHUB_PASSWORD` in the compose file, then rerun `./start-jupyterhub.sh`.
|
||||||
# List all available packages (personal + course + system)
|
|
||||||
installed.packages()[,"Package"]
|
|
||||||
|
|
||||||
# Check personal packages
|
To restrict access to a predefined list, edit `jupyterhub_config.py`:
|
||||||
list.files("/home/jovyan/work/R_packages")
|
|
||||||
|
|
||||||
# Check course packages (installed by prof)
|
```python
|
||||||
list.files("/home/jovyan/work/course/R_packages")
|
|
||||||
```
|
|
||||||
|
|
||||||
### Deposit or retrieve course and student files
|
|
||||||
|
|
||||||
On the host, place course files in `jupyterhub_volumes/course/` (they appear read-only to students), shared files in `jupyterhub_volumes/shared/`, and collect student work from `jupyterhub_volumes/users/`.
|
|
||||||
|
|
||||||
## User Management
|
|
||||||
|
|
||||||
### Option 1: Predefined User List
|
|
||||||
|
|
||||||
In `jupyterhub_config.py`, uncomment and modify:
|
|
||||||
|
|
||||||
``` python
|
|
||||||
c.Authenticator.allowed_users = {'student1', 'student2', 'student3'}
|
c.Authenticator.allowed_users = {'student1', 'student2', 'student3'}
|
||||||
```
|
```
|
||||||
|
|
||||||
### Option 2: Allow Everyone (for testing)
|
## Customising the images
|
||||||
|
|
||||||
By default, the configuration allows any user:
|
All image customisations require a rebuild. Use `--local-build` (or the targeted `--rebuild-*` flag) to apply changes locally, or `--publish` to push them to the registry.
|
||||||
|
|
||||||
``` python
|
### Add R packages baked into the student image
|
||||||
c.Authenticator.allow_all = True
|
|
||||||
```
|
|
||||||
|
|
||||||
⚠️ **Warning**: DummyAuthenticator is ONLY for local testing!
|
Edit `obijupyterhub/Dockerfile` (before `USER ${NB_UID}`):
|
||||||
|
|
||||||
## Kernel Verification
|
```dockerfile
|
||||||
|
|
||||||
Once logged in, create a new notebook and verify you have access to:
|
|
||||||
|
|
||||||
- **Python 3** (default kernel)
|
|
||||||
- **R** (R kernel)
|
|
||||||
- **Bash** (bash kernel)
|
|
||||||
|
|
||||||
## Customization for Your Labs
|
|
||||||
|
|
||||||
### Add Additional R Packages
|
|
||||||
|
|
||||||
Modify the `Dockerfile` (before `USER ${NB_UID}`):
|
|
||||||
|
|
||||||
``` dockerfile
|
|
||||||
RUN R -e "install.packages(c('your_package'), repos='http://cran.rstudio.com/')"
|
RUN R -e "install.packages(c('your_package'), repos='http://cran.rstudio.com/')"
|
||||||
```
|
```
|
||||||
|
|
||||||
Then rerun `./start-jupyterhub.sh` to rebuild and restart.
|
Then rebuild:
|
||||||
|
|
||||||
### Add Python Packages
|
```bash
|
||||||
|
./start-jupyterhub.sh --rebuild-student
|
||||||
|
```
|
||||||
|
|
||||||
Add to the `Dockerfile` (before `USER ${NB_UID}`):
|
### Add Python packages
|
||||||
|
|
||||||
``` dockerfile
|
Edit `obijupyterhub/Dockerfile` (before `USER ${NB_UID}`):
|
||||||
|
|
||||||
|
```dockerfile
|
||||||
RUN pip install numpy pandas matplotlib seaborn
|
RUN pip install numpy pandas matplotlib seaborn
|
||||||
```
|
```
|
||||||
|
|
||||||
Then rerun `./start-jupyterhub.sh` to rebuild and restart.
|
Then rebuild:
|
||||||
|
|
||||||
### Change Port (if 8000 is occupied)
|
```bash
|
||||||
|
./start-jupyterhub.sh --rebuild-student
|
||||||
Modify in `docker-compose.yml`:
|
|
||||||
|
|
||||||
``` yaml
|
|
||||||
ports:
|
|
||||||
- "8001:8000" # Accessible on localhost:8001
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Advantages of This Approach
|
### Change the listening port
|
||||||
|
|
||||||
✅ **Everything in Docker**: No need to install Python/JupyterHub on your computer\
|
In `obijupyterhub/docker-compose.yml`:
|
||||||
✅ **Portable**: Easy to deploy on another server\
|
|
||||||
✅ **Isolated**: No pollution of your system environment\
|
```yaml
|
||||||
✅ **Easy to Clean**: A simple `docker-compose down` is enough\
|
ports:
|
||||||
✅ **Reproducible**: Students will have exactly the same environment
|
- "8001:80" # accessible at http://localhost:8001
|
||||||
|
```
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
|
|
||||||
- Docker daemon unavailable: make sure OrbStack/Docker Desktop/daemon is running; verify `/var/run/docker.sock` exists.
|
**Docker daemon unavailable**: make sure OrbStack / Docker Desktop / the daemon is running.
|
||||||
- Student containers do not start: check `docker-compose logs jupyterhub` and confirm the images exist with `docker images | grep jupyterhub-student`.
|
|
||||||
- Port conflict: change the published port in `docker-compose.yml`.
|
|
||||||
|
|
||||||
|
**Student containers do not start**: run `docker-compose logs jupyterhub` from `obijupyterhub/` and confirm the student image is present:
|
||||||
|
|
||||||
**I want to start from scratch**:
|
```bash
|
||||||
|
docker images | grep obijupyterhub-student
|
||||||
``` bash
|
```
|
||||||
pushd obijupyterhub
|
|
||||||
docker-compose down -v
|
**Port conflict**: change the published port in `docker-compose.yml`.
|
||||||
docker rmi jupyterhub-hub jupyterhub-student
|
|
||||||
popd
|
**Registry pull fails**: check your network, or fall back to a local build:
|
||||||
|
|
||||||
# Then rebuild everything
|
```bash
|
||||||
./start-jupyterhub.sh
|
./start-jupyterhub.sh --local-build
|
||||||
|
```
|
||||||
|
|
||||||
|
**Start from scratch**:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./start-jupyterhub.sh --stop-server
|
||||||
|
|
||||||
|
cd obijupyterhub
|
||||||
|
docker-compose down -v
|
||||||
|
docker rmi jupyterhub-hub jupyterhub-student obijupyterhub-builder 2>/dev/null || true
|
||||||
|
docker rmi registry.metabarcoding.org/metabarschool/obijupyterhub-hub:latest \
|
||||||
|
registry.metabarcoding.org/metabarschool/obijupyterhub-student:latest \
|
||||||
|
registry.metabarcoding.org/metabarschool/obijupyterhub-builder:latest 2>/dev/null || true
|
||||||
|
cd ..
|
||||||
|
|
||||||
|
rm -rf jupyterhub_volumes/builder/R_packages # clear R package cache
|
||||||
|
|
||||||
|
./start-jupyterhub.sh # pull fresh images and start
|
||||||
```
|
```
|
||||||
|
|||||||
+52
-18
@@ -19,37 +19,63 @@ RUN TEMP=. curl -L https://raw.githubusercontent.com/metabarcoding/obitools4/mas
|
|||||||
&& cp $HOME/obitools-build/bin/* /usr/local/bin
|
&& cp $HOME/obitools-build/bin/* /usr/local/bin
|
||||||
RUN ls -l /usr/local/bin
|
RUN ls -l /usr/local/bin
|
||||||
|
|
||||||
|
|
||||||
# ---------- Stage 2 : image finale ----------
|
# ---------- Stage 2 : image finale ----------
|
||||||
FROM jupyter/base-notebook:latest
|
FROM jupyter/base-notebook:latest
|
||||||
|
|
||||||
USER root
|
USER root
|
||||||
|
|
||||||
# Installer seulement les dépendances d'exécution (sans build-essential)
|
# Installer seulement les dépendances d'exécution (sans build-essential)
|
||||||
RUN apt-get update && apt-get install -y \
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
# R et dépendances de base
|
||||||
r-base \
|
r-base \
|
||||||
libcurl4-openssl-dev libssl-dev libxml2-dev \
|
r-base-dev \
|
||||||
|
libcurl4-openssl-dev \
|
||||||
|
libssl-dev \
|
||||||
|
libxml2-dev \
|
||||||
|
libicu-dev \
|
||||||
|
zlib1g-dev \
|
||||||
|
# Polices et rendu graphique (indispensable pour ggplot2, ragg, etc.)
|
||||||
|
libharfbuzz-dev \
|
||||||
|
libfribidi-dev \
|
||||||
|
libfontconfig1-dev \
|
||||||
|
libfreetype6-dev \
|
||||||
|
libpng-dev \
|
||||||
|
libtiff5-dev \
|
||||||
|
libjpeg-dev \
|
||||||
|
pandoc \
|
||||||
|
# Outils de compilation et gestion de versions
|
||||||
|
libgit2-dev \
|
||||||
|
cmake \
|
||||||
|
# Utilitaires systèmes déjà présents dans votre Dockerfile
|
||||||
curl \
|
curl \
|
||||||
|
wget \
|
||||||
git \
|
git \
|
||||||
texlive-xetex texlive-fonts-recommended texlive-plain-generic \
|
vim \
|
||||||
ruby ruby-dev \
|
nano \
|
||||||
vim nano \
|
less \
|
||||||
|
gdebi-core \
|
||||||
|
ripgrep \
|
||||||
|
# Pour générer des PDF/rapports depuis R Markdown / Jupyter
|
||||||
|
texlive-xetex \
|
||||||
|
texlive-luatex \
|
||||||
|
texlive-fonts-recommended \
|
||||||
|
texlive-fonts-extra \
|
||||||
|
texlive-latex-extra \
|
||||||
|
texlive-plain-generic \
|
||||||
|
lmodern \
|
||||||
|
fonts-lmodern \
|
||||||
|
librsvg2-bin \
|
||||||
|
cm-super \
|
||||||
|
# Ruby (si vous en avez besoin pour autre chose)
|
||||||
|
ruby \
|
||||||
|
ruby-dev \
|
||||||
&& apt-get clean \
|
&& apt-get clean \
|
||||||
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
|
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
|
||||||
|
|
||||||
# Installer R et packages
|
# Installer R et packages
|
||||||
RUN R -e "install.packages(c('IRkernel','tidyverse','vegan','ade4','BiocManager','remotes','igraph'), \
|
COPY install_R_packages.R /tmp/install_R_packages.R
|
||||||
dependencies=TRUE, \
|
RUN Rscript /tmp/install_R_packages.R --no-save --no-restore && \
|
||||||
repos='http://cran.rstudio.com/')" && \
|
rm -rf /tmp/Rtmp* /tmp/install_R_packages.R
|
||||||
R -e "BiocManager::install('biomformat')" && \
|
|
||||||
R -e "remotes::install_github('metabaRfactory/metabaR')" && \
|
|
||||||
R -e "remotes::install_git('https://forge.metabarcoding.org/obitools/ROBIUtils.git')" && \
|
|
||||||
R -e "remotes::install_git('https://forge.metabarcoding.org/obitools/ROBITaxonomy.git')" && \
|
|
||||||
R -e "remotes::install_git('https://forge.metabarcoding.org/obitools/ROBITools.git')" && \
|
|
||||||
R -e "remotes::install_git('https://forge.metabarcoding.org/obitools/ROBITaxonomy.git')" && \
|
|
||||||
R -e "remotes::install_git('https://forge.metabarcoding.org/MetabarcodingSchool/biodiversity-metrics.git')" && \
|
|
||||||
R -e "IRkernel::installspec(user = FALSE)" && \
|
|
||||||
rm -rf /tmp/Rtmp*
|
|
||||||
|
|
||||||
# Installer les autres outils
|
# Installer les autres outils
|
||||||
RUN pip install --no-cache-dir bash_kernel csvkit && \
|
RUN pip install --no-cache-dir bash_kernel csvkit && \
|
||||||
@@ -57,11 +83,19 @@ RUN pip install --no-cache-dir bash_kernel csvkit && \
|
|||||||
|
|
||||||
RUN gem install youplot
|
RUN gem install youplot
|
||||||
|
|
||||||
|
# Installation de Quarto (multi-arch)
|
||||||
|
RUN ARCH=$(dpkg --print-architecture) && \
|
||||||
|
QUARTO_VERSION="1.8.27" && \
|
||||||
|
wget https://github.com/quarto-dev/quarto-cli/releases/download/v${QUARTO_VERSION}/quarto-${QUARTO_VERSION}-linux-${ARCH}.deb && \
|
||||||
|
gdebi --non-interactive quarto-${QUARTO_VERSION}-linux-${ARCH}.deb && \
|
||||||
|
rm quarto-${QUARTO_VERSION}-linux-${ARCH}.deb
|
||||||
|
|
||||||
# Set permissions for Jupyter user
|
# Set permissions for Jupyter user
|
||||||
RUN mkdir -p /home/${NB_USER}/.local/share/jupyter && \
|
RUN mkdir -p /home/${NB_USER}/.local/share/jupyter && \
|
||||||
chown -R ${NB_UID}:${NB_GID} /home/${NB_USER}
|
chown -R ${NB_UID}:${NB_GID} /home/${NB_USER}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Copier uniquement le binaire csvlens du builder
|
# Copier uniquement le binaire csvlens du builder
|
||||||
COPY --from=rust-builder /home/jovyan/.cargo/bin/csvlens /usr/local/bin/
|
COPY --from=rust-builder /home/jovyan/.cargo/bin/csvlens /usr/local/bin/
|
||||||
COPY --from=rust-builder /usr/local/bin/* /usr/local/bin/
|
COPY --from=rust-builder /usr/local/bin/* /usr/local/bin/
|
||||||
|
|||||||
@@ -0,0 +1,73 @@
|
|||||||
|
# Dockerfile.builder
|
||||||
|
# Image containing all tools needed to prepare the OBIJupyterHub stack
|
||||||
|
# This allows the host system to only require Docker to be installed
|
||||||
|
|
||||||
|
FROM ubuntu:24.04
|
||||||
|
|
||||||
|
LABEL maintainer="OBIJupyterHub"
|
||||||
|
LABEL description="Builder image for OBIJupyterHub preparation tasks"
|
||||||
|
|
||||||
|
# Avoid interactive prompts during package installation
|
||||||
|
ENV DEBIAN_FRONTEND=noninteractive
|
||||||
|
ENV TZ=Etc/UTC
|
||||||
|
|
||||||
|
# Install base dependencies and R
|
||||||
|
RUN apt-get update \
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
ca-certificates \
|
||||||
|
curl \
|
||||||
|
wget \
|
||||||
|
git \
|
||||||
|
rsync \
|
||||||
|
python3 \
|
||||||
|
r-base \
|
||||||
|
r-base-dev \
|
||||||
|
libcurl4-openssl-dev \
|
||||||
|
libssl-dev \
|
||||||
|
libxml2-dev \
|
||||||
|
libfontconfig1-dev \
|
||||||
|
libharfbuzz-dev \
|
||||||
|
libfribidi-dev \
|
||||||
|
libfreetype6-dev \
|
||||||
|
libpng-dev \
|
||||||
|
libtiff5-dev \
|
||||||
|
libjpeg-dev \
|
||||||
|
libuv1-dev \
|
||||||
|
golang-go \
|
||||||
|
&& apt-get clean \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Install the attachment package in a separate location (not overwritten by volume mount)
|
||||||
|
# This ensures attachment is always available even when site-library is mounted as a volume
|
||||||
|
ENV R_LIBS_BUILDER=/opt/R/builder-packages
|
||||||
|
RUN mkdir -p ${R_LIBS_BUILDER} \
|
||||||
|
&& R -e "install.packages('attachment', lib='${R_LIBS_BUILDER}', repos='https://cloud.r-project.org/')"
|
||||||
|
|
||||||
|
# Install Hugo (extended version for SCSS support)
|
||||||
|
# Detect architecture and download appropriate binary
|
||||||
|
ARG HUGO_VERSION=0.159.2
|
||||||
|
RUN ARCH=$(dpkg --print-architecture) \
|
||||||
|
&& case "$ARCH" in \
|
||||||
|
amd64) HUGO_ARCH="amd64" ;; \
|
||||||
|
arm64) HUGO_ARCH="arm64" ;; \
|
||||||
|
*) echo "Unsupported architecture: $ARCH" && exit 1 ;; \
|
||||||
|
esac \
|
||||||
|
&& curl -fsSL "https://github.com/gohugoio/hugo/releases/download/v${HUGO_VERSION}/hugo_extended_${HUGO_VERSION}_linux-${HUGO_ARCH}.tar.gz" \
|
||||||
|
| tar -xz -C /usr/local/bin hugo \
|
||||||
|
&& chmod +x /usr/local/bin/hugo
|
||||||
|
|
||||||
|
# Install Quarto from the official tarball.
|
||||||
|
# Using tar.gz instead of .deb avoids dpkg and is more reliable in cross-arch
|
||||||
|
# (QEMU) builds where GitHub downloads are slower and more prone to transient errors.
|
||||||
|
ARG QUARTO_VERSION=1.6.42
|
||||||
|
RUN ARCH=$(dpkg --print-architecture) \
|
||||||
|
&& curl -fsSL --retry 5 --retry-delay 10 \
|
||||||
|
"https://github.com/quarto-dev/quarto-cli/releases/download/v${QUARTO_VERSION}/quarto-${QUARTO_VERSION}-linux-${ARCH}.tar.gz" \
|
||||||
|
| tar -xz -C /opt \
|
||||||
|
&& ln -s "/opt/quarto-${QUARTO_VERSION}/bin/quarto" /usr/local/bin/quarto
|
||||||
|
|
||||||
|
# Create working directory
|
||||||
|
WORKDIR /workspace
|
||||||
|
|
||||||
|
# Default command
|
||||||
|
CMD ["/bin/bash"]
|
||||||
@@ -1,11 +1,8 @@
|
|||||||
services:
|
services:
|
||||||
jupyterhub:
|
jupyterhub:
|
||||||
build:
|
|
||||||
context: .
|
|
||||||
dockerfile: Dockerfile.hub
|
|
||||||
container_name: jupyterhub
|
container_name: jupyterhub
|
||||||
hostname: jupyterhub
|
hostname: jupyterhub
|
||||||
image: jupyterhub-hub:latest
|
image: ${HUB_IMAGE:-registry.metabarcoding.org/metabarschool/obijupyterhub-hub:latest}
|
||||||
expose:
|
expose:
|
||||||
- "8000"
|
- "8000"
|
||||||
volumes:
|
volumes:
|
||||||
@@ -21,6 +18,8 @@ services:
|
|||||||
- jupyterhub-network
|
- jupyterhub-network
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
environment:
|
environment:
|
||||||
|
# Docker image used for student containers (read by jupyterhub_config.py)
|
||||||
|
STUDENT_IMAGE: ${STUDENT_IMAGE:-registry.metabarcoding.org/metabarschool/obijupyterhub-student:latest}
|
||||||
# Shared password for all students
|
# Shared password for all students
|
||||||
JUPYTERHUB_PASSWORD: metabar2025
|
JUPYTERHUB_PASSWORD: metabar2025
|
||||||
# Admin password (for installing R packages)
|
# Admin password (for installing R packages)
|
||||||
|
|||||||
@@ -0,0 +1,43 @@
|
|||||||
|
#!/usr/bin/env Rscript
|
||||||
|
|
||||||
|
# Installer pak (lui-même en binaire si possible)
|
||||||
|
install.packages("pak", repos = sprintf("https://r-lib.github.io/p/pak/stable/%s/%s/%s", .Platform$pkgType, R.Version()$os, R.Version()$arch))
|
||||||
|
pak::pkg_install("cli")
|
||||||
|
|
||||||
|
# Détection automatique du système et installation de tous les paquets en binaire
|
||||||
|
pak::pkg_install(c(
|
||||||
|
"IRkernel",
|
||||||
|
"tidyverse",
|
||||||
|
"vegan",
|
||||||
|
"ade4",
|
||||||
|
"BiocManager",
|
||||||
|
"remotes",
|
||||||
|
"igraph",
|
||||||
|
"Rdpack"
|
||||||
|
))
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Paquets Bioconductor (toujours via BiocManager)
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
pak::pkg_install("bioc::biomformat")
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Paquets depuis GitHub / dépôts git
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
pak::pkg_install("metabaRfactory/metabaR")
|
||||||
|
pak::pkg_install("git::https://forge.metabarcoding.org/obitools/ROBIUtils.git")
|
||||||
|
pak::pkg_install("git::https://forge.metabarcoding.org/obitools/ROBITaxonomy.git")
|
||||||
|
pak::pkg_install("git::https://forge.metabarcoding.org/obitools/ROBITools.git")
|
||||||
|
pak::pkg_install("git::https://forge.metabarcoding.org/MetabarcodingSchool/biodiversity-metrics.git")
|
||||||
|
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Installation du noyau Jupyter pour IRkernel
|
||||||
|
# ------------------------------------------------------------
|
||||||
|
# Si on est root -> installation système, sinon -> user
|
||||||
|
if (Sys.info()["user"] == "root") {
|
||||||
|
IRkernel::installspec(user = FALSE)
|
||||||
|
} else {
|
||||||
|
IRkernel::installspec(user = TRUE)
|
||||||
|
}
|
||||||
|
|
||||||
|
cat("\n✅ Tous les paquets R ont été installés avec succès.\n")
|
||||||
@@ -14,7 +14,10 @@ VOLUMES_BASE_PATH = '/volumes/users' # Path as seen from JupyterHub container (
|
|||||||
HOST_VOLUMES_PATH = os.environ.get('HOST_VOLUMES_PATH', '/volumes') # Real path on host machine (parent dir)
|
HOST_VOLUMES_PATH = os.environ.get('HOST_VOLUMES_PATH', '/volumes') # Real path on host machine (parent dir)
|
||||||
|
|
||||||
# Docker image to use for student containers
|
# Docker image to use for student containers
|
||||||
c.DockerSpawner.image = 'jupyterhub-student:latest'
|
c.DockerSpawner.image = os.environ.get(
|
||||||
|
'STUDENT_IMAGE',
|
||||||
|
'registry.metabarcoding.org/metabarschool/obijupyterhub-student:latest'
|
||||||
|
)
|
||||||
|
|
||||||
# Docker network (create with: docker network create jupyterhub-network)
|
# Docker network (create with: docker network create jupyterhub-network)
|
||||||
c.DockerSpawner.network_name = 'jupyterhub-network'
|
c.DockerSpawner.network_name = 'jupyterhub-network'
|
||||||
|
|||||||
+398
-92
@@ -1,45 +1,84 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
|
||||||
# JupyterHub startup script for labs
|
# JupyterHub startup script for labs
|
||||||
# Usage: ./start-jupyterhub.sh [--no-build|--offline] [--force-rebuild] [--stop-server] [--update-lectures] [--build-obidoc]
|
#
|
||||||
|
# Modes (mutually exclusive):
|
||||||
|
# (default) Pull images from registry and start
|
||||||
|
# --local-build Build images locally and start (no push)
|
||||||
|
# --publish Build multi-arch images, push to registry, and start
|
||||||
|
#
|
||||||
|
# Usage: ./start-jupyterhub.sh [mode] [options]
|
||||||
|
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||||
DOCKER_DIR="${SCRIPT_DIR}/obijupyterhub/"
|
DOCKER_DIR="${SCRIPT_DIR}/obijupyterhub/"
|
||||||
|
|
||||||
|
REGISTRY="registry.metabarcoding.org/metabarschool"
|
||||||
|
PLATFORMS="linux/amd64,linux/arm64"
|
||||||
|
BUILDX_BUILDER_NAME="obijupyterhub-buildx"
|
||||||
|
|
||||||
# Colors for display
|
# Colors for display
|
||||||
GREEN='\033[0;32m'
|
GREEN='\033[0;32m'
|
||||||
BLUE='\033[0;34m'
|
BLUE='\033[0;34m'
|
||||||
YELLOW='\033[1;33m'
|
YELLOW='\033[1;33m'
|
||||||
NC='\033[0m' # No Color
|
NC='\033[0m'
|
||||||
|
|
||||||
|
# Operating mode
|
||||||
|
LOCAL_BUILD=false
|
||||||
|
PUBLISH=false
|
||||||
|
|
||||||
|
# Build options (meaningful in --local-build mode)
|
||||||
NO_BUILD=false
|
NO_BUILD=false
|
||||||
FORCE_REBUILD=false
|
FORCE_REBUILD=false
|
||||||
|
REBUILD_BUILDER=false
|
||||||
|
REBUILD_STUDENT=false
|
||||||
|
REBUILD_HUB=false
|
||||||
|
|
||||||
|
# Actions
|
||||||
STOP_SERVER=false
|
STOP_SERVER=false
|
||||||
UPDATE_LECTURES=false
|
UPDATE_LECTURES=false
|
||||||
|
UPDATE_OBIDOC=false
|
||||||
BUILD_OBIDOC=false
|
BUILD_OBIDOC=false
|
||||||
|
|
||||||
usage() {
|
usage() {
|
||||||
cat <<EOF
|
cat <<EOF
|
||||||
Usage: ./start-jupyterhub.sh [options]
|
Usage: ./start-jupyterhub.sh [mode] [options]
|
||||||
|
|
||||||
Options:
|
Modes (mutually exclusive, default is pull-from-registry):
|
||||||
--no-build | --offline Skip Docker image builds (use existing images)
|
--local-build Build images locally and start (no push to registry)
|
||||||
--force-rebuild Rebuild images without cache
|
--publish Build multi-arch images, push to registry, and start
|
||||||
|
|
||||||
|
Build options (--local-build only):
|
||||||
|
--no-build | --offline Skip all image operations (use existing local images)
|
||||||
|
--force-rebuild Rebuild all local images without cache
|
||||||
|
--rebuild-builder Force rebuild the builder image only
|
||||||
|
--rebuild-student Force rebuild the student image only
|
||||||
|
--rebuild-hub Force rebuild the JupyterHub image only
|
||||||
|
|
||||||
|
Actions:
|
||||||
--stop-server Stop the stack and remove student containers, then exit
|
--stop-server Stop the stack and remove student containers, then exit
|
||||||
--update-lectures Rebuild the course website only (no Docker stop/start)
|
--update-lectures Rebuild the course website only (no Docker stop/start)
|
||||||
--build-obidoc Force rebuild of obidoc documentation
|
--update-obidoc Rebuild the obidoc documentation only (no Docker stop/start)
|
||||||
|
--build-obidoc Force rebuild of obidoc documentation on next full start
|
||||||
-h, --help Show this help
|
-h, --help Show this help
|
||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
|
dockercompose=$(which docker-compose 2>/dev/null || echo 'docker compose')
|
||||||
|
|
||||||
while [[ $# -gt 0 ]]; do
|
while [[ $# -gt 0 ]]; do
|
||||||
case "$1" in
|
case "$1" in
|
||||||
|
--local-build) LOCAL_BUILD=true ;;
|
||||||
|
--publish) PUBLISH=true ;;
|
||||||
--no-build|--offline) NO_BUILD=true ;;
|
--no-build|--offline) NO_BUILD=true ;;
|
||||||
--force-rebuild) FORCE_REBUILD=true ;;
|
--force-rebuild) FORCE_REBUILD=true; LOCAL_BUILD=true ;;
|
||||||
|
--rebuild-builder) REBUILD_BUILDER=true; LOCAL_BUILD=true ;;
|
||||||
|
--rebuild-student) REBUILD_STUDENT=true; LOCAL_BUILD=true ;;
|
||||||
|
--rebuild-hub) REBUILD_HUB=true; LOCAL_BUILD=true ;;
|
||||||
--stop-server) STOP_SERVER=true ;;
|
--stop-server) STOP_SERVER=true ;;
|
||||||
--update-lectures) UPDATE_LECTURES=true ;;
|
--update-lectures) UPDATE_LECTURES=true ;;
|
||||||
|
--update-obidoc) UPDATE_OBIDOC=true ;;
|
||||||
--build-obidoc) BUILD_OBIDOC=true ;;
|
--build-obidoc) BUILD_OBIDOC=true ;;
|
||||||
-h|--help) usage; exit 0 ;;
|
-h|--help) usage; exit 0 ;;
|
||||||
*) echo "Unknown option: $1" >&2; usage; exit 1 ;;
|
*) echo "Unknown option: $1" >&2; usage; exit 1 ;;
|
||||||
@@ -47,67 +86,282 @@ while [[ $# -gt 0 ]]; do
|
|||||||
shift
|
shift
|
||||||
done
|
done
|
||||||
|
|
||||||
|
if $LOCAL_BUILD && $PUBLISH; then
|
||||||
|
echo "Error: --local-build and --publish cannot be used together" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
if $STOP_SERVER && $UPDATE_LECTURES; then
|
if $STOP_SERVER && $UPDATE_LECTURES; then
|
||||||
echo "❌ --stop-server and --update-lectures cannot be used together" >&2
|
echo "Error: --stop-server and --update-lectures cannot be used together" >&2
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
echo "🚀 Starting JupyterHub for Lab"
|
# ---------------------------------------------------------------------------
|
||||||
echo "=============================="
|
# Image name helpers
|
||||||
echo ""
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
echo -e "${BLUE}🔨 Building the volume directories...${NC}"
|
local_image_name() {
|
||||||
pushd "${SCRIPT_DIR}/jupyterhub_volumes" >/dev/null
|
case "$1" in
|
||||||
mkdir -p caddy
|
hub) echo "jupyterhub-hub:latest" ;;
|
||||||
mkdir -p course/bin
|
student) echo "jupyterhub-student:latest" ;;
|
||||||
mkdir -p course/R_packages
|
builder) echo "obijupyterhub-builder:latest" ;;
|
||||||
mkdir -p jupyterhub
|
esac
|
||||||
mkdir -p shared
|
|
||||||
mkdir -p users
|
|
||||||
mkdir -p web/obidoc
|
|
||||||
popd >/dev/null
|
|
||||||
|
|
||||||
pushd "${DOCKER_DIR}" >/dev/null
|
|
||||||
|
|
||||||
# Check we're in the right directory
|
|
||||||
if [ ! -f "Dockerfile" ] || [ ! -f "docker-compose.yml" ]; then
|
|
||||||
echo "❌ Error: Run this script from the jupyterhub-tp/ directory"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
stop_stack() {
|
|
||||||
echo -e "${BLUE}📦 Stopping existing containers...${NC}"
|
|
||||||
docker-compose down 2>/dev/null || true
|
|
||||||
|
|
||||||
echo -e "${BLUE}🧹 Cleaning up student containers...${NC}"
|
|
||||||
docker ps -aq --filter name=jupyter- | xargs -r docker rm -f 2>/dev/null || true
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
registry_image_name() {
|
||||||
|
echo "${REGISTRY}/obijupyterhub-$1:${2:-latest}"
|
||||||
|
}
|
||||||
|
|
||||||
|
dockerfile_for() {
|
||||||
|
case "$1" in
|
||||||
|
hub) echo "Dockerfile.hub" ;;
|
||||||
|
student) echo "Dockerfile" ;;
|
||||||
|
builder) echo "Dockerfile.builder" ;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
read_version() {
|
||||||
|
local vfile="${SCRIPT_DIR}/version.txt"
|
||||||
|
if [ ! -f "$vfile" ]; then
|
||||||
|
echo "Error: version.txt not found at ${vfile}" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
tr -d '[:space:]' < "$vfile"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Set image names based on mode
|
||||||
|
if $LOCAL_BUILD; then
|
||||||
|
BUILDER_IMAGE=$(local_image_name builder)
|
||||||
|
HUB_IMAGE=$(local_image_name hub)
|
||||||
|
STUDENT_IMAGE=$(local_image_name student)
|
||||||
|
else
|
||||||
|
BUILDER_IMAGE=$(registry_image_name builder)
|
||||||
|
HUB_IMAGE=$(registry_image_name hub)
|
||||||
|
STUDENT_IMAGE=$(registry_image_name student)
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Utility
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
get_file_timestamp() {
|
||||||
|
local file="$1"
|
||||||
|
case "$(uname -s)" in
|
||||||
|
Linux) stat -c %Y "$file" ;;
|
||||||
|
Darwin) stat -f %m "$file" ;;
|
||||||
|
*) echo "Système non supporté" >&2; return 1 ;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
check_if_image_needs_rebuild() {
|
||||||
|
local image_name="$1"
|
||||||
|
local dockerfile="$2"
|
||||||
|
local force="${3:-false}"
|
||||||
|
|
||||||
|
echo -e "${BLUE}Checking image ${image_name}...${NC}"
|
||||||
|
|
||||||
|
if ! docker image inspect "$image_name" >/dev/null 2>&1; then
|
||||||
|
echo -e "${YELLOW}Docker image ${image_name} doesn't exist.${NC}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
if $FORCE_REBUILD || $force; then
|
||||||
|
echo -e "${YELLOW}Docker image build is forced.${NC}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -f "$dockerfile" ]; then
|
||||||
|
local dockerfile_mtime
|
||||||
|
dockerfile_mtime=$(get_file_timestamp "$dockerfile" 2>/dev/null || echo 0)
|
||||||
|
local image_created
|
||||||
|
image_created=$(docker image inspect "$image_name" --format='{{.Created}}' 2>/dev/null \
|
||||||
|
| sed -E 's/\.[0-9]+//' \
|
||||||
|
| (read d; if [[ "$(uname -s)" == "Darwin" ]]; then date -ju -f "%Y-%m-%dT%H:%M:%S" "${d%Z}" +%s; else date -d "$d" +%s; fi) 2>/dev/null || echo 0)
|
||||||
|
|
||||||
|
echo -e "${BLUE}Docker image ${image_name} created at: ${image_created}.${NC}"
|
||||||
|
echo -e "${BLUE}Docker file ${dockerfile} modified at: ${dockerfile_mtime}.${NC}"
|
||||||
|
|
||||||
|
if [ "$dockerfile_mtime" -gt "$image_created" ]; then
|
||||||
|
echo -e "${YELLOW}Dockerfile is newer than image, rebuild needed${NC}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Builder image (local-build mode)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
build_builder_image() {
|
||||||
|
if check_if_image_needs_rebuild "$(local_image_name builder)" "Dockerfile.builder" "$REBUILD_BUILDER"; then
|
||||||
|
local build_flag=()
|
||||||
|
if $FORCE_REBUILD || $REBUILD_BUILDER; then build_flag+=(--no-cache); fi
|
||||||
|
echo ""
|
||||||
|
echo -e "${BLUE}Building builder image...${NC}"
|
||||||
|
docker build "${build_flag[@]}" -t "$(local_image_name builder)" -f Dockerfile.builder .
|
||||||
|
else
|
||||||
|
echo -e "${BLUE}Builder image is up to date, skipping build.${NC}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Student + Hub images (local-build mode)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
build_images() {
|
build_images() {
|
||||||
if $NO_BUILD; then
|
if $NO_BUILD; then
|
||||||
echo -e "${YELLOW}⏭️ Skipping image builds (offline/no-build mode).${NC}"
|
echo -e "${YELLOW}Skipping image builds (offline/no-build mode).${NC}"
|
||||||
return
|
return
|
||||||
fi
|
fi
|
||||||
|
|
||||||
local build_flag=()
|
if check_if_image_needs_rebuild "$(local_image_name student)" "Dockerfile" "$REBUILD_STUDENT"; then
|
||||||
if $FORCE_REBUILD; then
|
local student_flag=()
|
||||||
build_flag+=(--no-cache)
|
if $FORCE_REBUILD || $REBUILD_STUDENT; then student_flag+=(--no-cache); fi
|
||||||
|
echo ""
|
||||||
|
echo -e "${BLUE}Building student image...${NC}"
|
||||||
|
docker build "${student_flag[@]}" -t "$(local_image_name student)" -f Dockerfile .
|
||||||
|
else
|
||||||
|
echo -e "${BLUE}Student image is up to date, skipping build.${NC}"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
if check_if_image_needs_rebuild "$(local_image_name hub)" "Dockerfile.hub" "$REBUILD_HUB"; then
|
||||||
|
local hub_flag=()
|
||||||
|
if $FORCE_REBUILD || $REBUILD_HUB; then hub_flag+=(--no-cache); fi
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${BLUE}🔨 Building student image...${NC}"
|
echo -e "${BLUE}Building JupyterHub image...${NC}"
|
||||||
docker build "${build_flag[@]}" -t jupyterhub-student:latest -f Dockerfile .
|
docker build "${hub_flag[@]}" -t "$(local_image_name hub)" -f Dockerfile.hub .
|
||||||
|
else
|
||||||
|
echo -e "${BLUE}JupyterHub image is up to date, skipping build.${NC}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Pull images from registry (default mode)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
pull_images() {
|
||||||
|
if $NO_BUILD; then
|
||||||
|
echo -e "${YELLOW}Skipping image pull (offline/no-build mode).${NC}"
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
echo ""
|
||||||
|
echo -e "${BLUE}Pulling images from registry...${NC}"
|
||||||
|
docker pull "$BUILDER_IMAGE"
|
||||||
|
docker pull "$HUB_IMAGE"
|
||||||
|
docker pull "$STUDENT_IMAGE"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Multi-arch build + push to registry (--publish mode)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
ensure_buildx_builder() {
|
||||||
|
docker buildx inspect "$BUILDX_BUILDER_NAME" >/dev/null 2>&1 \
|
||||||
|
|| docker buildx create --name "$BUILDX_BUILDER_NAME" --driver docker-container --bootstrap
|
||||||
|
}
|
||||||
|
|
||||||
|
publish_images() {
|
||||||
|
local version
|
||||||
|
version=$(read_version)
|
||||||
|
|
||||||
|
# docker buildx --push uses Docker's own credential store, independent of
|
||||||
|
# skopeo. Prompt once before the (long) build so the user isn't surprised
|
||||||
|
# by an auth failure at the very end.
|
||||||
|
local registry_host="${REGISTRY%%/*}"
|
||||||
|
echo -e "${BLUE}Authenticating to ${registry_host} (required to push)...${NC}"
|
||||||
|
docker login "$registry_host" || {
|
||||||
|
echo "Error: authentication to ${registry_host} failed." >&2
|
||||||
|
echo "Run: docker login ${registry_host}" >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${BLUE}🔨 Building JupyterHub image...${NC}"
|
echo -e "${BLUE}Publishing images (version ${version}) to ${REGISTRY}${NC}"
|
||||||
docker build "${build_flag[@]}" -t jupyterhub-hub:latest -f Dockerfile.hub .
|
echo -e "${BLUE}Platforms: ${PLATFORMS}${NC}"
|
||||||
|
|
||||||
|
ensure_buildx_builder
|
||||||
|
|
||||||
|
local names=(builder student hub)
|
||||||
|
local dockerfiles=(Dockerfile.builder Dockerfile Dockerfile.hub)
|
||||||
|
|
||||||
|
for i in "${!names[@]}"; do
|
||||||
|
local name="${names[$i]}"
|
||||||
|
local df="${dockerfiles[$i]}"
|
||||||
|
local remote="${REGISTRY}/obijupyterhub-${name}"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo -e "${BLUE}Building and pushing ${name} image...${NC}"
|
||||||
|
docker buildx build \
|
||||||
|
--builder "$BUILDX_BUILDER_NAME" \
|
||||||
|
--platform "$PLATFORMS" \
|
||||||
|
--tag "${remote}:latest" \
|
||||||
|
--tag "${remote}:${version}" \
|
||||||
|
--file "${df}" \
|
||||||
|
--push \
|
||||||
|
.
|
||||||
|
echo -e "${GREEN} ${remote}:latest${NC}"
|
||||||
|
echo -e "${GREEN} ${remote}:${version}${NC}"
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo -e "${GREEN}All images published (version ${version}).${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Builder container (for website / docs)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
run_in_builder() {
|
||||||
|
docker run --rm \
|
||||||
|
-v "${SCRIPT_DIR}:/workspace" \
|
||||||
|
-v "${SCRIPT_DIR}/jupyterhub_volumes/builder/R_packages:/usr/local/lib/R/site-library" \
|
||||||
|
-e "R_LIBS=/opt/R/builder-packages:/usr/local/lib/R/site-library" \
|
||||||
|
-w /workspace \
|
||||||
|
"$BUILDER_IMAGE" \
|
||||||
|
bash -c "$1"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Stack management
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
stop_stack() {
|
||||||
|
echo -e "${BLUE}Stopping existing containers...${NC}"
|
||||||
|
HUB_IMAGE="$HUB_IMAGE" STUDENT_IMAGE="$STUDENT_IMAGE" \
|
||||||
|
${dockercompose} down 2>/dev/null || true
|
||||||
|
|
||||||
|
echo -e "${BLUE}Cleaning up student containers...${NC}"
|
||||||
|
docker ps -aq --filter name=jupyter- | xargs -r docker rm -f 2>/dev/null || true
|
||||||
|
}
|
||||||
|
|
||||||
|
build_website() {
|
||||||
|
echo ""
|
||||||
|
echo -e "${BLUE}Building web site (in builder container)...${NC}"
|
||||||
|
run_in_builder '
|
||||||
|
set -e
|
||||||
|
echo "-> Detecting and installing R dependencies..."
|
||||||
|
Rscript /workspace/tools/install_quarto_deps.R /workspace/web_src
|
||||||
|
|
||||||
|
echo "-> Rendering Quarto site..."
|
||||||
|
cd /workspace/web_src
|
||||||
|
quarto render
|
||||||
|
find . -name "*.pdf" -print | while read pdfname; do
|
||||||
|
dest="/workspace/jupyterhub_volumes/web/pages/${pdfname}"
|
||||||
|
dirdest=$(dirname "$dest")
|
||||||
|
mkdir -p "$dirdest"
|
||||||
|
cp "$pdfname" "$dest"
|
||||||
|
done
|
||||||
|
python3 /workspace/tools/generate_pdf_galleries.py
|
||||||
|
python3 /workspace/tools/generate_pages_json.py
|
||||||
|
'
|
||||||
}
|
}
|
||||||
|
|
||||||
build_obidoc() {
|
build_obidoc() {
|
||||||
local dest="${SCRIPT_DIR}/jupyterhub_volumes/web/obidoc"
|
local dest="${SCRIPT_DIR}/jupyterhub_volumes/web/obidoc"
|
||||||
|
|
||||||
if $NO_BUILD; then
|
if $NO_BUILD; then
|
||||||
echo -e "${YELLOW}⏭️ Skipping obidoc build in offline/no-build mode.${NC}"
|
echo -e "${YELLOW}Skipping obidoc build in offline/no-build mode.${NC}"
|
||||||
return
|
return
|
||||||
fi
|
fi
|
||||||
|
|
||||||
@@ -119,73 +373,64 @@ build_obidoc() {
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
if ! $needs_build; then
|
if ! $needs_build; then
|
||||||
echo -e "${BLUE}ℹ️ obidoc already present; skipping rebuild (use --build-obidoc to force).${NC}"
|
echo -e "${BLUE}obidoc already present; skipping rebuild (use --build-obidoc to force).${NC}"
|
||||||
return
|
return
|
||||||
fi
|
fi
|
||||||
|
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${BLUE}🔨 Building obidoc documentation...${NC}"
|
echo -e "${BLUE}Building obidoc documentation (in builder container)...${NC}"
|
||||||
BUILD_DIR=$(mktemp -d -p .)
|
run_in_builder '
|
||||||
pushd "$BUILD_DIR" >/dev/null
|
set -e
|
||||||
|
BUILD_DIR=$(mktemp -d)
|
||||||
|
cd "$BUILD_DIR"
|
||||||
git clone --recurse-submodules \
|
git clone --recurse-submodules \
|
||||||
--remote-submodules \
|
--remote-submodules \
|
||||||
-j 8 \
|
-j 8 \
|
||||||
https://github.com/metabarcoding/obitools4-doc.git
|
https://github.com/metabarcoding/obitools4-doc.git
|
||||||
pushd obitools4-doc >/dev/null
|
cd obitools4-doc
|
||||||
hugo -D build --baseURL "/obidoc/"
|
hugo --gc --minify --buildDrafts --baseURL "/obidoc/"
|
||||||
mkdir -p "$dest"
|
mkdir -p /workspace/jupyterhub_volumes/web/obidoc
|
||||||
rm -rf "${dest:?}/"*
|
rm -rf /workspace/jupyterhub_volumes/web/obidoc/*
|
||||||
mv public/* "$dest"
|
mv public/* /workspace/jupyterhub_volumes/web/obidoc/
|
||||||
popd >/dev/null
|
cd /
|
||||||
popd >/dev/null
|
rm -rf "$BUILD_DIR"
|
||||||
rm -rf
|
'
|
||||||
}
|
|
||||||
|
|
||||||
build_website() {
|
|
||||||
echo ""
|
|
||||||
echo -e "${BLUE}🔨 Building web site...${NC}"
|
|
||||||
pushd ../web_src >/dev/null
|
|
||||||
quarto render
|
|
||||||
find . -name '*.pdf' -print \
|
|
||||||
| while read pdfname ; do
|
|
||||||
dest="../jupyterhub_volumes/web/pages/${pdfname}"
|
|
||||||
dirdest=$(dirname "$dest")
|
|
||||||
mkdir -p "$dirdest"
|
|
||||||
echo "cp '${pdfname}' '${dest}'"
|
|
||||||
done \
|
|
||||||
| bash
|
|
||||||
python3 ../tools/generate_pdf_galleries.py
|
|
||||||
python3 ../tools/generate_pages_json.py
|
|
||||||
popd >/dev/null
|
|
||||||
}
|
}
|
||||||
|
|
||||||
start_stack() {
|
start_stack() {
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${BLUE}🚀 Starting JupyterHub...${NC}"
|
echo -e "${BLUE}Starting JupyterHub...${NC}"
|
||||||
docker-compose up -d --remove-orphans
|
HUB_IMAGE="$HUB_IMAGE" STUDENT_IMAGE="$STUDENT_IMAGE" \
|
||||||
|
${dockercompose} up -d --remove-orphans
|
||||||
|
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${YELLOW}⏳ Waiting for JupyterHub to start...${NC}"
|
echo -e "${YELLOW}Waiting for JupyterHub to start...${NC}"
|
||||||
sleep 3
|
sleep 3
|
||||||
}
|
}
|
||||||
|
|
||||||
print_success() {
|
print_success() {
|
||||||
if docker ps | grep -q jupyterhub; then
|
if docker ps | grep -q jupyterhub; then
|
||||||
|
local version
|
||||||
|
version=$(read_version 2>/dev/null || echo "?")
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${GREEN}✅ JupyterHub is running!${NC}"
|
echo -e "${GREEN}JupyterHub is running! (version ${version})${NC}"
|
||||||
echo ""
|
echo ""
|
||||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
echo "-------------------------------------------"
|
||||||
echo -e "${GREEN}🌐 JupyterHub available at: http://localhost:8888${NC}"
|
echo -e "${GREEN}JupyterHub available at: http://localhost:8888${NC}"
|
||||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
echo "-------------------------------------------"
|
||||||
echo ""
|
echo ""
|
||||||
echo "📝 Password: metabar2025"
|
echo "Images in use:"
|
||||||
echo "👥 Students can connect with any username"
|
echo " Hub: ${HUB_IMAGE}"
|
||||||
|
echo " Student: ${STUDENT_IMAGE}"
|
||||||
echo ""
|
echo ""
|
||||||
echo "🔑 Admin account:"
|
echo "Password: metabar2025"
|
||||||
|
echo "Students can connect with any username"
|
||||||
|
echo ""
|
||||||
|
echo "Admin account:"
|
||||||
echo " Username: admin"
|
echo " Username: admin"
|
||||||
echo " Password: admin2025"
|
echo " Password: admin2025"
|
||||||
echo ""
|
echo ""
|
||||||
echo "📂 Each student will have access to:"
|
echo "Each student will have access to:"
|
||||||
echo " - work/ : personal workspace (everything saved)"
|
echo " - work/ : personal workspace (everything saved)"
|
||||||
echo " - work/R_packages/ : personal R packages (writable)"
|
echo " - work/R_packages/ : personal R packages (writable)"
|
||||||
echo " - work/shared/ : shared workspace"
|
echo " - work/shared/ : shared workspace"
|
||||||
@@ -193,17 +438,49 @@ print_success() {
|
|||||||
echo " - work/course/R_packages/ : shared R packages by prof (read-only)"
|
echo " - work/course/R_packages/ : shared R packages by prof (read-only)"
|
||||||
echo " - work/course/bin/ : shared executables (in PATH)"
|
echo " - work/course/bin/ : shared executables (in PATH)"
|
||||||
echo ""
|
echo ""
|
||||||
echo "🔍 To view logs: docker-compose logs -f jupyterhub"
|
echo "To view logs: ${dockercompose} logs -f jupyterhub"
|
||||||
echo "🛑 To stop: docker-compose down"
|
echo "To stop: ${dockercompose} down"
|
||||||
echo ""
|
echo ""
|
||||||
else
|
else
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${YELLOW}⚠️ JupyterHub container doesn't seem to be starting${NC}"
|
echo -e "${YELLOW}JupyterHub container doesn't seem to be starting${NC}"
|
||||||
echo "Check logs with: docker-compose logs jupyterhub"
|
echo "Check logs with: ${dockercompose} logs jupyterhub"
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Setup volume directories
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
echo "Starting JupyterHub for Lab"
|
||||||
|
echo "=============================="
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
echo -e "${BLUE}Building the volume directories...${NC}"
|
||||||
|
pushd "${SCRIPT_DIR}/jupyterhub_volumes" >/dev/null
|
||||||
|
mkdir -p caddy/data
|
||||||
|
mkdir -p caddy/config
|
||||||
|
mkdir -p course/bin
|
||||||
|
mkdir -p course/R_packages
|
||||||
|
mkdir -p jupyterhub
|
||||||
|
mkdir -p shared
|
||||||
|
mkdir -p users
|
||||||
|
mkdir -p web/obidoc
|
||||||
|
mkdir -p builder/R_packages
|
||||||
|
popd >/dev/null
|
||||||
|
|
||||||
|
pushd "${DOCKER_DIR}" >/dev/null
|
||||||
|
|
||||||
|
if [ ! -f "Dockerfile" ] || [ ! -f "docker-compose.yml" ]; then
|
||||||
|
echo "Error: Run this script from the OBIJupyterHub directory"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Main flow
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
if $STOP_SERVER; then
|
if $STOP_SERVER; then
|
||||||
stop_stack
|
stop_stack
|
||||||
popd >/dev/null
|
popd >/dev/null
|
||||||
@@ -211,13 +488,42 @@ if $STOP_SERVER; then
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
if $UPDATE_LECTURES; then
|
if $UPDATE_LECTURES; then
|
||||||
|
if $LOCAL_BUILD; then
|
||||||
|
build_builder_image
|
||||||
|
elif ! $NO_BUILD; then
|
||||||
|
docker pull "$BUILDER_IMAGE" 2>/dev/null \
|
||||||
|
|| echo -e "${YELLOW}Could not pull builder image, using local cache.${NC}"
|
||||||
|
fi
|
||||||
build_website
|
build_website
|
||||||
popd >/dev/null
|
popd >/dev/null
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
if $UPDATE_OBIDOC; then
|
||||||
|
if $LOCAL_BUILD; then
|
||||||
|
build_builder_image
|
||||||
|
elif ! $NO_BUILD; then
|
||||||
|
docker pull "$BUILDER_IMAGE" 2>/dev/null \
|
||||||
|
|| echo -e "${YELLOW}Could not pull builder image, using local cache.${NC}"
|
||||||
|
fi
|
||||||
|
BUILD_OBIDOC=true
|
||||||
|
build_obidoc
|
||||||
|
popd >/dev/null
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
stop_stack
|
stop_stack
|
||||||
build_images
|
|
||||||
|
if $PUBLISH; then
|
||||||
|
publish_images
|
||||||
|
pull_images # pull the freshly published images into the local daemon
|
||||||
|
elif $LOCAL_BUILD; then
|
||||||
|
build_builder_image
|
||||||
|
build_images
|
||||||
|
else
|
||||||
|
pull_images # default: pull from registry
|
||||||
|
fi
|
||||||
|
|
||||||
build_website
|
build_website
|
||||||
build_obidoc
|
build_obidoc
|
||||||
start_stack
|
start_stack
|
||||||
|
|||||||
@@ -0,0 +1,142 @@
|
|||||||
|
#!/usr/bin/env Rscript
|
||||||
|
# Script to dynamically detect and install R dependencies from Quarto files.
|
||||||
|
# Scans library()/require() calls and remotes::install_git/github() calls.
|
||||||
|
|
||||||
|
args <- commandArgs(trailingOnly = TRUE)
|
||||||
|
quarto_dir <- if (length(args) > 0) args[1] else "."
|
||||||
|
|
||||||
|
target_lib <- "/usr/local/lib/R/site-library"
|
||||||
|
|
||||||
|
cat("Scanning Quarto files in:", quarto_dir, "\n")
|
||||||
|
cat("Target library:", target_lib, "\n")
|
||||||
|
|
||||||
|
qmd_files <- list.files(
|
||||||
|
path = quarto_dir,
|
||||||
|
pattern = "\\.qmd$",
|
||||||
|
recursive = TRUE,
|
||||||
|
full.names = TRUE
|
||||||
|
)
|
||||||
|
|
||||||
|
if (length(qmd_files) == 0) {
|
||||||
|
cat("No .qmd files found.\n")
|
||||||
|
quit(status = 0)
|
||||||
|
}
|
||||||
|
|
||||||
|
cat("Found", length(qmd_files), "Quarto files\n")
|
||||||
|
|
||||||
|
# Extract package names from library()/require() calls
|
||||||
|
extract_cran_packages <- function(files) {
|
||||||
|
pattern <- "(?:library|require)\\s*\\(\\s*['\"]?([A-Za-z0-9._]+)['\"]?"
|
||||||
|
pkgs <- character(0)
|
||||||
|
for (f in files) {
|
||||||
|
lines <- tryCatch(readLines(f, warn = FALSE), error = function(e) character(0))
|
||||||
|
m <- regmatches(lines, gregexpr(pattern, lines, perl = TRUE))
|
||||||
|
hits <- unlist(m)
|
||||||
|
if (length(hits) > 0) {
|
||||||
|
extracted <- sub(
|
||||||
|
"(?:library|require)\\s*\\(\\s*['\"]?([A-Za-z0-9._]+)['\"]?.*",
|
||||||
|
"\\1", hits, perl = TRUE
|
||||||
|
)
|
||||||
|
pkgs <- c(pkgs, extracted)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
unique(pkgs)
|
||||||
|
}
|
||||||
|
|
||||||
|
# Extract git/github URLs from remotes::install_git/github() calls
|
||||||
|
extract_git_packages <- function(files) {
|
||||||
|
# Matches remotes::install_git('url') or remotes::install_github('user/repo')
|
||||||
|
pattern <- "remotes::install_(git|github)\\s*\\(\\s*['\"]([^'\"]+)['\"]"
|
||||||
|
result <- list()
|
||||||
|
for (f in files) {
|
||||||
|
lines <- tryCatch(readLines(f, warn = FALSE), error = function(e) character(0))
|
||||||
|
text <- paste(lines, collapse = "\n")
|
||||||
|
m <- gregexpr(pattern, text, perl = TRUE)
|
||||||
|
hits <- regmatches(text, m)[[1]]
|
||||||
|
for (hit in hits) {
|
||||||
|
type <- sub("remotes::install_(git|github).*", "\\1", hit, perl = TRUE)
|
||||||
|
url <- sub("remotes::install_(?:git|github)\\s*\\(\\s*['\"]([^'\"]+)['\"].*",
|
||||||
|
"\\1", hit, perl = TRUE)
|
||||||
|
result[[length(result) + 1]] <- list(type = type, url = url)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result
|
||||||
|
}
|
||||||
|
|
||||||
|
cran_deps <- extract_cran_packages(qmd_files)
|
||||||
|
git_deps <- extract_git_packages(qmd_files)
|
||||||
|
|
||||||
|
# Quarto's implicit runtime dependencies — must be in target_lib (the persistent
|
||||||
|
# volume), not just somewhere in libPaths, because Quarto spawns its own R session.
|
||||||
|
quarto_required <- c("rmarkdown", "knitr")
|
||||||
|
if (length(git_deps) > 0) quarto_required <- c(quarto_required, "remotes")
|
||||||
|
|
||||||
|
cat("\nDetected CRAN packages:\n")
|
||||||
|
cat(paste(" -", unique(c(quarto_required, cran_deps)), collapse = "\n"), "\n")
|
||||||
|
|
||||||
|
if (length(git_deps) > 0) {
|
||||||
|
cat("\nDetected git/github packages:\n")
|
||||||
|
for (d in git_deps) cat(" -", d$type, ":", d$url, "\n")
|
||||||
|
}
|
||||||
|
cat("\n")
|
||||||
|
|
||||||
|
# --- Install CRAN packages ---
|
||||||
|
|
||||||
|
base_pkgs <- rownames(installed.packages(priority = "base"))
|
||||||
|
|
||||||
|
# quarto_required: check only in target_lib so they are guaranteed to be there
|
||||||
|
installed_in_target <- rownames(installed.packages(lib.loc = target_lib))
|
||||||
|
quarto_missing <- setdiff(quarto_required, c(base_pkgs, installed_in_target))
|
||||||
|
|
||||||
|
# other deps: check anywhere in libPaths (they just need to be loadable)
|
||||||
|
cran_deps <- setdiff(cran_deps, c(base_pkgs, quarto_required))
|
||||||
|
installed <- rownames(installed.packages())
|
||||||
|
to_install <- unique(c(quarto_missing, setdiff(cran_deps, installed)))
|
||||||
|
|
||||||
|
if (length(to_install) == 0) {
|
||||||
|
cat("All CRAN packages already installed.\n")
|
||||||
|
} else {
|
||||||
|
cat("Installing CRAN packages:", paste(to_install, collapse = ", "), "\n\n")
|
||||||
|
failed <- character(0)
|
||||||
|
for (pkg in to_install) {
|
||||||
|
result <- tryCatch({
|
||||||
|
withCallingHandlers(
|
||||||
|
install.packages(pkg, lib = target_lib, repos = "https://cloud.r-project.org/",
|
||||||
|
dependencies = TRUE, quiet = FALSE),
|
||||||
|
warning = function(w) {
|
||||||
|
if (grepl("not available", conditionMessage(w))) invokeRestart("muffleWarning")
|
||||||
|
}
|
||||||
|
)
|
||||||
|
if (!requireNamespace(pkg, quietly = TRUE)) "unavailable" else "ok"
|
||||||
|
}, error = function(e) "error")
|
||||||
|
|
||||||
|
if (result %in% c("unavailable", "error")) {
|
||||||
|
cat(" [SKIP]", pkg, "- not available on CRAN\n")
|
||||||
|
failed <- c(failed, pkg)
|
||||||
|
} else {
|
||||||
|
cat(" [OK]", pkg, "\n")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (length(failed) > 0)
|
||||||
|
cat("\nNot installed (not on CRAN):", paste(failed, collapse = ", "), "\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
# --- Install git/github packages ---
|
||||||
|
|
||||||
|
if (length(git_deps) > 0) {
|
||||||
|
cat("\nInstalling git/github packages...\n")
|
||||||
|
for (d in git_deps) {
|
||||||
|
tryCatch({
|
||||||
|
if (d$type == "git") {
|
||||||
|
remotes::install_git(d$url, lib = target_lib, upgrade = "never")
|
||||||
|
} else {
|
||||||
|
remotes::install_github(d$url, lib = target_lib, upgrade = "never")
|
||||||
|
}
|
||||||
|
cat(" [OK]", d$url, "\n")
|
||||||
|
}, error = function(e) {
|
||||||
|
cat(" [FAIL]", d$url, "-", conditionMessage(e), "\n")
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
cat("\nDependency installation complete.\n")
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
0.1.0
|
||||||
@@ -16,9 +16,18 @@ editor: visual
|
|||||||
|
|
||||||
```{r setup, include=FALSE}
|
```{r setup, include=FALSE}
|
||||||
library(knitr)
|
library(knitr)
|
||||||
|
library(Rdpack)
|
||||||
library(tidyverse)
|
library(tidyverse)
|
||||||
library(kableExtra)
|
library(gt)
|
||||||
library(latex2exp)
|
library(latex2exp)
|
||||||
|
|
||||||
|
# Install MetabarSchool if not available
|
||||||
|
if (!requireNamespace("MetabarSchool", quietly = TRUE)) {
|
||||||
|
if (!requireNamespace("remotes", quietly = TRUE)) {
|
||||||
|
install.packages("remotes", dependencies = TRUE)
|
||||||
|
}
|
||||||
|
remotes::install_git('https://forge.metabarcoding.org/MetabarcodingSchool/biodiversity-metrics.git')
|
||||||
|
}
|
||||||
library(MetabarSchool)
|
library(MetabarSchool)
|
||||||
|
|
||||||
opts_chunk$set(echo = FALSE,
|
opts_chunk$set(echo = FALSE,
|
||||||
@@ -49,7 +58,7 @@ install.packages("devtools",dependencies = TRUE)
|
|||||||
Then you can install *MetabarSchool*
|
Then you can install *MetabarSchool*
|
||||||
|
|
||||||
```{r eval=FALSE, echo=TRUE}
|
```{r eval=FALSE, echo=TRUE}
|
||||||
devtools::install_git("https://git.metabarcoding.org/MetabarcodingSchool/biodiversity-metrics.git")
|
remotes::install_git('https://forge.metabarcoding.org/MetabarcodingSchool/biodiversity-metrics.git')
|
||||||
```
|
```
|
||||||
|
|
||||||
You will also need the *vegan* package
|
You will also need the *vegan* package
|
||||||
@@ -68,11 +77,15 @@ A 16 plants mock community
|
|||||||
data("plants.16")
|
data("plants.16")
|
||||||
x = cbind(` ` =seq_len(nrow(plants.16)),plants.16)
|
x = cbind(` ` =seq_len(nrow(plants.16)),plants.16)
|
||||||
x$`Relative aboundance`=paste0('1/',1/x$dilution)
|
x$`Relative aboundance`=paste0('1/',1/x$dilution)
|
||||||
knitr::kable(x[,-(4:5)],
|
x[,-(4:5)] %>%
|
||||||
format = "html",
|
gt() %>%
|
||||||
row.names = FALSE,
|
cols_align(align = "center", columns = 1) %>%
|
||||||
align = "rlrr") %>%
|
cols_align(align = "left", columns = 2) %>%
|
||||||
kable_styling(position = "center")
|
cols_align(align = "right", columns = c(3, 4)) %>%
|
||||||
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
## The experiment {.flexbox .vcenter}
|
## The experiment {.flexbox .vcenter}
|
||||||
@@ -100,11 +113,14 @@ data("positive.motus")
|
|||||||
- `positive.count` read count matrix $`r nrow(positive.count)` \; PCRs \; \times \; `r ncol(positive.count)` \; MOTUs$
|
- `positive.count` read count matrix $`r nrow(positive.count)` \; PCRs \; \times \; `r ncol(positive.count)` \; MOTUs$
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
knitr::kable(positive.count[1:5,1:5],
|
as.data.frame(positive.count[1:5,1:5]) %>%
|
||||||
format="html",
|
gt() %>%
|
||||||
align = 'rc') %>%
|
cols_align(align = "right", columns = 1) %>%
|
||||||
kable_styling(position = "center") %>%
|
cols_align(align = "center", columns = 2:ncol(positive.count[1:5,1:5])) %>%
|
||||||
row_spec(0, angle = -45)
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@@ -126,10 +142,14 @@ data("positive.motus")
|
|||||||
- `positive.samples` a `r nrow(positive.samples)` rows `data.frame` of `r ncol(positive.samples)` columns describing each PCR
|
- `positive.samples` a `r nrow(positive.samples)` rows `data.frame` of `r ncol(positive.samples)` columns describing each PCR
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
knitr::kable(head(positive.samples,n=3),
|
head(positive.samples,n=3) %>%
|
||||||
format="html",
|
gt() %>%
|
||||||
align = 'rc') %>%
|
cols_align(align = "right", columns = 1) %>%
|
||||||
kable_styling(position = "center")
|
cols_align(align = "center", columns = 2:ncol(head(positive.samples,n=3))) %>%
|
||||||
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@@ -151,10 +171,16 @@ data("positive.motus")
|
|||||||
- `positive.motus` : a `r nrow(positive.motus)` rows `data.frame` of `r ncol(positive.motus)` columns describing each MOTU
|
- `positive.motus` : a `r nrow(positive.motus)` rows `data.frame` of `r ncol(positive.motus)` columns describing each MOTU
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
knitr::kable(head(positive.motus,n=3),
|
head(positive.motus,n=3) %>%
|
||||||
format = "html",
|
gt() %>%
|
||||||
align = 'rlrc') %>%
|
cols_align(align = "right", columns = 1) %>%
|
||||||
kable_styling(position = "center")
|
cols_align(align = "left", columns = 2) %>%
|
||||||
|
cols_align(align = "right", columns = 3) %>%
|
||||||
|
cols_align(align = "center", columns = 4) %>%
|
||||||
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@@ -172,10 +198,17 @@ table(colSums(positive.count) == 1)
|
|||||||
```
|
```
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
kable(t(table(colSums(positive.count) == 1)),
|
as.data.frame(t(table(colSums(positive.count) == 1))) %>%
|
||||||
format = "html") %>%
|
gt() %>%
|
||||||
kable_styling(position = "center") %>%
|
cols_align(align = "center", columns = everything()) %>%
|
||||||
row_spec(0, align = 'c')
|
tab_style(
|
||||||
|
style = cell_text(align = "center"),
|
||||||
|
locations = cells_column_labels()
|
||||||
|
) %>%
|
||||||
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
<br>
|
<br>
|
||||||
@@ -360,10 +393,13 @@ knitr::include_graphics("figures/alpha_diversity.svg")
|
|||||||
E1 = c(A=0.25,B=0.25,C=0.25,D=0.25,E=0,F=0,G=0)
|
E1 = c(A=0.25,B=0.25,C=0.25,D=0.25,E=0,F=0,G=0)
|
||||||
E2 = c(A=0.55,B=0.07,C=0.02,D=0.17,E=0.07,F=0.07,G=0.03)
|
E2 = c(A=0.55,B=0.07,C=0.02,D=0.17,E=0.07,F=0.07,G=0.03)
|
||||||
environments = t(data.frame(`Environment 1` = E1,`Environment 2` = E2))
|
environments = t(data.frame(`Environment 1` = E1,`Environment 2` = E2))
|
||||||
kable(environments,
|
as.data.frame(environments) %>%
|
||||||
format="html",
|
gt() %>%
|
||||||
align = 'rr') %>%
|
cols_align(align = "right", columns = everything()) %>%
|
||||||
kable_styling(position = "center")
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Richness {.flexbox .vcenter}
|
## Richness {.flexbox .vcenter}
|
||||||
@@ -379,10 +415,13 @@ S = rowSums(environments > 0)
|
|||||||
```
|
```
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
kable(data.frame(S=S),
|
data.frame(S=S) %>%
|
||||||
format="html",
|
gt() %>%
|
||||||
align = 'rr') %>%
|
cols_align(align = "right", columns = everything()) %>%
|
||||||
kable_styling(position = "center")
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Gini-Simpson's index {.smaller}
|
## Gini-Simpson's index {.smaller}
|
||||||
@@ -414,10 +453,13 @@ GS = 1 - rowSums(environments^2)
|
|||||||
```
|
```
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
kable(data.frame(`Gini-Simpson`=GS),
|
data.frame(`Gini-Simpson`=GS) %>%
|
||||||
format="html",
|
gt() %>%
|
||||||
align = 'rr') %>%
|
cols_align(align = "right", columns = everything()) %>%
|
||||||
kable_styling(position = "center")
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Shannon entropy {.smaller}
|
## Shannon entropy {.smaller}
|
||||||
@@ -443,10 +485,13 @@ H = - rowSums(environments * log(environments),na.rm = TRUE)
|
|||||||
```
|
```
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
kable(data.frame(`Shannon index`=H),
|
data.frame(`Shannon index`=H) %>%
|
||||||
format="html",
|
gt() %>%
|
||||||
align = 'rr') %>%
|
cols_align(align = "right", columns = everything()) %>%
|
||||||
kable_styling(position = "center")
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Hill's number {.smaller}
|
## Hill's number {.smaller}
|
||||||
@@ -476,10 +521,17 @@ D2 = exp(- rowSums(environments * log(environments),na.rm = TRUE))
|
|||||||
```
|
```
|
||||||
|
|
||||||
```{r}
|
```{r}
|
||||||
kable(data.frame(`Hill Numbers`=D2),
|
data.frame(`Hill Numbers` = D2) %>%
|
||||||
format="html",
|
gt() %>%
|
||||||
align = 'rr') %>%
|
cols_align(align = "center") %>%
|
||||||
kable_styling(position = "center")
|
tab_style(
|
||||||
|
style = cell_text(weight = "bold"),
|
||||||
|
locations = cells_column_labels()
|
||||||
|
) %>%
|
||||||
|
tab_options(
|
||||||
|
table.align = "center",
|
||||||
|
heading.align = "center"
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Generalized logaritmic function {.smaller}
|
## Generalized logaritmic function {.smaller}
|
||||||
|
|||||||
@@ -4,6 +4,9 @@ project:
|
|||||||
post-render:
|
post-render:
|
||||||
- scripts/copy-to-web.sh
|
- scripts/copy-to-web.sh
|
||||||
|
|
||||||
|
execute:
|
||||||
|
freeze: auto
|
||||||
|
|
||||||
format:
|
format:
|
||||||
html:
|
html:
|
||||||
toc: false
|
toc: false
|
||||||
|
|||||||
Reference in New Issue
Block a user