Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,3 @@ exclude_lines =
if __name__ == .__main__.:
pragma: no cover
show_missing = True

1 change: 0 additions & 1 deletion .darglint
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
[darglint]
ignore=DAR402
docstring_style=google

1 change: 1 addition & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
simba_ml/_version.py export-subst
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,5 @@ dev.py
wandb/
lightning_logs/
*.ipynb
man
man
.pre-commit-config.yaml
2 changes: 1 addition & 1 deletion .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ build:

sphinx:
configuration: docs/source/conf.py

python:
install:
- method: pip
Expand Down
2 changes: 1 addition & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -41,4 +41,4 @@
"--convention=google",
],
"python.linting.mypyEnabled": true,
}
}
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ RUN pip install -r /tmp/dev_requirements.txt
RUN pip install -r /tmp/docs_requirements.txt
RUN rm /tmp/requirements.txt
RUN rm /tmp/dev_requirements.txt
RUN rm /tmp/docs_requirements.txt
RUN rm /tmp/docs_requirements.txt
File renamed without changes.
2 changes: 1 addition & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
include versioneer.py
include simba_ml/_version.py
include simba_ml/_version.py
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ lint: simba_ml
pycodestyle --max-line-length=88 --ignore E203,W503 --select W504 simba_ml
pylint simba_ml
pydocstyle --convention=google simba_ml
sourcery review simba_ml --check
sourcery review simba_ml --check
mypy --pretty simba_ml/ --disable-error-code import --disable-error-code no-any-return --strict
find simba_ml ! -iwholename "simba_ml\/\_version\.py" -name "*.py" | xargs darglint -v 2
black simba_ml --check --exclude _version.py
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,45 +41,45 @@ The following code show how to integrate a model, which always predicts zero:
>>> import dataclasses
>>> import numpy as np
>>> import numpy.typing as npt
>>>
>>>
>>> from simba_ml.prediction.time_series.models import model
>>> from simba_ml.prediction.time_series.models import factory
>>>
>>>
>>>
>>>
>>> @dataclasses.dataclass
... class ZeroPredictorConfig(model.ModelConfig):
... """Defines the configuration for the DenseNeuralNetwork."""
... name: str = "Zero Predictor"
...
...
...
...
>>> class ZeroPredictor(model.Model):
... """Defines a model, which predicts the average of the train data."""
...
...
... def __init__(self, input_length: int, output_length: int, config: ZeroPredictorConfig):
... """Inits the `AveragePredictor`.
...
...
... Args:
... input_length: the length of the input data.
... output_length: the length of the output data.
... config: the config for the model
... """
... super().__init__(input_length, output_length, config)
...
...
... def set_seed(self, seed: int) -> None:
... """Sets the seed for the model. For this model, this is not required."""
... """Sets the seed for the model. For this model, this is not required."""
... pass
...
...
... def train(self, train: list[npt.NDArray[np.float64]], val: list[npt.NDArray[np.float64]]) -> None:
... pass
...
...
... def predict(self, data: npt.NDArray[np.float64]) -> npt.NDArray[np.float64]:
... self.validate_prediction_input(data)
... return np.full((data.shape[0], self.output_length, data.shape[2]), 0.0)
...
...
...
...
>>> def register() -> None:
... factory.register(
... "ZeroPredictor",
... ZeroPredictorConfig,
... ZeroPredictor
... )
... )
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,4 @@ Include Own Model for Transfer Learning
Besides the provided models, SimbaML allows for the effortless integration of any other machine learning models, for example, PyTorch Lightning and Keras

.. note::
Before applying your own model to the transfer learning pipeline, make sure that the model's weights are not reset when the train() function is called the second time. This is, for example, the case for Scikit-learn models.
Before applying your own model to the transfer learning pipeline, make sure that the model's weights are not reset when the train() function is called the second time. This is, for example, the case for Scikit-learn models.
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
Time-Series Prediction Pipelines
================================

To enable scalable and easy-to-run machine learning experiments on time-series data, SimbaML offers multiple pipelines covering data pre-processing, training, and evaluation of ML models.

.. toctree::
:maxdepth: 2

synthetic_data_pipeline
mixed_data_pipeline
transfer_learning_pipeline
transfer_learning_pipeline
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,4 @@ All provided machine learning pipelines of SimbaML can be configured based on co
Start Mixed Data Pipeline
--------------------------

$ simba_ml start-prediction mixed_data --config-path mixed_data_pipeline.toml
$ simba_ml start-prediction mixed_data --config-path mixed_data_pipeline.toml
Original file line number Diff line number Diff line change
Expand Up @@ -93,4 +93,4 @@ plugins = [
# make sure to specify the right project and entity
# [logging]
# project = "your-wandb-project"
# entity = "your-wandb-entity"
# entity = "your-wandb-entity"
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@ All provided machine learning pipelines of SimbaML can be configured based on co
Start Synthetic Data Pipeline
-----------------------------

$ simba_ml start-prediction synthetic_data --config-path synthetic_data_pipeline.toml
$ simba_ml start-prediction synthetic_data --config-path synthetic_data_pipeline.toml
Original file line number Diff line number Diff line change
Expand Up @@ -91,4 +91,4 @@ plugins = [
# make sure to specify the right project and entity
# [logging]
# project = "your-wandb-project"
# entity = "your-wandb-entity"
# entity = "your-wandb-entity"
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@ This way, users can change the models that are going to be trained, their hyperp
Start Pipeline
--------------

$ simba_ml start-prediction transfer_learning --config-path transfer_learning_pipeline.toml
$ simba_ml start-prediction transfer_learning --config-path transfer_learning_pipeline.toml
Original file line number Diff line number Diff line change
Expand Up @@ -46,4 +46,4 @@ plugins = [
# make sure to specify the right project and entity
# [logging]
# project = "your-wandb-project"
# entity = "your-wandb-entity"
# entity = "your-wandb-entity"
4 changes: 2 additions & 2 deletions docs/source/Usage/Machine-Learning/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,10 @@ Pipelines

.. toctree::
:maxdepth: 2

Time-Series-Prediction/Pipelines/index

Steady State Prediction
-----------------------

Coming soon!
Coming soon!
2 changes: 1 addition & 1 deletion docs/source/Usage/Simulation/create_complex_config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,4 +46,4 @@ Create the SystemModel object
--------------------------------
Save it in a variable called ‘sm’.

>>> sm = system_model.SystemModel(name, specieses, kinetic_parameters, deriv=deriv, deriv_noiser=derivative_noiser, noiser=noiser)
>>> sm = system_model.SystemModel(name, specieses, kinetic_parameters, deriv=deriv, deriv_noiser=derivative_noiser, noiser=noiser)
2 changes: 1 addition & 1 deletion docs/source/Usage/Simulation/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ data-generation with SimbaML based on the required configuration files.

create_config
create_complex_config
run_data_generation
run_data_generation
4 changes: 2 additions & 2 deletions docs/source/Usage/Simulation/run_data_generation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ To run the data generation, you need execute the following command:

$ simba_ml generate-data [generator] --config-module [config_module] --output-dir [output_dir]

The generator is the name of the generator to use.
The generator is the name of the generator to use.
Run `simba_ml generate-data --help` to see the list of available generators.
The config_module is the path of the module that contains the `SystemModel` for the generator.
The output_dir is the directory where the generated data will be stored.
The output_dir is the directory where the generated data will be stored.
146 changes: 140 additions & 6 deletions docs/source/Usage/cli.rst
Original file line number Diff line number Diff line change
@@ -1,12 +1,146 @@
Using SimbaML CLI
==================

SimbaML provides a CLI with multiple commands.
SimbaML provides a modern CLI for SBML parsing, BioModels integration, and data generation.

To get a list of all available commands, run:
Installation
------------

$ simba_ml --help
After installing SimbaML, the ``simba-ml`` command will be available:

To get help on a specific command, run:

$ simba_ml <command> --help
$ simba-ml --help

For detailed help on any command, use:

$ simba-ml <command> --help

SBML Parsing
------------

Parse and analyze SBML model files locally.

Basic Usage
^^^^^^^^^^^

$ simba-ml sbml parse <path-to-sbml-file>

This command will:
- Detect SBML Level and Version
- Parse the model structure (species, reactions, parameters, compartments)
- Analyze species types (dynamic vs boundary conditions)
- Display ODE readiness assessment
- Show sample species and reactions
- Display model description

The parser validates:
- SBML file format and compliance
- Presence of kinetic laws for ODE simulation
- Model connectivity and network structure

Options
^^^^^^^

- ``--verbose, -v``: Show detailed parsing information
- ``--species-limit, -s INTEGER``: Number of species to display (default: 5)
- ``--reactions-limit, -r INTEGER``: Number of reactions to display (default: 5)
- ``--export {csv}``: Export model data to CSV format (currently supported)
- ``--output-dir, -o PATH``: Output directory for exports (default: ./sbml_exports)
- ``--quiet, -q``: Suppress visual output (JSON output only)

Examples
^^^^^^^^

Parse a local SBML file:

$ simba-ml sbml parse Garde2020.xml

Parse with verbose output and custom display limits:

$ simba-ml sbml parse model.xml --verbose --species-limit 10 --reactions-limit 10

Export model data to CSV format:

$ simba-ml sbml parse model.xml --export csv --output-dir ./exported_data

Get JSON output (quiet mode, useful for scripts):

$ simba-ml sbml parse model.xml --quiet

BioModels Integration
---------------------

Search and download SBML models from the `BioModels Database <https://www.ebi.ac.uk/biomodels/>`_.

Search for Models
^^^^^^^^^^^^^^^^^

$ simba-ml biomodels search <query> [--limit <number>]

The search command queries the BioModels REST API and displays:
- Model ID (e.g., BIOMD0000000505)
- Model name
- Format (SBML)

Search examples:

# Search for SIR models (limit 3 results)
$ simba-ml biomodels search "SIR" --limit 3

# Search for oscillation models (default 10 results)
$ simba-ml biomodels search "oscillation"

# Search for cancer models
$ simba-ml biomodels search "cancer"

Download Models
^^^^^^^^^^^^^^^

$ simba-ml biomodels download <model-id> [--output-dir <path>]

Downloads a specific BioModels model and saves it as an SBML XML file.

Options:

- ``--output-dir, -o PATH``: Directory to save the model (default: ./biomodels_downloads)

Download examples:

# Download a specific model
$ simba-ml biomodels download BIOMD0000000505

# Download to a custom directory
$ simba-ml biomodels download BIOMD0000000505 --output-dir ./my_models

Complete Workflow
-----------------

Here's a typical workflow for finding and analyzing a model:

1. **Search for models of interest:**

$ simba-ml biomodels search "SIR"

2. **Download a model:**

$ simba-ml biomodels download BIOMD0000000982

3. **Parse and analyze the downloaded model:**

$ simba-ml sbml parse BIOMD0000000982_url.xml

4. **Export data for machine learning:**

$ simba-ml sbml parse BIOMD0000000982_url.xml --export csv --output-dir ./sir_data

5. **Get JSON output for programmatic use:**

$ simba-ml sbml parse BIOMD0000000982_url.xml --quiet

Legacy CLI
----------

For backward compatibility, the legacy CLI interface is still available:

$ python -m simba_ml.cli <command>

The modern CLI (``simba-ml``) is recommended for new workflows as it provides better formatting and improved user experience.
4 changes: 2 additions & 2 deletions docs/source/about/acknowledgements.rst
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
Acknowledgements
================

Thanks to
Thanks to

- Katharina Baum
- Pascal Iversen
- Simon Witzke
- Bernhard Renard
- Bernhard Renard
2 changes: 1 addition & 1 deletion docs/source/about/authors.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ Authors
- Lukas Drews
- Benedict Heyder
- Maximilian Kleissl
- Julian Zabbarov
- Julian Zabbarov
Loading