Skip to content

qbicsoftware/data-manager-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1,238 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data Manager

Data Manager - A web-based multi-omics data management platform for the biomedical life sciences that enables FAIR-compliant data access.

Build Maven Package Run Maven Tests CodeQL release

license language DOI

Security Rating Vulnerabilities Maintainability Rating Code Smells Bugs Duplicated Lines (%) Quality Gate Status

Overview:

Style Guide

Please find the styleguide in its own repository.

Requirements Documentation & Workflow

This project follows a structured requirements-driven development process to ensure traceability from user needs through implementation. All new features and changes must be backed by documented requirements.

Requirement IDs

All requirements follow a domain-based identifier scheme:

<DOMAIN>-<TYPE>-<NN>
  • DOMAIN — functional area (e.g., API, PROJECT, SAMPLE, MEASUREMENT, DATA, FAIR, AUTH)
  • TYPE — requirement category:
    • R = Functional requirement (system capability)
    • NFR = Non-functional requirement (quality attribute)
    • C = Constraint (architectural boundary)
  • NN — zero-padded sequential number (e.g., 01, 02, 10)

Examples

ID Type Description
API-R-01 Functional System provides authenticated API access
SAMPLE-R-02 Functional Users can register samples in bulk
DATA-NFR-01 Non-functional Data retrieval must respond within 2 seconds
FAIR-C-01 Constraint FAIR export must use RO-Crate v1.1.1 format

Where Requirements Live

All requirements are documented in docs/requirements.md, which is the authoritative registry. Each requirement includes:

  • ID — stable identifier (never renumbered)
  • Statement — clear capability description
  • Rationale — why this requirement exists (strategic, regulatory, stakeholder-driven)
  • Source (optional) — link to PRD section, regulatory reference, or stakeholder request

Example

API-R-01 The system shall provide authenticated API access to project metadata.

Rationale:
Enables integration with partner laboratories and automated analysis pipelines.

Source:
PRD §2.3 Partner Integration

Workflow: Stories and Tasks

Contributors use two issue types, defined in .github/ISSUE_TEMPLATE/:

Story (story.yml)

Use when implementing user-facing functionality. A Story describes value and workflow context.

Required fields:

  • At least one functional requirement reference (R-<NN>)
  • User story in the format: "As a [role], I want [goal], so that [benefit]"
  • Acceptance criteria in Given / When / Then format

Rules:

  • Stories must reference existing requirements.
  • If the story requires a new capability not covered by any existing requirement, you must create or update the requirement first (see "Adding New Capabilities" below).

Task (task.yml)

Use for concrete technical implementation work, always linked to a parent Story.

Required fields:

  • Parent Story reference (e.g., #123)
  • At least one requirement reference (R-<NN> or NFR-<NN>)
  • Description of what needs to be implemented
  • Technical notes (design hints, relevant constraints or ADRs)

Rules:

  • Every Task must have a parent Story.
  • Tasks must not redefine acceptance criteria from the Story.

Story → Task → PR Traceability

Requirement (docs/requirements.md)
    ↓
Story (GitHub issue, type: story.yml)
    ↓ (may have multiple tasks)
Task (GitHub issue, type: task.yml)
    ↓ (implements)
Pull Request (references task and requirement IDs)

Adding New Capabilities

Golden rule: Before implementing any new feature or changing existing behavior, ensure it is covered by a requirement in docs/requirements.md.

If you discover that a new capability is needed but no requirement exists:

  1. Update docs/requirements.md first

    • Add a new requirement ID following the scheme above
    • Include Statement, Rationale, and Source
    • Create a Pull Request for the requirement change alone (do not bundle with implementation changes)
    • Include a changelog entry in the PR description listing added/modified requirement IDs and rationale
    • Wait for maintainer approval
  2. Then create the Story

    • Reference the new requirement ID
    • Link to the requirements PR if helpful
  3. Then create the Task(s)

    • Reference the Story
    • Reference the requirement ID(s)

Contributing: Checklist

When contributing code:

  • Is this change covered by an existing requirement in docs/requirements.md?
  • If it introduces new capability, have you updated requirements first (via PR)?
  • Does your PR reference the GitHub issue (Story or Task) it implements?
  • Does your PR list the requirement IDs addressed?

Verify build signature

To mitigate the risk for supply chain attacks, we include signature bundles for every artifact (stable and snapshot) generated with sigstore.

We also run PGP verification during our CI builds for transitive dependencies to raise red flags for unexpected signature changes.

You can use cosign to verify the signature for our artifacts, e.g. for development build:

cosign verify-blob \
  --bundle user-interface-1.11.0-20250924.124400-2.jar.sigstore.json \
  --certificate-oidc-issuer https://token.actions.githubusercontent.com \
  --certificate-identity 'https://github.com/qbicsoftware/data-manager-app/.github/workflows/nexus-publish-snapshots.yml@refs/heads/development' \
  user-interface-1.11.0-20250924.124400-2.jar

In case you for some reason cannot verify the signature, please create an issue on GitHub.

How to Run

This application is based on maven and can be run after setting the required configurations, by typing mvnw (Windows), or ./mvnw (Mac & Linux) in the command line and opening http://localhost:8080 in your browser.

You can also import the project to your IDE of choice as you would with any Maven project. Read more on how to import Vaadin projects to different IDEs (Eclipse, IntelliJ IDEA, NetBeans, and VS Code).

Deploying to Production

To create a production build, call mvnw clean package -Pproduction (Windows), or ./mvnw clean package -Pproduction (Mac & Linux). This will build a JAR file with all the dependencies and front-end resources, ready to be deployed. The file can be found in the target folder after the build completes.

Once the JAR file is built, you can run it using java -jar target/datamanager-1.0-SNAPSHOT.jar

Maven Profiles

To have all beans available for running the datamanager, you need to set a profile during compilation. If you plan to deploy it in production or with external systems integrated, use the production profile.

For local development you can use the development profile. This profile will disable and mock integration with the TIB service, OpenBis and the ROR repository.

Configuration

Java Version

This application requires Java 17 to be build and run

Environment Variables

The env variables contain information about the salt and the secret. Both of them are used to encrypt and decrypt user information.

environment variable description
USER_DB_URL The database host address
USER_DB_USER_NAME The database user name
USER_DB_USER_PW The database password

The application properties file could look like the following:

spring.datasource.url=${USER_DB_URL:localhost}
spring.datasource.username=${USER_DB_USER_NAME:myusername}
spring.datasource.password=${USER_DB_USER_PW:astrongpassphrase!}

To change the port or the driver those can be added as environmental variables as well. Both are preset with default values and are therefore not mandatory to set

environment variable description
DM_PORT The application port
USER_DB_DRIVER The database driver
server.port=${DM_PORT:8080}
spring.datasource.driver-class-name=${USER_DB_DRIVER:com.mysql.cj.jdbc.Driver}

As the application needs to send emails, you have to configure an smtp server as well.

environment variable description
MAIL_HOST The smtp server host (e.g. smtp.gmail.com)
MAIL_PASSWORD The password to authenticate against the mail server
MAIL_USERNAME The username to authenticate against the mail server
MAIL_PORT The port to use for the SMTP connection
spring.mail.username=${MAIL_USERNAME}
spring.mail.password=${MAIL_PASSWORD}
spring.mail.host=${MAIL_HOST:smtp.gmail.com}
spring.mail.port=${MAIL_PORT:587}

For user email confirmation a specific endpoint and context-path (for example if the app runs in a different context than the root path) is addressed. This endpoint can be configured using the following properties:

environment variable description
DM_SERVICE_HOST The server address (if behind a proxy, the proxy domain name)
DM_HOST_PROTOCOL The server protocol (http or https)
DM_SERVICE_PORT The server port (-1 for default)
DM_SERVICE_CONTEXT_PATH The service context path of the application (empty for default)
EMAIL_CONFIRMATION_PARAMETER The name of the parameter to which to pass the confirmation token
EMAIL_CONFIRMATION_ENDPOINT The endpoint for the email configuration entry
PASSWORD_RESET_ENDPOINT The endpoint for the password reset entry
PASSWORD_RESET_PARAMETER The name for the password reset query parameter in the URL

Generated email confirmation links will look like localhost:8080/login?confirm-email=<token> with the default configuration.

# global service route configuration for mail interaction requests
service.host.name=${DM_SERVICE_HOST:localhost}
service.host.protocol=${DM_HOST_PROTOCOL:https}
service.host.port=${DM_SERVICE_PORT:-1}
# Set the context path, for example if your app runs behind a proxy
server.servlet.context-path=${DM_SERVICE_CONTEXT_PATH:}
# route for mail confirmation consumption
email-confirmation-endpoint=${EMAIL_CONFIRMATION_ENDPOINT:login}
email-confirmation-parameter=${EMAIL_CONFIRMATION_PARAMETER:confirm-email}
# route for password reset
password-reset-endpoint=${PASSWORD_RESET_ENDPOINT:new-password}
password-reset-parameter=${PASSWORD_RESET_PARAMETER:user-id}

Since the application will retrieve experimental design values from a list of defined vocabularies a connection to the datasource containing this information is necessary:

environment variable description
OPENBIS_DATASOURCE_URL The vocabulary database host API address
OPENBIS_USER_NAME The vocabulary database user name
OPENBIS_USER_PASSWORD The vocabulary database password

The application properties file could look like the following:

openbis.user.name=${OPENBIS_USER_NAME:openbis-username}
openbis.user.password=${OPENBIS_USER_PASSWORD:openbis-password}
openbis.datasource.url=${OPENBIS_DATASOURCE_URL:openbis-url}

Properties

The environment variables can either be set in the runtime configuration of your IDE or directly in the application properties file:

server.port=${DM_PORT:8080}
logging.level.org.atmosphere=warn
spring.mustache.check-template-location=false
# Launch the default browser when starting the application in development mode
vaadin.launch-browser=true
# To improve the performance during development.
# For more information https://vaadin.com/docs/flow/spring/tutorial-spring-configuration.html#special-configuration-parameters
vaadin.whitelisted-packages=com.vaadin,org.vaadin,dev.hilla,life.qbic
# Database setup configuration
spring.datasource.url=${USER_DB_URL:localhost}
spring.datasource.driver-class-name=${USER_DB_DRIVER:com.mysql.cj.jdbc.Driver}
spring.datasource.username=${USER_DB_USER_NAME:myusername}
spring.datasource.password=${USER_DB_USER_PW:astrongpassphrase!}
spring.jpa.hibernate.ddl-auto=update
# mail configuration
spring.mail.username=${MAIL_USERNAME}
spring.mail.password=${MAIL_PASSWORD}
spring.mail.host=${MAIL_HOST:smtp.gmail.com}
spring.mail.default-encoding=UTF-8
spring.mail.port=${MAIL_PORT:587}
# global service route configuration for mail interaction requests
service.host.name=${DM_SERVICE_HOST:localhost}
service.host.protocol=${DM_HOST_PROTOCOL:https}
service.host.port=${DM_SERVICE_PORT:-1}
# Set the context path, for example if your app runs behind a proxy
server.servlet.context-path=${DM_SERVICE_CONTEXT_PATH:}
# route for mail confirmation consumption
email-confirmation-endpoint=${EMAIL_CONFIRMATION_ENDPOINT:login}
email-confirmation-parameter=${EMAIL_CONFIRMATION_PARAMETER:confirm-email}
# route for password reset
password-reset-endpoint=${PASSWORD_RESET_ENDPOINT:new-password}
password-reset-parameter=${PASSWORD_RESET_PARAMETER:user-id}
# openbis-client credentials
openbis.user.name=${OPENBIS_USER_NAME:openbis-username}
openbis.user.password=${OPENBIS_USER_PASSWORD:openbis-password}
openbis.datasource.url=${OPENBIS_DATASOURCE_URL:openbis-url}

Secret handling

Data Manager uses a custom vault concept for storing application secrets, that builds upon a Java Keystore in its core.

We currently stick to a PKCS12 store type and encrypt every secret with AES.

Therefore, in order to be able to run the application, a keystore must be set-up beforehand.

Setup keystore

Before the Java keystore can be referenced in Data Manager's configuration, it has to be created in the first place.

You will need keytool for this step.

Start the setup with a dummy entry for creation of a keystore file in PKSC12 format:

 keytool -genkeypair -alias dummy -keyalg RSA -keysize 2048 -keystore keystore.p12 -storetype PKCS12 -storepass mysecretpassword -dname "CN=Dummy, OU=Test, O=Company, L=City, ST=State, C=US"

This secures the keystore with the mysecretpassword password. Change it to something only you have access and with Please choose a strong password. The application will fail for passwords with entropy below 100. The entropy of your password is calculated as follows

$$ H = -\sum_{i=1}^{n} P(x_i) \log_2 P(x_i) \times n > 100., $$

$$ \begin{aligned} \text{where } & P(x_i) \text{ is the probability of character } x_i, \\ & n \text{ is the total length of the password}. \end{aligned} $$

The application will fail starting, if the total entropy is below 100.

Now remove the dummy entry, to have a true empty keystore:

 keytool -delete -alias dummy -keystore keystore.p12 -storetype PKCS12 -storepass mysecretpassword

Verify:

keytool -list -keystore keystore.p12 -storetype PKCS12 -storepass mysecretpassword

which should show something like:

Keystore type: PKCS12
Keystore provider: SUN

Your keystore contains 0 entries

Configure vault in Data Manager

You need these three properties configured properly to operate the vault within the app. The secret for the vault is available in a local environment variable DATAMANAGER_VAULT_KEY and get more hardened with future releases.

For the secret entries themselves, define a strong secret in the DATAMANAGER_VAULT_ENTRY_PASSWORD environment variable. The same strength requirements apply here as well.

# Sets the environment variable that contains the vault key
qbic.security.vault.key.env=DATAMANAGER_VAULT_KEY
# Since it will be a PKCS12 file, let the file end with *.p12
qbic.security.vault.path=${DATAMANAGER_VAULT_PATH:keystore.p12}
# The password used for encrypting entries in the vault. Must be longer than 20 characters.
qbic.security.vault.entry.password.env=DATAMANAGER_VAULT_ENTRY_PASSWORD

Local testing

The default configuration of the app binds to the local port 8080 to the systems localhost.
After starting the application it will be accessible at http://localhost:8080 in a browser of your choice.

http://localhost:8080

Production Deployment

To create a production build, call mvnw clean package -Pproduction (Windows), or ./mvnw clean package -Pproduction (Mac & Linux).
This will build a JAR file with all the dependencies and front-end resources, ready to be deployed. The file can be found in the target folder after the build completes:

|-target
|---datamanager-{version}.jar
|---...

Once the JAR file is built, you can run it using java -jar target/datamanager-{version}.jar

How To Use

This guide intends to showcase the features of the data-manager-application

User Login

After startup the data manager application will redirect the user to the login screen hosted by default at

http://localhost:8080/login

This view enables the user to login into an already existing account by providing the required credentials.

add

Additionally, in this screen the user can request a password reset for his account if necessary.
The user will then be contacted via the provided email address with the steps necessary to perform a password reset.
Finally, the user is able to switch to the registration screen by clicking on the register button or the registration link

User Registration

This view is accessible by clicking on the register button or the registration link in the Login Screen. It is hosted by default at:

http://localhost:8080/register

This view enables the user to register a new account by providing the required credentials:

add

After successful registration the user will be contacted via the provided email address with the steps necessary to authenticate the generated account.

Application overview

Project structure

The project is composed of a multi-module maven structure divided into a domain and webapp module. The domain module hosts the business logic for user and data management.

Examples include:

  • UserRegistrationService.java in src/main/java/life/qbic/apps/datamanager/services/ contains the application service used to register users for the user management domain context.
  • policies package in src/main/java/domain/usermanagement/ contains the business logic to validate provided user information.
  • repository folder in src/main/java/domain/usermanagement/ contains the connection logic between the application and the user database.

In contrast, the webapp module hosts the frontend components and services provided in the application.

Examples include:

  • MainLayout.java in src/main/java/views/ contains the navigation setup (i.e., the side/top bar and the main menu). This setup uses App Layout.
  • views package in src/main/java contains the server-side Java views of your application.
  • views folder in frontend/ contains the client-side JavaScript views of your application.
  • themes folder in frontend/ contains the custom CSS styles.

Vaadin Framework

This application employs the frontend components released in version 23 of the vaadin framework

Additional Information

License

This work is licensed under the MIT license.

This work uses the Vaadin Framework, which is licensed under Apache 2.0.

The University of Tübingen logo is a registered trademark and the copyright is owned by the University of Tübingen.

Packages

 
 
 

Contributors