AI Model Marketplace
The Paramus AI Marketplace provides a curated portfolio of academic chemistry models and HPC components, enabling IT and scientific teams to integrate advanced AI into large-scale computational workflows.

This models are ‘one-click installable programs’ following the Paramus AI Runtime Specification.
TransPolymer
Free (MIT)
Predicts polymer properties using transformer-based deep learning models trained on polymer structure–property datasets. Designed for inverse design and polymer informatics workflows.
562 MB
Xu, C.; Wang, Y.; Barati Farimani, A.
“TransPolymer: a Transformer-based language model for polymer property predictions.” npj Computational Materials 9, 64 (2023).
DOI:10.1038/s41524-023-01009-9
PolyNC
Free (Apache 2.0)
A unified natural & chemical language model (text-to-text) for predicting polymer properties (multi-task: regression + classification).
1.02 GB ![]()
Qiu, H.; Liu, L.; Qiu, X.; Dai, X.; Ji, X.; Sun, Z.-Y.
“PolyNC: a natural and chemical language model for unified polymer properties prediction.” Chemical Science (2024).
DOI: 10.1039/D3SC05079C
PolyTAO
Free (Apache 2.0)
A Transformer-Assisted Oriented pretrained model for on-demand polymer generation (conditional generative LLM). Generates polymers with 15 predefined fundamental properties. Achieves ~99.3 % chemical validity in top-1 generation.
Qiu, H.; Sun, Z.-Y.
“On-Demand Reverse Design of Polymers with PolyTAO.” npj Computational Materials 10, 273 (2024). DOI:10.1038/s41524-024-01466-5
Polyply
Free (Apache 2.0)
Generates parameters and coordinates for atomistic & coarse-grained polymer MD simulations (force-field and topology agnostic)
Grünewald, F.; Alessandri, R.; Kroon, P. C.; Monticelli, L.; Souza, P. C. T.; Marrink, S. J. “Polyply: a python suite for facilitating simulations of (bio-) macromolecules and nanomaterials.” Nature Communications 13, 68 (2022). DOI:10.1038/s41467-021-27627-4
wD-MPNN
Free (MIT)
Weighted Directed Message Passing Neural Network (wD-MPNN) adapted for polymer property prediction – represents polymer ensembles as graphs, integrates monomer stoichiometry, chain architecture, and degree of polymerization.
Aldeghi, M.; Coley, C. W.
“A graph representation of molecular ensembles for polymer property prediction.” Chemical Science 13, 10486–10498 (2022). DOI:10.1039/D2SC02839E
Reference Model
Free
This model implements a reference for the Paramus light AI Model specification 1.0
High Percormance Computing Applications (HPC)
HPC enables scalable simulation, modeling, and analysis of chemical systems. In quantum chemistry (QC), HPC is crucial for performing accurate electronic structure calculations at high theory levels, enabling reliable predictions for molecular design and reactivity.

ORCA
Academic free, paid commercial
A versatile quantum chemistry program supporting DFT, ab initio, and semi-empirical methods. ORCA enables accurate calculations of molecular structures, spectra, and reaction mechanisms and serves as a backend for AI-assisted computational chemistry workflows in PARAMUS.
Neese, F.
The ORCA program system. WIREs Comput. Mol. Sci. 12, e1606 (2022).
DOI:10.1002/wcms.1606
LAMMPS
Academic, non-commercial use only
A molecular simulation engine (LAMMPS-oriented) that encodes atomistic polymer systems into descriptors (“fingerprints”) enabling fast property predictions, structure–property correlations, and integration into polymer informatics workflows.
C.; Plimpton, S. LAMMPS: A molecular dynamics engine for scalable simulations of materials and polymers. J. Comput. Phys. 117, 1–19 (1995). DOI:10.1006/jcph.1995.1039 / arXiv:0808.2505
PSI4
Free (LGPL 3)
A modular quantum chemistry suite, supporting HF, DFT, MP2, CC, SAPT, etc.
D. G. A. Smith et al., “Psi4 1.4: Open-Source Software for High-Throughput Quantum Chemistry,” J. Chem. Phys. 152, 184108 (2020)
Languange Model (LLM) Local Runtime
Running large language models locally enables full data privacy and offline inference without dependency on external servers.

llama 3.2:1b
Llama 3.2 Community License
Commercial use with specified attribution
Provides lightweight natural language understanding and generation suitable for embedded and edge devices. Its compact size enables efficient inference on consumer-grade GPUs and even high-performance CPUs. Despite its small footprint, it supports multilingual text processing, basic reasoning, and customizable fine-tuning for specialized tasks.
Meta AI. Llama 3.2 1B (1.23-billion-parameter multilingual language model), released 25 September 2024, https://huggingface.co/meta-llama/Llama-3.2-1B
Monetize Without Losing Control
Paramus.ai provides a secure AI model marketplace enabling external vendors to monetize their models with full cost carry-over and transparent revenue models. Intellectual property remains fully protected;
Empower Your Models
Paramus acts only as a distribution and licensing platform. Vendors gain access to a qualified R&D audience across academia and industry without operational overhead.
