# OmTest **Repository Path**: jungle/omtest ## Basic Information - **Project Name**: OmTest - **Description**: A small test framework for network device. - **Primary Language**: Python - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2026-02-01 - **Last Updated**: 2026-03-22 ## Categories & Tags **Categories**: Uncategorized **Tags**: Testing ## README Py-Tester ========= Small test framework that discovers `TestSuite` objects from `testcases/`. Quick start - Configure device connection in `config.json` (repo root) or use CLI args to override. - Add test modules in `testcases/` — each module should expose `TestSuite` objects at module level. - Run: ```powershell python main.py ``` CLI precedence and config - The framework loads device info from `config.json` by default. - Command-line options override `config.json` values. Supported flags: - `--config` / `-c`: path to config file - `--host`, `--port`, `--username`, `--password` - `--prompt`, `--login-prompt`, `--passwd-prompt` - `--dry-run`: do not open real network connections (uses DummyDevice) - `--filter` / `-f`: comma-separated case id patterns (supports `*` wildcard). Example: `--filter tc_*` will run cases whose `case_id` matches the pattern. - `--level`: levels to run; accepts a comma-separated list and ranges. Examples: `--level 1` or `--level 1-2,4`. - `--report`: output format, `plain` or `json` (default `plain`). - `--report-file`: write report to the given file path instead of stdout. - `--cases-dir`: package name or directory path where `testcases` live. Default: `testcases` (project subdirectory). Example: `--cases-dir ./my_tests` or `--cases-dir my_tests_pkg`. Test module authoring (summary) - Define `TestCase` and `TestSuite` objects at module scope; the framework auto-discovers them. - Use module-level hook functions if needed: - `suite_setup(dev)` / `suite_teardown(dev)` — run once per suite - `setup(dev)` / `teardown(dev)` — run for each case in the module - Ensure `suite_id` is unique. Examples and more details: see `testcases/README.md`. Examples -------- - Run dry-run with JSON report saved to `out.json`: ```powershell python -m omtest --dry-run --report json --report-file out.json ``` - Run only level 2 tests and filter case ids that start with `tc_`: ```powershell python -m omtest --host 192.0.2.10 --level 2 --filter tc_* ``` Capabilities ------------ Tests can declare required device capabilities using the `requires_capabilities` decorator exported by the package. If the running device does not provide the required capabilities the test case will be skipped. Example usage in a test module: ```python from omtest.capabilities import requires_capabilities @requires_capabilities('ftp', 'judge') def test_judge_with_ftp(dev): # this test will only run if the device reports both 'ftp' and 'judge' capabilities ... ``` Devices should implement `has_capability(cap)` or expose a `capabilities` set. The provided `DummyDevice` used with `--dry-run` includes a small set of capabilities for demonstration. Configuration-driven capability checks ------------------------------------- You can instruct the test runner to probe a device for capabilities at connect time by adding a `capability` mapping under the `device` section in `config.json`. The mapper keys are capability names and the value is a list of checks. Each check may be an object with `cmd` and `expect`, or a two-element array `[cmd, expect]`. The `expect` entry can be a string or a list of strings; when a list is provided at least one element must match the command output. All checks defined for a capability must pass (logical AND) for the capability to be considered present. Example `config.json` snippet: ```json { "device": { "host": "192.0.2.10", "port": 23, "username": "admin", "password": "secret", "capability": { "ftp": [ { "cmd": "display version", "expect": "FTP server" } ], "judge": [ ["display judge brief", "Judge Mode : ACM"] ], "local-user": [ { "cmd": "display user", "expect": ["root", "admin"] } ] } } } ``` Notes: - If `expect` is omitted, a non-empty command output is treated as success for that check. - Matching is substring-based. If you need regex matching or more complex logic, extend `pytester.devices.CapacityManager.collect` accordingly. - In `--dry-run` mode `DummyDevice` provides some example capabilities; capability checks are not executed against network devices in dry-run. Build & Publish --------------- Build a source and wheel distribution locally: ```bash python -m pip install --upgrade build wheel twine python -m build ``` Check the produced files in the `dist/` directory. To upload to PyPI (test PyPI recommended first): ```bash # upload to Test PyPI python -m twine upload --repository testpypi dist/* # upload to PyPI python -m twine upload dist/* ``` Before uploading, update `setup.cfg` metadata (author, email, url, version) as appropriate. Build and publish to PyPI ------------------------- Build source and wheel distributions: ```powershell pip install --upgrade build wheel twine python -m build ``` Upload to Test PyPI (recommended first): ```powershell twine upload --repository testpypi dist/* ``` Upload to PyPI: ```powershell twine upload dist/* ``` Notes: Notes: - Version single-source: package reads `version = attr: omtest.__version__` from `setup.cfg`. To bump the package version, update the `__version__` variable in `omtest/__init__.py`. - After updating the version, run the build commands above and upload the new distribution. - Add any runtime dependencies to `setup.cfg` under `install_requires`.