Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
78 changes: 78 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ system debug.
- ['run-plugins' sub command](#run-plugins-sub-command)
- ['gen-plugin-config' sub command](#gen-plugin-config-sub-command)
- ['compare-runs' subcommand](#compare-runs-subcommand)
- ['show-redfish-oem-allowable' subcommand](#show-redfish-oem-allowable-subcommand)
- ['summary' sub command](#summary-sub-command)
- [Configs](#configs)
- [Global args](#global-args)
Expand Down Expand Up @@ -116,6 +117,8 @@ node-scraper --sys-name <remote_host> --sys-location REMOTE --connection-config

##### Example: connection_config.json

In-band (SSH) connection:

```json
{
"InBandConnectionManager": {
Expand All @@ -128,6 +131,24 @@ node-scraper --sys-name <remote_host> --sys-location REMOTE --connection-config
}
```

Redfish (BMC) connection for Redfish-only plugins:

```json
{
"RedfishConnectionManager": {
"host": "bmc.example.com",
"port": 443,
"username": "admin",
"password": "secret",
"use_https": true,
"verify_ssl": true,
"api_root": "redfish/v1"
}
}
```

- `api_root` (optional): Redfish API path (e.g. `redfish/v1`). If omitted, the default `redfish/v1` is used. Override this when your BMC uses a different API version path.

**Notes:**
- If using SSH keys, specify `key_filename` instead of `password`.
- The remote user must have permissions to run the requested plugins and access required files. If needed, use the `--skip-sudo` argument to skip plugins requiring sudo.
Expand Down Expand Up @@ -319,6 +340,63 @@ node-scraper compare-runs path1 path2 --include-plugins DmesgPlugin --dont-trunc

You can pass multiple plugin names to `--skip-plugins` or `--include-plugins`.

#### **'show-redfish-oem-allowable' subcommand**
The `show-redfish-oem-allowable` subcommand fetches the list of OEM diagnostic types supported by your BMC (from the Redfish LogService `OEMDiagnosticDataType@Redfish.AllowableValues`). Use it to discover which types you can put in `oem_diagnostic_types_allowable` and `oem_diagnostic_types` in the Redfish OEM diag plugin config.

**Requirements:** A Redfish connection config (same as for RedfishOemDiagPlugin).

**Command:**
```sh
node-scraper --connection-config connection-config.json show-redfish-oem-allowable --log-service-path "redfish/v1/Systems/UBB/LogServices/DiagLogs"
```

Output is a JSON array of allowable type names (e.g. `["Dmesg", "JournalControl", "AllLogs", ...]`). Copy that list into your plugin config’s `oem_diagnostic_types_allowable` if you want to match your BMC.

**Redfish OEM diag plugin config example**

Use a plugin config that points at your LogService and lists the types to collect. Logs are written under the run log path (see `--log-path`).

```json
{
"name": "Redfish OEM diagnostic logs",
"desc": "Collect OEM diagnostic logs from Redfish LogService. Requires Redfish connection config.",
"global_args": {},
"plugins": {
"RedfishOemDiagPlugin": {
"collection_args": {
"log_service_path": "redfish/v1/Systems/UBB/LogServices/DiagLogs",
"oem_diagnostic_types_allowable": [
"JournalControl",
"AllLogs",
...
],
"oem_diagnostic_types": ["JournalControl", "AllLogs"],
"task_timeout_s": 600
},
"analysis_args": {
"require_all_success": false
}
}
},
"result_collators": {}
}
```

- **`log_service_path`**: Redfish path to the LogService (e.g. DiagLogs). Must match your system (e.g. `UBB` vs. another system id).
- **`oem_diagnostic_types_allowable`**: Full list of types the BMC supports (from `show-redfish-oem-allowable` or vendor docs).
- **`oem_diagnostic_types`**: Subset of types to collect on each run (e.g. `["JournalControl", "AllLogs"]`).
- **`task_timeout_s`**: Max seconds to wait per collection task.

**How to use**

1. **Discover allowable types** (optional): run `show-redfish-oem-allowable` and paste the output into `oem_diagnostic_types_allowable` in your plugin config.
2. **Set `oem_diagnostic_types`** to the list you want to collect (e.g. `["JournalControl", "AllLogs"]`).
3. **Run the plugin** with a Redfish connection config and your plugin config:
```sh
node-scraper --connection-config connection-config.json --plugin-config plugin_config_redfish_oem_diag.json run-plugins RedfishOemDiagPlugin
```
4. Use **`--log-path`** to choose where run logs (and OEM diag archives) are written.

#### **'summary' sub command**
The 'summary' subcommand can be used to combine results from multiple runs of node-scraper to a
single summary.csv file. Sample run:
Expand Down
4 changes: 4 additions & 0 deletions nodescraper/base/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,14 @@
###############################################################################
from .inbandcollectortask import InBandDataCollector
from .inbanddataplugin import InBandDataPlugin
from .oobanddataplugin import OOBandDataPlugin
from .redfishcollectortask import RedfishDataCollector
from .regexanalyzer import RegexAnalyzer

__all__ = [
"InBandDataCollector",
"InBandDataPlugin",
"OOBandDataPlugin",
"RedfishDataCollector",
"RegexAnalyzer",
]
137 changes: 1 addition & 136 deletions nodescraper/base/inbanddataplugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,16 +23,11 @@
# SOFTWARE.
#
###############################################################################
import json
import os
from pathlib import Path
from typing import Any, Generic, Optional
from typing import Generic

from nodescraper.connection.inband import InBandConnectionManager, SSHConnectionParams
from nodescraper.generictypes import TAnalyzeArg, TCollectArg, TDataModel
from nodescraper.interfaces import DataPlugin
from nodescraper.models import DataModel
from nodescraper.utils import pascal_to_snake


class InBandDataPlugin(
Expand All @@ -42,133 +37,3 @@ class InBandDataPlugin(
"""Base class for in band plugins."""

CONNECTION_TYPE = InBandConnectionManager

@classmethod
def find_datamodel_path_in_run(cls, run_path: str) -> Optional[str]:
"""Find this plugin's collector datamodel file under a scraper run directory.

Args:
run_path: Path to a scraper log run directory (e.g. scraper_logs_*).

Returns:
Absolute path to the datamodel file, or None if not found.
"""
run_path = os.path.abspath(run_path)
if not os.path.isdir(run_path):
return None
collector_cls = getattr(cls, "COLLECTOR", None)
data_model_cls = getattr(cls, "DATA_MODEL", None)
if not collector_cls or not data_model_cls:
return None
collector_dir = os.path.join(
run_path,
pascal_to_snake(cls.__name__),
pascal_to_snake(collector_cls.__name__),
)
if not os.path.isdir(collector_dir):
return None
result_path = os.path.join(collector_dir, "result.json")
if not os.path.isfile(result_path):
return None
try:
res_payload = json.loads(Path(result_path).read_text(encoding="utf-8"))
if res_payload.get("parent") != cls.__name__:
return None
except (json.JSONDecodeError, OSError):
return None
want_json = data_model_cls.__name__.lower() + ".json"
for fname in os.listdir(collector_dir):
low = fname.lower()
if low.endswith("datamodel.json") or low == want_json:
return os.path.join(collector_dir, fname)
if low.endswith(".log"):
return os.path.join(collector_dir, fname)
return None

@classmethod
def load_datamodel_from_path(cls, dm_path: str) -> Optional[TDataModel]:
"""Load this plugin's DATA_MODEL from a file path (JSON or .log).

Args:
dm_path: Path to datamodel JSON or to a .log file (if DATA_MODEL
implements import_model for that format).

Returns:
Instance of DATA_MODEL or None if load fails.
"""
dm_path = os.path.abspath(dm_path)
if not os.path.isfile(dm_path):
return None
data_model_cls = getattr(cls, "DATA_MODEL", None)
if not data_model_cls:
return None
try:
if dm_path.lower().endswith(".log"):
import_model = getattr(data_model_cls, "import_model", None)
if not callable(import_model):
return None
base_import = getattr(DataModel.import_model, "__func__", DataModel.import_model)
if getattr(import_model, "__func__", import_model) is base_import:
return None
return import_model(dm_path)
with open(dm_path, encoding="utf-8") as f:
data = json.load(f)
return data_model_cls.model_validate(data)
except (json.JSONDecodeError, OSError, Exception):
return None

@classmethod
def get_extracted_errors(cls, data_model: DataModel) -> Optional[list[str]]:
"""Compute extracted errors from datamodel for compare-runs (in memory only).

Args:
data_model: Loaded DATA_MODEL instance.

Returns:
Sorted list of error match strings, or None if not applicable.
"""
get_content = getattr(data_model, "get_compare_content", None)
if not callable(get_content):
return None
try:
content = get_content()
except Exception:
return None
if not isinstance(content, str):
return None
analyzer_cls = getattr(cls, "ANALYZER", None)
if not analyzer_cls:
return None
get_matches = getattr(analyzer_cls, "get_error_matches", None)
if not callable(get_matches):
return None
try:
matches = get_matches(content)
return sorted(matches) if matches is not None else None
except Exception:
return None

@classmethod
def load_run_data(cls, run_path: str) -> Optional[dict[str, Any]]:
"""Load this plugin's run data from a scraper run directory for comparison.

Args:
run_path: Path to a scraper log run directory or to a datamodel file.

Returns:
Dict suitable for diffing with another run, or None if not found.
"""
run_path = os.path.abspath(run_path)
if not os.path.exists(run_path):
return None
dm_path = run_path if os.path.isfile(run_path) else cls.find_datamodel_path_in_run(run_path)
if not dm_path:
return None
data_model = cls.load_datamodel_from_path(dm_path)
if data_model is None:
return None
out = data_model.model_dump(mode="json")
extracted = cls.get_extracted_errors(data_model)
if extracted is not None:
out["extracted_errors"] = extracted
return out
48 changes: 48 additions & 0 deletions nodescraper/base/oobanddataplugin.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
###############################################################################
#
# MIT License
#
# Copyright (c) 2026 Advanced Micro Devices, Inc.
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
#
###############################################################################
from typing import Generic

from nodescraper.connection.redfish import (
RedfishConnectionManager,
RedfishConnectionParams,
)
from nodescraper.generictypes import TAnalyzeArg, TCollectArg, TDataModel
from nodescraper.interfaces import DataPlugin


class OOBandDataPlugin(
DataPlugin[
RedfishConnectionManager,
RedfishConnectionParams,
TDataModel,
TCollectArg,
TAnalyzeArg,
],
Generic[TDataModel, TCollectArg, TAnalyzeArg],
):
"""Base class for out-of-band (OOB) plugins that use Redfish connection."""

CONNECTION_TYPE = RedfishConnectionManager
Loading