Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughCentralized notification section metadata was added and a broad camelCase migration applied across schema, migrations, DB init, server models, APIs, front-end configs, plugins, reporting/notification generation, and tests to use unified camelCase field names and data-driven SECTION_* constants. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Scheduler as Scheduler / Cron
participant Reporter as server/messaging/reporting.py
participant Meta as server/messaging/notification_sections.py
participant DB as Database
participant Renderer as notification_instance.py
participant Publisher as Publisher (email/webhook/etc.)
rect rgba(200,200,255,0.5)
Scheduler->>Reporter: trigger notification run
Reporter->>Meta: load SECTION_ORDER, SQL_TEMPLATES, DATETIME_FIELDS, SECTION_TITLES
Reporter->>DB: execute SQL_TEMPLATES[section] (with formatted condition)
DB-->>Reporter: result rows
Reporter->>Renderer: supply rows + section metadata
Renderer->>Meta: resolve template placeholders per section
Renderer->>Publisher: enqueue/send formatted notifications
Publisher-->>DB: upsert Notifications, clear/set pending flags
end
Possibly related PRs
Poem
✨ Finishing Touches
🧪 Generate unit tests (beta)
|
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@server/messaging/notification_sections.py`:
- Around line 97-110: Add documentation clarifying behavior when both "events"
and "down_reconnected" notification sections are enabled: explain that "events"
(the SQL query under the "events" key) includes eve_EventType IN ('Down
Reconnected', ...) and the "down_reconnected" section filters exclusively on
eve_EventType = 'Down Reconnected', which will produce duplicate alerts if both
are enabled; update the notification_sections mapping comment or the relevant
docs to state whether users should avoid enabling both simultaneously or that
duplicate notifications are expected/intentional for different formatting
contexts, referencing the "events" and "down_reconnected" section names and the
'Down Reconnected' eve_EventType.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: c301461a-3d0a-4437-87dd-b32d1f7fde57
📒 Files selected for processing (7)
docs/NOTIFICATION_TEMPLATES.mdfront/plugins/notification_processing/config.jsonserver/messaging/notification_sections.pyserver/messaging/reporting.pyserver/models/notification_instance.pytest/backend/test_notification_templates.pytest/integration/integration_test.py
- Updated test cases to reflect new column names (eve_MAC -> eveMac, eve_DateTime -> eveDateTime, etc.) across various test files. - Modified SQL table definitions in the database cleanup and migration tests to use camelCase naming conventions. - Implemented migration tests to ensure legacy column names are correctly renamed to camelCase equivalents. - Ensured that existing data is preserved during the migration process and that views referencing old column names are dropped before renaming. - Verified that the migration function is idempotent, allowing for safe re-execution without data loss.
There was a problem hiding this comment.
Actionable comments posted: 18
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (4)
front/plugins/website_monitor/config.json (1)
535-593:⚠️ Potential issue | 🟠 MajorAdd
ordeableto multi-selectelementOptionsfor compatibility.Both
WATCHandREPORT_ONdefine multi-select UI elements but omit the legacyordeablekey used by the frontend behavior.💡 Proposed patch
"function": "WATCH", "type": { "dataType": "array", "elements": [ { "elementType": "select", - "elementOptions": [{ "multiple": "true" }], + "elementOptions": [{ "multiple": "true" }, { "ordeable": "true" }], "transformers": [] } ] }, @@ "function": "REPORT_ON", "type": { "dataType": "array", "elements": [ { "elementType": "select", - "elementOptions": [{ "multiple": "true" }], + "elementOptions": [{ "multiple": "true" }, { "ordeable": "true" }], "transformers": [] } ] },Based on learnings: In NetAlertX plugin config.json files, keep the property name spelled as
ordeable(notorderable) inelementOptionsfor frontend compatibility.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/website_monitor/config.json` around lines 535 - 593, The multi-select configurations for the WATCH and REPORT_ON fields omit the legacy "ordeable" flag causing frontend compatibility issues; update the elementOptions arrays inside the WATCH and REPORT_ON definitions to include the key "ordeable" (set to "true") alongside the existing "multiple": "true" so the UI recognizes these as orderable multi-selects (modify the elementOptions objects referenced in the WATCH and REPORT_ON entries).front/plugins/db_cleanup/script.py (1)
133-137:⚠️ Potential issue | 🔴 CriticalRemove
PARTITION BY "index"from window function—it prevents trimming because partitioning by the PRIMARY KEY makesROW_NUMBER()always 1.When
"index"isPRIMARY KEY AUTOINCREMENT(unique),PARTITION BY "index"creates a separate partition for each row, soROW_NUMBER()resets to 1 in each partition. This meansrow_num <= {histCount}keeps all rows instead of trimming to the most recenthistCountrecords.Fix (applies to lines 133 and 150)
- ROW_NUMBER() OVER(PARTITION BY "index" ORDER BY dateTimeCreated DESC) AS row_num + ROW_NUMBER() OVER(ORDER BY dateTimeCreated DESC) AS row_numAffects both Notifications (lines 133–137) and AppEvents (lines 150–154).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/db_cleanup/script.py` around lines 133 - 137, The window function currently uses PARTITION BY "index" which makes ROW_NUMBER() always 1 for unique PRIMARY KEY values and prevents trimming; remove PARTITION BY "index" from the ROW_NUMBER() call in both queries that build ranked_objects for Notifications and AppEvents so the ORDER BY dateTimeCreated DESC applies across the whole table and row_num <= {histCount} correctly keeps only the most recent histCount rows (update the two occurrences that produce ROW_NUMBER() OVER(PARTITION BY "index" ORDER BY dateTimeCreated DESC) AS row_num).server/models/plugin_object_instance.py (1)
86-95:⚠️ Potential issue | 🟡 MinorThe
fieldparameter inupdateField()uses unsafe string interpolation, but current callers pass trusted values.While the code directly interpolates
fieldinto the SQL statement (line 93:f"UPDATE Plugins_Objects SET {field}=? WHERE objectGuid=?"), the field values originate fromworkflows.json, a server-side configuration file loaded byWorkflowManager.load_workflows(). This mitigates the immediate injection risk, but the code is still vulnerable if field values ever come from untrusted sources. Consider using a whitelist of allowed field names to validate the input before interpolation, or refactor to use parameterized queries for column names (though most SQL databases don't support parameters for identifiers).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/models/plugin_object_instance.py` around lines 86 - 95, The updateField method interpolates the column name directly into SQL (in updateField / _execute for table Plugins_Objects), which is unsafe; add a strict whitelist of allowed column names (e.g., a module-level or class-level set like ALLOWED_PLUGIN_OBJECT_FIELDS) and validate the incoming field against that set before building the query, raising ValueError if not allowed; keep the existing exists(ObjectGUID) check and then call _execute with the parameterized values only (value, ObjectGUID) after validation; optionally document that valid field names are sourced from workflows.json / WorkflowManager.load_workflows() to keep callers aligned.server/workflows/triggers.py (1)
42-51:⚠️ Potential issue | 🔴 CriticalParameterize
objectGuidquery and handle missing rows.Line 45 interpolates
event["objectGuid"]directly into SQL, which is injection-prone. Also, Line 51 assumes at least one row and can crash on missing objects.💡 Suggested fix
- query = f""" - SELECT * FROM - {db_table} - WHERE {refField} = '{event["objectGuid"]}' - """ + query = f""" + SELECT * FROM + {db_table} + WHERE {refField} = ? + """ mylog("debug", [query]) - result = db.sql.execute(query).fetchall() - self.object = result[0] + self.object = db.sql.execute(query, (event["objectGuid"],)).fetchone() + if self.object is None: + mylog("debug", f"[WF] No object found for {db_table}.{refField}={event['objectGuid']}")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/workflows/triggers.py` around lines 42 - 51, The SQL built in triggers.py currently interpolates event["objectGuid"] into query and assumes a row exists; change the db call to use a parameterized query (pass the objectGuid as a bound parameter to db.sql.execute instead of string formatting the query built with db_table and refField) and after fetching rows from db.sql.execute(...).fetchall() check for empty result before accessing result[0] — handle the missing row by setting self.object = None or raising a clear exception/logging an error. Ensure you keep db_table and refField usage intact but do not concatenate user data into the SQL string.
🧹 Nitpick comments (9)
front/plugins/ddns_update/config.json (1)
604-629: Consider renaming "Dummy" column for consistency.The
Dummycolumn at line 605 retains PascalCase while all other columns were migrated to camelCase. This may be intentional given its specialmapped_to_column_datausage, but could be worth aligning for consistency.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/ddns_update/config.json` around lines 604 - 629, Rename the "Dummy" column key to use camelCase (e.g., "dummy") to match the rest of the schema: update the "column": "Dummy" entry in the config object that has mapped_to_column_data, css_classes, show, type, default_value, options, localized and name (the "Scan method" / "Método de escaneo" / "Scanmethode" block); also search for and update any references to "Dummy" elsewhere (code or other config keys) so mapped_to_column_data and any consumers of that column still resolve correctly.front/plugins/db_cleanup/script.py (1)
94-94: Parameterize retention-window SQL instead of f-strings.These queries are likely safe today, but binding parameters is cleaner and avoids string-built SQL risks/noise.
🔧 Proposed hardening diff
- sql = f"""DELETE FROM Events WHERE eveDateTime <= date('now', '-{str(DAYS_TO_KEEP_EVENTS)} day')""" + sql = "DELETE FROM Events WHERE eveDateTime <= date('now', ?)" mylog("verbose", [f"[{pluginName}] SQL : {sql}"]) - cursor.execute(sql) + cursor.execute(sql, (f"-{DAYS_TO_KEEP_EVENTS} day",)) @@ - sql = f"""DELETE FROM Sessions WHERE sesDateTimeConnection <= date('now', '-{str(DAYS_TO_KEEP_EVENTS)} day')""" + sql = "DELETE FROM Sessions WHERE sesDateTimeConnection <= date('now', ?)" mylog("verbose", [f"[{pluginName}] SQL : {sql}"]) - cursor.execute(sql) + cursor.execute(sql, (f"-{DAYS_TO_KEEP_EVENTS} day",))Also applies to: 103-103
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/db_cleanup/script.py` at line 94, Replace the f-string SQL that injects DAYS_TO_KEEP_EVENTS with a parameterized query: compute the cutoff datetime in Python (e.g., cutoff = (datetime.utcnow() - timedelta(days=int(DAYS_TO_KEEP_EVENTS))).isoformat() or the DB's expected format) and then execute "DELETE FROM Events WHERE eveDateTime <= ?" (or a named placeholder) passing cutoff as the bound parameter; update both the sql assignment (variable sql) and the corresponding execute call so DAYS_TO_KEEP_EVENTS is never string-injected into the SQL and the table/column (Events, eveDateTime) remain referenced as before.front/php/templates/language/lang.php (1)
27-27: Add transitional fallback for plugin language DB column names.Line 27 currently assumes camelCase-only columns. If an upgraded instance still has legacy column names, this can raise undefined-index notices and skip SQL overrides.
♻️ Suggested defensive mapping
while ($row = $result->fetchArray(SQLITE3_ASSOC)) { - $strings[$row['stringKey']] = $row['stringValue']; + $key = $row['stringKey'] ?? $row['String_Key'] ?? null; + if ($key === null) { + continue; + } + $strings[$key] = $row['stringValue'] ?? $row['String_Value'] ?? ''; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/php/templates/language/lang.php` at line 27, Replace the direct $row['stringKey']/['stringValue'] access with a defensive mapping that checks for camelCase first and falls back to legacy snake_case column names to avoid undefined-index notices; e.g., derive $key = $row['stringKey'] ?? $row['string_key'] and $value = $row['stringValue'] ?? $row['string_value'], skip or continue when $key is null, then set $strings[$key] = $value (use isset/?? to avoid notices).scripts/db_cleanup/db_cleanup.py (1)
36-43: camelCase column name updates look correct.The MAC-based queries correctly use the new column names (
eveMac,devMac,objectPrimaryId,objectSecondaryId).However, the static analyzer flagged SQL injection concerns. While this is an admin CLI tool, user input is interpolated directly into SQL strings. Consider using parameterized queries if this script might be adapted for broader use.
♻️ Example of parameterized approach (if future-proofing is desired)
Instead of direct interpolation, you could refactor to pass parameters separately if migrating to a Python SQLite connection:
# Example pattern (would require refactoring run_sqlite_command) query = "DELETE FROM Events WHERE eveMac = ?" params = (mac,)For the current subprocess/docker approach, consider at minimum validating MAC format with a regex before use.
Also applies to: 56-63
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/db_cleanup/db_cleanup.py` around lines 36 - 43, The SQL strings in tables_checks (and the similar block around lines 56-63) directly interpolate the mac variable, exposing SQL injection risk; either refactor the code path that runs these queries (e.g., run_sqlite_command or whatever executor handles these strings) to accept parameterized queries and pass mac as a parameter, or—if changing the executor isn’t possible now—validate/sanitize mac first using a strict MAC regex before building the query strings (apply the same change to the block referenced at 56-63); update references to tables_checks and the corresponding executor so all queries use the chosen safe approach.server/models/plugin_object_instance.py (1)
64-68: Code pattern risk:matchedColumnshould be parameterized rather than interpolated.The SQL interpolation of
matchedColumnis a code smell, but currently safe—the only caller (getOpenPortsin device_instance.py) hardcodesmatchedColumn='objectPrimaryId', and user input (matchedKey) is properly parameterized. To prevent future misuse, refactor the method to use a whitelist of allowed columns or pass column names as safe identifiers rather than through string interpolation.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/models/plugin_object_instance.py` around lines 64 - 68, getByField currently interpolates the matchedColumn into the SQL string which is risky; change it to validate matchedColumn against a whitelist of allowed column names (e.g., allowlist like ["objectPrimaryId","otherAllowedColumn"]) or accept only a predefined enum/constant, then build the query using the validated identifier and continue to parameterize matchedKey; update getByField (and callers such as getOpenPorts in device_instance.py) to pass only allowed column names or constants and raise an error if an invalid column is supplied so no untrusted string can be injected into the SQL.server/scan/session_events.py (1)
174-176: Convert parameterized SQL for all five event inserts to eliminate f-string interpolation.
startTimeis internal code (fromtimeNowUTC()), but parameterized queries follow security best practices and remove injection-shaped patterns that trigger tools like S608. This applies to lines 174–176, 188–190, 207–209, 224–226, and 238–240.♻️ Suggested pattern
- sql.execute(f"""INSERT OR IGNORE INTO Events (eveMac, eveIp, eveDateTime, - eveEventType, eveAdditionalInfo, - evePendingAlertEmail) - SELECT devMac, devLastIP, '{startTime}', 'Device Down', '', 1 + sql.execute("""INSERT OR IGNORE INTO Events (eveMac, eveIp, eveDateTime, + eveEventType, eveAdditionalInfo, + evePendingAlertEmail) + SELECT devMac, devLastIP, ?, 'Device Down', '', 1 FROM DevicesView WHERE devAlertDown != 0 AND devCanSleep = 0 AND devPresentLastScan = 1 AND NOT EXISTS (SELECT 1 FROM CurrentScan WHERE devMac = scanMac - ) """) + ) """, (startTime,))🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/scan/session_events.py` around lines 174 - 176, Replace the f-string SQL INSERTs that call sql.execute(...) into the Events table with parameterized queries: build the SQL string with placeholders (e.g., INSERT OR IGNORE INTO Events (eveMac, eveIp, eveDateTime, eveEventType, eveAdditionalInfo, evePendingAlertEmail) VALUES (?, ?, ?, ?, ?, ?)) and pass a tuple of the corresponding variables (eveMac, eveIp, startTime, eveEventType, eveAdditionalInfo, evePendingAlertEmail) as the second argument to sql.execute; do this for all five insert locations that currently use f-strings so startTime (from timeNowUTC()) and the other event fields are passed as parameters rather than interpolated.docs/PLUGINS_DEV_UI_COMPONENTS.md (1)
19-42: Consider a full consistency pass for remaining legacy field names.This file now mixes updated camelCase examples with older identifiers in other sections, which may blur the current data contract for plugin authors.
Also applies to: 67-67, 102-102, 120-120, 138-138, 156-156, 210-210, 255-255, 289-289, 313-313, 325-325, 350-350, 379-379, 408-408, 442-442, 450-450, 504-504, 548-548, 558-558, 577-577, 592-592
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/PLUGINS_DEV_UI_COMPONENTS.md` around lines 19 - 42, Run a full consistency pass in PLUGINS_DEV_UI_COMPONENTS.md to replace legacy snake_case field names with the canonical camelCase names used elsewhere: update keys like mapped_to_column → mappedToColumn, mapped_to_column_data → mappedToColumnData, css_classes → cssClasses, default_value → defaultValue, options_params → optionsParams (keep column, show, options, localized, name as-is), and normalize example values such as device_mac → deviceMac and ensure objectPrimaryId and devMac are used consistently across all examples and the Properties table so examples, code snippets, and descriptions all reference the same camelCase data contract.server/messaging/reporting.py (1)
197-209: Intentional broad exception handling for resilience.The static analysis flags the broad
Exceptioncatch on line 203, but this appears intentional — the fallback ensures notification processing continues even if condition building fails for a specific section. The fallback on lines 205-208 provides safe defaults.Consider logging the exception type for better debugging:
🔧 Optional: Log exception type for debugging
except Exception as e: - mylog("verbose", [f"[Notification] Error building condition for {section}: ", e]) + mylog("verbose", [f"[Notification] Error building condition for {section} ({type(e).__name__}): ", e]) fallback_vars = {"condition": ""}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/messaging/reporting.py` around lines 197 - 209, The catch-all Exception handler around building sqlQuery in the try/except (where format_vars, template, and sqlQuery are used) currently swallows errors; update the except block to include the exception details in the log by passing e (and optionally type(e)) to mylog so you record the error type/message when the fallback_vars path is taken (keep the existing fallback behavior for section == "down_devices" and assignment of sqlQuery and parameters). Ensure logging uses the same mylog call site to avoid changing control flow and retains the resilient fallback logic.server/plugin.py (1)
750-753: Pre-existing SQL injection risk via string concatenation.The static analysis tool correctly flags this as a potential SQL injection vector. While
pluginPrefcomes from the plugin configuration dictionary (not direct user input), using parameterized queries is the safer pattern.♻️ Recommended fix using parameterized query
# Create plugin objects from existing database entries plugObjectsArr = db.get_sql_array( - "SELECT * FROM Plugins_Objects where plugin = '" + str(pluginPref) + "'" + "SELECT * FROM Plugins_Objects WHERE plugin = ?", + (pluginPref,) )Note: This requires
get_sql_arrayto support parameterized queries. If it doesn't currently, consider adding that capability or using a different query method.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/plugin.py` around lines 750 - 753, The query building in plugObjectsArr uses string concatenation with pluginPref which creates an SQL injection risk; update the call to db.get_sql_array to use a parameterized query (e.g., "SELECT * FROM Plugins_Objects WHERE plugin = %s" with pluginPref passed as a parameter) or, if get_sql_array does not support parameters, extend get_sql_array to accept query parameters or switch to a provided parameterized query method (ensuring the symbol plugObjectsArr still receives the result and the table name Plugins_Objects and variable pluginPref are used correctly).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/NOTIFICATION_TEMPLATES.md`:
- Around line 17-27: The fenced code blocks in the notification template
examples are missing a language identifier (triggering markdownlint MD040);
update the opening fences for the examples shown (the "🆕 New devices" example
and the later syntax example that currently starts with ``` ) to use a language
token such as ```text so both blocks become ```text ... ```; this will satisfy
MD040 and preserve the block contents.
In `@front/deviceDetailsEvents.php`:
- Around line 41-47: The SQL built into rawSql (the SELECT using
eveDateTime/eveEventType/eveIp/eveAdditionalInfo) is vulnerable because it
interpolates mac, start, end and hideConnectionsStr directly; change this to a
parameterized/prepared statement (e.g., PDO or mysqli prepared statements) and
bind mac, start, end, and hideConnectionsStr as parameters instead of
concatenating into the query, or strictly validate/escape those inputs before
use; locate the code that constructs rawSql and replace it with a prepared query
using bound parameters for eveMac, eveDateTime BETWEEN start/end, and the
hideConnections flag (reference variables: rawSql, mac, start, end,
hideConnectionsStr, and the Events table/eve* columns).
In `@front/js/cache.js`:
- Line 307: The init flag usage for string caching is inconsistent: the success
path sets cacheStrings_v2 but the fast-path guard and the failure path still
reference cacheStrings, causing repeated reloads and bad tracking; update all
checks and failure handling to consistently use cacheStrings_v2 (replace
references to cacheStrings in the fast-path guard and failure branches), ensure
setCache(CACHE_KEYS.langString(...), langString.stringValue) remains unchanged,
and verify any logic that toggles or reads the init flag (cacheStrings or
cacheStrings_v2) uses cacheStrings_v2 everywhere (including the code around the
block that processes langString and the error/failure handling).
In `@front/plugins/internet_ip/config.json`:
- Line 361: Update the German description string to correctly reflect that
watchedValue4 maps to scanType instead of being unused: locate the JSON entry
containing "watchedValue1"..."watchedValue4" and change the sentence for
<code>watchedValue4</code> from "ist nicht in Verwendung" to a phrase indicating
it represents the <code>scanType</code> (e.g., "repräsentiert
<code>scanType</code>"). Keep the rest of the list and formatting intact and
ensure encoding/escaping remains valid.
In `@front/plugins/internet_speedtest/config.json`:
- Line 647: The Spanish translation for the REPORT_ON message is malformed
around the inline code token `watchedValueN`; update the JSON value (the
"string" for the REPORT_ON entry) to a clear phrasing that correctly wraps the
code token and reads naturally, e.g. replace the broken fragment "seleccionó
<code>watchedValueN Las columnas </code> cambiaron." with a corrected phrase
such as "cuando se seleccionó <code>watchedValueN</code> y las columnas
cambiaron" or "cuando las columnas <code>watchedValueN</code> cambian" so the
`<code>watchedValueN</code>` token is isolated and the sentence is grammatical.
In `@front/plugins/nmap_scan/config.json`:
- Line 622: The Spanish text for the "string" value contains malformed
wording/markup: the fragment "<code>watchedValueN Las columnas </code>" should
be rewritten so the code tag only wraps the identifier and the sentence reads
naturally; update the "string" value to use "<code>watchedValueN</code>" and
reorder to something like "las columnas de <code>watchedValueN</code> cambiaron"
(or "las columnas <code>watchedValueN</code> cambiaron") so the markup and
Spanish grammar are correct.
- Around line 582-587: The WATCH description currently reverses watchedValue1
and watchedValue2 relative to the column definitions: update the English and
Spanish "string" entries so they state that watchedValue1 is State (e.g., open
or closed) and watchedValue2 is Service (e.g., http, ssh) to match the column
definitions referenced by watchedValue1 and watchedValue2; ensure both
language_code "string" values are changed consistently.
In `@front/plugins/snmp_discovery/config.json`:
- Line 691: The Spanish WATCH description string contains a malformed closing
code tag "< /code>" for watchedValue3 which breaks HTML rendering; update the
JSON value (the "string" for the WATCH description) to replace "< /code>" with a
proper "</code>" so the list item reads "<code>watchedValue3</code> no se
utiliza" and verify similar tags (e.g., other <code> elements) have no extra
spaces.
In `@front/plugins/website_monitor/config.json`:
- Around line 608-612: The Spanish localization string for language_code "es_es"
has malformed markup and word order around the watchedValueN snippet (currently
"<code>watchedValueN Las columnas </code>"); update the "string" value so the
<code> tag wraps only "watchedValueN" and the Spanish grammar is correct—for
example: "significa que las columnas seleccionadas <code>watchedValueN</code>
cambiaron." Locate the JSON entry with language_code "es_es" and replace the
faulty fragment accordingly, preserving surrounding text and markup.
In `@front/report.php`:
- Around line 46-48: The default format string passed to updateData is still
uppercase ('HTML') which no longer matches the lowercase switch cases and causes
an empty initial render; update all hardcoded format arguments to use lowercase
(e.g., change updateData('HTML', -1) to updateData('html', -1)) and similarly
fix other occurrences in the file (notably the calls around the earlier block
referenced at 103-112) so the format keys match the switch/case handling in
updateData.
In `@server/db/db_upgrade.py`:
- Line 820: The call to sql.execute in db_upgrade.py uses an unnecessary
f-string: replace sql.execute(f"SELECT name FROM sqlite_master WHERE
type='table' AND name?", (table,)) with a plain string literal in the
sql.execute call so the query is not an f-string; keep the parameter tuple
(table,) as-is to continue using parameterized SQL (locate the sql.execute call
shown in the diff).
- Around line 796-807: The current check in db_upgrade.py that returns early
when "eveMac" exists can prematurely stop remaining renames if a prior run
partially migrated tables; instead, change the logic in the Events probe so that
seeing "eveMac" only logs and does not return from the migration routine (i.e.,
remove the early "return True" after mylog), and adjust the subsequent condition
so you only skip the camelCase migration when neither "eveMac" nor "eve_MAC"
exist (keep the verbose log for missing/unrecognised schema). Update references
around sql, events_cols, and the "eveMac"/"eve_MAC" checks so the rest of the
migration process still runs when partial migrations are detected.
In `@server/initialise.py`:
- Around line 866-893: The ses_* legacy mapping in the replacements dictionary
only covers a few session fields (e.g. r"\bses_MAC\b": "sesMac") and misses
other legacy names like ses_DateTimeConnection and ses_EventTypeConnection,
causing migrations to skip those columns; update the same replacements dict (the
block containing entries such as r"\bses_MAC\b" and r"\bses_AdditionalInfo\b")
to include explicit mappings for all known legacy ses_ keys (e.g.
r"\bses_DateTimeConnection\b": "sesDateTimeConnection",
r"\bses_EventTypeConnection\b": "sesEventTypeConnection", plus any other ses_*
variants used historically), or replace the static entries with a single
regex-based rule that converts r"\bses_(\w+)\b" into the lowerCamel target;
modify the mapping in initialise.py where those r"\bses_..." entries live so
legacy session columns are rewritten during migration.
In `@server/models/event_instance.py`:
- Around line 166-167: The deletion cutoff uses date('now', ?) which strips time
and causes missed rows; update the SQL in event_instance.py that builds the sql
variable (the DELETE FROM Events WHERE eveDateTime ...) to use datetime('now',
?) instead of date('now', ?) so eveDateTime timestamps are compared against a
full timestamp; keep the same parameter passed to cur.execute (the f"-{days}
days" argument) and run the updated cur.execute(sql, [f"-{days} days"]) call to
apply the correct time-aware cutoff.
In `@server/scan/name_resolution.py`:
- Around line 27-28: The SQL queries in name_resolution.py interpolate variables
directly (plugin, pMAC, pIP) into the string for the Plugins_Objects SELECT
statements, which is injection-prone; update the code to use parameterized
queries (use your DB API's placeholders, e.g., ? or %s) and pass (plugin, pMAC)
and (plugin, pIP) as parameters to cursor.execute instead of f-strings, and
apply the same change to the other occurrence around lines 40-41 so both SELECTs
use safe parameter binding.
In `@test/backend/test_notification_templates.py`:
- Around line 241-269: The test uses legacy underscore field names; update
test_down_reconnected_template to use camelCase to match production/output from
notification_sections.py and construct_notifications: change keys in the
reconnected dict (eve_MAC -> eveMac, eve_DateTime -> eveDateTime, eve_IP ->
eveIp, eve_EventType -> eveEventType) and update the columns list and the
NTfPRCS_TEXT_TEMPLATE_down_reconnected placeholder(s) to use {eveMac} and
{eveDateTime} so json_data and construct_notifications operate on the same
camelCase fields as the SQL/template output.
- Around line 291-292: Update the template placeholder in the test data for
"NTFPRCS_TEXT_TEMPLATE_new_devices" to use the camelCase form; replace the
underscore-style {eve_MAC} with {eveMAC} so the template string reads "{devName}
({eveMAC})" and remains consistent with the camelCase migration used elsewhere
in the codebase.
In `@test/db/test_db_cleanup.py`:
- Around line 77-79: The DELETE query currently injects the variable days
directly into the SQL f-string; instead compute the cutoff timestamp in Python
(e.g., cutoff = now - timedelta(days=days)) and pass it as a SQL parameter to
the DELETE statement (replace the date('now', '-{days} day') fragment with a
parameter placeholder and bind the computed cutoff). Update the test code where
the SQL string is built (the block containing the "DELETE FROM Sessions WHERE
sesDateTimeConnection <= ..." string and the execute call) to use a
parameterized query and bind the cutoff value rather than interpolating days.
---
Outside diff comments:
In `@front/plugins/db_cleanup/script.py`:
- Around line 133-137: The window function currently uses PARTITION BY "index"
which makes ROW_NUMBER() always 1 for unique PRIMARY KEY values and prevents
trimming; remove PARTITION BY "index" from the ROW_NUMBER() call in both queries
that build ranked_objects for Notifications and AppEvents so the ORDER BY
dateTimeCreated DESC applies across the whole table and row_num <= {histCount}
correctly keeps only the most recent histCount rows (update the two occurrences
that produce ROW_NUMBER() OVER(PARTITION BY "index" ORDER BY dateTimeCreated
DESC) AS row_num).
In `@front/plugins/website_monitor/config.json`:
- Around line 535-593: The multi-select configurations for the WATCH and
REPORT_ON fields omit the legacy "ordeable" flag causing frontend compatibility
issues; update the elementOptions arrays inside the WATCH and REPORT_ON
definitions to include the key "ordeable" (set to "true") alongside the existing
"multiple": "true" so the UI recognizes these as orderable multi-selects (modify
the elementOptions objects referenced in the WATCH and REPORT_ON entries).
In `@server/models/plugin_object_instance.py`:
- Around line 86-95: The updateField method interpolates the column name
directly into SQL (in updateField / _execute for table Plugins_Objects), which
is unsafe; add a strict whitelist of allowed column names (e.g., a module-level
or class-level set like ALLOWED_PLUGIN_OBJECT_FIELDS) and validate the incoming
field against that set before building the query, raising ValueError if not
allowed; keep the existing exists(ObjectGUID) check and then call _execute with
the parameterized values only (value, ObjectGUID) after validation; optionally
document that valid field names are sourced from workflows.json /
WorkflowManager.load_workflows() to keep callers aligned.
In `@server/workflows/triggers.py`:
- Around line 42-51: The SQL built in triggers.py currently interpolates
event["objectGuid"] into query and assumes a row exists; change the db call to
use a parameterized query (pass the objectGuid as a bound parameter to
db.sql.execute instead of string formatting the query built with db_table and
refField) and after fetching rows from db.sql.execute(...).fetchall() check for
empty result before accessing result[0] — handle the missing row by setting
self.object = None or raising a clear exception/logging an error. Ensure you
keep db_table and refField usage intact but do not concatenate user data into
the SQL string.
---
Nitpick comments:
In `@docs/PLUGINS_DEV_UI_COMPONENTS.md`:
- Around line 19-42: Run a full consistency pass in PLUGINS_DEV_UI_COMPONENTS.md
to replace legacy snake_case field names with the canonical camelCase names used
elsewhere: update keys like mapped_to_column → mappedToColumn,
mapped_to_column_data → mappedToColumnData, css_classes → cssClasses,
default_value → defaultValue, options_params → optionsParams (keep column, show,
options, localized, name as-is), and normalize example values such as device_mac
→ deviceMac and ensure objectPrimaryId and devMac are used consistently across
all examples and the Properties table so examples, code snippets, and
descriptions all reference the same camelCase data contract.
In `@front/php/templates/language/lang.php`:
- Line 27: Replace the direct $row['stringKey']/['stringValue'] access with a
defensive mapping that checks for camelCase first and falls back to legacy
snake_case column names to avoid undefined-index notices; e.g., derive $key =
$row['stringKey'] ?? $row['string_key'] and $value = $row['stringValue'] ??
$row['string_value'], skip or continue when $key is null, then set
$strings[$key] = $value (use isset/?? to avoid notices).
In `@front/plugins/db_cleanup/script.py`:
- Line 94: Replace the f-string SQL that injects DAYS_TO_KEEP_EVENTS with a
parameterized query: compute the cutoff datetime in Python (e.g., cutoff =
(datetime.utcnow() - timedelta(days=int(DAYS_TO_KEEP_EVENTS))).isoformat() or
the DB's expected format) and then execute "DELETE FROM Events WHERE eveDateTime
<= ?" (or a named placeholder) passing cutoff as the bound parameter; update
both the sql assignment (variable sql) and the corresponding execute call so
DAYS_TO_KEEP_EVENTS is never string-injected into the SQL and the table/column
(Events, eveDateTime) remain referenced as before.
In `@front/plugins/ddns_update/config.json`:
- Around line 604-629: Rename the "Dummy" column key to use camelCase (e.g.,
"dummy") to match the rest of the schema: update the "column": "Dummy" entry in
the config object that has mapped_to_column_data, css_classes, show, type,
default_value, options, localized and name (the "Scan method" / "Método de
escaneo" / "Scanmethode" block); also search for and update any references to
"Dummy" elsewhere (code or other config keys) so mapped_to_column_data and any
consumers of that column still resolve correctly.
In `@scripts/db_cleanup/db_cleanup.py`:
- Around line 36-43: The SQL strings in tables_checks (and the similar block
around lines 56-63) directly interpolate the mac variable, exposing SQL
injection risk; either refactor the code path that runs these queries (e.g.,
run_sqlite_command or whatever executor handles these strings) to accept
parameterized queries and pass mac as a parameter, or—if changing the executor
isn’t possible now—validate/sanitize mac first using a strict MAC regex before
building the query strings (apply the same change to the block referenced at
56-63); update references to tables_checks and the corresponding executor so all
queries use the chosen safe approach.
In `@server/messaging/reporting.py`:
- Around line 197-209: The catch-all Exception handler around building sqlQuery
in the try/except (where format_vars, template, and sqlQuery are used) currently
swallows errors; update the except block to include the exception details in the
log by passing e (and optionally type(e)) to mylog so you record the error
type/message when the fallback_vars path is taken (keep the existing fallback
behavior for section == "down_devices" and assignment of sqlQuery and
parameters). Ensure logging uses the same mylog call site to avoid changing
control flow and retains the resilient fallback logic.
In `@server/models/plugin_object_instance.py`:
- Around line 64-68: getByField currently interpolates the matchedColumn into
the SQL string which is risky; change it to validate matchedColumn against a
whitelist of allowed column names (e.g., allowlist like
["objectPrimaryId","otherAllowedColumn"]) or accept only a predefined
enum/constant, then build the query using the validated identifier and continue
to parameterize matchedKey; update getByField (and callers such as getOpenPorts
in device_instance.py) to pass only allowed column names or constants and raise
an error if an invalid column is supplied so no untrusted string can be injected
into the SQL.
In `@server/plugin.py`:
- Around line 750-753: The query building in plugObjectsArr uses string
concatenation with pluginPref which creates an SQL injection risk; update the
call to db.get_sql_array to use a parameterized query (e.g., "SELECT * FROM
Plugins_Objects WHERE plugin = %s" with pluginPref passed as a parameter) or, if
get_sql_array does not support parameters, extend get_sql_array to accept query
parameters or switch to a provided parameterized query method (ensuring the
symbol plugObjectsArr still receives the result and the table name
Plugins_Objects and variable pluginPref are used correctly).
In `@server/scan/session_events.py`:
- Around line 174-176: Replace the f-string SQL INSERTs that call
sql.execute(...) into the Events table with parameterized queries: build the SQL
string with placeholders (e.g., INSERT OR IGNORE INTO Events (eveMac, eveIp,
eveDateTime, eveEventType, eveAdditionalInfo, evePendingAlertEmail) VALUES (?,
?, ?, ?, ?, ?)) and pass a tuple of the corresponding variables (eveMac, eveIp,
startTime, eveEventType, eveAdditionalInfo, evePendingAlertEmail) as the second
argument to sql.execute; do this for all five insert locations that currently
use f-strings so startTime (from timeNowUTC()) and the other event fields are
passed as parameters rather than interpolated.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 1482862e-4cd2-4edf-bc53-b5958752ec1d
📒 Files selected for processing (109)
.github/skills/code-standards/SKILL.mddocs/API_EVENTS.mddocs/API_SESSIONS.mddocs/DEBUG_PLUGINS.mddocs/NOTIFICATIONS.mddocs/NOTIFICATION_TEMPLATES.mddocs/PLUGINS_DEV.mddocs/PLUGINS_DEV_CONFIG.mddocs/PLUGINS_DEV_DATASOURCES.mddocs/PLUGINS_DEV_DATA_CONTRACT.mddocs/PLUGINS_DEV_QUICK_START.mddocs/PLUGINS_DEV_UI_COMPONENTS.mdfront/appEventsCore.phpfront/deviceDetailsEvents.phpfront/deviceDetailsSessions.phpfront/js/app-init.jsfront/js/cache.jsfront/multiEditCore.phpfront/php/components/graph_online_history.phpfront/php/templates/language/lang.phpfront/plugins/__template/config.jsonfront/plugins/__template/rename_me.pyfront/plugins/_publisher_apprise/config.jsonfront/plugins/_publisher_email/config.jsonfront/plugins/_publisher_mqtt/config.jsonfront/plugins/_publisher_mqtt/mqtt.pyfront/plugins/_publisher_ntfy/config.jsonfront/plugins/_publisher_pushover/config.jsonfront/plugins/_publisher_pushsafer/config.jsonfront/plugins/_publisher_telegram/config.jsonfront/plugins/_publisher_webhook/config.jsonfront/plugins/adguard_import/config.jsonfront/plugins/arp_scan/config.jsonfront/plugins/asuswrt_import/config.jsonfront/plugins/avahi_scan/config.jsonfront/plugins/db_cleanup/script.pyfront/plugins/ddns_update/config.jsonfront/plugins/dhcp_leases/ASUS_ROUTERS.mdfront/plugins/dhcp_leases/config.jsonfront/plugins/dhcp_servers/config.jsonfront/plugins/dig_scan/config.jsonfront/plugins/freebox/config.jsonfront/plugins/icmp_scan/config.jsonfront/plugins/internet_ip/config.jsonfront/plugins/internet_speedtest/README.mdfront/plugins/internet_speedtest/config.jsonfront/plugins/ipneigh/config.jsonfront/plugins/luci_import/config.jsonfront/plugins/mikrotik_scan/config.jsonfront/plugins/nbtscan_scan/config.jsonfront/plugins/nmap_dev_scan/config.jsonfront/plugins/nmap_scan/config.jsonfront/plugins/notification_processing/README.mdfront/plugins/notification_processing/config.jsonfront/plugins/nslookup_scan/config.jsonfront/plugins/omada_sdn_imp/config.jsonfront/plugins/omada_sdn_imp/omada_sdn.pyfront/plugins/omada_sdn_openapi/config.jsonfront/plugins/pihole_api_scan/config.jsonfront/plugins/pihole_scan/config.jsonfront/plugins/snmp_discovery/config.jsonfront/plugins/sync/config.jsonfront/plugins/unifi_api_import/config.jsonfront/plugins/unifi_import/config.jsonfront/plugins/vendor_update/config.jsonfront/plugins/wake_on_lan/config.jsonfront/plugins/website_monitor/config.jsonfront/pluginsCore.phpfront/report.phpfront/report_templates/webhook_json_sample.jsonscripts/db_cleanup/db_cleanup.pyserver/api_server/graphql_endpoint.pyserver/api_server/sessions_endpoint.pyserver/const.pyserver/database.pyserver/db/db_upgrade.pyserver/db/schema/app.sqlserver/db/sql_safe_builder.pyserver/initialise.pyserver/messaging/notification_sections.pyserver/messaging/reporting.pyserver/models/device_instance.pyserver/models/event_instance.pyserver/models/notification_instance.pyserver/models/plugin_object_instance.pyserver/plugin.pyserver/scan/device_handling.pyserver/scan/name_resolution.pyserver/scan/session_events.pyserver/utils/plugin_utils.pyserver/workflows/actions.pyserver/workflows/app_events.pyserver/workflows/manager.pyserver/workflows/triggers.pytest/api_endpoints/test_events_endpoints.pytest/api_endpoints/test_mcp_tools_endpoints.pytest/api_endpoints/test_sessions_endpoints.pytest/backend/sql_safe_builder.pytest/backend/test_compound_conditions.pytest/backend/test_notification_templates.pytest/backend/test_safe_builder_unit.pytest/backend/test_sql_injection_prevention.pytest/backend/test_sql_security.pytest/db/test_camelcase_migration.pytest/db/test_db_cleanup.pytest/db_test_helpers.pytest/integration/integration_test.pytest/scan/test_down_sleep_events.pytest/scan/test_field_lock_scan_integration.py
✅ Files skipped from review due to trivial changes (8)
- front/plugins/notification_processing/README.md
- front/plugins/internet_speedtest/README.md
- docs/NOTIFICATIONS.md
- .github/skills/code-standards/SKILL.md
- front/plugins/omada_sdn_imp/omada_sdn.py
- docs/PLUGINS_DEV_QUICK_START.md
- front/plugins/__template/rename_me.py
- front/plugins/pihole_scan/config.json
🚧 Files skipped from review as they are similar to previous changes (1)
- test/integration/integration_test.py
| ``` | ||
| 🆕 New devices | ||
| --------- | ||
| MAC: aa:bb:cc:dd:ee:ff | ||
| Datetime: 2025-01-15 10:30:00 | ||
| IP: 192.168.1.42 | ||
| Event Type: New Device | ||
| Device name: MyPhone | ||
| Comments: | ||
| devName: MyPhone | ||
| eveMac: aa:bb:cc:dd:ee:ff | ||
| devVendor: Apple | ||
| eveIp: 192.168.1.42 | ||
| eveDateTime: 2025-01-15 10:30:00 | ||
| eveEventType: New Device | ||
| devComments: | ||
| ``` |
There was a problem hiding this comment.
Add language identifiers to fenced code blocks.
Line 17 (and the touched syntax example block starting at Line 53) should specify a fenced code language to satisfy markdownlint MD040.
Suggested doc fix
-```
+```text
🆕 New devices
---------
devName: MyPhone
eveMac: aa:bb:cc:dd:ee:ff
devVendor: Apple
eveIp: 192.168.1.42
eveDateTime: 2025-01-15 10:30:00
eveEventType: New Device
devComments:@@
- +text
{devName} ({eveMac}) connected at {eveDateTime}
Also applies to: 53-55
🧰 Tools
🪛 markdownlint-cli2 (0.21.0)
[warning] 17-17: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/NOTIFICATION_TEMPLATES.md` around lines 17 - 27, The fenced code blocks
in the notification template examples are missing a language identifier
(triggering markdownlint MD040); update the opening fences for the examples
shown (the "🆕 New devices" example and the later syntax example that currently
starts with ``` ) to use a language token such as ```text so both blocks become
```text ... ```; this will satisfy MD040 and preserve the block contents.
| SELECT eveDateTime, eveEventType, eveIp, eveAdditionalInfo | ||
| FROM Events | ||
| WHERE eve_MAC = "${mac}" | ||
| AND eve_DateTime BETWEEN "${start}" AND "${end}" | ||
| WHERE eveMac = "${mac}" | ||
| AND eveDateTime BETWEEN "${start}" AND "${end}" | ||
| AND ( | ||
| (eve_EventType NOT IN ("Connected", "Disconnected", "VOIDED - Connected", "VOIDED - Disconnected")) | ||
| (eveEventType NOT IN ("Connected", "Disconnected", "VOIDED - Connected", "VOIDED - Disconnected")) | ||
| OR "${hideConnectionsStr}" = "false" |
There was a problem hiding this comment.
Raw SQL interpolation is still unsafe (SQL injection risk).
Building rawSql with direct string interpolation from runtime values (mac, start, end) is unsafe.
🛡️ Proposed hardening (immediate frontend mitigation)
+ const macPattern = /^([0-9A-Fa-f]{2}:){5}[0-9A-Fa-f]{2}$/;
+ if (!macPattern.test(mac)) {
+ console.error("Invalid MAC format");
+ hideSpinner();
+ return;
+ }
+ const sqlEscape = (v) => String(v).replace(/'/g, "''");
@@
- WHERE eveMac = "${mac}"
- AND eveDateTime BETWEEN "${start}" AND "${end}"
+ WHERE eveMac = '${sqlEscape(mac)}'
+ AND eveDateTime BETWEEN '${sqlEscape(start)}' AND '${sqlEscape(end)}'
@@
- OR "${hideConnectionsStr}" = "false"
+ OR '${sqlEscape(hideConnectionsStr)}' = 'false'🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@front/deviceDetailsEvents.php` around lines 41 - 47, The SQL built into
rawSql (the SELECT using eveDateTime/eveEventType/eveIp/eveAdditionalInfo) is
vulnerable because it interpolates mac, start, end and hideConnectionsStr
directly; change this to a parameterized/prepared statement (e.g., PDO or mysqli
prepared statements) and bind mac, start, end, and hideConnectionsStr as
parameters instead of concatenating into the query, or strictly validate/escape
those inputs before use; locate the code that constructs rawSql and replace it
with a prepared query using bound parameters for eveMac, eveDateTime BETWEEN
start/end, and the hideConnections flag (reference variables: rawSql, mac,
start, end, hideConnectionsStr, and the Events table/eve* columns).
| f"DELETE FROM Sessions " | ||
| f"WHERE ses_DateTimeConnection <= date('now', '-{days} day')" | ||
| f"WHERE sesDateTimeConnection <= date('now', '-{days} day')" | ||
| ) |
There was a problem hiding this comment.
Parameterize the retention window in DELETE SQL.
On Line 77, the query interpolates days directly via f-string. Even in tests, this can trigger security/lint failures and is avoidable.
🔧 Proposed fix
def _run_sessions_trim(cur, days: int) -> int:
"""Execute the exact DELETE used by db_cleanup and return rowcount."""
- cur.execute(
- f"DELETE FROM Sessions "
- f"WHERE sesDateTimeConnection <= date('now', '-{days} day')"
- )
+ days_int = int(days)
+ modifier = f"-{days_int} day"
+ cur.execute(
+ "DELETE FROM Sessions "
+ "WHERE sesDateTimeConnection <= date('now', ?)",
+ (modifier,),
+ )
return cur.rowcount🧰 Tools
🪛 Ruff (0.15.5)
[error] 77-78: Possible SQL injection vector through string-based query construction
(S608)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@test/db/test_db_cleanup.py` around lines 77 - 79, The DELETE query currently
injects the variable days directly into the SQL f-string; instead compute the
cutoff timestamp in Python (e.g., cutoff = now - timedelta(days=days)) and pass
it as a SQL parameter to the DELETE statement (replace the date('now', '-{days}
day') fragment with a parameter placeholder and bind the computed cutoff).
Update the test code where the SQL string is built (the block containing the
"DELETE FROM Sessions WHERE sesDateTimeConnection <= ..." string and the execute
call) to use a parameterized query and bind the cutoff value rather than
interpolating days.
…trings function for consistency
There was a problem hiding this comment.
Actionable comments posted: 5
♻️ Duplicate comments (4)
front/plugins/nmap_scan/config.json (2)
582-587:⚠️ Potential issue | 🟡 MinorWATCH description maps
watchedValue1/watchedValue2in reverse order.The column definitions show
watchedValue1= "State" (Line 181) andwatchedValue2= "Service" (Line 200), but the description text states the opposite. Both English (Line 582) and Spanish (Line 586) strings have this reversal.Suggested text fix
- "string": "Send a notification if selected values change. Use <code>CTRL + Click</code> to select/deselect. <ul> <li><code>watchedValue1</code> is service type (e.g.: http, ssh)</li><li><code>watchedValue2</code> is Status (open or closed)</li><li><code>watchedValue3</code> unused </li><li><code>watchedValue4</code> unused </li></ul>" + "string": "Send a notification if selected values change. Use <code>CTRL + Click</code> to select/deselect. <ul> <li><code>watchedValue1</code> is status (open or closed)</li><li><code>watchedValue2</code> is service type (e.g.: http, ssh)</li><li><code>watchedValue3</code> unused </li><li><code>watchedValue4</code> unused </li></ul>"- "string": "Envíe una notificación si los valores seleccionados cambian. Utilice <code>CTRL + clic</code> para seleccionar/deseleccionar. <ul> <li><code>watchedValue1</code> es el tipo de servicio (p. ej., http, ssh)</li><li><code>watchedValue2</code> es el estado (abierto o cerrado)</li> <li><code>watchedValue3</code> no utilizado </li><li><code>watchedValue4</code> no utilizado </li></ul>" + "string": "Envíe una notificación si los valores seleccionados cambian. Utilice <code>CTRL + clic</code> para seleccionar/deseleccionar. <ul> <li><code>watchedValue1</code> es el estado (abierto o cerrado)</li><li><code>watchedValue2</code> es el tipo de servicio (p. ej., http, ssh)</li> <li><code>watchedValue3</code> no utilizado </li><li><code>watchedValue4</code> no utilizado </li></ul>",
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/nmap_scan/config.json` around lines 582 - 587, The WATCH description currently swaps watchedValue1 and watchedValue2: update the English and Spanish "string" entries so watchedValue1 is described as the State (open/closed) and watchedValue2 as the Service (e.g., http, ssh) to match the column definitions; edit the two "string" values that mention <code>watchedValue1</code>/<code>watchedValue2</code> and swap their explanatory text (including both language_code "en_us" and "es_es" variants) so the mapping aligns with the column definitions.
622-622:⚠️ Potential issue | 🟡 MinorSpanish REPORT_ON text has malformed wording/markup.
The fragment
"<code>watchedValueN Las columnas </code>"has incorrect markup (code tag wraps Spanish text) and awkward grammar. The<code>tag should only wrap the identifier.Suggested fix
- "string": "Envíe una notificación solo en estos estados. <code>new</code> significa que se descubrió un nuevo objeto único (combinación única de PrimaryId y SecondaryId). <code>watched-changed</code> significa que seleccionó <code>watchedValueN Las columnas </code> cambiaron." + "string": "Envíe una notificación solo en estos estados. <code>new</code> significa que se descubrió un nuevo objeto único (combinación única de PrimaryId y SecondaryId). <code>watched-changed</code> significa que las columnas <code>watchedValueN</code> seleccionadas cambiaron.",
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/nmap_scan/config.json` at line 622, The Spanish REPORT_ON string value (the JSON "string" entry) contains malformed markup and awkward grammar: the <code> tag surrounds Spanish text instead of only the identifier watchedValueN and the sentence is unclear. Edit that value so the <code> tag wraps only the identifier watchedValueN (e.g., <code>watchedValueN</code>) and rephrase the clause to correct Spanish grammar (for example: "significa que las columnas <code>watchedValueN</code> cambiaron" or "cuando las columnas <code>watchedValueN</code> han cambiado"). Update the JSON "string" value accordingly in the same object so the code markup is correct and the sentence reads naturally.front/plugins/snmp_discovery/config.json (1)
691-691:⚠️ Potential issue | 🟡 MinorFix malformed HTML closing tag in Spanish WATCH description.
Line 691 still contains
<code>watchedValue3< /code>, which breaks markup rendering in UI text.✏️ Suggested fix
- "string": "Envíe una notificación ... <li><code>watchedValue3< /code> no se utiliza </li>..." + "string": "Envíe una notificación ... <li><code>watchedValue3</code> no se utiliza </li>..."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/snmp_discovery/config.json` at line 691, The Spanish WATCH description contains a malformed HTML closing tag "<code>watchedValue3< /code>" which breaks UI rendering; update the string to use a proper closing tag (change "< /code>" to "</code>") for the watchedValue3 fragment within the JSON "string" value so the markup is valid and renders correctly (look for the entry that mentions watchedValue1, watchedValue2, watchedValue3, watchedValue4).server/db/db_upgrade.py (1)
796-806:⚠️ Potential issue | 🟠 MajorDo not short-circuit migration based only on
Events.eveMac.Line 799 returns early if
eveMacexists, which can permanently skip pending renames in other tables after partial migrations. This should continue scanning all mapped tables before deciding to skip.🔧 Suggested fix
- if "eveMac" in events_cols: - mylog("verbose", ["[db_upgrade] Schema already uses camelCase — skipping migration"]) - return True + if "eveMac" in events_cols: + mylog("verbose", ["[db_upgrade] Events looks migrated; checking remaining tables"])🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/db/db_upgrade.py` around lines 796 - 806, The early return triggered when Events.eveMac exists should be removed and replaced with a full scan of all mapped tables so we don't short-circuit pending renames in other tables; specifically, change the logic around the PRAGMA for "Events" (the sql.execute('PRAGMA table_info("Events")') call that populates events_cols and checks for "eveMac" / "eve_MAC") to record that Events is already camelCase but continue processing other mapped tables, and only decide to skip the entire camelCase migration after verifying every mapped table either already uses camelCase or is missing/unrecognized; implement this by replacing the immediate return True when "eveMac" is found with a flag (e.g., events_is_camel=True) and removing the duplicate short-circuit so the later migration loop inspects all tables before returning.
🧹 Nitpick comments (1)
front/plugins/omada_sdn_imp/config.json (1)
443-443: Align REPORT_ON wording with new identifier naming (optional).Line 443 still says
PrimaryId/SecondaryId; consider matching the newobjectPrimaryId/objectSecondaryIdterms for consistency in UI/help text.Suggested text tweak
- "string": "Send a notification only on these statuses. <code>new</code> means a new unique (unique combination of PrimaryId and SecondaryId) object was discovered. <code>watched-changed</code> means that selected <code>watchedValueN</code> columns changed." + "string": "Send a notification only on these statuses. <code>new</code> means a new unique (unique combination of <code>objectPrimaryId</code> and <code>objectSecondaryId</code>) object was discovered. <code>watched-changed</code> means that selected <code>watchedValueN</code> columns changed."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/omada_sdn_imp/config.json` at line 443, Update the help string for REPORT_ON to use the new identifier names: replace "PrimaryId" and "SecondaryId" with "objectPrimaryId" and "objectSecondaryId" so the UI/help text is consistent with the new naming; keep the rest of the sentence intact and retain references to "new" and "watched-changed" and "watchedValueN" to preserve meaning.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@front/plugins/dhcp_leases/config.json`:
- Line 794: Update the German localization strings that contain typos: in the
JSON value currently containing "Sende eine Benachrichtigung, wenn ein
ausgwählter Wert sich ändert..." replace the misspelled "ausgwählter" with
"ausgewählter" and check the other changed WATCH/REPORT strings (also noted
around the other entry at the same file, referenced by the other occurrence) for
the incorrect phrase "wurde erkennt" and correct it to the proper German (e.g.,
"wird erkannt" or the intended tense). Edit the affected JSON string values (the
localization "string" entries) to use the corrected German phrases so both
occurrences are fixed.
In `@front/plugins/vendor_update/config.json`:
- Line 381: The REPORT_ON description string currently wraps extra words inside
a <code> tag ("watchedValueN Las columnas"); update the config value so only the
identifier is code-formatted: change the snippet to use
<code>watchedValueN</code> las columnas (ensure correct spacing and casing),
inside the same "string" value for REPORT_ON so the UI displays only the
identifier in code formatting.
In `@front/plugins/website_monitor/config.json`:
- Around line 606-609: The English description string currently references
legacy column names PrimaryId and SecondaryId; update the text to use the
camelCase column names objectPrimaryId and objectSecondaryId (leave other
wording intact) so the "string" value under language_code "en_us" mentions
objectPrimaryId and objectSecondaryId instead of PrimaryId and SecondaryId.
In `@front/report.php`:
- Around line 103-114: In the switch that sets notificationData.innerHTML (cases
'html','json','text'), fix the typo in the 'text' case where the pre tag has
malformed readonly attribute `readonly"`; update the pre element string in the
case 'text' branch (the code assigning notificationData.innerHTML) to use a
correct readonly attribute (e.g., readonly="" or readonly) consistent with the
'json' case so the attribute is valid HTML.
In `@server/initialise.py`:
- Around line 856-897: The _column_replacements dict is missing several legacy
alias keys (e.g. variations like eve_PairEventRowID, ObjectPrimaryID,
ObjectSecondaryID and mixed-case ID/Id variants) so user-config references won't
be rewritten; update _column_replacements to include all legacy aliases used by
server/db/db_upgrade.py (add entries for eve_PairEventRowID,
eve_PairEventRowID/Rowid case variants,
ObjectPrimaryID/ObjectPrimaryId/Object_PrimaryID,
ObjectSecondaryID/ObjectSecondaryId/Object_SecondaryID, and any other migrated
names you see in db_upgrade.py) using the same regex-word-boundary pattern
(r"\b... \b") and map them to the existing canonical names (e.g. map all
eve_PairEventRowID/Rowid variants to "evePairEventRowid" and all
ObjectPrimary*/Object_Secondary* variants to
"objectPrimaryId"/"objectSecondaryId") so config rewrites match the DB upgrade
behavior.
---
Duplicate comments:
In `@front/plugins/nmap_scan/config.json`:
- Around line 582-587: The WATCH description currently swaps watchedValue1 and
watchedValue2: update the English and Spanish "string" entries so watchedValue1
is described as the State (open/closed) and watchedValue2 as the Service (e.g.,
http, ssh) to match the column definitions; edit the two "string" values that
mention <code>watchedValue1</code>/<code>watchedValue2</code> and swap their
explanatory text (including both language_code "en_us" and "es_es" variants) so
the mapping aligns with the column definitions.
- Line 622: The Spanish REPORT_ON string value (the JSON "string" entry)
contains malformed markup and awkward grammar: the <code> tag surrounds Spanish
text instead of only the identifier watchedValueN and the sentence is unclear.
Edit that value so the <code> tag wraps only the identifier watchedValueN (e.g.,
<code>watchedValueN</code>) and rephrase the clause to correct Spanish grammar
(for example: "significa que las columnas <code>watchedValueN</code> cambiaron"
or "cuando las columnas <code>watchedValueN</code> han cambiado"). Update the
JSON "string" value accordingly in the same object so the code markup is correct
and the sentence reads naturally.
In `@front/plugins/snmp_discovery/config.json`:
- Line 691: The Spanish WATCH description contains a malformed HTML closing tag
"<code>watchedValue3< /code>" which breaks UI rendering; update the string to
use a proper closing tag (change "< /code>" to "</code>") for the watchedValue3
fragment within the JSON "string" value so the markup is valid and renders
correctly (look for the entry that mentions watchedValue1, watchedValue2,
watchedValue3, watchedValue4).
In `@server/db/db_upgrade.py`:
- Around line 796-806: The early return triggered when Events.eveMac exists
should be removed and replaced with a full scan of all mapped tables so we don't
short-circuit pending renames in other tables; specifically, change the logic
around the PRAGMA for "Events" (the sql.execute('PRAGMA table_info("Events")')
call that populates events_cols and checks for "eveMac" / "eve_MAC") to record
that Events is already camelCase but continue processing other mapped tables,
and only decide to skip the entire camelCase migration after verifying every
mapped table either already uses camelCase or is missing/unrecognized; implement
this by replacing the immediate return True when "eveMac" is found with a flag
(e.g., events_is_camel=True) and removing the duplicate short-circuit so the
later migration loop inspects all tables before returning.
---
Nitpick comments:
In `@front/plugins/omada_sdn_imp/config.json`:
- Line 443: Update the help string for REPORT_ON to use the new identifier
names: replace "PrimaryId" and "SecondaryId" with "objectPrimaryId" and
"objectSecondaryId" so the UI/help text is consistent with the new naming; keep
the rest of the sentence intact and retain references to "new" and
"watched-changed" and "watchedValueN" to preserve meaning.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 9ae66b49-b9d5-42ad-afc2-661686b4d14a
📒 Files selected for processing (20)
front/js/cache.jsfront/plugins/arp_scan/config.jsonfront/plugins/ddns_update/config.jsonfront/plugins/dhcp_leases/config.jsonfront/plugins/dhcp_servers/config.jsonfront/plugins/internet_ip/config.jsonfront/plugins/internet_speedtest/config.jsonfront/plugins/nmap_scan/config.jsonfront/plugins/omada_sdn_imp/config.jsonfront/plugins/omada_sdn_openapi/config.jsonfront/plugins/pihole_scan/config.jsonfront/plugins/snmp_discovery/config.jsonfront/plugins/unifi_import/config.jsonfront/plugins/vendor_update/config.jsonfront/plugins/website_monitor/config.jsonfront/report.phpserver/db/db_upgrade.pyserver/initialise.pyserver/scan/name_resolution.pytest/backend/test_notification_templates.py
🚧 Files skipped from review as they are similar to previous changes (4)
- test/backend/test_notification_templates.py
- front/plugins/internet_ip/config.json
- front/plugins/dhcp_servers/config.json
- front/plugins/pihole_scan/config.json
| { | ||
| "language_code": "de_de", | ||
| "string": "Sende eine Benachrichtigung, wenn ein ausgwählter Wert sich ändert. <code>STRG + klicken</code> zum aus-/abwählen. <ul> <li><code>Watched_Value1</code> ist der Aktivstatus</li><li><code>Watched_Value2</code> ist der Hostname</li><li><code>Watched_Value3</code> ist die Hardware</li><li><code>Watched_Value4</code> ist der Zustand </li></ul>" | ||
| "string": "Sende eine Benachrichtigung, wenn ein ausgwählter Wert sich ändert. <code>STRG + klicken</code> zum aus-/abwählen. <ul> <li><code>watchedValue1</code> ist der Aktivstatus</li><li><code>watchedValue2</code> ist der Hostname</li><li><code>watchedValue3</code> ist die Hardware</li><li><code>watchedValue4</code> ist der Zustand </li></ul>" |
There was a problem hiding this comment.
Fix German localization typos in updated WATCH/REPORT text.
There are user-facing grammar/spelling issues in the changed strings (ausgwählter and wurde erkennt) that should be corrected before release.
Suggested text fix
- "string": "Sende eine Benachrichtigung, wenn ein ausgwählter Wert sich ändert. <code>STRG + klicken</code> zum aus-/abwählen. <ul> <li><code>watchedValue1</code> ist der Aktivstatus</li><li><code>watchedValue2</code> ist der Hostname</li><li><code>watchedValue3</code> ist die Hardware</li><li><code>watchedValue4</code> ist der Zustand </li></ul>"
+ "string": "Sende eine Benachrichtigung, wenn ein ausgewählter Wert sich ändert. <code>STRG + klicken</code> zum Aus-/Abwählen. <ul> <li><code>watchedValue1</code> ist der Aktivstatus</li><li><code>watchedValue2</code> ist der Hostname</li><li><code>watchedValue3</code> ist die Hardware</li><li><code>watchedValue4</code> ist der Zustand </li></ul>"
- "string": "Benachrichtige nur bei diesen Status. <code>new</code> bedeutet ein neues eindeutiges (einzigartige Kombination aus PrimaryId und SecondaryId) Objekt wurde erkennt. <code>watched-changed</code> bedeutet eine ausgewählte <code>watchedValueN</code>-Spalte hat sich geändert."
+ "string": "Benachrichtige nur bei diesen Status. <code>new</code> bedeutet, dass ein neues eindeutiges (einzigartige Kombination aus PrimaryId und SecondaryId) Objekt erkannt wurde. <code>watched-changed</code> bedeutet, dass sich eine ausgewählte <code>watchedValueN</code>-Spalte geändert hat."Also applies to: 843-843
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@front/plugins/dhcp_leases/config.json` at line 794, Update the German
localization strings that contain typos: in the JSON value currently containing
"Sende eine Benachrichtigung, wenn ein ausgwählter Wert sich ändert..." replace
the misspelled "ausgwählter" with "ausgewählter" and check the other changed
WATCH/REPORT strings (also noted around the other entry at the same file,
referenced by the other occurrence) for the incorrect phrase "wurde erkennt" and
correct it to the proper German (e.g., "wird erkannt" or the intended tense).
Edit the affected JSON string values (the localization "string" entries) to use
the corrected German phrases so both occurrences are fixed.
… nmap_scan, and snmp_discovery plugins
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (1)
front/plugins/website_monitor/config.json (1)
612-612:⚠️ Potential issue | 🟡 MinorFix unresolved Spanish REPORT_ON text (legacy IDs + malformed
<code>span).Line 612 still contains legacy
PrimaryId/SecondaryIdand malformed markup (<code>watchedValueN Las columnas </code>), so localization remains inconsistent with the camelCase migration.✏️ Suggested fix
- "string": "Envíe una notificación solo en estos estados. <code>new</code> significa que se descubrió un nuevo objeto único (combinación única de PrimaryId y SecondaryId). <code>watched-changed</code> significa que seleccionó <code>watchedValueN Las columnas </code> cambiaron." + "string": "Envíe una notificación solo en estos estados. <code>new</code> significa que se descubrió un nuevo objeto único (combinación única de <code>objectPrimaryId</code> y <code>objectSecondaryId</code>). <code>watched-changed</code> significa que las columnas seleccionadas <code>watchedValueN</code> cambiaron."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@front/plugins/website_monitor/config.json` at line 612, Update the Spanish REPORT_ON string value: replace legacy `PrimaryId`/`SecondaryId` with the camelCase `primaryId`/`secondaryId`, and correct the malformed code span so `watchedValueN` is enclosed in its own `<code>` tag and the rest of the sentence reads naturally (e.g., "...<code>watchedValueN</code> las columnas cambiaron" or better "...las columnas <code>watchedValueN</code> cambiaron"). Locate and edit the "string" entry containing the current text so it becomes consistent with camelCase identifiers and valid HTML markup.
🧹 Nitpick comments (1)
server/initialise.py (1)
901-930: Consider specifying explicit encoding for file operations.The file operations at lines 905 and 922 don't specify an encoding, which could cause issues on systems where the default encoding isn't UTF-8 and the config contains non-ASCII characters.
This matches the existing pattern in
renameSettings(), but both functions would benefit from explicit encoding for robustness.💡 Suggested improvement
- with open(str(config_file), "r") as f: + with open(str(config_file), "r", encoding="utf-8") as f: for line in f:with ( - open(str(config_file), "r") as original, - open(str(config_file) + "_temp", "w") as temp, + open(str(config_file), "r", encoding="utf-8") as original, + open(str(config_file) + "_temp", "w", encoding="utf-8") as temp, ):🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@server/initialise.py` around lines 901 - 930, The file reads/writes in renameColumnReferences currently use open() without an explicit encoding, which can break on non-UTF-8 systems; update all open(...) calls in renameColumnReferences (the initial read loop, the backup/temp write block where you open original and temp) to specify encoding='utf-8' (and errors='replace' if you want to be tolerant), leaving shutil.copy/shutil.move as-is; keep the existing logic using _column_replacements, mylog, and timeNowUTC unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@front/plugins/vendor_update/config.json`:
- Line 336: The German localization in the vendor_update plugin config has
typos: update the JSON value for the "string" entry in the vendor_update config
to correct "ausgwählter" to "ausgewählter" and find the other faulty phrase
"wurde erkennt" (also in the vendor_update localization entries) and change it
to the grammatically correct "wurde erkannt"; ensure you update the exact
"string" fields in front/plugins/vendor_update/config.json so the user-facing
texts are corrected.
In `@server/initialise.py`:
- Around line 856-898: The _column_replacements mapping in initialise.py is
missing non-underscore legacy keys for ObjectPrimaryID/ObjectSecondaryID; add
regex entries for r"\bObjectPrimaryID\b": "objectPrimaryId" and
r"\bObjectSecondaryID\b": "objectSecondaryId" alongside the existing
r"\bObject_PrimaryID\b"/r"\bObject_SecondaryID\b" and
r"\bObject_PrimaryId\b"/r"\bObject_SecondaryId\b" so the migration logic in
db_upgrade.py (which expects these variants) will correctly normalize those
column names.
---
Duplicate comments:
In `@front/plugins/website_monitor/config.json`:
- Line 612: Update the Spanish REPORT_ON string value: replace legacy
`PrimaryId`/`SecondaryId` with the camelCase `primaryId`/`secondaryId`, and
correct the malformed code span so `watchedValueN` is enclosed in its own
`<code>` tag and the rest of the sentence reads naturally (e.g.,
"...<code>watchedValueN</code> las columnas cambiaron" or better "...las
columnas <code>watchedValueN</code> cambiaron"). Locate and edit the "string"
entry containing the current text so it becomes consistent with camelCase
identifiers and valid HTML markup.
---
Nitpick comments:
In `@server/initialise.py`:
- Around line 901-930: The file reads/writes in renameColumnReferences currently
use open() without an explicit encoding, which can break on non-UTF-8 systems;
update all open(...) calls in renameColumnReferences (the initial read loop, the
backup/temp write block where you open original and temp) to specify
encoding='utf-8' (and errors='replace' if you want to be tolerant), leaving
shutil.copy/shutil.move as-is; keep the existing logic using
_column_replacements, mylog, and timeNowUTC unchanged.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: f99c30e6-4037-47a3-85d0-1ecca90a7c2e
📒 Files selected for processing (4)
front/plugins/vendor_update/config.jsonfront/plugins/website_monitor/config.jsonfront/report.phpserver/initialise.py
| { | ||
| "language_code": "de_de", | ||
| "string": "Sende eine Benachrichtigung, wenn ein ausgwählter Wert sich ändert. <code>STRG + klicken</code> zum aus-/abwählen. <ul> <li><code>Watched_Value1</code> ist der Herstellername</li><li><code>Watched_Value2</code> ist der Gerätename</li><li><code>Watched_Value3</code> ist nicht in Verwendung </li><li><code>Watched_Value4</code> ist nicht in Verwendung </li></ul>" | ||
| "string": "Sende eine Benachrichtigung, wenn ein ausgwählter Wert sich ändert. <code>STRG + klicken</code> zum aus-/abwählen. <ul> <li><code>watchedValue1</code> ist der Herstellername</li><li><code>watchedValue2</code> ist der Gerätename</li><li><code>watchedValue3</code> ist nicht in Verwendung </li><li><code>watchedValue4</code> ist nicht in Verwendung </li></ul>" |
There was a problem hiding this comment.
Fix minor German localization typos in changed strings.
Line 336 (ausgwählter) and Line 385 (wurde erkennt) are grammatically incorrect in user-facing text.
✏️ Suggested patch
- "string": "Sende eine Benachrichtigung, wenn ein ausgwählter Wert sich ändert. <code>STRG + klicken</code> zum aus-/abwählen. <ul> <li><code>watchedValue1</code> ist der Herstellername</li><li><code>watchedValue2</code> ist der Gerätename</li><li><code>watchedValue3</code> ist nicht in Verwendung </li><li><code>watchedValue4</code> ist nicht in Verwendung </li></ul>"
+ "string": "Sende eine Benachrichtigung, wenn ein ausgewählter Wert sich ändert. <code>STRG + Klicken</code> zum Aus-/Abwählen. <ul> <li><code>watchedValue1</code> ist der Herstellername</li><li><code>watchedValue2</code> ist der Gerätename</li><li><code>watchedValue3</code> ist nicht in Verwendung </li><li><code>watchedValue4</code> ist nicht in Verwendung </li></ul>"
- "string": "Benachrichtige nur bei diesen Status. <code>new</code> bedeutet ein neues eindeutiges (einzigartige Kombination aus PrimaryId und SecondaryId) Objekt wurde erkennt. <code>watched-changed</code> bedeutet eine ausgewählte <code>watchedValueN</code>-Spalte hat sich geändert."
+ "string": "Benachrichtige nur bei diesen Status. <code>new</code> bedeutet, dass ein neues eindeutiges Objekt (einzigartige Kombination aus PrimaryId und SecondaryId) erkannt wurde. <code>watched-changed</code> bedeutet, dass sich eine ausgewählte <code>watchedValueN</code>-Spalte geändert hat."Also applies to: 385-385
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@front/plugins/vendor_update/config.json` at line 336, The German localization
in the vendor_update plugin config has typos: update the JSON value for the
"string" entry in the vendor_update config to correct "ausgwählter" to
"ausgewählter" and find the other faulty phrase "wurde erkennt" (also in the
vendor_update localization entries) and change it to the grammatically correct
"wurde erkannt"; ensure you update the exact "string" fields in
front/plugins/vendor_update/config.json so the user-facing texts are corrected.
| _column_replacements = { | ||
| # Event columns | ||
| r"\beve_MAC\b": "eveMac", | ||
| r"\beve_IP\b": "eveIp", | ||
| r"\beve_DateTime\b": "eveDateTime", | ||
| r"\beve_EventType\b": "eveEventType", | ||
| r"\beve_AdditionalInfo\b": "eveAdditionalInfo", | ||
| r"\beve_PendingAlertEmail\b": "evePendingAlertEmail", | ||
| r"\beve_PairEventRowid\b": "evePairEventRowid", | ||
| r"\beve_PairEventRowID\b": "evePairEventRowid", | ||
| # Session columns | ||
| r"\bses_MAC\b": "sesMac", | ||
| r"\bses_IP\b": "sesIp", | ||
| r"\bses_DateTimeConnection\b": "sesDateTimeConnection", | ||
| r"\bses_DateTimeDisconnection\b": "sesDateTimeDisconnection", | ||
| r"\bses_EventTypeConnection\b": "sesEventTypeConnection", | ||
| r"\bses_EventTypeDisconnection\b": "sesEventTypeDisconnection", | ||
| r"\bses_StillConnected\b": "sesStillConnected", | ||
| r"\bses_AdditionalInfo\b": "sesAdditionalInfo", | ||
| # Plugin columns (templates + WATCH values) | ||
| r"\bObject_PrimaryID\b": "objectPrimaryId", | ||
| r"\bObject_PrimaryId\b": "objectPrimaryId", | ||
| r"\bObject_SecondaryID\b": "objectSecondaryId", | ||
| r"\bObject_SecondaryId\b": "objectSecondaryId", | ||
| r"\bWatched_Value1\b": "watchedValue1", | ||
| r"\bWatched_Value2\b": "watchedValue2", | ||
| r"\bWatched_Value3\b": "watchedValue3", | ||
| r"\bWatched_Value4\b": "watchedValue4", | ||
| r"\bDateTimeChanged\b": "dateTimeChanged", | ||
| r"\bDateTimeCreated\b": "dateTimeCreated", | ||
| r"\bSyncHubNodeName\b": "syncHubNodeName", | ||
| # Online_History (in case of API_CUSTOM_SQL) | ||
| r"\bScan_Date\b": "scanDate", | ||
| r"\bOnline_Devices\b": "onlineDevices", | ||
| r"\bDown_Devices\b": "downDevices", | ||
| r"\bAll_Devices\b": "allDevices", | ||
| r"\bArchived_Devices\b": "archivedDevices", | ||
| r"\bOffline_Devices\b": "offlineDevices", | ||
| # Language strings (unlikely in user config but thorough) | ||
| r"\bLanguage_Code\b": "languageCode", | ||
| r"\bString_Key\b": "stringKey", | ||
| r"\bString_Value\b": "stringValue", | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Search db_upgrade.py for all column rename mappings to ensure completeness
echo "=== Checking db_upgrade.py for ObjectPrimaryID/ObjectSecondaryID without underscore ==="
rg -n "ObjectPrimaryID|ObjectSecondaryID" --type py
echo ""
echo "=== Checking for ses_DateTimeOrder references ==="
rg -n "ses_DateTimeOrder|sesDateTimeOrder" --type pyRepository: netalertx/NetAlertX
Length of output: 721
Add missing non-underscore variants for ObjectPrimaryID and ObjectSecondaryID.
The _column_replacements dictionary is missing the legacy column names without underscores. The codebase in db_upgrade.py (lines 759-760) maps these variants:
ObjectPrimaryID→objectPrimaryIdObjectSecondaryID→objectSecondaryId
Add these entries to ensure user configs using the non-underscore naming convention are properly migrated.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@server/initialise.py` around lines 856 - 898, The _column_replacements
mapping in initialise.py is missing non-underscore legacy keys for
ObjectPrimaryID/ObjectSecondaryID; add regex entries for r"\bObjectPrimaryID\b":
"objectPrimaryId" and r"\bObjectSecondaryID\b": "objectSecondaryId" alongside
the existing r"\bObject_PrimaryID\b"/r"\bObject_SecondaryID\b" and
r"\bObject_PrimaryId\b"/r"\bObject_SecondaryId\b" so the migration logic in
db_upgrade.py (which expects these variants) will correctly normalize those
column names.
Summary by CodeRabbit
New Features
Documentation
Chores