Skip to content

Add getting LLM interface to niavely work with OpenAI. #553

Add getting LLM interface to niavely work with OpenAI.

Add getting LLM interface to niavely work with OpenAI. #553

Triggered via push September 30, 2025 05:59
Status Failure
Total duration 34m 56s
Artifacts 2

build_app.yml

on: push
Matrix: build-win
release
release
Fit to window
Zoom out
Zoom in

Annotations

1 error and 5 warnings
Windows Latest MSVC-2022
Process completed with exit code 1.
Windows Latest MSVC-2022
Failed to save: <h2>Our services aren't available right now</h2><p>We're working to restore all services as soon as possible. Please check back soon.</p>0hnnbaAAAAAB1DjgRFKCwQ4qP5IKBsIGfUEFPRURHRTA2MTAARWRnZQ==
Windows Latest MSVC-2022
Failed to restore: Cache service responded with 400
Windows Latest MSVC-2022
Failed to save: <h2>Our services aren't available right now</h2><p>We're working to restore all services as soon as possible. Please check back soon.</p>0wXPbaAAAAABZQlWVfUDLTpmuyOMK7p6zQlkzRURHRTA0MTIARWRnZQ==
Windows Latest MSVC-2022
Failed to restore: Cache service responded with 400
build-linux
Failed to save: "/usr/bin/tar" failed with error: The process '/usr/bin/tar' failed with exit code 2

Artifacts

Produced during runtime
Name Size Digest
InterSpec_app_Windows_WebView_latest
4.08 MB
sha256:3ba5e55f0622d109534d6f7bb57bb5f5c90b66b51e53ed05678102e92da1e986
linux-build
124 MB
sha256:c0d71589b01f4bc9ae715d4fd2ce726d16d53251f67d192a1a05fa705435b01e