Skip to content

Conversation

felvieira
Copy link

refactor: simplifica configuração do Docker Compose para deploy no Coolify

  • Remove estágios múltiplos de build para evitar conflitos no deploy
  • Simplifica para um único perfil de produção
  • Ajusta configuração da porta para 5173
  • Remove comandos de build e start redundantes
  • Otimiza a estrutura do docker-compose para melhor compatibilidade com Coolify

refactor: simplify Docker Compose configuration for Coolify deployment

  • Remove multiple build stages to prevent deployment conflicts
  • Simplify to a single production profile
  • Adjust port configuration to 5173
  • Remove redundant build and start commands
  • Optimize docker-compose structure for better Coolify compatibility

image

@felvieira felvieira changed the title Coolify deploy configuration WORKING AND DONE build: Coolify deploy configuration WORKING AND DONE Jan 31, 2025
@felvieira felvieira changed the title build: Coolify deploy configuration WORKING AND DONE build: Coolify deploy configuration Jan 31, 2025
@felvieira felvieira changed the title build: Coolify deploy configuration build: coolify deploy configuration Jan 31, 2025
@iaminawe
Copy link

iaminawe commented Jan 31, 2025

Thanks for this Felipe, its much appreciated. I tried to test it earlier by merging your pull request with my up to date fork of bolt.diy at https://github.com/iaminawe/bolt.diy and then tried to deploy through coolify. I added relevant keys for Claude, openai, groq and open router but when I deploy through coolify I see the interface for a moment then I get an application error. I have been troubleshooting it for a few hours but have not yet had any success and I was hoping you may be able point me in the right direction as to what I could be missing. Thanks in advance for any assistance you can give.

@kingfish65
Copy link

kingfish65 commented Jan 31, 2025

@iaminawe It seems to work for me. I ran the ToDo app from your link with one error. Clicked on ask bolt and it was fixed. I would not leave your API's on a public facing app as somebody could really burn through your AI tokens.

image

@leex279 leex279 self-requested a review January 31, 2025 08:10
@felvieira
Copy link
Author

Thanks for this Felipe, its much appreciated. I tried to test it earlier by merging your pull request with my up to date fork of bolt.diy at https://github.com/iaminawe/bolt.diy and then tried to deploy through coolify. I added relevant keys for Claude, openai, groq and open router but when I deploy through coolify I see the interface for a moment then I get an application error. I have been troubleshooting it for a few hours but have not yet had any success and I was hoping you may be able point me in the right direction as to what I could be missing. Thanks in advance for any assistance you can give.

put your keys only inside env variables inside coolify.

@iaminawe
Copy link

Thank you both. I did add my keys through the environment variables in coolify. I only left it open publicly temporarily because I was working on it and as far as I could see it was not working. I will try it out in some other browsers and see if I can access it without the app crashing. Much appreciated

- "host.docker.internal:host-gateway"
command: pnpm run dockerstart
profiles:
- prebuilt No newline at end of file

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing new line


# Instalar pnpm e wrangler
RUN corepack enable pnpm && \
npm install -g wrangler

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to address problem c88938c

Suggested change
npm install -g wrangler
npm install -g wrangler corepack@latest

@Arlington1985
Copy link

In the long run, managing separate Coolify Dockerfiles and Compose files may not be a sustainable approach. It’s easy for them to become unsynchronized or overlooked over time. I’d love to hear the maintainers' thoughts on this.

@leex279
Copy link
Collaborator

leex279 commented Feb 17, 2025

@Arlington1985 I see it as you. We blocked most such stuff and closed PRs which did similar things.

For the coolify, as it is just one file, we could, in my view, keep it until the Extension Library is in place, then it could be added there and it is not in the main bolt.diy repo.
See: #935 (dont know when we will/can launch this, but then a lot stuff can provided there instead of the main "product". Maybe the extension library can then be also community driven. we will see)

@Arlington1985
Copy link

I was trying to run it on top of my own forked branch https://github.com/Arlington1985/bolt.diy/tree/coolify. Was getting following error

2025-Feb-17 22:45:48.723852
#12 72.14 <--- Last few GCs --->
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14 [19:0x1df06f30]    69552 ms: Mark-Compact 1986.1 (2094.5) -> 1986.0 (2094.5) MB, 799.47 / 0.00 ms  (average mu = 0.130, current mu = 0.004) allocation failure; scavenge might not succeed
2025-Feb-17 22:45:48.723852
#12 72.14 [19:0x1df06f30]    70373 ms: Mark-Compact 1987.1 (2095.5) -> 1987.0 (2095.5) MB, 819.08 / 0.00 ms  (average mu = 0.070, current mu = 0.002) allocation failure; scavenge might not succeed
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14 <--- JS stacktrace --->
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14 FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
2025-Feb-17 22:45:48.723852
#12 72.14 ----- Native stack trace -----
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.888000
#12 72.14  1: 0xb8ced1 node::OOMErrorHandler(char const*, v8::OOMDetails const&) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  2: 0xf06460 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, v8::OOMDetails const&) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  3: 0xf06747 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, v8::OOMDetails const&) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  4: 0x11182e5  [node]
2025-Feb-17 22:45:48.888000
#12 72.14  5: 0x1118874 v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  6: 0x112f764 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::internal::GarbageCollectionReason, char const*) [node]
2025-Feb-17 22:45:48.888000
#12 72.15  7: 0x112ff7c v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [node]
2025-Feb-17 22:45:48.888000
#12 72.15  8: 0x1106281 v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15  9: 0x1107415 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 10: 0x10e3b36 v8::internal::Factory::AllocateRaw(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 11: 0x10d5764 v8::internal::FactoryBase<v8::internal::Factory>::AllocateRawWithImmortalMap(int, v8::internal::AllocationType, v8::internal::Map, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 12: 0x10d7f46 v8::internal::FactoryBase<v8::internal::Factory>::NewRawOneByteString(int, v8::internal::AllocationType) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 13: 0x10eef64 v8::internal::Factory::NewStringFromUtf8(v8::base::Vector<char const> const&, v8::internal::AllocationType) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 14: 0xf18cf2 v8::String::NewFromUtf8(v8::Isolate*, char const*, v8::NewStringType, int) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 15: 0xdf17e7  [node]
2025-Feb-17 22:45:48.888000
#12 72.15 16: 0xdf191f node::StringDecoder::DecodeData(v8::Isolate*, char const*, unsigned long*) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 17: 0xdf1e5d  [node]
2025-Feb-17 22:45:48.888000
#12 72.15 18: 0xf6e88f v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 19: 0xf6f0fd  [node]
2025-Feb-17 22:45:48.888000
#12 72.15 20: 0xf6f5c5 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [node]
2025-Feb-17 22:45:48.888000
#12 72.16 21: 0x1979df6  [node]
2025-Feb-17 22:45:49.171565
#12 72.59 Aborted (core dumped)
2025-Feb-17 22:45:49.300002
#12 72.60  ELIFECYCLE  Command failed with exit code 134.
2025-Feb-17 22:45:49.300002
#12 ERROR: process "/bin/sh -c pnpm run build" did not complete successfully: exit code: 134
2025-Feb-17 22:45:49.335449
------
2025-Feb-17 22:45:49.335449
> [app 8/8] RUN pnpm run build:
2025-Feb-17 22:45:49.335449
72.15 14: 0xf18cf2 v8::String::NewFromUtf8(v8::Isolate*, char const*, v8::NewStringType, int) [node]
2025-Feb-17 22:45:49.335449
72.15 15: 0xdf17e7  [node]
2025-Feb-17 22:45:49.335449
72.15 16: 0xdf191f node::StringDecoder::DecodeData(v8::Isolate*, char const*, unsigned long*) [node]
2025-Feb-17 22:45:49.335449
72.15 17: 0xdf1e5d  [node]
2025-Feb-17 22:45:49.335449
72.15 18: 0xf6e88f v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) [node]
2025-Feb-17 22:45:49.335449
72.15 19: 0xf6f0fd  [node]
2025-Feb-17 22:45:49.335449
72.15 20: 0xf6f5c5 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [node]
2025-Feb-17 22:45:49.335449
72.16 21: 0x1979df6  [node]
2025-Feb-17 22:45:49.335449
72.59 Aborted (core dumped)
2025-Feb-17 22:45:49.335449
72.60  ELIFECYCLE  Command failed with exit code 134.
2025-Feb-17 22:45:49.335449
------
2025-Feb-17 22:45:49.346535
failed to solve: process "/bin/sh -c pnpm run build" did not complete successfully: exit code: 134

Initially I tried on cx22 hetzner(2vCPU, 4gb RAM), got the error then tried on cx32 hetzner(4vCPU, 8gb RAM), again got the same error. Anyone has experienced same issue?

@felvieira
Copy link
Author

I was trying to run it on top of my own forked branch https://github.com/Arlington1985/bolt.diy/tree/coolify. Was getting following error

2025-Feb-17 22:45:48.723852
#12 72.14 <--- Last few GCs --->
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14 [19:0x1df06f30]    69552 ms: Mark-Compact 1986.1 (2094.5) -> 1986.0 (2094.5) MB, 799.47 / 0.00 ms  (average mu = 0.130, current mu = 0.004) allocation failure; scavenge might not succeed
2025-Feb-17 22:45:48.723852
#12 72.14 [19:0x1df06f30]    70373 ms: Mark-Compact 1987.1 (2095.5) -> 1987.0 (2095.5) MB, 819.08 / 0.00 ms  (average mu = 0.070, current mu = 0.002) allocation failure; scavenge might not succeed
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14 <--- JS stacktrace --->
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.723852
#12 72.14 FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
2025-Feb-17 22:45:48.723852
#12 72.14 ----- Native stack trace -----
2025-Feb-17 22:45:48.723852
#12 72.14
2025-Feb-17 22:45:48.888000
#12 72.14  1: 0xb8ced1 node::OOMErrorHandler(char const*, v8::OOMDetails const&) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  2: 0xf06460 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, v8::OOMDetails const&) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  3: 0xf06747 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, v8::OOMDetails const&) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  4: 0x11182e5  [node]
2025-Feb-17 22:45:48.888000
#12 72.14  5: 0x1118874 v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [node]
2025-Feb-17 22:45:48.888000
#12 72.14  6: 0x112f764 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::internal::GarbageCollectionReason, char const*) [node]
2025-Feb-17 22:45:48.888000
#12 72.15  7: 0x112ff7c v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [node]
2025-Feb-17 22:45:48.888000
#12 72.15  8: 0x1106281 v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15  9: 0x1107415 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 10: 0x10e3b36 v8::internal::Factory::AllocateRaw(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 11: 0x10d5764 v8::internal::FactoryBase<v8::internal::Factory>::AllocateRawWithImmortalMap(int, v8::internal::AllocationType, v8::internal::Map, v8::internal::AllocationAlignment) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 12: 0x10d7f46 v8::internal::FactoryBase<v8::internal::Factory>::NewRawOneByteString(int, v8::internal::AllocationType) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 13: 0x10eef64 v8::internal::Factory::NewStringFromUtf8(v8::base::Vector<char const> const&, v8::internal::AllocationType) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 14: 0xf18cf2 v8::String::NewFromUtf8(v8::Isolate*, char const*, v8::NewStringType, int) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 15: 0xdf17e7  [node]
2025-Feb-17 22:45:48.888000
#12 72.15 16: 0xdf191f node::StringDecoder::DecodeData(v8::Isolate*, char const*, unsigned long*) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 17: 0xdf1e5d  [node]
2025-Feb-17 22:45:48.888000
#12 72.15 18: 0xf6e88f v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) [node]
2025-Feb-17 22:45:48.888000
#12 72.15 19: 0xf6f0fd  [node]
2025-Feb-17 22:45:48.888000
#12 72.15 20: 0xf6f5c5 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [node]
2025-Feb-17 22:45:48.888000
#12 72.16 21: 0x1979df6  [node]
2025-Feb-17 22:45:49.171565
#12 72.59 Aborted (core dumped)
2025-Feb-17 22:45:49.300002
#12 72.60  ELIFECYCLE  Command failed with exit code 134.
2025-Feb-17 22:45:49.300002
#12 ERROR: process "/bin/sh -c pnpm run build" did not complete successfully: exit code: 134
2025-Feb-17 22:45:49.335449
------
2025-Feb-17 22:45:49.335449
> [app 8/8] RUN pnpm run build:
2025-Feb-17 22:45:49.335449
72.15 14: 0xf18cf2 v8::String::NewFromUtf8(v8::Isolate*, char const*, v8::NewStringType, int) [node]
2025-Feb-17 22:45:49.335449
72.15 15: 0xdf17e7  [node]
2025-Feb-17 22:45:49.335449
72.15 16: 0xdf191f node::StringDecoder::DecodeData(v8::Isolate*, char const*, unsigned long*) [node]
2025-Feb-17 22:45:49.335449
72.15 17: 0xdf1e5d  [node]
2025-Feb-17 22:45:49.335449
72.15 18: 0xf6e88f v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) [node]
2025-Feb-17 22:45:49.335449
72.15 19: 0xf6f0fd  [node]
2025-Feb-17 22:45:49.335449
72.15 20: 0xf6f5c5 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [node]
2025-Feb-17 22:45:49.335449
72.16 21: 0x1979df6  [node]
2025-Feb-17 22:45:49.335449
72.59 Aborted (core dumped)
2025-Feb-17 22:45:49.335449
72.60  ELIFECYCLE  Command failed with exit code 134.
2025-Feb-17 22:45:49.335449
------
2025-Feb-17 22:45:49.346535
failed to solve: process "/bin/sh -c pnpm run build" did not complete successfully: exit code: 134

Initially I tried on cx22 hetzner(2vCPU, 4gb RAM), got the error then tried on cx32 hetzner(4vCPU, 8gb RAM), again got the same error. Anyone has experienced same issue?

my vps have 12gb of ram and daploys ok https://bolt.felvieira.com.br/

@felvieira
Copy link
Author

felvieira commented Feb 18, 2025 via email

@Arlington1985
Copy link

Arlington1985 commented Feb 18, 2025

Just wondering, I can deploy it directly inside 4GB VM(but DNS resolution doesn't work, I can reach by public ip address and port, just like local), why would Coolify require so much power?

@felvieira
Copy link
Author

felvieira commented Feb 18, 2025 via email

@Arlington1985
Copy link

i dont know if the issue you report, is because this, But in my vm with 12gm ram works like charm. Att,

________________________________ De: Arlington1985 @.> Enviado: terça-feira, 18 de fevereiro de 2025 12:31 Para: stackblitz-labs/bolt.diy @.> Cc: Felipe Vieira @.>; Author @.> Assunto: Re: [stackblitz-labs/bolt.diy] build: coolify deploy configuration (PR #1231) Just wondering, I can deploy it directly inside 4GB VM without pointing just ok, why would Coolify require so much power? — Reply to this email directly, view it on GitHub<#1231 (comment)>, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABQT7NZTK3VK74VBSUWOHBT2QNG3XAVCNFSM6AAAAABWGRI2YOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNRWGA3DAMZYGU. You are receiving this because you authored the thread.Message ID: @.> [Arlington1985]Arlington1985 left a comment (stackblitz-labs/bolt.diy#1231)<#1231 (comment)> Just wondering, I can deploy it directly inside 4GB VM without pointing just ok, why would Coolify require so much power? — Reply to this email directly, view it on GitHub<#1231 (comment)>, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABQT7NZTK3VK74VBSUWOHBT2QNG3XAVCNFSM6AAAAABWGRI2YOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNRWGA3DAMZYGU. You are receiving this because you authored the thread.Message ID: @.>

I tried with 16 GB and got the same error. Anyone else already tried to spin up?

@goldzulu
Copy link

goldzulu commented Mar 9, 2025

Screenshot 2025-03-09 at 9 21 26 PM

I've tested this and on [email protected] pulled directly from felvieira:coolify and have managed to make it work. At first it was displaying a blank screen and found out that caddy was refusing port 3000 (not sure if traefik will be the same)

I had to add a dynamic proxy configuration on the server itself in coolify dashboard

boltdiy.caddy

yourfqdn.com { reverse_proxy app-xosg488804ockskoo8888os4-201618619730:5173 }

replace the app-xxxxx-xxxx with your own boltdiy docker instance name (using docker ps in terminal)
It seems port 3000 has no effect. (caddy refused to connect on that port for some reason)

Boltdiy will then work just fine on chrome but not safari for some reason (it loads up and then after that spits out some javascript error. At least chome is working!

Screenshot 2025-03-09 at 9 18 35 PM

@felvieira perhaps update your coolify branch to stable or main (make sure it bolt 0.0.7) and then use your coolify Dockerfile-coolify and docker-compose-coolify and see if you get the same error. I tried playing with coolify resource limits but did not get far. I believe something in 0.0.7 use up a bit too much memory or is buggy and causing errors which drains memory.

With my test, if I just copy Dockerfile-coolify and docker-compose-coolify.yaml to the latest HEAD in main (or stable branch - doesn't mater) during the deployment, which is bolt 0.0.7, it will error out of memory just like @Arlington1985 mentioned.

Screenshot 2025-03-09 at 9 01 36 PM

@kerbymart
Copy link

Has anyone encountered this issue with Bolt on Coolify?

After a prompt, the models work, but the shell fails to spawn with the error:

Failed to spawn shell  
Failed to execute 'postMessage' on 'Worker': SharedArrayBuffer transfer requires self.crossOriginIsolated.

See complete logs

Logs also show a DNS lookup failure for host.docker.internal.

My setup:

  • Coolify runs on a server accessible via ZeroTier.
  • The server’s ZT IP is mapped on my Macbook:
    192.168.196.186  coolify.local  
    192.168.196.186  bolt-diy.coolify.local  
    
  • The "Domains for App" in Coolify is set to http://bolt-diy.coolify.local.
  • Deployed using [docker-compose-coolify.yaml]

Any ideas on what might be causing this?

@nemixe
Copy link

nemixe commented Apr 6, 2025

Can anyone share the coolify settings?, i haven't gotten run this right.

I tried to run with custom start command docker compose --profile prebuilt up -d app-prebuild.

It's succeed but not running in domain i set, so the only way to access it through ip address and port of my server

@trigop
Copy link

trigop commented Apr 14, 2025

I think it is not necessary to make such a big change.
In order to make it work in coolify I just need to add a profile in the docker compose command and that makes everything work.

In my case I selected production with docker compose --profile production up -d and it worked correctly.

imagen

@clarcksp
Copy link

Olá, estou tentando instalar o bolt.diy em um servidor coolify versão 4.0.0.
No meu caso irei utilizar apenas a conexão com o Ollama Server que tenho.

Esse é o trecho do erro que estou recebendo:

2025-Jun-29 10:47:49.714265
Starting deployment of felvieira/bolt.feldev:main-bcccw8occowgsosg84ows0w8 to localhost.
2025-Jun-29 10:47:49.990147
Preparing container with helper image: ghcr.io/coollabsio/coolify-helper:1.0.8.
2025-Jun-29 10:47:50.139913
[CMD]: docker stop --time=30 nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:47:50.139913
Error response from daemon: No such container: nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:47:50.279461
[CMD]: docker rm -f nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:47:50.279461
Error response from daemon: No such container: nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:47:50.566835
[CMD]: docker run -d --network coolify --name nwgw88g40cscos4o44kwcgwc --rm -v /var/run/docker.sock:/var/run/docker.sock ghcr.io/coollabsio/coolify-helper:1.0.8
2025-Jun-29 10:47:50.566835
de7323cea86e7ba00fc795e3b01816c139feba06ac2cc522ae1bd5e07989b686
2025-Jun-29 10:47:52.501048
[CMD]: docker exec nwgw88g40cscos4o44kwcgwc bash -c 'GIT_SSH_COMMAND="ssh -o ConnectTimeout=30 -p 22 -o Port=22 -o LogLevel=ERROR -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" git ls-remote https://github.com/felvieira/bolt.feldev main'
2025-Jun-29 10:47:52.501048
5e590aa refs/heads/main
2025-Jun-29 10:47:52.527014

2025-Jun-29 10:47:52.532287
Importing felvieira/bolt.feldev:main (commit sha HEAD) to /artifacts/nwgw88g40cscos4o44kwcgwc.
2025-Jun-29 10:47:52.750596
[CMD]: docker exec nwgw88g40cscos4o44kwcgwc bash -c 'git clone -b "main" https://github.com/felvieira/bolt.feldev /artifacts/nwgw88g40cscos4o44kwcgwc && cd /artifacts/nwgw88g40cscos4o44kwcgwc && sed -i "s#git@(.*):#https://\1/#g" /artifacts/nwgw88g40cscos4o44kwcgwc/.gitmodules || true && cd /artifacts/nwgw88g40cscos4o44kwcgwc && GIT_SSH_COMMAND="ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" git submodule update --init --recursive && cd /artifacts/nwgw88g40cscos4o44kwcgwc && GIT_SSH_COMMAND="ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" git lfs pull'
2025-Jun-29 10:47:52.750596
Cloning into '/artifacts/nwgw88g40cscos4o44kwcgwc'...
2025-Jun-29 10:47:54.516237
sed: /artifacts/nwgw88g40cscos4o44kwcgwc/.gitmodules: No such file or directory
2025-Jun-29 10:47:55.378238
[CMD]: docker exec nwgw88g40cscos4o44kwcgwc bash -c 'cd /artifacts/nwgw88g40cscos4o44kwcgwc && git log -1 5e590aa --pretty=%B'
2025-Jun-29 10:47:55.378238
Merge pull request #1748 from xKevIsDev/enhancements
2025-Jun-29 10:47:55.378238
2025-Jun-29 10:47:55.378238
feat: add inspector, design palette and redesign
2025-Jun-29 10:47:59.008126
Pulling & building required images.
2025-Jun-29 10:48:00.103475
Removing old containers.
2025-Jun-29 10:48:00.252871
Starting new application.
2025-Jun-29 10:48:00.873255
[CMD]: docker exec nwgw88g40cscos4o44kwcgwc bash -c 'SOURCE_COMMIT=5e590aa16ad09e8c5f6eb3f9a204ee2eea40ec26 COOLIFY_BRANCH=main docker compose --env-file /artifacts/nwgw88g40cscos4o44kwcgwc/.env --project-name i4ccwc0o0scossoo4w80ossc --project-directory /artifacts/nwgw88g40cscos4o44kwcgwc -f /artifacts/nwgw88g40cscos4o44kwcgwc/docker-compose.yaml up -d'
2025-Jun-29 10:48:00.873255
no service selected
2025-Jun-29 10:48:00.880768
exit status 1
2025-Jun-29 10:48:00.938653
Oops something is not okay, are you okay? 😢
2025-Jun-29 10:48:00.941670
no service selected
2025-Jun-29 10:48:00.941670
exit status 1
2025-Jun-29 10:48:01.355306
Gracefully shutting down build container: nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:48:01.854826
[CMD]: docker stop --time=30 nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:48:01.854826
nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:48:02.015102
[CMD]: docker rm -f nwgw88g40cscos4o44kwcgwc
2025-Jun-29 10:48:02.015102
Error response from daemon: No such container: nwgw88g40cscos4o44kwcgwc

@Stijnus Stijnus self-assigned this Sep 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.