Skip to content

Conversation

@ahmadnaufal
Copy link

No description provided.

EndTime time.Time
}

func (r *LineageResolver) pruneLineage(lineage *scheduler.JobLineageSummary, maxUpstreamsPerLevel, depth int) *scheduler.JobLineageSummary {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possible to cover this scenario in the test case

Copy link

@sravankorumilli sravankorumilli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possible to cover this scenario in a test case

func (r *LineageResolver) pruneLineage(lineage *scheduler.JobLineageSummary, maxUpstreamsPerLevel, depth int) *scheduler.JobLineageSummary {
// base case: stop if max depth reached or number of upstreams is already within limit
if depth > MaxLineageDepth || len(lineage.Upstreams) <= maxUpstreamsPerLevel {
if depth > MaxLineageDepth {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we avoid to use function recursive? use stack data type instead

function recursive uses stack frame allocation, which is limited, and can lead to stack overflow. meanwhile using stack data type is more safe, as it uses heap memory

// base case: stop if max depth reached or number of upstreams is already within limit
if depth > MaxLineageDepth || len(lineage.Upstreams) <= maxUpstreamsPerLevel {
if depth > MaxLineageDepth {
return &scheduler.JobLineageSummary{
Copy link
Member

@deryrahman deryrahman Oct 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

base question: why we need to prune the lineage? if cyclic is found, it's fine I guess as long as we don't revisit the job again

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants