Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,6 @@
node_modules
.venv
__pycache__
venv/
*.pyc
.env
Original file line number Diff line number Diff line change
Expand Up @@ -5,25 +5,25 @@
* and the "product" is every number multiplied together
* so for example: [2, 3, 5] would return
* {
* "sum": 10, // 2 + 3 + 5
* "product": 30 // 2 * 3 * 5
* "sum": 10, // 2 + 3 + 5
* "product": 30 // 2 * 3 * 5
* }
*
* Time Complexity:
* Space Complexity:
* Optimal Time Complexity:
* Time Complexity: O(N)
* Space Complexity: O(1)
* Optimal Time Complexity: O(N)
*
* @param {Array<number>} numbers - Numbers to process
* @returns {Object} Object containing running total and product
*/
export function calculateSumAndProduct(numbers) {
let sum = 0;
for (const num of numbers) {
sum += num;
}

let product = 1;

// The complexity is reduced by combining the two sequential O(N) loops
// into a single O(N) loop, cutting the number of array traversals in half.
for (const num of numbers) {
sum += num;
product *= num;
}

Expand All @@ -32,3 +32,18 @@ export function calculateSumAndProduct(numbers) {
product: product,
};
}

/* Explanation:
The complexity is reduced by combining the two sequential O(N) loops
into a single O(N) loop, cutting the number of array traversals in half.

Complexity of Refactor
Time Complexity: O(N)(Linear Time).
The function performs only one pass over the $N$ elements of the array,
which is the most efficient possible time complexity for this problem,
as every element must be read once.

Space Complexity: O(1)(Constant Time).
Only a fixed number of variables (sum, product, and num) are used,
regardless of the size of the input array.
*/
48 changes: 42 additions & 6 deletions Sprint-1/JavaScript/findCommonItems/findCommonItems.js
Original file line number Diff line number Diff line change
@@ -1,14 +1,50 @@
/**
* Finds common items between two arrays.
*
* Time Complexity:
* Space Complexity:
* Optimal Time Complexity:
* Time Complexity: O(N + M)
* Space Complexity: O(M)
* Optimal Time Complexity: O(N + M)
*
* @param {Array} firstArray - First array to compare
* @param {Array} secondArray - Second array to compare
* @returns {Array} Array containing unique common items
*/
export const findCommonItems = (firstArray, secondArray) => [
...new Set(firstArray.filter((item) => secondArray.includes(item))),
];
export const findCommonItems = (firstArray, secondArray) => {
// 1. Create a Set from the second array. (O(M) time)
// This allows for O(1) average time lookups using the has() method,
// as per the typical performance characteristics of Set.
const secondSet = new Set(secondArray);

// 2. Filter the first array (O(N) time) using O(1) lookups.
// The filter uses Set.prototype.has() for O(1) average time complexity.
// The overall time complexity for the filter step becomes O(N * 1) = O(N).
const commonItems = firstArray.filter((item) => secondSet.has(item));

// 3. Use a final Set to guarantee unique results (O(N) time) and return an array.
return [...new Set(commonItems)];
};

/*
Explanation:
The function first creates a Set from the second array, which takes O(M) time,
where M is the length of the second array. This allows for O(1) average time
lookups when checking for common items.

Next, it filters the first array, which takes O(N) time, where N is the length
of the first array. Each lookup in the Set is O(1) on average, so the overall
time complexity for this step remains O(N).

Finally, to ensure that the result contains only unique items, a new Set is
created from the filtered results, which also takes O(N) time in the worst case.
Converting this Set back to an array is also O(N).

Overall, the total time complexity is O(N + M), which is optimal for this problem,
as each element from both arrays must be examined at least once.

Space Complexity:
The space complexity is O(M) due to storing the second array in a Set.
The additional space used for the result array is O(K), where K is the number
of unique common items, but this is typically less than or equal to M.

Source: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set
*/
45 changes: 37 additions & 8 deletions Sprint-1/Python/has_pair_with_sum/has_pair_with_sum.py
Original file line number Diff line number Diff line change
@@ -1,18 +1,47 @@
from typing import List, TypeVar
from typing import List, TypeVar, Set

Number = TypeVar("Number", int, float)


def has_pair_with_sum(numbers: List[Number], target_sum: Number) -> bool:
"""
Find if there is a pair of numbers that sum to a target value.
This uses a hash set (Python's set) to achieve O(N) time complexity.

Time Complexity:
Space Complexity:
Optimal time complexity:
Time Complexity: O(N)
Space Complexity: O(N)
Optimal time complexity: O(N)
"""
for i in range(len(numbers)):
for j in range(i + 1, len(numbers)):
if numbers[i] + numbers[j] == target_sum:
return True
seen_numbers: Set[Number] = set()

# Iterate through the list once (O(N) time)
for current_num in numbers:
complement = target_sum - current_num

# Check if the required complement is already in the set (O(1) average time)
if complement in seen_numbers:
return True

# Add the current number to the set for future lookups
seen_numbers.add(current_num)

return False

'''
The complexity is driven by the nested for loops
The bottleneck is the inner loop, which forces us to check every possible pair.
We can use a Hash Set (a set in Python) to store the numbers we've already seen.
This allows us to perform lookups in O(1) average time.

Complexity of Refactor
Time Complexity: O(N)(Linear Time).
The function performs a single pass over the N elements, with constant-time operations inside the loop.
This is the optimal complexity.

Space Complexity:
O(N)(Linear Space). We introduce the seen_numbers set, which, in the worst case,
will store up to N elements from the input list.

Resources: https://www.w3schools.com/python/ref_set_intersection.asp

'''
21 changes: 21 additions & 0 deletions Sprint-1/Python/has_pair_with_sum/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 5 additions & 0 deletions Sprint-1/Python/has_pair_with_sum/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"dependencies": {
"pytest": "^1.0.0"
}
}
32 changes: 21 additions & 11 deletions Sprint-1/Python/remove_duplicates/remove_duplicates.py
Original file line number Diff line number Diff line change
@@ -1,25 +1,35 @@
from typing import List, Sequence, TypeVar
from typing import List, Sequence, TypeVar, Set

ItemType = TypeVar("ItemType")


def remove_duplicates(values: Sequence[ItemType]) -> List[ItemType]:
"""
Remove duplicate values from a sequence, preserving the order of the first occurrence of each value.
Refactored to use a set for O(1) average time lookups, achieving O(N) overall time complexity.

Time complexity:
Space complexity:
Optimal time complexity:
Time complexity: O(N)
Space complexity: O(N)
Optimal time complexity: O(N)
"""
unique_items = []
seen: Set[ItemType] = set()
unique_items: List[ItemType] = []

for value in values:
is_duplicate = False
for existing in unique_items:
if value == existing:
is_duplicate = True
break
if not is_duplicate:
if value not in seen:
seen.add(value)
unique_items.append(value)

return unique_items

"""
Explanation:
The inner loop performs a linear scan through the unique_items list to check for duplicates, which is the source of the high time complexity.

The only effective way to reduce O(N^2) complexity for this problem is to replace the
linear search (for existing in unique_items) with an O(1) average time lookup,
which requires a hash set (set in Python).The optimal approach is to use a set to track seen items, achieving O(N)$ time complexity.

Resource: https://www.w3schools.com/python/ref_set_intersection.asp

"""