## # 1235. Maximum Profit in Job Scheduling

We have n jobs, where every job is scheduled to be done from startTime[i] to endTime[i], obtaining a profit of profit[i].

You're given the startTime , endTime and profit arrays, you need to output the maximum profit you can take such that there are no 2 jobs in the subset with overlapping time range.

If you choose a job that ends at time X you will be able to start another job that starts at time X.

Example 1:

Input: startTime = [1,2,3,3], endTime = [3,4,5,6], profit = [50,10,40,70]
Output: 120
Explanation: The subset chosen is the first and fourth job.
Time range [1-3]+[3-6] , we get profit of 120 = 50 + 70.

Example 2:

Input: startTime = [1,2,3,4,6], endTime = [3,5,10,6,9], profit = [20,20,100,70,60]
Output: 150
Explanation: The subset chosen is the first, fourth and fifth job.
Profit obtained 150 = 20 + 70 + 60.

Example 3:

Input: startTime = [1,1,1], endTime = [2,3,4], profit = [5,6,4]
Output: 6

### # Solution

Approach 1: (TLE) knapsack. Sort the jobs by start time, and let total[i][j] to be the max total profit using first i items up until time j.

Approach 2: (TLE) typical 2D array setup -- let table[i][j] be the max profit starting time i and ending time j.

Approach 3: monotonic time flow + binary search. Keep a list of (time_consumed_to_reach_this_max_profit, max_profit_value), and for each job, try to add to this list if doing this job yields higher profit. Time: O(NlogN).

### # Code (Python)

Approach 1:

def jobScheduling(self, start_time, end_time, profit):
# total[i][j]: max total profit using first i items up until time j
# init: total[0][j] = 0, total[i][0] = 0
# transfer: total[i][j] = max(total[i-1][j], total[i][j-1], profit[i-1] + total[i-1][start[i-1]] if end[j] <= j)
# time: O(max(len of jobs * max end time, JlogJ -- J for len of jobs)) TLE
jobs = sorted(zip(start_time, end_time, profit))
total = [[0] * (max(end_time) + 1) for _ in range(len(jobs) + 1)]
for i in range(1, len(total)):
for j in range(1, len(total[0])):
total[i][j] = max(total[i-1][j], total[i][j-1])
if jobs[i-1][1] <= j:
total[i][j] = max(total[i][j], jobs[i-1][2] + total[i-1][jobs[i-1][0]])
return max(total[-1])

Approach 2:

from collections import defaultdict

class Solution:
def jobScheduling(self, startTime: List[int], endTime: List[int], profit: List[int]) -> int:
table = defaultdict(int)
for i in range(len(startTime)):
table[(startTime[i], endTime[i])] = profit[i]

min_start, max_end = min(startTime), max(endTime)
for delta in range(1, max_end - min_start + 2):
start = min_start
while True:
end = start + delta
max_val = 0
if end <= max_end:
for i in range(start, end + 1):
max_val = max(max_val, table[(start, i)] + table[(i, end)])
table[(start, end)] = max_val
else:
break
start += 1

return table[(min_start, max_end)]

Approach 3:

import bisect
class Solution:
def jobScheduling(self, start_time, end_time, profit):
# https://leetcode.com/problems/maximum-profit-in-job-scheduling/discuss/409009/JavaC%2B%2BPython-DP-Solution
# time progresses monotonically. keep a list of (time_consumed_to_reach_this_max_profit, max_profit_value)
# for each job, try to add to this list if doing this job yields higher profit
jobs = sorted(zip(start_time, end_time, profit), key=lambda job: job[1])
max_profit = [[0,0]]
for job in jobs:
# search max profit before this job starts, in order to replace the later ones
index = bisect.bisect_right(max_profit, [job[0], float('inf')]) - 1 # bisect_left doesn't work because not sure whether it returns that index or next
if max_profit[index][1] + job[2] > max_profit[-1][1]:
max_profit.append([job[1], max_profit[index][1] + job[2]])

return max_profit[-1][-1]

Approach 1:

Approach 2: