There are numerous problems minimizing lateness, here we have a single resource which can only process one job at a time. Job j requires tj units of processing time and is due at time dj. if j starts at time sj it will finish at time fj=sj+tj. We define lateness L=max{0,fj-dh} for all j. The goal is to minimize the maximum lateness L.
| 1 | 2 | 3 | 4 | 5 | 6 |
|:—:|:-:|:-:|:-:|:-:|:–:|:–:| | tj| 3 | 2 | 1 | 4 | 3 | 2 | | dj| 6 | 8 | 9 | 9 | 10 | 11 |
|Job | 3 | 2 | 2 | 5 | 5 | 5 | 4 | 4 | 4 | 4 | 1 | 1 | 1 | 6 | 6 | |:-: |:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:—:|:-:|:-:| | Time | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |10 |11 |12 |13 |14 |15 | | Lj |-8 | |-5 | | |-4 | | | | 1 | | |7| | 4 |
The solution L=7 is obviously not optimal. Lets look at some greedy strategies:
djdj-tjIts easy to see that shortest processing time first is not optimal a good counter example is
| 1 | 2 |
|:—:|:-:|:-:| | tj| 1 | 5 | | dj|10 | 5 |
the smallest stack solution has simillar problems
| 1 | 2 |
|:—:|:-:|:-:| | tj| 1 | 5 | | dj| 3 | 5 |
the last strategy looks valid so we start with some pseudo code:
n jobs by due time so that d1<=d2<=...<=dnt=0j=1 to n- Assign job `j` to interval `[t,t+tj]`
- set `sj=t` and `fj=t+tj`
- set `t=t+tj`
[s1,f1],[s2,f2],...,[sn,fn]