Serving the Quantitative Finance Community

 
User avatar
cdmurray80
Topic Author
Posts: 0
Joined: November 30th, 2005, 3:28 am

Machine Scheduling

November 27th, 2006, 1:58 am

[From an algorithms class with Gary Miller, does NOT require knowledge of algo)Your factory has M identical machines. There are N > M jobs to do. Each job must be scheduled to a single machine, cannot be split, and each job i requires L_i units of dedicated machine time . A machine can do only one job at a time. You must create a schedule of jobs to machines. Once you do, all machines will start working on their scheduled jobs at time t=0. When the last machine finishes its last job, that is time t=TDefinition: The WORK SPAN is the total amount of time to process all jobs, which is simply time T.Your goal is to minimize the work span. That is, you want to minimize the time between the start of the first job and the end of the last jobConsider some algorithms to assign jobs to machinesGREEDY: Schedule the jobs in the order they are given, on the first available machineSHORTEST-FIRST: Sort jobs by how long they take to complete, and assign jobs to machines from shortest to longest. Once any machine finishes a job, the next shortest job which isn't yet assigned is assigned to itLONGEST-FIRST: Sort jobs by how long they take to complete, and assign jobs to machines from longest to shortest. Once any machine finishes a job, the next longest job which isn't yet assigned is assigned to itOPT: This is the optimal algorithm for assigning jobs to machines.Let T(SHORTEST-FIRST) be the work span of the shortest-first schedule,T(LONGEST-FIRST) the workspan of the longest-first schedule, etc.------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------Here's an example if you don't get it, read on if you do:You are given 3 machines and jobs with required machine times of [ 4 8 2 12 6 14 6]Shortest first would make the following schedule, where time goes horizontal to the right and * means a machine is still working on its last jobM1: 2*6*****14************M2: 4***8*******M3: 6*****12**********Workspan = 2+6+14=22So T(SHORTEST-FIRST) = 22GREEDY would make the following scheduleM1: 4***6*****6*****M2: 8*******14************M3: 2*12**********Workspan = 8+14=22So T(GREEDY) = 22Longest first would make the following scheduleM1: 14************4***M2: 12*********6*****M3: 8*******6*****2*Workspan = 4+14=18So T(LONGEST-FIRST) = 18------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------We are concerned with how much worse than OPT the SHORTEST-FIRST and LONGEST-FIRST algorithms are.We want to say T(LONGEST FIRST) <= K * T(OPT), and T(SHORTEST-FIRST) <= K*T(OPT)So, you are given M machines and some list of jobs with required machine times of L_1 to L_N1) Prove TWO lower bounds for T(OPT) in terms of the given jobs and number of machines. That is, prove that the optimal schedule must have a workspan of at least X2) Prove an UPPER bound of 2*T(OPT) for GREEDY (and it will hold for SHORTEST-FIRST and LONGEST-FIRST too). That is, prove T(GREEDY) <= 2*T(OPT) Thus, in the worst case, LONGEST-FIRST, GREEDY, and SHORTEST-FIRST take at most twice as long as OPT3) Prove that, for any epsilon > 0, there exists a set of jobs with M machines where T(SHORTEST-FIRST) >= 2*T(OPT) - epsilon Thus, shortest-first WILL take twice as long as OPT for some pathological set of jobs. What is this set of jobs??? qualitatively???4) [Hard] Show that T(LONGEST-FIRST) <= 1.5 * T(OPT).This probably can be googled, it's a well known problem, but please let people who haven't seen it before take a crack at it first
 
User avatar
cdmurray80
Topic Author
Posts: 0
Joined: November 30th, 2005, 3:28 am

Machine Scheduling

November 29th, 2006, 1:30 am

Hint:Define L_(max) = MAX_i ( L_i )So L_(max) is the amount of machine time required by the job which needs the most machine timeDefine W_(avg) = (1/M) * SUM_i( L_i )W_(avg) = the total amount of machine-time required, divided by the number of machines