Under this "worst case" assumption, the banker's algorithm
checks to see if there is some order in which processes can
be allowed to complete.
Assume the following data structures:
- available(resource-class)
- a vector with one entry for each
resource indicating the number of units available.
- max(process,resource-class)
- a matrix with one entry for
each process/resource pair giving the upper bound on the
units of the resource class the specified process may request.
- allocation(process,resource-class)
- a matrix indicating the
number of units of each resource class allocated to each process.
- need(process,resource-class)
- a matrix obtained by subtracting
corresponding entries of "allocation" from "max".
Note: assume all the quantities in these tables reflect the state
that would exist if the request being considered were satisfied.
The idea is that if a process makes a request, we update
available and allocation as if the request was accepted
and then run the algorithm below. If it "completes" all
processes, the allocation is safe. Otherwise the requesting
process must wait. The algorithm is then:
repeat
look for a process whose ``needs'' are less
than units ``available'' (i.e. compare rows of ``needs''
to the ``available'' vector)
if a process is found then
add that process' row of ``allocation'' to
available
delete the ``completed'' process
until no more processes can be completed